Algorithms: A procedure consisting of a sequence of algebraic formulas and/or logical steps to calculate or determine a given task.Software: Sequential operating programs and data which instruct the functioning of a digital computer.Pattern Recognition, Automated: In INFORMATION RETRIEVAL, machine-sensing or identification of visible patterns (shapes, forms, and configurations). (Harrod's Librarians' Glossary, 7th ed)Computer Simulation: Computer-based representation of physical systems and phenomena such as chemical processes.Computational Biology: A field of biology concerned with the development of techniques for the collection and manipulation of biological data, and the use of such data to make biological discoveries or predictions. This field encompasses all computational methods and theories for solving biological problems including manipulation of models and datasets.Reproducibility of Results: The statistical reproducibility of measurements (often in a clinical context), including the testing of instrumentation or techniques to obtain reproducible results. The concept includes reproducibility of physiological measurements, which may be used to develop rules to assess probability or prognosis, or response to a stimulus; reproducibility of occurrence of a condition; and reproducibility of experimental results.Artificial Intelligence: Theory and development of COMPUTER SYSTEMS which perform tasks that normally require human intelligence. Such tasks may include speech recognition, LEARNING; VISUAL PERCEPTION; MATHEMATICAL COMPUTING; reasoning, PROBLEM SOLVING, DECISION-MAKING, and translation of language.Models, Statistical: Statistical formulations or analyses which, when applied to data and found to fit the data, are then used to verify the assumptions and parameters used in the analysis. Examples of statistical models are the linear model, binomial model, polynomial model, two-parameter model, etc.Sensitivity and Specificity: Binary classification measures to assess test results. Sensitivity or recall rate is the proportion of true positives. Specificity is the probability of correctly determining the absence of a condition. (From Last, Dictionary of Epidemiology, 2d ed)Cluster Analysis: A set of statistical methods used to group variables or observations into strongly inter-related subgroups. In epidemiology, it may be used to analyze a closely grouped series of events or cases of disease or other health-related phenomenon with well-defined distribution patterns in relation to time or place or both.Image Processing, Computer-Assisted: A technique of inputting two-dimensional images into a computer and then enhancing or analyzing the imagery into a form that is more useful to the human observer.Sequence Analysis, Protein: A process that includes the determination of AMINO ACID SEQUENCE of a protein (or peptide, oligopeptide or peptide fragment) and the information analysis of the sequence.Sequence Alignment: The arrangement of two or more amino acid or base sequences from an organism or organisms in such a way as to align areas of the sequences sharing common properties. The degree of relatedness or homology between the sequences is predicted computationally or statistically based on weights assigned to the elements aligned between the sequences. This in turn can serve as a potential indicator of the genetic relatedness between the organisms.Image Interpretation, Computer-Assisted: Methods developed to aid in the interpretation of ultrasound, radiographic images, etc., for diagnosis of disease.Phantoms, Imaging: Devices or objects in various imaging techniques used to visualize or enhance visualization by simulating conditions encountered in the procedure. Phantoms are used very often in procedures employing or measuring x-irradiation or radioactive material to evaluate performance. Phantoms often have properties similar to human tissue. Water demonstrates absorbing properties similar to normal tissue, hence water-filled phantoms are used to map radiation levels. Phantoms are used also as teaching aids to simulate real conditions with x-ray or ultrasonic machines. (From Iturralde, Dictionary and Handbook of Nuclear Medicine and Clinical Imaging, 1990)Models, Genetic: Theoretical representations that simulate the behavior or activity of genetic processes or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.Signal Processing, Computer-Assisted: Computer-assisted processing of electric, ultrasonic, or electronic signals to interpret function and activity.Software Validation: The act of testing the software for compliance with a standard.Imaging, Three-Dimensional: The process of generating three-dimensional images by electronic, photographic, or other methods. For example, three-dimensional images can be generated by assembling multiple tomographic images with the aid of a computer, while photographic 3-D images (HOLOGRAPHY) can be made by exposing film to the interference pattern created when two laser light sources shine on an object.Sequence Analysis, DNA: A multistage process that includes cloning, physical mapping, subcloning, determination of the DNA SEQUENCE, and information analysis.Image Enhancement: Improvement of the quality of a picture by various techniques, including computer processing, digital filtering, echocardiographic techniques, light and ultrastructural MICROSCOPY, fluorescence spectrometry and microscopy, scintigraphy, and in vitro image processing at the molecular level.Markov Chains: A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.Proteins: Linear POLYPEPTIDES that are synthesized on RIBOSOMES and may be further modified, crosslinked, cleaved, or assembled into complex proteins with several subunits. The specific sequence of AMINO ACIDS determines the shape the polypeptide will take, during PROTEIN FOLDING, and the function of the protein.Databases, Protein: Databases containing information about PROTEINS such as AMINO ACID SEQUENCE; PROTEIN CONFORMATION; and other properties.Bayes Theorem: A theorem in probability theory named for Thomas Bayes (1702-1761). In epidemiology, it is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the likelihood of that characteristic in healthy and diseased individuals. The most familiar application is in clinical decision analysis where it is used for estimating the probability of a particular diagnosis given the appearance of some symptoms or test result.Gene Expression Profiling: The determination of the pattern of genes expressed at the level of GENETIC TRANSCRIPTION, under specific circumstances or in a specific cell.Monte Carlo Method: In statistics, a technique for numerically approximating the solution of a mathematical problem by studying the distribution of some random variable, often generated by a computer. The name alludes to the randomness characteristic of the games of chance played at the gambling casinos in Monte Carlo. (From Random House Unabridged Dictionary, 2d ed, 1993)Computer Graphics: The process of pictorial communication, between human and computers, in which the computer input and output have the form of charts, drawings, or other appropriate pictorial representation.Automation: Controlled operation of an apparatus, process, or system by mechanical or electronic devices that take the place of human organs of observation, effort, and decision. (From Webster's Collegiate Dictionary, 1993)Databases, Factual: Extensive collections, reputedly complete, of facts and data garnered from material of a specialized subject area and made available for analysis and application. The collection can be automated by various contemporary methods for retrieval. The concept should be differentiated from DATABASES, BIBLIOGRAPHIC which is restricted to collections of bibliographic references.Oligonucleotide Array Sequence Analysis: Hybridization of a nucleic acid sample to a very large set of OLIGONUCLEOTIDE PROBES, which have been attached individually in columns and rows to a solid support, to determine a BASE SEQUENCE, or to detect variations in a gene sequence, GENE EXPRESSION, or for GENE MAPPING.Neural Networks (Computer): A computer architecture, implementable in either hardware or software, modeled after biological neural networks. Like the biological system in which the processing capability is a result of the interconnection strengths between arrays of nonlinear processing nodes, computerized neural networks, often called perceptrons or multilayer connectionist models, consist of neuron-like units. A homogeneous group of units makes up a layer. These networks are good at pattern recognition. They are adaptive, performing tasks by example, and thus are better for decision-making than are linear learning machines or cluster analysis. They do not require explicit programming.Numerical Analysis, Computer-Assisted: Computer-assisted study of methods for obtaining useful quantitative solutions to problems that have been expressed mathematically.Models, Theoretical: Theoretical representations that simulate the behavior or activity of systems, processes, or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.User-Computer Interface: The portion of an interactive computer program that issues messages to and receives commands from a user.Data Compression: Information application based on a variety of coding methods to minimize the amount of data to be stored, retrieved, or transmitted. Data compression can be applied to various forms of data, such as images and signals. It is used to reduce costs and increase efficiency in the maintenance of large volumes of data.Fuzzy Logic: Approximate, quantitative reasoning that is concerned with the linguistic ambiguity which exists in natural or synthetic language. At its core are variables such as good, bad, and young as well as modifiers such as more, less, and very. These ordinary terms represent fuzzy sets in a particular problem. Fuzzy logic plays a key role in many medical expert systems.Artifacts: Any visible result of a procedure which is caused by the procedure itself and not by the entity being analyzed. Common examples include histological structures introduced by tissue processing, radiographic images of structures that are not naturally present in living tissue, and products of chemical reactions that occur during analysis.Diagnosis, Computer-Assisted: Application of computer programs designed to assist the physician in solving a diagnostic problem.Databases, Genetic: Databases devoted to knowledge about specific genes and gene products.Data Interpretation, Statistical: Application of statistical procedures to analyze specific observed or assumed facts from a particular study.Models, Biological: Theoretical representations that simulate the behavior or activity of biological processes or diseases. For disease models in living animals, DISEASE MODELS, ANIMAL is available. Biological models include the use of mathematical equations, computers, and other electronic equipment.Normal Distribution: Continuous frequency distribution of infinite range. Its properties are as follows: 1, continuous, symmetrical distribution with both tails extending to infinity; 2, arithmetic mean, mode, and median identical; and 3, shape completely determined by the mean and standard deviation.Information Storage and Retrieval: Organized activities related to the storage, location, search, and retrieval of information.Likelihood Functions: Functions constructed from a statistical model and a set of observed data which give the probability of that data for various values of the unknown model parameters. Those parameter values that maximize the probability are the maximum likelihood estimates of the parameters.Radiographic Image Interpretation, Computer-Assisted: Computer systems or networks designed to provide radiographic interpretive information.Genomics: The systematic study of the complete DNA sequences (GENOME) of organisms.Internet: A loose confederation of computer communication networks around the world. The networks that make up the Internet are connected through several backbone networks. The Internet grew out of the US Government ARPAnet project and was designed to facilitate information exchange.Decision Trees: A graphic device used in decision analysis, series of decision options are represented as branches (hierarchical).Radiographic Image Enhancement: Improvement in the quality of an x-ray image by use of an intensifying screen, tube, or filter and by optimum exposure techniques. Digital processing methods are often employed.Subtraction Technique: Combination or superimposition of two images for demonstrating differences between them (e.g., radiograph with contrast vs. one without, radionuclide images using different radionuclides, radiograph vs. radionuclide image) and in the preparation of audiovisual materials (e.g., offsetting identical images, coloring of vessels in angiograms).Programming Languages: Specific languages used to prepare computer programs.Wavelet Analysis: Signal and data processing method that uses decomposition of wavelets to approximate, estimate, or compress signals with finite time and frequency domains. It represents a signal or data in terms of a fast decaying wavelet series from the original prototype wavelet, called the mother wavelet. This mathematical algorithm has been adopted widely in biomedical disciplines for data and signal processing in noise removal and audio/image compression (e.g., EEG and MRI).Computing Methodologies: Computer-assisted analysis and processing of problems in a particular area.Signal-To-Noise Ratio: The comparison of the quantity of meaningful data to the irrelevant or incorrect data.Data Mining: Use of sophisticated analysis tools to sort through, organize, examine, and combine large sets of information.Protein Interaction Mapping: Methods for determining interaction between PROTEINS.Models, Molecular: Models used experimentally or theoretically to study molecular shape, electronic properties, or interactions; includes analogous molecules, computer-generated graphics, and mechanical structures.Wireless Technology: Techniques using energy such as radio frequency, infrared light, laser light, visible light, or acoustic energy to transfer information without the use of wires, over both short and long distances.Support Vector Machines: Learning algorithms which are a set of related supervised computer learning methods that analyze data and recognize patterns, and used for classification and regression analysis.Automatic Data Processing: Data processing largely performed by automatic means.Software Design: Specifications and instructions applied to the software.Sequence Analysis, RNA: A multistage process that includes cloning, physical mapping, subcloning, sequencing, and information analysis of an RNA SEQUENCE.ComputersMolecular Sequence Data: Descriptions of specific amino acid, carbohydrate, or nucleotide sequences which have appeared in the published literature and/or are deposited in and maintained by databanks such as GENBANK, European Molecular Biology Laboratory (EMBL), National Biomedical Research Foundation (NBRF), or other sequence repositories.Stochastic Processes: Processes that incorporate some element of randomness, used particularly to refer to a time series of random variables.Genome: The genetic complement of an organism, including all of its GENES, as represented in its DNA, or in some cases, its RNA.Gene Regulatory Networks: Interacting DNA-encoded regulatory subsystems in the GENOME that coordinate input from activator and repressor TRANSCRIPTION FACTORS during development, cell differentiation, or in response to environmental cues. The networks function to ultimately specify expression of particular sets of GENES for specific conditions, times, or locations.ROC Curve: A graphic means for assessing the ability of a screening test to discriminate between healthy and diseased persons; may also be used in other studies, e.g., distinguishing stimuli responses as to a faint stimuli or nonstimuli.Equipment Design: Methods of creating machines and devices.Models, Chemical: Theoretical representations that simulate the behavior or activity of chemical processes or phenomena; includes the use of mathematical equations, computers, and other electronic equipment.Probability: The study of chance processes or the relative frequency characterizing a chance process.Predictive Value of Tests: In screening and diagnostic tests, the probability that a person with a positive test is a true positive (i.e., has the disease), is referred to as the predictive value of a positive test; whereas, the predictive value of a negative test is the probability that the person with a negative test does not have the disease. Predictive value is related to the sensitivity and specificity of the test.Chromosome Mapping: Any method used for determining the location of and relative distances between genes on a chromosome.Phylogeny: The relationships of groups of organisms as reflected by their genetic makeup.Magnetic Resonance Imaging: Non-invasive method of demonstrating internal anatomy based on the principle that atomic nuclei in a strong magnetic field absorb pulses of radiofrequency energy and emit them as radiowaves which can be reconstructed into computerized images. The concept includes proton spin tomographic techniques.Time Factors: Elements of limited time intervals, contributing to particular results or situations.Base Sequence: The sequence of PURINES and PYRIMIDINES in nucleic acids and polynucleotides. It is also called nucleotide sequence.Discriminant Analysis: A statistical analytic technique used with discrete dependent variables, concerned with separating sets of observed values and allocating new values. It is sometimes used instead of regression analysis.Cone-Beam Computed Tomography: Computed tomography modalities which use a cone or pyramid-shaped beam of radiation.Tomography, X-Ray Computed: Tomography using x-ray transmission and a computer algorithm to reconstruct the image.Least-Squares Analysis: A principle of estimation in which the estimates of a set of parameters in a statistical model are those quantities minimizing the sum of squared differences between the observed values of a dependent variable and the values predicted by the model.Nonlinear Dynamics: The study of systems which respond disproportionately (nonlinearly) to initial conditions or perturbing stimuli. Nonlinear systems may exhibit "chaos" which is classically characterized as sensitive dependence on initial conditions. Chaotic systems, while distinguished from more ordered periodic systems, are not random. When their behavior over time is appropriately displayed (in "phase space"), constraints are evident which are described by "strange attractors". Phase space representations of chaotic systems, or strange attractors, usually reveal fractal (FRACTALS) self-similarity across time scales. Natural, including biological, systems often display nonlinear dynamics and chaos.Programming, Linear: A technique of operations research for solving certain kinds of problems involving many variables where a best value or set of best values is to be found. It is most likely to be feasible when the quantity to be optimized, sometimes called the objective function, can be stated as a mathematical expression in terms of the various activities within the system, and when this expression is simply proportional to the measure of the activities, i.e., is linear, and when all the restrictions are also linear. It is different from computer programming, although problems using linear programming techniques may be programmed on a computer.Equipment Failure Analysis: The evaluation of incidents involving the loss of function of a device. These evaluations are used for a variety of purposes such as to determine the failure rates, the causes of failures, costs of failures, and the reliability and maintainability of devices.Genome, Human: The complete genetic complement contained in the DNA of a set of CHROMOSOMES in a HUMAN. The length of the human genome is about 3 billion base pairs.Proteomics: The systematic study of the complete complement of proteins (PROTEOME) of organisms.Databases, Nucleic Acid: Databases containing information about NUCLEIC ACIDS such as BASE SEQUENCE; SNPS; NUCLEIC ACID CONFORMATION; and other properties. Information about the DNA fragments kept in a GENE LIBRARY or GENOMIC LIBRARY is often maintained in DNA databases.Principal Component Analysis: Mathematical procedure that transforms a number of possibly correlated variables into a smaller number of uncorrelated variables called principal components.Polymorphism, Single Nucleotide: A single nucleotide variation in a genetic sequence that occurs at appreciable frequency in the population.Amino Acid Sequence: The order of amino acids as they occur in a polypeptide chain. This is referred to as the primary structure of proteins. It is of fundamental importance in determining PROTEIN CONFORMATION.Computer Communication Networks: A system containing any combination of computers, computer terminals, printers, audio or visual display devices, or telephones interconnected by telecommunications equipment or cables: used to transmit or receive information. (Random House Unabridged Dictionary, 2d ed)Natural Language Processing: Computer processing of a language with rules that reflect and describe current usage rather than prescribed usage.Tomography: Imaging methods that result in sharp images of objects located on a chosen plane and blurred images located above or below the plane.Proteome: The protein complement of an organism coded for by its genome.

*  Algorithms Software - SourceForge.net

Algorithms Software Software. Free, secure and fast downloads from the largest Open Source applications and software directory ...
https://sourceforge.net/directory/development/algorithms/natlanguage:arabic/language:asp_dot_net/?sort=update

*  Algorithms Software - SourceForge.net

Algorithms Software Software. Free, secure and fast downloads from the largest Open Source applications and software directory ... Hot topics in Algorithms Software. photogrammetry photogrammetry software detectmite simplex method java linear regex word list ... The Numerical Learning Library intends to provide a wide range of machine learning algorithms. It is a generic and efficient ... Library of terrain level of detail algorithm nodes for Coin (an implementation of an Open Inventor scene graph) and other ...
https://sourceforge.net/directory/development/algorithms/?sort=popular&page=12

*  Algorithms Software - SourceForge.net

Algorithms Software Software. Free, secure and fast downloads from the largest Open Source applications and software directory ... Algorithms Software. * Avoid hiccups: deliver native mobile apps seamlessly. Feel confident that you're keeping your apps and ...
https://sourceforge.net/directory/development/algorithms/developmentstatus:inactive/language:matlab/?sort=popular

*  Algorithms Software - SourceForge.net

Algorithms Software Software. Free, secure and fast downloads from the largest Open Source applications and software directory ... Algorithms Software. * Protect your Network & Cloud Assets with USM Anywhere. Discover a better way to detect & respond to ...
https://sourceforge.net/directory/development/algorithms/language:plsql/language:javascript/?sort=rating

*  Algorithms Software - SourceForge.net

Algorithms Software Software. Free, secure and fast downloads from the largest Open Source applications and software directory ... Algorithms Software. * Monitor any application, any server, anywhere with SolarWinds®. Find and resolve application problems ... it will compare Genetic Algorithm solution of the Knapsack problem to greedy algorithm. ... genetic algorithm simulation toolbox 1 wire c# net temp dsp public domain fft ...
https://sourceforge.net/directory/development/algorithms/developmentstatus:alpha/license:publicdomain/?sort=name

*  Algorithms Software - SourceForge.net

Algorithms Software Software. Free, secure and fast downloads from the largest Open Source applications and software directory ... Can be used in testing various robotic algorithms, and already used for comparison of path planning algorithms like RRT, ... An easy to extend, highly graphical, easy to use 2D robot simulator specialized for path planning algorithms. ...
https://sourceforge.net/directory/development/algorithms/natlanguage:turkish/developmentstatus:alpha/?sort=name

*  Linux Algorithms Software - SourceForge.net

Linux Algorithms Software Software. Free, secure and fast downloads from the largest Open Source applications and software ... NSketch, .Net sketch-based algorithms. The NSketch library provides implementations of most common sketch-based algorithms ( ... Algorithms Software. * SolarWinds is #1 in network monitoring.. Reduce network outages and improve performance with advanced ... In this project we develop a software, that uses evolutionary algorithm to evolve a solution, that can drive a car on an ...
https://sourceforge.net/directory/development/algorithms/language:csharp/os:linux/?page=3

*  BSD Algorithms Software - SourceForge.net

BSD Algorithms Software Software. Free, secure and fast downloads from the largest Open Source applications and software ... Algorithms Software. * SolarWinds is #1 in network monitoring.. Reduce network outages and improve performance with advanced ... Hot topics in Algorithms Software. quine-mccluskey c++ lzma samtools zlib quine mccluskey c code acl-tizen codec win64 disk ... BlueDS is a library that contains the implementation of frequently used data structures and algorithms in Computer Science. ...
https://sourceforge.net/directory/development/algorithms/os:bsd/license:publicdomain/?sort=popular

*  Algorithms for Manipulating Compressed Images

p,A family of algorithms that implement operations on compressed digital images is described. These algorithms allow many ... p,A family of algorithms that implement operations on compressed digital images is described. These algorithms allow many ... Brian C. Smith, Lawrence A. Rowe, "Algorithms for Manipulating Compressed Images", IEEE Computer Graphics and Applications, vol ...
https://computer.org/csdl/mags/cg/1993/05/mcg1993050034-abs.html

*  Symmetric-key algorithm - Wikipedia

Symmetric-key algorithms[1] are algorithms for cryptography that use the same cryptographic keys for both encryption of ... Encryption algorithm example[edit]. The following steps should be followed to develop an encrypted text: [10] ... Types of symmetric-key algorithms[edit]. Symmetric-key encryption can use either stream ciphers or block ciphers.[4] ... Symmetric-key algorithms require both the sender and the recipient of a message to have the same secret key. All early ...
https://en.wikipedia.org/wiki/Symmetric_key

*  De novo peptide sequencing and spectral alignment algorithm via tandem mass spectrometry :: University of Southern California...

Tandem mass spectrometry (MS/MS) has become an important experimental method for high throughput proteomics based biological discovery. The most common usage of MS/MS in biological applications is peptide sequencing. In this thesis, we focus on algorithms for MS/MS peptide identification and spectral alignment. We carry out two studies: (1) We have developed a de novo sequencing algorithm called MSNovo that integrates a new probabilistic scoring function with a mass array based dynamic programming algorithm. MSNovo works on various MS data generated from both LCQ and LTQ mass spectrometers and interprets singly, doubly and triply charged ions. MSNovo was tested to perform better than previous algorithms on several datasets. (2)We have developed a spectrum-peptide and spectrum-spectrum alignment algorithms called MSPEP. MSPEP identifies Post Translational Modifications through the spectrum-peptide alignment algorithm and reveals the relationship among unknown ...
digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/266475/rec/6

*  Model structure selection for a discrete-time non-linear system using genetic algorithm - Universiti Teknologi Malaysia...

In recent years, extensive works on genetic algorithms have been reported covering various applications. Genetic algorithms (GAs) have received significant interest from researchers and have been applied to various optimization problems. They offer many advantages such as global search characteristics, and this has led to the idea of using this programming method in modelling dynamic non-linear systems. In this paper, a methodology for model structure selection based on a genetic algorithm was developed and applied to non-linear discrete-time dynamic systems. First the effect of different combinations of GA operators on the performance of the model developed is studied. A proposed algorithm called modified GA, or MGA, is presented and a comparison between a simple GA and a modified GA is carried out. The performance of the proposed algorithm is also compared to the model developed using the orthogonal least squares (OLS) algorithm. The adequacy of the developed models is ...
eprints.utm.my/7107/

*  Evolutionary multi-objective optimization algorithms with probabilistic representation based on pheromone trails - Nottingham...

Recently, the research on quantum-inspired evolutionary algorithms (QEA) has attracted some attention in the area of evolutionary computation. QEA use a probabilistic representation, called Q-bit, to encode individuals in population. Unlike standard evolutionary algorithms, each Q-bit individual is a probability model, which can represent multiple solutions. Since probability models store global statistical information of good solutions found previously in the search, QEA have good potential to deal with hard optimization problems with many local optimal solutions. So far, not much work has been done on evolutionary multi-objective (EMO) algorithms with probabilistic representation. In this paper, we investigate the performance of two state-of-the-art EMO algorithms - MOEA/D and NSGA-II, with probabilistic representation based on pheromone trails, on the multi-objective travelling salesman problem. Our experimental results show that MOEA/D ...
eprints.nottingham.ac.uk/35592/

*  Parallel genetic algorithms: an efficient model and applications in control systems - SEA

Optimisation is an adaptiver process and it is widely applied in science and engineering from scheduling a manufacturing process to control of a spacecraft. Genetic algorithms are search and optimisation methods based on the mechanics of natural evolution. they provide promising results in non-linear and complex search problems. They have been proven to be parallel and global in nature. Genetic algorithms run slowly on sequential machines which is their mmmajor drawback. Most of the applications of genetic algorithm in engineering are in the area of design and schedule optimisation, where usually enough time is avialable to simulate the algorithm. The computer architecture is a main bottleneck since the sequential computation does not reflect the true spatial structure of the algorithm. There are a couple of parallel models and implementations available which improve the performance of these algorithms. The aim of this research is to develop a new model ...
ssudl.solent.ac.uk/1266/

*  Biologically inspired algorithms applied to prosthetic control

Biologically inspired algorithms were used in this work to approach different components of pattern recognition applied to the control of robotic prosthetics. In order to contribute with a different training paradigm, Evolutionary (EA) and Particle Swarm Optimization (PSO) algorithms were used to train an Artificial Neural Network (ANN). Since the optimal input set of signal features is yet unknown, a Genetic Algorithm (GA) was used to approach this problem. The training length and rate of convergence were considered in the search of an optimal set of signal features, as well as for the optimal time window length. The ANN proved to be an accurate pattern recognition algorithm predicting 10 movements with over 95% accuracy. Moreover, new combinations of signal features with higher convergence rates than the commonly found in the literature were discovered by the GA. It was also found that the PSO had better performance that the EA as a training algorithm but worse than the ...
publications.lib.chalmers.se/publication/162938-biologically-inspired-algorithms-applied-to-prosthetic-control

*  Introduction to Parallel Algorithms and Architectures - O'Reilly Media

Introduction to Parallel Algorithms and Architectures: Arrays Trees Hypercubes provides an introduction to the expanding field of parallel algorithms and architectures. This book focuses on parallel computation involving the most popular network...
shop.oreilly.com/product/9781558601178.do

*  genetic algorithm | Hackaday

Kory] has been writing genetic algorithms for a few months now. This in itself isn't anything unique or exceptional, except for what he's getting these genetic algorithms to do. [Kory] has been using genetic algorithms to write programs in Brainfuck. Yes, it's a computer programming a computer. Be thankful Skynet is 18 years late.. When we first saw [Kory]'s work, he had programmed a computer to write and run its own programs in Brainfuck. Although the name of the language [Kory] chose could use some work, it's actually the ideal language for computer-generated programs. With only eight commands, each consisting of a single character, it greatly reduces the overhead of what any genetic algorithm must produce and what a fitness function must evaluate.. There was one shortcoming to [Kory]'s initial efforts: functions. It's relatively easy to get a program to say Hello World, but to do something complex, you're going to need something like a macro or a ...
hackaday.com/tag/genetic-algorithm/

*  A computationally efficient method for automatic registration of orthogonal x-ray images with volumetric CT data. - The...

The paper presents a computationally efficient 3D-2D image registration algorithm for automatic pre-treatment validation in radiotherapy. The novel aspects of the algorithm include (a) a hybrid cost function based on partial digitally reconstructed radiographs (DRRs) generated along projected anatomical contours and a level set term for similarity measurement; and (b) a fast search method based on parabola fitting and sensitivity-based search order. Using CT and orthogonal x-ray images from a skull and a pelvis phantom, the proposed algorithm is compared with the conventional ray-casting full DRR based registration method. Not only is the algorithm shown to be computationally more efficient with registration time being reduced by a factor of 8, but also the algorithm is shown to offer 50% higher capture range allowing the initial patient displacement up to 15 mm (measured by mean target registration error). For the simulated data, high registration accuracy with average errors of 0.53 mm +/- ...
christie.openrepository.com/christie/handle/10541/68836

*  Kaisa Miettinen - Abstracts

Sindhya, K., Ojalehto, V., Savolainen, J., Niemistö, H., Hakanen, J., Miettinen, K., Coupling Dynamic Simulation and Interactive Multiobjective Optimization for Complex Problems: An APROS-NIMBUS Case Study, Expert Systems with Applications, 41(5), 2546-2558, 2014. Dynamic process simulators for plant-wide process simulation and multiobjective optimization tools can be used by industries as a means to cut costs and enhance profitability. Specifically, dynamic process simulators are useful in the process plant design phase, as they provide several benefits such as savings in time and costs. On the other hand, multiobjective optimization tools are useful in obtaining the best possible process designs when multiple conflicting objectives are to be optimized simultaneously. Here we concentrate on interactive multiobjective optimization. When multiobjective optimization methods are used in process design, they need an access to dynamic process simulators, hence it is desirable for them to coexist on ...
users.jyu.fi/~miettine/publ_abstracts.htm

*  Which machine learning algorithm should I use? - Subconscious Musings

We generally do not want to feed a large number of features directly into a machine learning algorithm since some features may be irrelevant or the "intrinsic" dimensionality may be smaller than the number of features. Principal component analysis (PCA), singular value decomposition (SVD), and latent Dirichlet allocation (LDA) all can be used to perform dimension reduction.. PCA is an unsupervised clustering method which maps the original data space into a lower dimensional space while preserving as much information as possible. The PCA basically finds a subspace that most preserves the data variance, with the subspace defined by the dominant eigenvectors of the data's covariance matrix.. The SVD is related to PCA in the sense that SVD of the centered data matrix (features versus samples) provides the dominant left singular vectors that define the same subspace as found by PCA. However, SVD is a more versatile technique as it can also do things that PCA may not do. For example, the SVD of a ...
blogs.sas.com/content/subconsciousmusings/2017/04/12/machine-learning-algorithm-use/

*  Invisible entities in your system: Encapsulated Genetic Algorithm within Input Framework creates Closed-loop model

System Owner develops "Unconventional Algorithm" for solving complex scenario in rationalization process. Input 2 data with "Encapsulated Unconventional Algorithm" process "Output 2 frameworks" through Information Processing System. Transparent analogical patterns between Input 2 and Output 2 can identify allocated "Unconventional Algorithm" within Input 2 Framework. Unconventional Algorithm focuses on instance parameters of complexity in rationalization process. Unconventional Algorithm would fail to respond internal & external resources; besides, it could not perceive and define requirements for functional & operational level strategy. Eventually, outcome mapping for encapsulated "Unconventional Algorithm" is partial open-loop model structure. Parameter complexity requires multiple sub-optimizations on the evolutionary path of systems performances. (Fig 1) ...
invisibleentities.blogspot.com/2015/09/encapsulated-genetic-algorithm-within.html

*  A novel algorithm and its VLSI architecture for connected component labeling | (2011) | Zhao | Publications | Spie

A novel line-based streaming labeling algorithm with its VLSI architecture is proposed in this paper. Line-based neighborhood examination scheme is used for efficient local connected components extraction. A novel reversed rooted tree hook-up strategy, which is very suitable for hardware implementation, is applied on the mergence stage of equivalent connected components. The reversed rooted tree hook-up strategy significant reduces the requirement of on-chip memory, which makes the chip area smaller. Clock domains crossing FIFOs are also applied for connecting the label core and external memory interface, which makes the label engine working in a higher frequency and raises the throughput of the label engine. Several performance tests have been performed for our proposed hardware implementation. The processing bandwidth of our hardware architecture can reach the I/O transfer boundary according to the external interface clock in all the real image tests. Beside the advantage of reducing the ...
spie.org/Publications/Proceedings/Paper/10.1117/12.902126

*  Design a framework of a genetic algorithm, Data Structure & Algorithms

Data Structure & Algorithms Assignment Help, Design a framework of a genetic algorithm, You have to design a framework of a Genetic Algorithm (GA) with basic functionality. The basic functionality includes representation, recombination operators, tness function and selection criteria. You can implement this framework in any language of
expertsmind.com/questions/design-a-framework-of-a-genetic-algorithm-30148551.aspx

*  Items where Division is "College Of Engineering Sciences | Electrical Engineering Dept" and Year is 2003 - KFUPM ePrints

A. Abu-Al-Saud,, Wajih and L. Stüber,, Gordon (2003) Modified CIC Filter for Sample Rate Conversion in Software Radio Systems. IEEE SIGNAL PROCESSING LETTERS, 10 (5).. Abdel-Magid, Y.L. and Abido, M.A. (2003) AGC tuning of interconnected reheat thermal systems with particle swarm optimization. Electronics, Circuits and Systems, 2003. ICECS 2003. Proceedings of the 2003 10th IEEE International conference, 1.. Abdel-Magid, Y.L. and Abido, M.A. (2003) Optimal multiobjective design of robust power system stabilizers using genetic algorithms. Power Systems, IEEE Transactions on, 18.. Abido, M. A. (2003) Environmental/Economic Power Dispatch Using Multiobjective Evolutionary Algorithms. IEEE Trans. on Power Systems,, 18 (4). pp. 1529-1537.. Abido, M.A. (2003) Environmental/economic power dispatch using multiobjective evolutionary algorithms. Power Systems, IEEE Transactions on, 18.. Abido, M.A. (2003) Environmental/economic power dispatch using multiobjective ...
eprints.kfupm.edu.sa/view/divisions/B4/2003.default.html

*  A Parallel Implementation of the Cellular Potts Model for Simulation of Cell-Based Morphogenesis | iCeNSA

The Cellular Potts Model (CPM) has been used in a wide variety of biological simulations. However, most current CPM implementations use a sequential modified Metropolis algorithm which restricts the size of simulations. In this paper we present a parallel CPM algorithm for simulations of morphogenesis, which includes cell-cell adhesion, a cell volume constraint, and cell haptotaxis. The algorithm uses appropriate data structures and checkerboard subgrids for parallelization. Communication and updating algorithms synchronize properties of cells simulated on different processor nodes. Tests show that the parallel algorithm has good scalability, permitting large-scale simulations of cell morphogenesis (107 or more cells) and broadening the scope of CPM applications. The new algorithm satisfies the balance condition, which is sufficient for convergence of the underlying Markov chain.. ...
icensa.com/content/parallel-implementation-cellular-potts-model-simulation-cell-based-morphogenesis

*  A Framework for RNAV trajectory generation minimizing noise nuisances

In this work it is presented a framework for a global optimization tool that will take into account air craft dynamics and performances, noise nuisances and RNAV radionavigation requirements in order to assess an optimum flight depart or approach procedure. This strategy would be used as an optimization process performed by the corresponding authority in charge of the air traffic management of the involved airport or by an on-board optimization algorithm integrated in the Flight Management and Guidance System (FMGS). In both cases, the optimization framework is the same and the differences reside in the specific implementation of the optimization algorithms and the availability of the data in real time. In addition, aircraft's dynamic equations are developed in order to compute the flight trajectory from a set of flight guidance control variables and a first glance into a noise optimization criterion is given. Finally, the global optimization problem is properly formulated and the proposed ...
upcommons.upc.edu/handle/2117/553

*  CSE/PSU Theory and Algorithm Group

Finding the length of the longest increasing subsequence (LIS) is a classic algorithmic problem. A small example should explain the problem. In the array 1 9 4 10 6 7, the LIS has length 4 and is 1 4 6 7. Let n denote the size of the input array. Simple textbook solutions achieve an O (n log n) running time, using dynamic programming. What can a sublinear time algorithm say about the LIS? For any constant delta > 0, we construct a polylogarithmic time randomized algorithm that estimates the length of the LIS within an additive error of (delta n). Previously, the best known polylogarithmic time algorithms could only achieve an additive n/2 approximation. Why is this problem challenging? The LIS length is the output of a dynamic program, which means unless we solve many (read linear) sub problems, we cannot get the exact LIS length. We are looking to get extremely accurate (read delta n) approximations in *polylogarithmic time*. The algorithm we construct attempts to follow the progress of ...
cse.psu.edu/theory/sem11f.html

*  Optimization Technique for Maximization Problem in Evolutionary Programming of Genetic Algorithm in Data Mining - TechRepublic

The optimization technique is used for the identification of some best values from the various populations. The Evolutionary algorithm is...
https://techrepublic.com/resource-library/whitepapers/optimization-technique-for-maximization-problem-in-evolutionary-programming-of-genetic-algorithm-in-data-mining/

*  Evaluation of different structural models for target detection in hyperspectral imagery | (2010) | Peña-Ortega | Publications ...

Target detection is an essential component for defense, security and medical applications of hyperspectral imagery. Structured and unstructured models are used to model variability of spectral signatures, for the design of information extraction algorithms. In structured models, spectral variability is modeled using different geometric representations. In linear approaches, the spectral signatures are assumed to be generated by the linear combination of basis vectors. The nature of the basis vectors, and its allowable linear combinations, define different structural models such as vector subspaces, polyhedral cones, and convex hulls. In this paper, we investigate the use of these models to describe background of hyperspectral images, and study the performance of target detection algorithms based on these models. We also study the effect of the model order in the performance of target detection algorithms based on these models. Results show that model order ...
spie.org/Publications/Proceedings/Paper/10.1117/12.851743

*  Feature Selection Using Age Layered Genetic Programming

This research presents the FSALPS (Feature Selection Age Layered Population Structure) evolutionary algorithm. FSALPS performs effective feature subset selection and classification of varied supervised learning tasks. It is a modication of Hornby's ALPS algorithm, which is a renown meta-heuristic for overcoming the problem of premature convergence in evolutionary algorithms, and for improving search in the fitness landscape. FSALPS uses a novel frequency count system to rank features in the GP population based on evolved feature frequencies. The ranked features are translated into probabilities, which are used to control evolutionary processes such as terminal selection for the construction of GP trees/sub-trees. FSALPS continuously refines the feature subset selection process while simultaneously evolving efficient classifiers through a non-converging evolutionary process that favors selection of features with high discrimination of class labels. The research applies FSALPS an assortment of ...
cosc.brocku.ca/~bross/FSALPS/

*  Development of antibiotic treatment algorithms based on local ecology and respiratory surveillance cultures to restrict the use...

Both guidance by SC as well as the use of ICU-specific empirical schemes that incorporate local microbiology data have been shown to increase appropriate empirical prescription and reduce the use of broad-spectrum antimicrobials as compared to general guidelines [4, 7, 8, 17-19]. Our study is the first to demonstrate the benefit of SC in surplus to tailoring guidelines to local susceptibility data. We found that incorporating results of SC (SCBA) in a clinical algorithm (LEBA) to help the choice of an empirical antibiotic regimen in suspected HAP would allow reduction in the use of broad-spectrum antimicrobials for equal rates of appropriate coverage. In particular, a 60% decrease in the empirical use of carbapenems would be attained, which is an important achievement in terms of antibiotic stewardship. Similarly, as compared to actually prescribed antibiotics, which were at the discretion of the attending physician with access to SC results but without guidance by a treatment algorithm, ...
https://ccforum.biomedcentral.com/articles/10.1186/cc13990

*  Genetic Algorithms for Models Optimization for Recognition of Translation Initiation Sites - KAUST Repository

This work uses genetic algorithms (GA) to reduce the complexity of the artificial neural networks (ANNs) and decision trees (DTs) for the accurate recognition of translation initiation sites (TISs) in Arabidopsis Thaliana. The Arabidopsis data was extracted directly from genomic DNA sequences. Methods derived in this work resulted in both reduced complexity of the predictors, as well as in improvement in prediction accuracy (generalization). Optimization through use of GA is generally a computationally intensive task. One of the approaches to overcome this problem is to use parallelization of code that implements GA, thus allowing computation on multiprocessing infrastructure. However, further improvement in performance GA implementation could be achieved through modification done to GA basic operations such as selection, crossover and mutation. In this work we explored two such improvements, namely evolutive mutation and GA-Simplex crossover operation. In this thesis we studied the benefit ...
repository.kaust.edu.sa/kaust/handle/10754/136690

*  Peer Reviewed Abstracts/Posters - Dr. Ryan J. Urbanowicz

Abstract: One important aspect of epidemiological research is the advancement of machine learning strategies that can detect complex patterns of association between genetic or environmental variants and common disease risk. In particular, methods that can detect, model, and characterize epistatic interactions and heterogeneous patterns of associations can offer new insights in contrast with traditional methods that rely on simplifying assumptions. Data mining methods that can handle heterogeneity are almost non-existent, however rule-based machine learning algorithms have been demonstrated to possess this ability. Over the last few years, we have been developing a rule-based machine learning algorithm called ExSTraCS, or an Extended Supervised Tracking and Classifying System. This adaptive algorithm evolves a population of human interpretable 'IF:THEN' rules that can break complex and heterogeneous problems into accessible pieces. In previous publications we have adapted rule-based machine ...
ryanurbanowicz.com/index.php/publications/presented-abstracts/

*  Separability legal definition of separability

Definition of separability in the Legal Dictionary - by Free online English dictionary and encyclopedia. What is separability? Meaning of separability as a legal term. What does separability mean in law?
legal-dictionary.thefreedictionary.com/separability

*  Nickle

... encourages (but does not require) the developer to use type declarations in programming and on the command line, to allow static type checking. Nickle backs this up with full run-time type checking.. The numeric datatypes within Nickle make it a good choice for the design and implementation of numeric algorithms. Nickle provides three numeric data types: arbitrary precision integers, arbitrary-precision rationals and unbounded floating-point "reals" with specifiable precision. These datatypes permit computations which would be difficult or impossible using fixed format numeric datatypes.. Along with the basic numeric datatypes, Nickle includes multi-dimensional arrays, strings, structures, tagged unions and pointers.. Nickle provides support for parallel computation via a simple thread model, along with ancilary datatypes like semaphores and mutexes. These provide an excellent environment for parallel algorithm development.. Nickle also has features inspired by modern languages like ...
nickle.org

Clonal Selection Algorithm: In artificial immune systems, Clonal selection algorithms are a class of algorithms inspired by the clonal selection theory of acquired immunity that explains how B and T lymphocytes improve their response to antigens over time called affinity maturation. These algorithms focus on the Darwinian attributes of the theory where selection is inspired by the affinity of antigen-antibody interactions, reproduction is inspired by cell division, and variation is inspired by somatic hypermutation.Mac OS X Server 1.0Interval boundary element method: Interval boundary element method is classical boundary element method with the interval parameters.
PSI Protein Classifier: PSI Protein Classifier is a program generalizing the results of both successive and independent iterations of the PSI-BLAST program. PSI Protein Classifier determines belonging of the found by PSI-BLAST proteins to the known families.Generalizability theory: Generalizability theory, or G Theory, is a statistical framework for conceptualizing, investigating, and designing reliable observations. It is used to determine the reliability (i.Mexican International Conference on Artificial Intelligence: MICAI (short for Mexican International Conference on Artificial Intelligence) is the name of an annual conference covering all areas of Artificial Intelligence (AI), held in Mexico. The first MICAI conference was held in 2000.Inverse probability weighting: Inverse probability weighting is a statistical technique for calculating statistics standardized to a population different from that in which the data was collected. Study designs with a disparate sampling population and population of target inference (target population) are common in application.Assay sensitivity: Assay sensitivity is a property of a clinical trial defined as the ability of a trial to distinguish an effective treatment from a less effective or ineffective intervention. Without assay sensitivity, a trial is not internally valid and is not capable of comparing the efficacy of two interventions.Image fusion: In computer vision, Multisensor Image fusion is the process of combining relevant information from two or more images into a single image.Haghighat, M.Protein subcellular localization prediction: Protein subcellular localization prediction (or just protein localization prediction) involves the computational prediction of where a protein resides in a cell.CS-BLASTImaging phantom: Phantom}}Demodulation: Demodulation is the act of extracting the original information-bearing signal from a modulated carrier wave. A demodulator is an electronic circuit (or computer program in a software-defined radio) that is used to recover the information content from the modulated carrier wave.Jigsaw (power tool): A jigsaw power tool is a jigsaw made up of an electric motor and a reciprocating saw blade.Volume rendering: 250px|thumb| A volume rendered cadaver head using view-aligned [[texture mapping and diffuse reflection]]DNA sequencer: A DNA sequencer is a scientific instrument used to automate the DNA sequencing process. Given a sample of DNA, a DNA sequencer is used to determine the order of the four bases: G (guanine), C (cytosine), A (adenine) and T (thymine).Vladimir Andreevich Markov: Vladimir Andreevich Markov (; May 8, 1871 – January 18, 1897) was a Russian mathematician, known for proving the Markov brothers' inequality with his older brother Andrey Markov. He died of tuberculosis at the age of 25.Lattice protein: Lattice proteins are highly simplified computer models of proteins which are used to investigate protein folding.Human Proteinpedia: Human Proteinpedia is a portal for sharing and integration of human proteomic data,.Kandasamy et al.Hyperparameter: In Bayesian statistics, a hyperparameter is a parameter of a prior distribution; the term is used to distinguish them from parameters of the model for the underlying system under analysis.Gene signature: A gene signature is a group of genes in a cell whose combined expression patternItadani H, Mizuarai S, Kotani H. Can systems biology understand pathway activation?Monte Carlo methods for option pricing: In mathematical finance, a Monte Carlo option model uses Monte Carlo methods Although the term 'Monte Carlo method' was coined by Stanislaw Ulam in the 1940s, some trace such methods to the 18th century French naturalist Buffon, and a question he asked about the results of dropping a needle randomly on a striped floor or table. See Buffon's needle.List of molecular graphics systems: This is a list of software systems that are used for visualizing macromolecules.XAP Home Automation protocol: xAP is an open protocol used for home automation and supports integration of telemetry and control devices primarily within the home. Common communications networks include RS232, RS485, Ethernet& wireless.Cellular microarray: A cellular microarray is a laboratory tool that allows for the multiplex interrogation of living cells on the surface of a solid support. The support, sometimes called a "chip", is spotted with varying materials, such as antibodies, proteins, or lipids, which can interact with the cells, leading to their capture on specific spots.Physical neural network: A physical neural network is a type of artificial neural network in which an electrically adjustable resistance material is used to emulate the function of a neural synapse. "Physical" neural network is used to emphasize the reliance on physical hardware used to emulate neurons as opposed to software-based approaches which simulate neural networks.Von Neumann regular ring: In mathematics, a von Neumann regular ring is a ring R such that for every a in R there exists an x in R such that . To avoid the possible confusion with the regular rings and regular local rings of commutative algebra (which are unrelated notions), von Neumann regular rings are also called absolutely flat rings, because these rings are characterized by the fact that every left module is flat.Immersive technologyLempel–Ziv–Oberhumer: Lempel–Ziv–Oberhumer (LZO) is a lossless data compression algorithm that is focused on decompression speed.Vague setEEGLAB: EEGLAB is a MATLAB toolbox distributed under the free GNU GPL license for processing data from electroencephalography (EEG), magnetoencephalography (MEG), and other electrophysiological signals. Along with all the basic processing tools, EEGLAB implements independent component analysis (ICA), time/frequency analysis, artifact rejection, and several modes of data visualization.Computer-aided diagnosis: In radiology, computer-aided detection (CADe), also called computer-aided diagnosis (CADx), are procedures in medicine that assist doctors in the interpretation of medical images. Imaging techniques in X-ray, MRI, and Ultrasound diagnostics yield a great deal of information, which the radiologist has to analyze and evaluate comprehensively in a short time.Extracellular: In cell biology, molecular biology and related fields, the word extracellular (or sometimes extracellular space) means "outside the cell". This space is usually taken to be outside the plasma membranes, and occupied by fluid.Matrix model: == Mathematics and physics ==STO-nG basis sets: STO-nG basis sets are minimal basis sets, where n primitive Gaussian orbitals are fitted to a single Slater-type orbital (STO). n originally took the values 2 - 6.Conference and Labs of the Evaluation Forum: The Conference and Labs of the Evaluation Forum (formerly Cross-Language Evaluation Forum), or CLEF, is an organization promoting research in multilingual information access (currently focusing on European languages). Its specific functions are to maintain an underlying framework for testing information retrieval systems and to create repositories of data for researchers to use in developing comparable standards.Decoding methods: In coding theory, decoding is the process of translating received messages into codewords of a given code. There have been many common methods of mapping messages to codewords.Ontario Genomics Institute: The Ontario Genomics Institute (OGI) is a not-for-profit organization that manages cutting-edge genomics research projects and platforms.The Ontario Genomics Institute OGI also helps scientists find paths to the marketplace for their discoveries and the products to which they lead, and it works through diverse outreach and educational activities to raise awareness and facilitate informed public dialogue about genomics and its social impacts.Internet organizations: This is a list of Internet organizations, or organizations that play or played a key role in the evolution of the Internet by developing recommendations, standards, and technology; deploying infrastructure and services; and addressing other major issues.Recursive partitioning: Recursive partitioning is a statistical method for multivariable analysis. Recursive partitioning creates a decision tree that strives to correctly classify members of the population by splitting it into sub-populations based on several dichotomous independent variables. The process is termed recursive because each sub-population may in turn be split an indefinite number of times until the splitting process terminates after a particular stopping criterion is reached.UnsharpnessAngiographyRDF query language: An RDF query language is a computer language, specifically a query language for databases, able to retrieve and manipulate data stored in Resource Description Framework format.List of transforms: This is a list of transforms in mathematics.Andy HardyProcess mining: Process mining is a process management technique that allows for the analysis of business processes based on event logs. The basic idea is to extract knowledge from event logs recorded by an information system.Protein–protein interactionReaction coordinateBody area network: A body area network (BAN), also referred to as a wireless body area network (WBAN) or a body sensor network (BSN), is a wireless network of wearable computing devices.Developing wireless body area networks standardSana Ullah, Henry Higgins, Bart Braem, Benoit Latre, Chris Blondia, Ingrid Moerman, Shahnaz Saleem, Ziaur Rahman and Kyung Sup Kwak, A Comprehensive Survey of Wireless Body Area Networks: On PHY, MAC, and Network Layers Solutions, Journal of Medical Systems (Springer), 2010.Corinna CortesVisionxList of software development philosophies: This is a list of approaches, styles, and philosophies in software development not included in the category tree of software development philosophies. It contains also software development processes, software development methodologies and single practices, principles and laws.Calculator: An electronic calculator is a small, portable electronic device used to perform both basic operations of arithmetic and complex mathematical operations.Coles PhillipsDoob decomposition theorem: In the theory of stochastic processes in discrete time, a part of the mathematical theory of probability, the Doob decomposition theorem gives a unique decomposition of every adapted and integrable stochastic process as the sum of a martingale and a predictable process (or "drift") starting at zero. The theorem was proved by and is named for Joseph L.List of sequenced eukaryotic genomesBiological network: A biological network is any network that applies to biological systems. A network is any system with sub-units that are linked into a whole, such as species units linked into a whole food web.Beta encoder: A beta encoder is an analog to digital conversion (A/D) system in which a real number in the unit interval is represented by a finite representation of a sequence in base beta, with beta being a real number between 1 and 2. Beta encoders are an alternative to traditional approaches to pulse code modulation.Negative probability: The probability of the outcome of an experiment is never negative, but quasiprobability distributions can be defined that allow a negative probability for some events. These distributions may apply to unobservable events or conditional probabilities.Chromosome regionsBranching order of bacterial phyla (Gupta, 2001): There are several models of the Branching order of bacterial phyla, one of these was proposed in 2001 by Gupta based on conserved indels or protein, termed "protein signatures", an alternative approach to molecular phylogeny. Some problematic exceptions and conflicts are present to these conserved indels, however, they are in agreement with several groupings of classes and phyla.HyperintensityTemporal analysis of products: Temporal Analysis of Products (TAP), (TAP-2), (TAP-3) is an experimental technique for studyingSymmetry element: A symmetry element is a point of reference about which symmetry operations can take place. In particular, symmetry elements can be centers of inversion, axes of rotation and mirror planes.Modified Maddrey's discriminant function: The modified Maddrey's discriminant function) was originally described by Maddrey and Boitnott to predict prognosis in alcoholic hepatitis. It is calculated by a simple formula:PlanmecaDense artery sign: In medicine, the dense artery sign or hyperdense artery sign is a radiologic sign seen on computer tomography (CT) scans suggestive of early ischemic stroke. In earlier studies of medical imaging in patients with strokes, it was the earliest sign of ischemic stroke in a significant minority of cases.The Unscrambler: The Unscrambler® X is a commercial software product for multivariate data analysis, used for calibration of multivariate data which is often in the application of analytical data such as near infrared spectroscopy and Raman spectroscopy, and development of predictive models for use in real-time spectroscopic analysis of materials. The software was originally developed in 1986 by Harald MartensHarald Martens, Terje Karstang, Tormod Næs (1987) Improved selectivity in spectroscopy by multivariate calibration Journal of Chemometrics 1(4):201-219 and later by CAMO Software.Nonlinear system: In physics and other sciences, a nonlinear system, in contrast to a linear system, is a system which does not satisfy the superposition principle – meaning that the output of a nonlinear system is not directly proportional to the input.Ideal number: In number theory an ideal number is an algebraic integer which represents an ideal in the ring of integers of a number field; the idea was developed by Ernst Kummer, and led to Richard Dedekind's definition of ideals for rings. An ideal in the ring of integers of an algebraic number field is principal if it consists of multiples of a single element of the ring, and nonprincipal otherwise.Cable fault location: Cable fault location is the process of locating periodic faults, such as insulation faults in underground cables, and is an application of electrical measurement systems. In this process, mobile shock discharge generators are among the devices used.Proteomics Standards Initiative: The Proteomics Standards Initiative (PSI) is a working group of Human Proteome Organization. It aims to define data standards for proteomics in order to facilitate data comparison, exchange and verification.RV coefficient: In statistics, the RV coefficientWGAViewer: WGAViewer is a bioinformatics software tool which is designed to visualize, annotate, and help interpret the results generated from a genome wide association study (GWAS). Alongside the P values of association, WGAViewer allows a researcher to visualize and consider other supporting evidence, such as the genomic context of the SNP, linkage disequilibrium (LD) with ungenotyped SNPs, gene expression database, and the evidence from other GWAS projects, when determining the potential importance of an individual SNP.Protein primary structure: The primary structure of a peptide or protein is the linear sequence of its amino acid structural units, and partly comprises its overall biomolecular structure. By convention, the primary structure of a protein is reported starting from the amino-terminal (N) end to the carboxyl-terminal (C) end.Acknowledgement (data networks): In data networking, an acknowledgement (or acknowledgment) is a signal passed between communicating processes or computers to signify acknowledgement, or receipt of response, as part of a communications protocol. For instance, ACK packets are used in the Transmission Control Protocol (TCP) to acknowledge the receipt of SYN packets when establishing a connection, data packets while a connection is being used, and FIN packets when terminating a connection.Dragomir R. Radev: Dragomir R. Radev is a University of Michigan computer science professor and Columbia University computer science adjunct professor working on natural language processing and information retrieval.Electrical impedance tomography: Electrical Impedance Tomography (EIT) is a non-invasive medical imaging technique in which an image of the conductivity or permittivity of part of the body is inferred from surface electrode measurements. Electrical conductivity depends on free ion content and differs considerably between various biological tissues (absolute EIT) or different functional states of one and the same tissue or organ (relative or functional EIT).Plant Proteome Database: The Plant Proteome Database is a National Science Foundation-funded project to determine the biological function of each protein in plants.Sun Q, Zybailov B, Majeran W, Friso G, Olinares PD, van Wijk KJ.

(1/42270) An effective approach for analyzing "prefinished" genomic sequence data.

Ongoing efforts to sequence the human genome are already generating large amounts of data, with substantial increases anticipated over the next few years. In most cases, a shotgun sequencing strategy is being used, which rapidly yields most of the primary sequence in incompletely assembled sequence contigs ("prefinished" sequence) and more slowly produces the final, completely assembled sequence ("finished" sequence). Thus, in general, prefinished sequence is produced in excess of finished sequence, and this trend is certain to continue and even accelerate over the next few years. Even at a prefinished stage, genomic sequence represents a rich source of important biological information that is of great interest to many investigators. However, analyzing such data is a challenging and daunting task, both because of its sheer volume and because it can change on a day-by-day basis. To facilitate the discovery and characterization of genes and other important elements within prefinished sequence, we have developed an analytical strategy and system that uses readily available software tools in new combinations. Implementation of this strategy for the analysis of prefinished sequence data from human chromosome 7 has demonstrated that this is a convenient, inexpensive, and extensible solution to the problem of analyzing the large amounts of preliminary data being produced by large-scale sequencing efforts. Our approach is accessible to any investigator who wishes to assimilate additional information about particular sequence data en route to developing richer annotations of a finished sequence.  (+info)

(2/42270) A computational screen for methylation guide snoRNAs in yeast.

Small nucleolar RNAs (snoRNAs) are required for ribose 2'-O-methylation of eukaryotic ribosomal RNA. Many of the genes for this snoRNA family have remained unidentified in Saccharomyces cerevisiae, despite the availability of a complete genome sequence. Probabilistic modeling methods akin to those used in speech recognition and computational linguistics were used to computationally screen the yeast genome and identify 22 methylation guide snoRNAs, snR50 to snR71. Gene disruptions and other experimental characterization confirmed their methylation guide function. In total, 51 of the 55 ribose methylated sites in yeast ribosomal RNA were assigned to 41 different guide snoRNAs.  (+info)

(3/42270) Referenceless interleaved echo-planar imaging.

Interleaved echo-planar imaging (EPI) is an ultrafast imaging technique important for applications that require high time resolution or short total acquisition times. Unfortunately, EPI is prone to significant ghosting artifacts, resulting primarily from system time delays that cause data matrix misregistration. In this work, it is shown mathematically and experimentally that system time delays are orientation dependent, resulting from anisotropic physical gradient delays. This analysis characterizes the behavior of time delays in oblique coordinates, and a new ghosting artifact caused by anisotropic delays is described. "Compensation blips" are proposed for time delay correction. These blips are shown to remove the effects of anisotropic gradient delays, eliminating the need for repeated reference scans and postprocessing corrections. Examples of phantom and in vivo images are shown.  (+info)

(4/42270) An evaluation of elongation factor 1 alpha as a phylogenetic marker for eukaryotes.

Elongation factor 1 alpha (EF-1 alpha) is a highly conserved ubiquitous protein involved in translation that has been suggested to have desirable properties for phylogenetic inference. To examine the utility of EF-1 alpha as a phylogenetic marker for eukaryotes, we studied three properties of EF-1 alpha trees: congruency with other phyogenetic markers, the impact of species sampling, and the degree of substitutional saturation occurring between taxa. Our analyses indicate that the EF-1 alpha tree is congruent with some other molecular phylogenies in identifying both the deepest branches and some recent relationships in the eukaryotic line of descent. However, the topology of the intermediate portion of the EF-1 alpha tree, occupied by most of the protist lineages, differs for different phylogenetic methods, and bootstrap values for branches are low. Most problematic in this region is the failure of all phylogenetic methods to resolve the monophyly of two higher-order protistan taxa, the Ciliophora and the Alveolata. JACKMONO analyses indicated that the impact of species sampling on bootstrap support for most internal nodes of the eukaryotic EF-1 alpha tree is extreme. Furthermore, a comparison of observed versus inferred numbers of substitutions indicates that multiple overlapping substitutions have occurred, especially on the branch separating the Eukaryota from the Archaebacteria, suggesting that the rooting of the eukaryotic tree on the diplomonad lineage should be treated with caution. Overall, these results suggest that the phylogenies obtained from EF-1 alpha are congruent with other molecular phylogenies in recovering the monophyly of groups such as the Metazoa, Fungi, Magnoliophyta, and Euglenozoa. However, the interrelationships between these and other protist lineages are not well resolved. This lack of resolution may result from the combined effects of poor taxonomic sampling, relatively few informative positions, large numbers of overlapping substitutions that obscure phylogenetic signal, and lineage-specific rate increases in the EF-1 alpha data set. It is also consistent with the nearly simultaneous diversification of major eukaryotic lineages implied by the "big-bang" hypothesis of eukaryote evolution.  (+info)

(5/42270) Hierarchical cluster analysis applied to workers' exposures in fiberglass insulation manufacturing.

The objectives of this study were to explore the application of cluster analysis to the characterization of multiple exposures in industrial hygiene practice and to compare exposure groupings based on the result from cluster analysis with that based on non-measurement-based approaches commonly used in epidemiology. Cluster analysis was performed for 37 workers simultaneously exposed to three agents (endotoxin, phenolic compounds and formaldehyde) in fiberglass insulation manufacturing. Different clustering algorithms, including complete-linkage (or farthest-neighbor), single-linkage (or nearest-neighbor), group-average and model-based clustering approaches, were used to construct the tree structures from which clusters can be formed. Differences were observed between the exposure clusters constructed by these different clustering algorithms. When contrasting the exposure classification based on tree structures with that based on non-measurement-based information, the results indicate that the exposure clusters identified from the tree structures had little in common with the classification results from either the traditional exposure zone or the work group classification approach. In terms of the defining homogeneous exposure groups or from the standpoint of health risk, some toxicological normalization in the components of the exposure vector appears to be required in order to form meaningful exposure groupings from cluster analysis. Finally, it remains important to see if the lack of correspondence between exposure groups based on epidemiological classification and measurement data is a peculiarity of the data or a more general problem in multivariate exposure analysis.  (+info)

(6/42270) A new filtering algorithm for medical magnetic resonance and computer tomography images.

Inner views of tubular structures based on computer tomography (CT) and magnetic resonance (MR) data sets may be created by virtual endoscopy. After a preliminary segmentation procedure for selecting the organ to be represented, the virtual endoscopy is a new postprocessing technique using surface or volume rendering of the data sets. In the case of surface rendering, the segmentation is based on a grey level thresholding technique. To avoid artifacts owing to the noise created in the imaging process, and to restore spurious resolution degradations, a robust Wiener filter was applied. This filter working in Fourier space approximates the noise spectrum by a simple function that is proportional to the square root of the signal amplitude. Thus, only points with tiny amplitudes consisting mostly of noise are suppressed. Further artifacts are avoided by the correct selection of the threshold range. Afterwards, the lumen and the inner walls of the tubular structures are well represented and allow one to distinguish between harmless fluctuations and medically significant structures.  (+info)

(7/42270) Efficacy of ampicillin plus ceftriaxone in treatment of experimental endocarditis due to Enterococcus faecalis strains highly resistant to aminoglycosides.

The purpose of this work was to evaluate the in vitro possibilities of ampicillin-ceftriaxone combinations for 10 Enterococcus faecalis strains with high-level resistance to aminoglycosides (HLRAg) and to assess the efficacy of ampicillin plus ceftriaxone, both administered with humanlike pharmacokinetics, for the treatment of experimental endocarditis due to HLRAg E. faecalis. A reduction of 1 to 4 dilutions in MICs of ampicillin was obtained when ampicillin was combined with a fixed subinhibitory ceftriaxone concentration of 4 micrograms/ml. This potentiating effect was also observed by the double disk method with all 10 strains. Time-kill studies performed with 1 and 2 micrograms of ampicillin alone per ml or in combination with 5, 10, 20, 40, and 60 micrograms of ceftriaxone per ml showed a > or = 2 log10 reduction in CFU per milliliter with respect to ampicillin alone and to the initial inoculum for all 10 E. faecalis strains studied. This effect was obtained for seven strains with the combination of 2 micrograms of ampicillin per ml plus 10 micrograms of ceftriaxone per ml and for six strains with 5 micrograms of ceftriaxone per ml. Animals with catheter-induced endocarditis were infected intravenously with 10(8) CFU of E. faecalis V48 or 10(5) CFU of E. faecalis V45 and were treated for 3 days with humanlike pharmacokinetics of 2 g of ampicillin every 4 h, alone or combined with 2 g of ceftriaxone every 12 h. The levels in serum and the pharmacokinetic parameters of the humanlike pharmacokinetics of ampicillin or ceftriaxone in rabbits were similar to those found in humans treated with 2 g of ampicillin or ceftriaxone intravenously. Results of the therapy for experimental endocarditis caused by E. faecalis V48 or V45 showed that the residual bacterial titers in aortic valve vegetations were significantly lower in the animals treated with the combinations of ampicillin plus ceftriaxone than in those treated with ampicillin alone (P < 0.001). The combination of ampicillin and ceftriaxone showed in vitro and in vivo synergism against HLRAg E. faecalis.  (+info)

(8/42270) The muscle chloride channel ClC-1 has a double-barreled appearance that is differentially affected in dominant and recessive myotonia.

Single-channel recordings of the currents mediated by the muscle Cl- channel, ClC-1, expressed in Xenopus oocytes, provide the first direct evidence that this channel has two equidistant open conductance levels like the Torpedo ClC-0 prototype. As for the case of ClC-0, the probabilities and dwell times of the closed and conducting states are consistent with the presence of two independently gated pathways with approximately 1.2 pS conductance enabled in parallel via a common gate. However, the voltage dependence of the common gate is different and the kinetics are much faster than for ClC-0. Estimates of single-channel parameters from the analysis of macroscopic current fluctuations agree with those from single-channel recordings. Fluctuation analysis was used to characterize changes in the apparent double-gate behavior of the ClC-1 mutations I290M and I556N causing, respectively, a dominant and a recessive form of myotonia. We find that both mutations reduce about equally the open probability of single protopores and that mutation I290M yields a stronger reduction of the common gate open probability than mutation I556N. Our results suggest that the mammalian ClC-homologues have the same structure and mechanism proposed for the Torpedo channel ClC-0. Differential effects on the two gates that appear to modulate the activation of ClC-1 channels may be important determinants for the different patterns of inheritance of dominant and recessive ClC-1 mutations.  (+info)



implementations


  • Multicore SWARM (Software and Algorithms for Running on Multicore Processors) is an open source library for developing efficient and portable implementations that make use of multi-core processors. (sourceforge.net)

implements


  • NOMAD is a C++ code that implements the MADS algorithm (Mesh Adaptive Direct Search) for difficult blackbox optimization problems. (sourceforge.net)
  • This Project implements the mergesort algorithm in a MPICH2 (parallel programming -) environment. (sourceforge.net)

solve


  • Software to solve Linear Programming problems applying the Revised Simplex Algorithm (2-Phase Method) and performing a Sensitivity Analysis too. (sourceforge.net)
  • Pollard's rho algorithm for logarithms is an algorithm introduced by John Pollard in 1978 to solve the discrete logarithm problem, analogous to Pollard's rho algorithm to solve the integer factorization problem. (wikipedia.org)

Simulator


  • An easy to extend, highly graphical, easy to use 2D robot simulator specialized for path planning algorithms. (sourceforge.net)

Software


  • In this project we develop a software, that uses evolutionary algorithm to evolve a solution, that can drive a car on an arbitrari path. (sourceforge.net)

uses


  • Like most Business Rule Engines (BRE) it uses the Rete algorithm. (sourceforge.net)

Search


  • Under the additional assumption that matches between segments are transitive, we further improve the running time for finding the optimal solution by restricting the search space of the dynamic programming algorithm. (computer.org)

solution


  • Application to test a GA solution for the Knapsack problem, it will compare Genetic Algorithm solution of the Knapsack problem to greedy algorithm. (sourceforge.net)
  • We present several techniques for making the dynamic programming algorithm more efficient, while still finding an optimal solution under these restrictions. (computer.org)
  • Empirical study shows that, taken together, these observations lead to an improved running time over the basic dynamic programming algorithm by 4 to 12 orders of magnitude, while still obtaining an optimal solution. (computer.org)

Project


  • This project aims to implement several network fault diagnosis algorithms. (sourceforge.net)

Images


  • A family of algorithms that implement operations on compressed digital images is described. (computer.org)

Library


  • The Numerical Learning Library intends to provide a wide range of machine learning algorithms. (sourceforge.net)
  • Library of terrain level of detail algorithm nodes for Coin (an implementation of an Open Inventor scene graph) and other utility classes. (sourceforge.net)
  • BlueDS is a library that contains the implementation of frequently used data structures and algorithms in Computer Science. (sourceforge.net)