Experimental constraints on fluid-rock reactions during incipient serpentinization of harzburgite
Klein, Frieder
2014-10-20
Search results
1,869 records were found.
We describe a novel application of a stochastic name passing calculus for the study of biomolecular systems. We specify the structure and dynamics of biochemical networks in a variant of the stochastic P-calculus, yielding a model which is mathematically welldefined and biologically faithful. We adapt the operational semantics of the calculus to account for both the time and probability of biochemical reactions, and present a computer implementation of the calculus for biochemical simulations.
In the last ten years, knowledge management (KM) has become a new fashioned managerial practice. Though KM theories seem to benefit from a "contamination" with cognitive and social sciences, which emphasize a subjective, contextual, and distributed approach to knowledge representation and integration, current technologies support what we may call a "god's eye" paradigm, in which knowledge is viewed as an objective resource. In this paper we discuss artificial intelligence theories and technologies that can support a shift to a new paradigm, called the "distributed intelligence" paradigm, in designing KM systems. Using the evolution of KM systems within Arthur Andersen Consulting as a motivating case study, we propose the framework of it MultiContext Systems as a specification language for distributed intelligence KM systems, and sketch...
This paper presents a preliminary investigation of the relationship between the process of functional division of labour and the modes in which activities and plans are coordinated. We consider a very simple production process: a given heap of bank-notes has to be counted by a group of accountants. Because of limited individual capabilities and/or the possibilities of mistakes and external disturbances, the task has to be divided among several accountants and a hierarchical coordination problem arises. We can imagine several different ways of socially implementing coordination of devided tasks. 1) a central planner can compute the optimal architecture of the system; 2) a central planner can promote quantity adjustments by moving accountants from hierarchical levels where there exist idle resources to levels where resources are insuffic...
In this paper, we propose a system for monitoring abnormal NO2 emissions in troposphere by using remote-sensing sensors. In particular, the system aims at estimating the amount of NO2 resulting from biomass burning by exploiting the synergies between the GOME and the ATSR-2 sensors mounted on board of the ERS-2 satellite. Two different approaches to the estimation of NO2 are proposed: the former, which is the simplest one, assumes a linear relationship between the GOME and ATSR-2 measurements and the NO2 concentration. The latter exploits a nonlinear and nonparametric method based on a radial basis function (RBF) neural network. The architecture of such a network is defined in order to retrieve the values of NO2 concentration on the basis of the GOME and ATSR-2 measurements, as well as of other ancillary input parameters. Experimental ...
The median problem transforms a set of $N$ numbers in such a way that none of the first $\frac{N}{2}$ numbers exceeds any of the last $\frac{N}{2}$ numbers. A comparator network that solves the median problem on a set of $r$ numbers is commonly called an $r$-{\em classifier}. This paper shows how the well-known Leighton's Columnsort algorithm can be modified to solve the median problem of $N=rs$ numbers, with $1 \le s \le r$,using an $r$-classifier instead of an $r$-sorting network. Overall the $r$-classifier is used $O(s)$ times, namely the same number of times that Columnsort applies an $r$-sorter. A hardware implementation is proposed that runs in optimal $O(s + \log r)$ time and uses an $O(r\log r(s + \log r))$ work. The implementation shows that when $N= r\log r$ there is a classifier network solving the median problem on $N$ numb...
We use a structural operational semantics which drives us in inferring quantitative measures on system evolution. The transitions of the system are labelled and we assign rates to them by only looking at these labels. The rates re ect the possibly distributed architecture on which applications run. We then map transition systems to Markov chains, and performance evaluation is carried out using standard tools. As a working example, we compare the performance of a conventional uniprocessor with a prefetch pipeline machine. We also consider two case studies from the literature involving mobile computation to show that our framework is feasible.
