# Table algebra of infinite tables is considered. # The signature of table algebra of infinite tables is tilled up with outer set operations. # A formal mathematical semantics of these operations is defined. # Key words: relation databases, table algebra of infinite tables, outer union, outer difference, outer intersection.
# The purpose of this work is to develop a software tool for analysis automation # and simplifying of understanding of software systems behavior. # Methods for translation, abstraction, debugging and test generation for COBOL are proposed. # We developed a software system, which implements the methods. # Key words: translation, Cobol, modeling, abstraction, test generation, debugging.
# This paper represents the results of the numerical experiment # which aims to clarify the actual performance of arithmetic algorithms implemented in C ++ and Python # programming languages using arbitrary precision arithmetic. # "Addition machine" has been chosen as a mathematical model for integer arithmetic algorithms. # "Addition machine" is a mathematical abstraction introduced by R. Floyd and D. Knuth. # The essence of "addition machine" is the following: it uses operations of addition, subtraction, comparison, # assignment and a limited number of registers only and enables to calculate more complex operations # such as finding the residue modulo, multiplication, finding the greatest common divisor, # exponentiation modulo with reasonable computational efficiency. # One of the features of proposed implementation is the use of arbitrary precision arithmetic # that can be used in cryptographic algorithms. # Key words: C++, Python, GMP, addition, subtraction, greatest common divisor, exponentiation, Fibonacci numbers.
# The method for solution of adaptation problem of the logical network with multiple outputs # for the restoration of the input set of binary vectors when given only the lower values of this set # and the values of the outputs is considered. # The algorithm synthesis of the logical network is based on the description of its Zhegalkin polynomial. # Key words: adaptation, Boolean functions, Zhegalkin polynomial.
# The paper is a logical continuation of the previously published work, # which was dedicated to the creation of the data manipulation methods. # We perform mappings of the ALC extension into relational data model (RDM) # based on the previously created binary relational data structure. # The results of previous research namely data structure RM2 and mappings of the basic ALC concepts # into RDM are used in this paper. # Key words: Semantic Web, description logic mapping, ALC, RDM, relational data model, description logic extensions, # binary relational structure, RM2.
# Free-quantifier composition nominative logics of partial quasiary predicates are considered. # We specify the following levels of these logics: renominative, renominative with predicates of weak equality, # renominative with predicates of strong equality, free-quantifier, free-quantifier with composition of weak equality, # free-quantifier with composition of strong equality. # The paper is mainly dedicated to investigation of logics of free-quantifier levels with equality. # Languages and semantic models of such logics are described, their semantic properties are studied. # Normal forms of terms and formulas are considered, the properties of logical consequence relations are investigated. # Key words: free-quantifier logic, predicate, renomination, superposition, equality, logical consequence.
# The article describes the graph model of the text that enable to accelerate the information processing. # This model provides the identification of the identical fragments in the documents with the changed order # of sentences and other parts. # Use of constructive-synthesizing structure to formalize this model is a promising approach # to further automation of these working process with model and the text accordingly. # Key words: graph model of the text, graph compression, constructive-synthesizing structure, text comparison.
# Pure first-order logics of partial and total, single-valued and multi-valued quasiary predicates # are investigated. For these logics we describe semantic models and languages, # paying special attention in our research to composition algebras of predicates and interpretation classes (semantics). # We specify a number of subalgebras of first-order algebra of quasiary predicates and describe corresponding semantics. # We consider logical consequence relations for sets of formulas and study their properties. # For the defined relations a number of sequent type calculi is constructed; # their characteristic features are generalized sequent closedness conditions # and special forms for quantifier elimination. # Key words: logic, partial predicate, semantics, logical consequence, sequent calculus.
# The paper dwells computation process (CP) array sorting from the simple to the efficient. # Computation processes are represented as a block flowcharts for clarity. # Macro operations (MO) are functionality finished pieces of CP. They are extracted for each process. # Lists of MO for each sorting algorithm are combined together and summarized by minimizing of MO number. # This approach provides the consideration of CP majority within the scope of dedicated MO. # Key words: sorting algorithm, computation process, flowchart, macro operations.
# An automated development of a parallel distributed dynamically scalable fault-tolerant system # for processing large amount of streaming data is performed. # The system is based on the framework for distributed computing Hazelcast # and the usage of the toolkit for generation of programs from high-level specifications of algorithms. # The inspection and study of this system is performed on an example of data processing in Twitter social network # in which sentiment analysis functionality is implemented. # The mechanism of the deployment of the created system on a cloud platform is examined. # Key words: network, analysis, fault-tolerance, scaling, cluster, stream, cloud, node, program generation.
# Use of Johnson's algorithm for finding of shortest path for all pairs # of weighted directed graph nodes is proposed. # It's formalization in terms of Glushkov's modified systems of algorithmic algebras is realized. # The expediency of GPGPU technology using to accelerate the algorithm is proved. # A number of schemas of parallel algorithm optimized for using in GPGPU are obtained. # Approach to the implementation of the schemes obtained with use of computing architecture NVIDIA CUDA is proposed. # An experimental study of improved performance by using GPU for computations is realized. # Key words: NVidia CUDA, GPGPU, SSSP, APSP, Thrust, SAA scheme, Johnson's algorithm.
# The total correctness of the Peterson's Algorithm is proved. # States and transitions are fixed by die program. # Considered runtime environment is interleaving concurrency with shared memory. # Invariant of die program is constructed. # All reasoning is provided in terms of Method for software properties proof in # Interleaving Parallel Compositional Languages (IPCL). # Conclusions about adequacy of the Method usage for such a kind of tasks # (based on flexibility of composition-nominative platform) and it's practicality as well # as ease of use for real-world systems are made in this and other author's works. # Key Words: Peterson's algorithm, mutual exclusion, software total correctness, formal verification, # liveness property, concurrent program, interleaving, IPCL, composition-nominative languages.
# The method for proof of properties for parallel programs running multiple-instance interleaving # with shared memory is applied in order to prove the correctness property of the banking system # for remittances payments. # The transitional system was built for the model with simplified state, # and the program invariant was formulated and proved to keep true over the software system # at any given time in this work. # Conclusions about the convenience and adequacy of method application to prove the correctness # of parallel systems were made. # Key words: software correctness, safety property proof, concurrent program, interleaving, invariant, IPCL, # composition-nominative languages, formal verification
# An approach to automatic transformation of the legacy code on Fortran for execution # on cloud computing platforms is proposed. # Architecture of the system based on web-services choreography is developed. # This architecture allows infinite scalability of the system and reduces overhead on message passing. # Research of the approach on example of the program from the quantum chemistry domain is realized . # Key words: virtualization, cloud computing, scalable parallelism, web-services choreography.
# Parallel tiered and dynamic models of the fuzzy inference in expert-diagnostic software systems # with fuzzy rules knowledge bases are considered. # Tiered parallel and dynamic fuzzy inference procedures that allow speed-up of computations # in the software system for evaluating the quality of scientific papes are developed. # Evaluations of the effectiveness of parallel tiered and dynamic schemes of computations # with complex dependency graph between blocks of fuzzy Takagi-Sugeno rules are constructed. # Comparative characteristic of the efficiency of parallel tiered and dynamic models is carried out. # Key words: fuzzy Takagi-Sugeno systems, parallel tiered scheme, dynamic scheme of computing, speed-up, # dependency graph, fuzzy inference.
# The article describes machine learning methods for the named entity recognition. # Two basic models of machine learning - the Naive Bayes and Conditional Random Fields were used # for building of named entity classifiers. # A model for multi-classification of named entities that uses Error Correcting Output Codes was also researched. # The paper describes a method for classifiers' training and the results of test experiments. # Conditional Random Fields overcome other models in precision and recall evaluations. # Key words: machine learning, natural language processing, named entity recognition.
# The research is dedicated to the problem of large volumes of results acquired from sequential pattern mining. # The new form of sequential patterns is proposed. # The requirements for a programmed implementation of the described method are introduced. # The results of experiments based on real malware behavior data are demonstrated. # Key words: data mining, sequential pattern mining, regular expressions.
# Authors describe an application for solving video context detection problem. # Application architecture use state-of-the-art deap learning TensorFlow framework # together with the computer vision library OpenCV in isolated agent environment. # The experimental results demonstrate the effectiveness of developed product. # Key words: deep learning, tensorflow, computer vision, video context prediction.
# The article deals with different approaches to calculating the probability of fuzzy event. # We consider the problems in fuzzy formulation, which are solving with help of the proposed approaches # to the calculation of fuzzy probability are considered. # Key words: probability, fuzzy event.
# The basic components of information technology inductive modeling causation under uncertainty # based on fuzzy object-oriented Bayesian networks are proposed. # The technology is built on base of algorithms of Bayesian network transformation into the junction tree. # New more efficient algorithms for Bayesian network transformation that are resulted from modifications # of known algorithms based on the use of more information on the graphical representation # of the network are considered. # Functional model designed for transformation of fuzzy object-oriented Bayesian networks is described structurally. # Key words: Bayesian network, object-oriented Bayesian network, fuzzy Bayesian network, transformation algorithms, # moral graph, junction tree, fuzzy probability.
# The necessity of use of personalized knowledge in modern distributed applications # and sources of such knowledge are considered. # The specific features of Wiki-resources and their semantic markup means are analyzed. # Method of processing of semantic Wiki resource for generating of personalized ontologies for user tasks is proposed. # The means of these ontologies usage in intelligent Web-based applications are considered # (on example of personified semantic search system). # Key words: Semantic Wiki, Wiki-resource, ontology, semantic search.
# Automated composition of services described by their process model is difficult but very vital task. # Its decision needs strict formalization and strong semantization of a service. # Similarity of definitions of intelligent planning tasks and services composition objectives # demonstrates the possibility of using AI-planning approaches for resolving Web services composition problems, # provided a number of solutions to existing problems. # In this article the analysis of task similarity and differences of HTN-planning and composition # of Web-services presented by process model is executed. # BPEL-services are considered. # Main problems that appear under usage of AI-planning methods are defined. # Some approaches for their resolving by integration DL and HTN-planning are proposed. # Translation algorithms from BPEL to HTN-DL are also discussed. # Key words: semantic Web-service, descriptive logic, process model, semantic Web-service composition, # AI-planning task, HTN-planning, BPEL-processes, HTN-DL.
# An approach to the construction of personalized ontological model of domain that interests the user is proposed. # This model is based on processing of semantically marked Wiki-resource which the user interacted with, # and uses semantic properties, links, and categories for building of the ontology classes and instances. # This approach allows to provide application by relevant formalized system of knowledge represented on the Web. # Key words: Wiki-resource, ontology, RDF, semantic search.
# In the article mapping from UML class diagrams to SHOIQ dialect of descriptive logic is proposed. # Extension of the UML notations by usage of stereotypes for the closest approximation # to semantic structures is proposed. # The causes and problems that arise in such mapping are specified. # Key words: UML, description logick, semantic web, validation class diagram uml.
# This study describes an application of the decision tree approach to the problem of data analysis # and forecasting. Data processing bases on the real observations that represent sales level in the period # between 2006 and 2009. # R (programming language and software environment) is used as a tool for statistical computing. # Paper includes comparison of the method with well-known approaches and solutions in order # to improve accuracy of the gained consequences. # Key words: data mining, forecasting, decision making, decision trees. R language.
# A non-classical method for contact center load forecasting as well as methods # (Erlang C, imitation modelling) to count the necessary amount of operators is applied. # The task to predict the volume of contact center incoming calls for a certain month # and also to calculate the necessary number of operators have been set and solved. # Conclusions about the model's forecast quality and precision are proposed. # Key words: contact center, load planning, man-hours, forecast. Erlang C, imitation modelling.
# The information technology on base of the disperse evaluation, which with enough high efficiency # allows automated define the semantic terms in content of educational materials article is given. # The factors that hinder effective analysis of educational materials are considered. # High efficiency of offered technologies makes possible their using for a number of problems, # such as estimation of the correspondence of educational materials to requirements, # estimation of the correspondence of set test tasks to educational materials, semantic help of making tests, # automated generation of keyword list and abstract. # Key words: information technology, semantic terms, training materials, summary.
# The article deals with the further development of a service-oriented portal solution # for providing meteorological forecasting services. # In particular, the three-dimensional algorithm for solving the task of circulation atmosphere modeling # was implemented using graphics processing units (GPU) as a computing device for forecast calculation # as well as for its visualization. # Key words: parallel computing, meteorological forecasting, web portal, web, GPU, visualization.
# The paper describes the modeling technology and the main results of the simulation of hemodynamic effects # of cardiac hypertrophy (HH), conducted using previously published mathematical model (MM) [9]. # The dynamics of hemodynamic abnormalities are not modeled. # MM simulates changes in the central hemodynamics at different degrees and forms of myocardial hypertrophy (MH). # Software technology provides a simulation of three types of HH: # a) adaptive HH arising in response to the chronic lack of the systemic circulation; # b) abnormal HH, which is at the extreme stage of adaptive HH; # c) abnormal MH of left ventricle. # The first two versions of HH have been simulated by increasing of myocardium's stiffness, # while the third version of HH is simulated via additional decrease of the unstressed volume of the left ventricle. # For each version of HH a compensatory potential of self-regulation mechanisms # (model uncontrolled cardiovascular system) is studied, and then similar opportunities of baroreflex regulators # of hemodynamics have been evaluated. HM satisfactorily reproduces the main changes in blood pressure, # cardiac output, and heart rate. # The likely role of cell energy mechanisms in the cardiovascular system adaptation to high loads is discussed. # The simulator is an autonomous program which can be both a tool to support the medical-physiological research # and an educational means for demonstrating causal relationships to medical students. # An implementation of the program in a more general program-modeling complex focused on the identification # of patterns of functioning of super-human energy is planned. # Key words: mathematical model, hemodynamics, computer, simulation.