Search results for: Models of information systems
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9300

Search results for: Models of information systems

8790 DJess A Knowledge-Sharing Middleware to Deploy Distributed Inference Systems

Authors: Federico Cabitza, Bernardo Dal Seno

Abstract:

In this paper DJess is presented, a novel distributed production system that provides an infrastructure for factual and procedural knowledge sharing. DJess is a Java package that provides programmers with a lightweight middleware by which inference systems implemented in Jess and running on different nodes of a network can communicate. Communication and coordination among inference systems (agents) is achieved through the ability of each agent to transparently and asynchronously reason on inferred knowledge (facts) that might be collected and asserted by other agents on the basis of inference code (rules) that might be either local or transmitted by any node to any other node.

Keywords: Knowledge-Based Systems, Expert Systems, Distributed Inference Systems, Parallel Production Systems, Ambient Intelligence, Mobile Agents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
8789 Fusion of Colour and Depth Information to Enhance Wound Tissue Classification

Authors: Darren Thompson, Philip Morrow, Bryan Scotney, John Winder

Abstract:

Patients with diabetes are susceptible to chronic foot wounds which may be difficult to manage and slow to heal. Diagnosis and treatment currently rely on the subjective judgement of experienced professionals. An objective method of tissue assessment is required. In this paper, a data fusion approach was taken to wound tissue classification. The supervised Maximum Likelihood and unsupervised Multi-Modal Expectation Maximisation algorithms were used to classify tissues within simulated wound models by weighting the contributions of both colour and 3D depth information. It was found that, at low weightings, depth information could show significant improvements in classification accuracy when compared to classification by colour alone, particularly when using the maximum likelihood method. However, larger weightings were found to have an entirely negative effect on accuracy.

Keywords: Classification, data fusion, diabetic foot, stereophotogrammetry, tissue colour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710
8788 A Fast Sign Localization System Using Discriminative Color Invariant Segmentation

Authors: G.P. Nguyen, H.J. Andersen

Abstract:

Building intelligent traffic guide systems has been an interesting subject recently. A good system should be able to observe all important visual information to be able to analyze the context of the scene. To do so, signs in general, and traffic signs in particular, are usually taken into account as they contain rich information to these systems. Therefore, many researchers have put an effort on sign recognition field. Sign localization or sign detection is the most important step in the sign recognition process. This step filters out non informative area in the scene, and locates candidates in later steps. In this paper, we apply a new approach in detecting sign locations using a new color invariant model. Experiments are carried out with different datasets introduced in other works where authors claimed the difficulty in detecting signs under unfavorable imaging conditions. Our method is simple, fast and most importantly it gives a high detection rate in locating signs.

Keywords: Sign localization, color-based segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1293
8787 An Optimal Steganalysis Based Approach for Embedding Information in Image Cover Media with Security

Authors: Ahlem Fatnassi, Hamza Gharsellaoui, Sadok Bouamama

Abstract:

This paper deals with the study of interest in the fields of Steganography and Steganalysis. Steganography involves hiding information in a cover media to obtain the stego media in such a way that the cover media is perceived not to have any embedded message for its unintended recipients. Steganalysis is the mechanism of detecting the presence of hidden information in the stego media and it can lead to the prevention of disastrous security incidents. In this paper, we provide a critical review of the steganalysis algorithms available to analyze the characteristics of an image stego media against the corresponding cover media and understand the process of embedding the information and its detection. We anticipate that this paper can also give a clear picture of the current trends in steganography so that we can develop and improvise appropriate steganalysis algorithms.

Keywords: Optimization, heuristics and metaheuristics algorithms, embedded systems, low-power consumption, Steganalysis Heuristic approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1183
8786 Chaos Theory and Application in Foreign Exchange Rates vs. IRR (Iranian Rial)

Authors: M. A. Torkamani, S. Mahmoodzadeh, S. Pourroostaei, C. Lucas

Abstract:

Daily production of information and importance of the sequence of produced data in forecasting future performance of market causes analysis of data behavior to become a problem of analyzing time series. But time series that are very complicated, usually are random and as a result their changes considered being unpredictable. While these series might be products of a deterministic dynamical and nonlinear process (chaotic) and as a result be predictable. Point of Chaotic theory view, complicated systems have only chaotically face and as a result they seem to be unregulated and random, but it is possible that they abide by a specified math formula. In this article, with regard to test of strange attractor and biggest Lyapunov exponent probability of chaos on several foreign exchange rates vs. IRR (Iranian Rial) has been investigated. Results show that data in this market have complex chaotic behavior with big degree of freedom.

Keywords: Chaos, Exchange Rate, Nonlinear Models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2477
8785 A Comparative Analysis of the Performance of COSMO and WRF Models in Quantitative Rainfall Prediction

Authors: Isaac Mugume, Charles Basalirwa, Daniel Waiswa, Mary Nsabagwa, Triphonia Jacob Ngailo, Joachim Reuder, Sch¨attler Ulrich, Musa Semujju

Abstract:

The Numerical weather prediction (NWP) models are considered powerful tools for guiding quantitative rainfall prediction. A couple of NWP models exist and are used at many operational weather prediction centers. This study considers two models namely the Consortium for Small–scale Modeling (COSMO) model and the Weather Research and Forecasting (WRF) model. It compares the models’ ability to predict rainfall over Uganda for the period 21st April 2013 to 10th May 2013 using the root mean square (RMSE) and the mean error (ME). In comparing the performance of the models, this study assesses their ability to predict light rainfall events and extreme rainfall events. All the experiments used the default parameterization configurations and with same horizontal resolution (7 Km). The results show that COSMO model had a tendency of largely predicting no rain which explained its under–prediction. The COSMO model (RMSE: 14.16; ME: -5.91) presented a significantly (p = 0.014) higher magnitude of error compared to the WRF model (RMSE: 11.86; ME: -1.09). However the COSMO model (RMSE: 3.85; ME: 1.39) performed significantly (p = 0.003) better than the WRF model (RMSE: 8.14; ME: 5.30) in simulating light rainfall events. All the models under–predicted extreme rainfall events with the COSMO model (RMSE: 43.63; ME: -39.58) presenting significantly higher error magnitudes than the WRF model (RMSE: 35.14; ME: -26.95). This study recommends additional diagnosis of the models’ treatment of deep convection over the tropics.

Keywords: Comparative performance, the COSMO model, the WRF model, light rainfall events, extreme rainfall events.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541
8784 Framework for Government ICT Projects

Authors: Manal Rayes

Abstract:

In its efforts to utilize the information and communication technology to enhance the quality of public service delivery, national and local governments around the world are competing to introduce more ICT applications as tools to automate processes related to law enforcement or policy execution, increase citizen orientation, trust, and satisfaction, and create one-stop-shops for public services. In its implementation, e-Government ICTs need to maintain transparency, participation, and collaboration. Due to this diverse of mixed goals and requirements, e-Government systems need to be designed based on special design considerations in order to eliminate the risks of failure to compliance to government regulations, citizen dissatisfaction, or market repulsion. In this article we suggest a framework with guidelines for designing government information systems that takes into consideration the special requirements of the public sector. Then we introduce two case studies and show how applying those guidelines would result in a more solid system design.

Keywords: e-government, framework, guidelines, system design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
8783 Comparing and Combining the Axial with the Network Maps for Analyzing Urban Street Pattern

Authors: Nophaket Napong

Abstract:

Rooted in the study of social functioning of space in architecture, Space Syntax (SS) and the more recent Network Pattern (NP) researches demonstrate the 'spatial structures' of city, i.e. the hierarchical patterns of streets, junctions and alley ends. Applying SS and NP models, planners can conceptualize the real city-s patterns. Although, both models yield the optimal path of the city their underpinning displays of the city-s spatial configuration differ. The Axial Map analyzes the topological non-distance-based connectivity structure, whereas, the Central-Node Map and the Shortcut-Path Map, in contrast, analyze the metrical distance-based structures. This research contrasts and combines them to understand various forms of city-s structures. It concludes that, while they reveal different spatial structures, Space Syntax and Network Pattern urban models support each the other. Combining together they simulate the global access and the locally compact structures namely the central nodes and the shortcuts for the city.

Keywords: Street pattern, space syntax, syntactic and metrical models, network pattern models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1460
8782 Application of RP Technology with Polycarbonate Material for Wind Tunnel Model Fabrication

Authors: A. Ahmadi Nadooshan, S. Daneshmand, C. Aghanajafi

Abstract:

Traditionally, wind tunnel models are made of metal and are very expensive. In these years, everyone is looking for ways to do more with less. Under the right test conditions, a rapid prototype part could be tested in a wind tunnel. Using rapid prototype manufacturing techniques and materials in this way significantly reduces time and cost of production of wind tunnel models. This study was done of fused deposition modeling (FDM) and their ability to make components for wind tunnel models in a timely and cost effective manner. This paper discusses the application of wind tunnel model configuration constructed using FDM for transonic wind tunnel testing. A study was undertaken comparing a rapid prototyping model constructed of FDM Technologies using polycarbonate to that of a standard machined steel model. Testing covered the Mach range of Mach 0.3 to Mach 0.75 at an angle-ofattack range of - 2° to +12°. Results from this study show relatively good agreement between the two models and rapid prototyping Method reduces time and cost of production of wind tunnel models. It can be concluded from this study that wind tunnel models constructed using rapid prototyping method and materials can be used in wind tunnel testing for initial baseline aerodynamic database development.

Keywords: Polycarbonate, Fabrication, FDM, Model, RapidPrototyping, Wind Tunnel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2446
8781 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: Computational social science, movie preference, machine learning, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
8780 On the Dynamic Model of Service Innovation in Manufacturing Industry

Authors: Yongyoon Suh, Chulhyun Kim, Moon-soo Kim

Abstract:

As the trend of manufacturing is being dominated depending on services, products and processes are more and more related with sophisticated services. Thus, this research starts with the discussion about integration of the product, process, and service in the innovation process. In particular, this paper sets out some foundations for a theory of service innovation in the field of manufacturing, and proposes the dynamic model of service innovation related to product and process. Two dynamic models of service innovation are suggested to investigate major tendencies and dynamic variations during the innovation cycle: co-innovation and sequential innovation. To structure dynamic models of product, process, and service innovation, the innovation stages in which two models are mainly achieved are identified. The research would encourage manufacturers to formulate strategy and planning for service development with product and process.

Keywords: dynamic model, service innovation, service innovation models, innovation cycle, manufacturing industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2019
8779 Group Contribution Parameters for Nonrandom Lattice Fluid Equation of State involving COSMO-RS

Authors: Alexander Breitholz, Wolfgang Arlt, Ki-Pung Yoo

Abstract:

Group contribution based models are widely used in industrial applications for its convenience and flexibility. Although a number of group contribution models have been proposed, there were certain limitations inherent to those models. Models based on group contribution excess Gibbs free energy are limited to low pressures and models based on equation of state (EOS) cannot properly describe highly nonideal mixtures including acids without introducing additional modification such as chemical theory. In the present study new a new approach derived from quantum chemistry have been used to calculate necessary EOS group interaction parameters. The COSMO-RS method, based on quantum mechanics, provides a reliable tool for fluid phase thermodynamics. Benefits of the group contribution EOS are the consistent extension to hydrogen-bonded mixtures and the capability to predict polymer-solvent equilibria up to high pressures. The authors are confident that with a sufficient parameter matrix the performance of the lattice EOS can be improved significantly.

Keywords: COSMO-RS, Equation of State, Group contribution, Lattice Fluid, Phase equilibria.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
8778 Hydrological Modeling of Watersheds Using the Only Corresponding Competitor Method: The Case of M’Zab Basin, South East Algeria

Authors: Oulad Naoui Noureddine, Cherif ELAmine, Djehiche Abdelkader

Abstract:

Water resources management includes several disciplines; the modeling of rainfall-runoff relationship is the most important discipline to prevent natural risks. There are several models to study rainfall-runoff relationship in watersheds. However, the majority of these models are not applicable in all basins of the world.  In this study, a new stochastic method called The Only Corresponding Competitor method (OCC) was used for the hydrological modeling of M’ZAB   Watershed (South East of Algeria) to adapt a few empirical models for any hydrological regime.  The results obtained allow to authorize a certain number of visions, in which it would be interesting to experiment with hydrological models that improve collectively or separately the data of a catchment by the OCC method.

Keywords: Empirical model, modeling, OCC, rainfall-runoff relationship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1155
8777 Estimating Regression Parameters in Linear Regression Model with a Censored Response Variable

Authors: Jesus Orbe, Vicente Nunez-Anton

Abstract:

In this work we study the effect of several covariates X on a censored response variable T with unknown probability distribution. In this context, most of the studies in the literature can be located in two possible general classes of regression models: models that study the effect the covariates have on the hazard function; and models that study the effect the covariates have on the censored response variable. Proposals in this paper are in the second class of models and, more specifically, on least squares based model approach. Thus, using the bootstrap estimate of the bias, we try to improve the estimation of the regression parameters by reducing their bias, for small sample sizes. Simulation results presented in the paper show that, for reasonable sample sizes and censoring levels, the bias is always smaller for the new proposals.

Keywords: Censored response variable, regression, bias.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1475
8776 Application of Adaptive Neuro-Fuzzy Inference System in the Prediction of Economic Crisis Periods in USA

Authors: Eleftherios Giovanis

Abstract:

In this paper discrete choice models, Logit and Probit are examined in order to predict the economic recession or expansion periods in USA. Additionally we propose an adaptive neuro-fuzzy inference system with triangular membership function. We examine the in-sample period 1947-2005 and we test the models in the out-of sample period 2006-2009. The forecasting results indicate that the Adaptive Neuro-fuzzy Inference System (ANFIS) model outperforms significant the Logit and Probit models in the out-of sample period. This indicates that neuro-fuzzy model provides a better and more reliable signal on whether or not a financial crisis will take place.

Keywords: ANFIS, discrete choice models, financial crisis, USeconomy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610
8775 Interactive Model Based On an Extended CPN

Authors: Shuzhen Yao, Fengjing Zhao, Jianwei He

Abstract:

The UML modeling of complex distributed systems often is a great challenge due to the large amount of parallel real-time operating components. In this paper the problems of verification of such systems are discussed. ECPN, an Extended Colored Petri Net is defined to formally describe state transitions of components and interactions among components. The relationship between sequence diagrams and Free Choice Petri Nets is investigated. Free Choice Petri Net theory helps verifying the liveness of sequence diagrams. By converting sequence diagrams to ECPNs and then comparing behaviors of sequence diagram ECPNs and statecharts, the consistency among models is analyzed. Finally, a verification process for an example model is demonstrated.

Keywords: Consistency, liveness, Petri Net, sequence diagram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610
8774 Parallel Particle Swarm Optimization Optimized LDI Controller with Lyapunov Stability Criterion for Nonlinear Structural Systems

Authors: P.-W. Tsai, W.-L. Hong, C.-W. Chen, C.-Y. Chen

Abstract:

In this paper, we present a neural-network (NN) based approach to represent a nonlinear Tagagi-Sugeno (T-S) system. A linear differential inclusion (LDI) state-space representation is utilized to deal with the NN models. Taking advantage of the LDI representation, the stability conditions and controller design are derived for a class of nonlinear structural systems. Moreover, the concept of utilizing the Parallel Particle Swarm Optimization (PPSO) algorithm to solve the common P matrix under the stability criteria is given in this paper.

Keywords: Lyapunov Stability, Parallel Particle Swarm Optimization, Linear Differential Inclusion, Artificial Intelligence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
8773 Statistical Analysis for Overdispersed Medical Count Data

Authors: Y. N. Phang, E. F. Loh

Abstract:

Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling overdispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling overdispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling overdispered medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling overdispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling overdispersed medical count data when ZIP and ZINB are inadequate.

Keywords: Zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3315
8772 Distributed Manufacturing (DM) - Smart Units and Collaborative Processes

Authors: Hermann Kuehnle

Abstract:

Applications of the Hausdorff space and its mappings into tangent spaces are outlined, including their fractal dimensions and self-similarities. The paper details this theory set up and further describes virtualizations and atomization of manufacturing processes. It demonstrates novel concurrency principles that will guide manufacturing processes and resources configurations. Moreover, varying levels of details may be produced by up folding and breaking down of newly introduced generic models. This choice of layered generic models for units and systems aspects along specific aspects allows research work in parallel to other disciplines with the same focus on all levels of detail. More credit and easier access are granted to outside disciplines for enriching manufacturing grounds. Specific mappings and the layers give hints for chances for interdisciplinary outcomes and may highlight more details for interoperability standards, as already worked on the international level. The new rules are described, which require additional properties concerning all involved entities for defining distributed decision cycles, again on the base of self-similarity. All properties are further detailed and assigned to a maturity scale, eventually displaying the smartness maturity of a total shopfloor or a factory. The paper contributes to the intensive ongoing discussion in the field of intelligent distributed manufacturing and promotes solid concepts for implementations of Cyber Physical Systems and the Internet of Things into manufacturing industry, like industry 4.0, as discussed in German-speaking countries.

Keywords: Autonomous unit, Networkability, Smart manufacturing unit, Virtualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2074
8771 Survival of Neutrino Mass Models in Nonthermal Leptogenesis

Authors: Amal Kr Sarma, H Zeen Devi, N Nimai Singh

Abstract:

The Constraints imposed by non-thermal leptogenesis on the survival of the neutrino mass models describing the presently available neutrino mass patterns, are studied numerically. We consider the Majorana CP violating phases coming from right-handed Majorana mass matrices to estimate the baryon asymmetry of the universe, for different neutrino mass models namely quasi-degenerate, inverted hierarchical and normal hierarchical models, with tribimaximal mixings. Considering two possible diagonal forms of Dirac neutrino mass matrix as either charged lepton or up-quark mass matrix, the heavy right-handed mass matrices are constructed from the light neutrino mass matrix. Only the normal hierarchical model leads to the best predictions of baryon asymmetry of the universe, consistent with observations in non-thermal leptogenesis scenario.

Keywords: Thermal leptogenesis, Non-thermal leptogenesis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1286
8770 Theoretical Analysis of Capacities in Dynamic Spatial Multiplexing MIMO Systems

Authors: Imen Sfaihi, Noureddine Hamdi

Abstract:

In this paper, we investigate the study of techniques for scheduling users for resource allocation in the case of multiple input and multiple output (MIMO) packet transmission systems. In these systems, transmit antennas are assigned to one user or dynamically to different users using spatial multiplexing. The allocation of all transmit antennas to one user cannot take full advantages of multi-user diversity. Therefore, we developed the case when resources are allocated dynamically. At each time slot users have to feed back their channel information on an uplink feedback channel. Channel information considered available in the schedulers is the zero forcing (ZF) post detection signal to interference plus noise ratio. Our analysis study concerns the round robin and the opportunistic schemes. In this paper, we present an overview and a complete capacity analysis of these schemes. The main results in our study are to give an analytical form of system capacity using the ZF receiver at the user terminal. Simulations have been carried out to validate all proposed analytical solutions and to compare the performance of these schemes.

Keywords: MIMO, scheduling, ZF receiver, spatial multiplexing, round robin scheduling, opportunistic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1317
8769 On the Need to have an Additional Methodology for the Psychological Product Measurement and Evaluation

Authors: Corneliu Sofronie, Roxana Zubcov

Abstract:

Cognitive Science appeared about 40 years ago, subsequent to the challenge of the Artificial Intelligence, as common territory for several scientific disciplines such as: IT, mathematics, psychology, neurology, philosophy, sociology, and linguistics. The new born science was justified by the complexity of the problems related to the human knowledge on one hand, and on the other by the fact that none of the above mentioned sciences could explain alone the mental phenomena. Based on the data supplied by the experimental sciences such as psychology or neurology, models of the human mind operation are built in the cognition science. These models are implemented in computer programs and/or electronic circuits (specific to the artificial intelligence) – cognitive systems – whose competences and performances are compared to the human ones, leading to the psychology and neurology data reinterpretation, respectively to the construction of new models. During these processes if psychology provides the experimental basis, philosophy and mathematics provides the abstraction level utterly necessary for the intermission of the mentioned sciences. The ongoing general problematic of the cognitive approach provides two important types of approach: the computational one, starting from the idea that the mental phenomenon can be reduced to 1 and 0 type calculus operations, and the connection one that considers the thinking products as being a result of the interaction between all the composing (included) systems. In the field of psychology measurements in the computational register use classical inquiries and psychometrical tests, generally based on calculus methods. Deeming things from both sides that are representing the cognitive science, we can notice a gap in psychological product measurement possibilities, regarded from the connectionist perspective, that requires the unitary understanding of the quality – quantity whole. In such approach measurement by calculus proves to be inefficient. Our researches, deployed for longer than 20 years, lead to the conclusion that measuring by forms properly fits to the connectionism laws and principles.

Keywords: complementary methodology, connection approach, networks without scaling, quantum psychology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3670
8768 Accuracy of Autonomy Navigation of Unmanned Aircraft Systems through Imagery

Authors: Sidney A. Lima, Hermann J. H. Kux, Elcio H. Shiguemori

Abstract:

The Unmanned Aircraft Systems (UAS) usually navigate through the Global Navigation Satellite System (GNSS) associated with an Inertial Navigation System (INS). However, GNSS can have its accuracy degraded at any time or even turn off the signal of GNSS. In addition, there is the possibility of malicious interferences, known as jamming. Therefore, the image navigation system can solve the autonomy problem, because if the GNSS is disabled or degraded, the image navigation system would continue to provide coordinate information for the INS, allowing the autonomy of the system. This work aims to evaluate the accuracy of the positioning though photogrammetry concepts. The methodology uses orthophotos and Digital Surface Models (DSM) as a reference to represent the object space and photograph obtained during the flight to represent the image space. For the calculation of the coordinates of the perspective center and camera attitudes, it is necessary to know the coordinates of homologous points in the object space (orthophoto coordinates and DSM altitude) and image space (column and line of the photograph). So if it is possible to automatically identify in real time the homologous points the coordinates and attitudes can be calculated whit their respective accuracies. With the methodology applied in this work, it is possible to verify maximum errors in the order of 0.5 m in the positioning and 0.6º in the attitude of the camera, so the navigation through the image can reach values equal to or higher than the GNSS receivers without differential correction. Therefore, navigating through the image is a good alternative to enable autonomous navigation.

Keywords: Autonomy, navigation, security, photogrammetry, remote sensing, spatial resection, UAS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321
8767 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine

Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li

Abstract:

Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.

Keywords: Machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 948
8766 Models to Customise Web Service Discovery Result using Static and Dynamic Parameters

Authors: Kee-Leong Tan, Cheng-Suan Lee, Hui-Na Chua

Abstract:

This paper presents three models which enable the customisation of Universal Description, Discovery and Integration (UDDI) query results, based on some pre-defined and/or real-time changing parameters. These proposed models detail the requirements, design and techniques which make ranking of Web service discovery results from a service registry possible. Our contribution is two fold: First, we present an extension to the UDDI inquiry capabilities. This enables a private UDDI registry owner to customise or rank the query results, based on its business requirements. Second, our proposal utilises existing technologies and standards which require minimal changes to existing UDDI interfaces or its data structures. We believe these models will serve as valuable reference for enhancing the service discovery methods within a private UDDI registry environment.

Keywords: Web service, discovery, semantic, SOA, registry, UDDI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486
8765 Predicting DHF Incidence in Northern Thailand using Time Series Analysis Technique

Authors: S. Wongkoon, M. Pollar, M. Jaroensutasinee, K. Jaroensutasinee

Abstract:

This study aimed at developing a forecasting model on the number of Dengue Haemorrhagic Fever (DHF) incidence in Northern Thailand using time series analysis. We developed Seasonal Autoregressive Integrated Moving Average (SARIMA) models on the data collected between 2003-2006 and then validated the models using the data collected between January-September 2007. The results showed that the regressive forecast curves were consistent with the pattern of actual values. The most suitable model was the SARIMA(2,0,1)(0,2,0)12 model with a Akaike Information Criterion (AIC) of 12.2931 and a Mean Absolute Percent Error (MAPE) of 8.91713. The SARIMA(2,0,1)(0,2,0)12 model fitting was adequate for the data with the Portmanteau statistic Q20 = 8.98644 ( x20,95= 27.5871, P>0.05). This indicated that there was no significant autocorrelation between residuals at different lag times in the SARIMA(2,0,1)(0,2,0)12 model.

Keywords: Dengue, SARIMA, Time Series Analysis, Northern Thailand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
8764 A Gnutella-based P2P System Using Cross-Layer Design for MANET

Authors: Ho-Hyun Park, Woosik Kim, Miae Woo

Abstract:

It is expected that ubiquitous era will come soon. A ubiquitous environment has features like peer-to-peer and nomadic environments. Such features can be represented by peer-to-peer systems and mobile ad-hoc networks (MANETs). The features of P2P systems and MANETs are similar, appealing for implementing P2P systems in MANET environment. It has been shown that, however, the performance of the P2P systems designed for wired networks do not perform satisfactorily in mobile ad-hoc environment. Subsequently, this paper proposes a method to improve P2P performance using cross-layer design and the goodness of a node as a peer. The proposed method uses routing metric as well as P2P metric to choose favorable peers to connect. It also utilizes proactive approach for distributing peer information. According to the simulation results, the proposed method provides higher query success rate, shorter query response time and less energy consumption by constructing an efficient overlay network.

Keywords: Ad-hoc Networks, Cross-layer, Peer-to-Peer, Performance Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671
8763 Modelling Export Dynamics in the CSEE Countries Using GVAR Model

Authors: S. Jakšić, B. Žmuk

Abstract:

The paper investigates the key factors of export dynamics for a set of Central and Southeast European (CSEE) countries in the context of current economic and financial crisis. In order to model the export dynamics a Global Vector Auto Regressive (GVAR) model is defined. As opposed to models which model each country separately, the GVAR combines all country models in a global model which enables obtaining important information on spillover effects in the context of globalisation and rising international linkages. The results of the study indicate that for most of the CSEE countries, exports are mainly driven by domestic shocks, both in the short run and in the long run. This study is the first application of the GVAR model to studying the export dynamics in the CSEE countries and therefore the results of the study present an important empirical contribution.

Keywords: Export, GFEVD, Global VAR, International trade, weak exogeneity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2492
8762 Lumped Parameter Models for Numerical Simulation of the Dynamic Response of Hoisting Appliances

Authors: Giovanni Incerti, Luigi Solazzi, Candida Petrogalli

Abstract:

This paper describes three lumped parameters models for the study of the dynamic behavior of a boom crane. The models here proposed allows to evaluate the fluctuations of the load arising from the rope and structure elasticity and from the type of the motion command imposed by the winch. A calculation software was developed in order to determine the actual acceleration of the lifted mass and the dynamic overload during the lifting phase. Some application examples are presented, with the aim of showing the correlation between the magnitude of the stress and the type of the employed motion command.

Keywords: Crane, dynamic model, overloading condition, vibration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
8761 Performance Analysis in 5th Generation Massive Multiple-Input-Multiple-Output Systems

Authors: Jihad S. Daba, Jean-Pierre Dubois, Georges El Soury

Abstract:

Fifth generation wireless networks guarantee significant capacity enhancement to suit more clients and services at higher information rates with better reliability while consuming less power. The deployment of massive multiple-input-multiple-output technology guarantees broadband wireless networks with the use of base station antenna arrays to serve a large number of users on the same frequency and time-slot channels. In this work, we evaluate the performance of massive multiple-input-multiple-output systems (MIMO) systems in 5th generation cellular networks in terms of capacity and bit error rate. Several cases were considered and analyzed to compare the performance of massive MIMO systems while varying the number of antennas at both transmitting and receiving ends. We found that, unlike classical MIMO systems, reducing the number of transmit antennas while increasing the number of antennas at the receiver end provides a better solution to performance enhancement. In addition, enhanced orthogonal frequency division multiplexing and beam division multiple access schemes further improve the performance of massive MIMO systems and make them more reliable.

Keywords: Beam division multiple access, D2D communication, enhanced OFDM, fifth generation broadband, massive MIMO.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 748