Search results for: hybrid approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5594

Search results for: hybrid approach

3854 Comparison of Conventional and “ECO“Transportation Pavements in Cyprus using Life Cycle Approach

Authors: Constantia Achilleos, Diofantos G. Hadjimitsis

Abstract:

Road industry has challenged the prospect of ecoconstruction. Pavements may fit within the framework of sustainable development. Hence, research implements assessments of conventional pavements impacts on environment in use of life cycle approach. To meet global, and often national, targets on pollution control, newly introduced pavement designs are under study. This is the case of Cyprus demonstration, which occurred within EcoLanes project work. This alternative pavement differs on concrete layer reinforced with tire recycling product. Processing of post-consumer tires produces steel fibers improving strength capacity against cracking. Thus maintenance works are relevantly limited in comparison to flexible pavement. This enables to be more ecofriendly, referenced to current study outputs. More specific, proposed concrete pavement life cycle processes emits 15 % less air pollutants and consumes 28 % less embodied energy than those of the asphalt pavement. In addition there is also a reduction on costs by 0.06 %.

Keywords: Environmental impact assessment, life cycle, tirerecycling, transportation pavement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2184
3853 Class Outliers Mining: Distance-Based Approach

Authors: Nabil M. Hewahi, Motaz K. Saad

Abstract:

In large datasets, identifying exceptional or rare cases with respect to a group of similar cases is considered very significant problem. The traditional problem (Outlier Mining) is to find exception or rare cases in a dataset irrespective of the class label of these cases, they are considered rare events with respect to the whole dataset. In this research, we pose the problem that is Class Outliers Mining and a method to find out those outliers. The general definition of this problem is “given a set of observations with class labels, find those that arouse suspicions, taking into account the class labels". We introduce a novel definition of Outlier that is Class Outlier, and propose the Class Outlier Factor (COF) which measures the degree of being a Class Outlier for a data object. Our work includes a proposal of a new algorithm towards mining of the Class Outliers, presenting experimental results applied on various domains of real world datasets and finally a comparison study with other related methods is performed.

Keywords: Class Outliers, Distance-Based Approach, Outliers Mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3388
3852 Dynamic Process Monitoring of an Ammonia Synthesis Fixed-Bed Reactor

Authors: Bothinah Altaf, Gary Montague, Elaine B. Martin

Abstract:

This study involves the modeling and monitoring of an ammonia synthesis fixed-bed reactor using partial least squares (PLS) and its variants. The process exhibits complex dynamic behavior due to the presence of heat recycling and feed quench. One limitation of static PLS model in this situation is that it does not take account of the process dynamics and hence dynamic PLS was used. Although it showed, superior performance to static PLS in terms of prediction, the monitoring scheme was inappropriate hence adaptive PLS was considered. A limitation of adaptive PLS is that non-conforming observations also contribute to the model, therefore, a new adaptive approach was developed, robust adaptive dynamic PLS. This approach updates a dynamic PLS model and is robust to non-representative data. The developed methodology showed a clear improvement over existing approaches in terms of the modeling of the reactor and the detection of faults.

Keywords: Ammonia synthesis fixed-bed reactor, dynamic partial least squares modeling, recursive partial least squares, robust modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1922
3851 Knowledge Representation and Inconsistency Reasoning of Class Diagram Maintenance in Big Data

Authors: Chi-Lun Liu

Abstract:

Requirements modeling and analysis are important in successful information systems' maintenance. Unified Modeling Language (UML) class diagrams are useful standards for modeling information systems. To our best knowledge, there is a lack of a systems development methodology described by the organism metaphor. The core concept of this metaphor is adaptation. Using the knowledge representation and reasoning approach and ontologies to adopt new requirements are emergent in recent years. This paper proposes an organic methodology which is based on constructivism theory. This methodology is a knowledge representation and reasoning approach to analyze new requirements in the class diagrams maintenance. The process and rules in the proposed methodology automatically analyze inconsistencies in the class diagram. In the big data era, developing an automatic tool based on the proposed methodology to analyze large amounts of class diagram data is an important research topic in the future.

Keywords: Knowledge representation, reasoning, ontology, class diagram, software engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1042
3850 The Integrated Management of Health Care Strategies and Differential Diagnosis by Expert System Technology: A Single-Dimensional Approach

Authors: A. B. Adehor, P. R. Burrell

Abstract:

The Integrated Management of Child illnesses (IMCI) and the surveillance Health Information Systems (HIS) are related strategies that are designed to manage child illnesses and community practices of diseases. However, both strategies do not function well together because of classification incompatibilities and, as such, are difficult to use by health care personnel in rural areas where a majority of people lack the basic knowledge of interpreting disease classification from these methods. This paper discusses a single approach on how a stand-alone expert system can be used as a prompt diagnostic tool for all cases of illnesses presented. The system combines the action-oriented IMCI and the disease-oriented HIS approaches to diagnose malaria and typhoid fever in the rural areas of the Niger-delta region.

Keywords: Differential diagnosis, Health Information System(HIS), Integrated Management of Child Illnesses (IMCI), Malaria andTyphoid fever.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1868
3849 Stereo Motion Tracking

Authors: Yudhajit Datta, Jonathan Bandi, Ankit Sethia, Hamsi Iyer

Abstract:

Motion Tracking and Stereo Vision are complicated, albeit well-understood problems in computer vision. Existing softwares that combine the two approaches to perform stereo motion tracking typically employ complicated and computationally expensive procedures. The purpose of this study is to create a simple and effective solution capable of combining the two approaches. The study aims to explore a strategy to combine the two techniques of two-dimensional motion tracking using Kalman Filter; and depth detection of object using Stereo Vision. In conventional approaches objects in the scene of interest are observed using a single camera. However for Stereo Motion Tracking; the scene of interest is observed using video feeds from two calibrated cameras. Using two simultaneous measurements from the two cameras a calculation for the depth of the object from the plane containing the cameras is made. The approach attempts to capture the entire three-dimensional spatial information of each object at the scene and represent it through a software estimator object. In discrete intervals, the estimator tracks object motion in the plane parallel to plane containing cameras and updates the perpendicular distance value of the object from the plane containing the cameras as depth. The ability to efficiently track the motion of objects in three-dimensional space using a simplified approach could prove to be an indispensable tool in a variety of surveillance scenarios. The approach may find application from high security surveillance scenes such as premises of bank vaults, prisons or other detention facilities; to low cost applications in supermarkets and car parking lots.

Keywords: Kalman Filter, Stereo Vision, Motion Tracking, Matlab, Object Tracking, Camera Calibration, Computer Vision System Toolbox.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2822
3848 Modal Propagation Properties of Elliptical Core Optical Fibers Considering Stress-Optic Effects

Authors: M. Shah Alam, Sarkar Rahat M. Anwar

Abstract:

The effect of thermally induced stress on the modal properties of highly elliptical core optical fibers is studied in this work using a finite element method. The stress analysis is carried out and anisotropic refractive index change is calculated using both the conventional plane strain approximation and the generalized plane strain approach. After considering the stress optical effect, the modal analysis of the fiber is performed to obtain the solutions of fundamental and higher order modes. The modal effective index, modal birefringence, group effective index, group birefringence, and dispersion of different modes of the fiber are presented. For propagation properties, it can be seen that the results depend much on the approach of stress analysis.

Keywords: Birefringence, dispersion, elliptical core fiber, optical mode analysis, stress-optic effect, stress analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2290
3847 Face Image Coding Using Face Prototyping

Authors: Jaroslav Polec, Lenka Krulikovská, Natália Helešová, Tomáš Hirner

Abstract:

In this paper we present a novel approach for face image coding. The proposed method makes a use of the features of video encoders like motion prediction. At first encoder selects appropriate prototype from the database and warps it according to features of encoding face. Warped prototype is placed as first I frame. Encoding face is placed as second frame as P frame type. Information about features positions, color change, selected prototype and data flow of P frame will be sent to decoder. The condition is both encoder and decoder own the same database of prototypes. We have run experiment with H.264 video encoder and obtained results were compared to results achieved by JPEG and JPEG2000. Obtained results show that our approach is able to achieve 3 times lower bitrate and two times higher PSNR in comparison with JPEG. According to comparison with JPEG2000 the bitrate was very similar, but subjective quality achieved by proposed method is better.

Keywords: Triangulation, H.264, Model-based coding, Average face

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1725
3846 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models

Authors: Rossella Arcucci, Luisa D’Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti

Abstract:

This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.

Keywords: Data Assimilation, Parallel Algorithm, GPU architectures, Ocean Models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2012
3845 A New Fast Skin Color Detection Technique

Authors: Tarek M. Mahmoud

Abstract:

Skin color can provide a useful and robust cue for human-related image analysis, such as face detection, pornographic image filtering, hand detection and tracking, people retrieval in databases and Internet, etc. The major problem of such kinds of skin color detection algorithms is that it is time consuming and hence cannot be applied to a real time system. To overcome this problem, we introduce a new fast technique for skin detection which can be applied in a real time system. In this technique, instead of testing each image pixel to label it as skin or non-skin (as in classic techniques), we skip a set of pixels. The reason of the skipping process is the high probability that neighbors of the skin color pixels are also skin pixels, especially in adult images and vise versa. The proposed method can rapidly detect skin and non-skin color pixels, which in turn dramatically reduce the CPU time required for the protection process. Since many fast detection techniques are based on image resizing, we apply our proposed pixel skipping technique with image resizing to obtain better results. The performance evaluation of the proposed skipping and hybrid techniques in terms of the measured CPU time is presented. Experimental results demonstrate that the proposed methods achieve better result than the relevant classic method.

Keywords: Adult images filtering, image resizing, skin color detection, YcbCr color space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4005
3844 Novel Hybrid Method for Gene Selection and Cancer Prediction

Authors: Liping Jing, Michael K. Ng, Tieyong Zeng

Abstract:

Microarray data profiles gene expression on a whole genome scale, therefore, it provides a good way to study associations between gene expression and occurrence or progression of cancer. More and more researchers realized that microarray data is helpful to predict cancer sample. However, the high dimension of gene expressions is much larger than the sample size, which makes this task very difficult. Therefore, how to identify the significant genes causing cancer becomes emergency and also a hot and hard research topic. Many feature selection algorithms have been proposed in the past focusing on improving cancer predictive accuracy at the expense of ignoring the correlations between the features. In this work, a novel framework (named by SGS) is presented for stable gene selection and efficient cancer prediction . The proposed framework first performs clustering algorithm to find the gene groups where genes in each group have higher correlation coefficient, and then selects the significant genes in each group with Bayesian Lasso and important gene groups with group Lasso, and finally builds prediction model based on the shrinkage gene space with efficient classification algorithm (such as, SVM, 1NN, Regression and etc.). Experiment results on real world data show that the proposed framework often outperforms the existing feature selection and prediction methods, say SAM, IG and Lasso-type prediction model.

Keywords: Gene Selection, Cancer Prediction, Lasso, Clustering, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2044
3843 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data

Authors: Benjamin D. Leiby, Darryl K. Ahner

Abstract:

This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known  values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions, while presenting a need for further refinement that mimics predictive mean matching.

Keywords: Correlation, country conflict, imputation, stochastic regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 418
3842 A Genetic Algorithm Based Classification Approach for Finding Fault Prone Classes

Authors: Parvinder S. Sandhu, Satish Kumar Dhiman, Anmol Goyal

Abstract:

Fault-proneness of a software module is the probability that the module contains faults. A correlation exists between the fault-proneness of the software and the measurable attributes of the code (i.e. the static metrics) and of the testing (i.e. the dynamic metrics). Early detection of fault-prone software components enables verification experts to concentrate their time and resources on the problem areas of the software system under development. This paper introduces Genetic Algorithm based software fault prediction models with Object-Oriented metrics. The contribution of this paper is that it has used Metric values of JEdit open source software for generation of the rules for the classification of software modules in the categories of Faulty and non faulty modules and thereafter empirically validation is performed. The results shows that Genetic algorithm approach can be used for finding the fault proneness in object oriented software components.

Keywords: Genetic Algorithms, Software Fault, Classification, Object Oriented Metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2291
3841 A Relational Case-Based Reasoning Framework for Project Delivery System Selection

Authors: Yang Cui, Yong Qiang Chen

Abstract:

An appropriate project delivery system (PDS) is crucial to the success of a construction projects. Case-based Reasoning (CBR) is a useful support for PDS selection. However, the traditional CBR approach represents cases as attribute-value vectors without taking relations among attributes into consideration, and could not calculate the similarity when the structures of cases are not strictly same. Therefore, this paper solves this problem by adopting the Relational Case-based Reasoning (RCBR) approach for PDS selection, considering both the structural similarity and feature similarity. To develop the feature terms of the construction projects, the criteria and factors governing PDS selection process are first identified. Then feature terms for the construction projects are developed. Finally, the mechanism of similarity calculation and a case study indicate how RCBR works for PDS selection. The adoption of RCBR in PDS selection expands the scope of application of traditional CBR method and improves the accuracy of the PDS selection system.

Keywords: Relational Cased-based Reasoning, Case-based Reasoning, Project delivery system, Selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1994
3840 One-Class Support Vector Machines for Aerial Images Segmentation

Authors: Chih-Hung Wu, Chih-Chin Lai, Chun-Yen Chen, Yan-He Chen

Abstract:

Interpretation of aerial images is an important task in various applications. Image segmentation can be viewed as the essential step for extracting information from aerial images. Among many developed segmentation methods, the technique of clustering has been extensively investigated and used. However, determining the number of clusters in an image is inherently a difficult problem, especially when a priori information on the aerial image is unavailable. This study proposes a support vector machine approach for clustering aerial images. Three cluster validity indices, distance-based index, Davies-Bouldin index, and Xie-Beni index, are utilized as quantitative measures of the quality of clustering results. Comparisons on the effectiveness of these indices and various parameters settings on the proposed methods are conducted. Experimental results are provided to illustrate the feasibility of the proposed approach.

Keywords: Aerial imaging, image segmentation, machine learning, support vector machine, cluster validity index

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
3839 Simulation of Thin Film Relaxation by Buried Misfit Networks

Authors: A. Derardja

Abstract:

The present work is motivated by the idea that the layer deformation in anisotropic elasticity can be estimated from the theory of interfacial dislocations. In effect, this work which is an extension of a previous approach given by one of the authors determines the anisotropic displacement fields and the critical thickness due to a complex biperiodic network of MDs lying just below the free surface in view of the arrangement of dislocations. The elastic fields of such arrangements observed along interfaces play a crucial part in the improvement of the physical properties of epitaxial systems. New results are proposed in anisotropic elasticity for hexagonal networks of MDs which contain intrinsic and extrinsic stacking faults. We developed, using a previous approach based on the relative interfacial displacement and a Fourier series formulation of the displacement fields, the expressions of elastic fields when there is a possible dissociation of MDs. The numerical investigations in the case of the observed system Si/(111)Si with low twist angles show clearly the effect of the anisotropy and thickness when the misfit networks are dissociated.

Keywords: Angular misfit, dislocation networks, plane interfaces, stacking faults.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493
3838 The Influencing Factors and the Approach to Enhance the Standard of E-Commerce for Small and Medium Enterprises in Bangkok

Authors: Wanida Suwunniponth

Abstract:

The objectives of this research paper were to study the influencing factors that contributed to the success of electronic commerce (e-commerce) and to study the approach to enhance the standard of e-commerce for small and medium enterprises (SME). The research paper focused the study on only sole proprietorship SMEs in Bangkok, Thailand. The factors contributed to the success of SME included business management, learning in the organization, business collaboration, and the quality of website. A quantitative and qualitative mixed research methodology was used. In terms of quantitative method, a questionnaire was used to collect data from 251 sole proprietorships. The System Equation Model (SEM) was utilized as the tool for data analysis. In terms of qualitative method, an in-depth interview, a dialogue with experts in the field of ecommerce for SMEs, and content analysis were used. By using the adjusted causal relationship structure model, it was revealed that the factors affecting the success of e-commerce for SMEs were found to be congruent with the empirical data. The hypothesis testing indicated that business management influenced the learning in the organization, the learning in the organization influenced business collaboration and the quality of the website, and these factors, in turn, influenced the success of SMEs. Moreover, the approach to enhance the standard of SMEs revealed that the majority of respondents wanted to enhance the standard of SMEs to a high level in the category of safety of e-commerce system, basic structure of e-commerce, development of staff potentials, assistance of budget and tax reduction, and law improvement regarding the e-commerce respectively.

Keywords: Electronic Commerce, Influencing Factors, Small and Medium Enterprises.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1561
3837 Financial Portfolio Optimization in Turkish Electricity Market via Value at Risk

Authors: F. Gökgöz, M. E. Atmaca

Abstract:

Electricity has an indispensable role in human daily life, technological development and economy. It is a special product or service that should be instantaneously generated and consumed. Sources of the world are limited so that effective and efficient use of them is very important not only for human life and environment but also for technological and economic development. Competitive electricity market is one of the important way that provides suitable platform for effective and efficient use of electricity. Besides benefits, it brings along some risks that should be carefully managed by a market player like Electricity Generation Company. Risk management is an essential part in market players’ decision making. In this paper, risk management through diversification is applied with the help of Value at Risk methods for case studies. Performance of optimal electricity sale solutions are measured and the portfolio performance has been evaluated via Sharpe-Ratio, and compared with conventional approach. Biennial historical electricity price data of Turkish Day Ahead Market are used to demonstrate the approach.

Keywords: Electricity market, portfolio optimization, risk management, Sharpe ratio, value at risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1052
3836 Parameters Optimization of the Laminated Composite Plate for Sound Transmission Problem

Authors: Yu T. Tsai, Jin H. Huang

Abstract:

In this paper, the specific sound Transmission Loss (TL) of the Laminated Composite Plate (LCP) with different material properties in each layer is investigated. The numerical method to obtain the TL of the LCP is proposed by using elastic plate theory. The transfer matrix approach is novelty presented for computational efficiency in solving the numerous layers of dynamic stiffness matrix (D-matrix) of the LCP. Besides the numerical simulations for calculating the TL of the LCP, the material properties inverse method is presented for the design of a laminated composite plate analogous to a metallic plate with a specified TL. As a result, it demonstrates that the proposed computational algorithm exhibits high efficiency with a small number of iterations for achieving the goal. This method can be effectively employed to design and develop tailor-made materials for various applications.

Keywords: Sound transmission loss, laminated composite plate, transfer matrix approach, inverse problem, elastic plate theory, material properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1973
3835 A Methodological Test to Study the Concrete Workability with the Fractal Model

Authors: F. Achouri, K. Chouicha

Abstract:

The main parameters affecting the workability are the water content, particle size, and the total surface of the grains, as long as the mixing water begins by wetting the surface of the grains and then fills the voids between the grains to form entrapped water, the quantity of water remaining is called free water. The aim of this study is to undertake a fractal approach through the relationship between the concrete formulation parameters and workability. To develop this approach a series of concrete taken from the literature was investigated by varying formulation parameters such as G/S, the quantity of cement C and the quantity of water W. We also call another model as the model of water layer thickness and model of paste layer thickness to judge their relevance, hence the following results: the relevance of the water layer thickness model is considered as a relevant when there is a variation in the water quantity. The model of the paste layer thickness is only applicable if we considered that the paste is made with the grain value Dmax = 2.85: value from which we see a stability of the model.

Keywords: Concrete, fractal method, paste layer thickness, water layer thickness, workability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
3834 Effective Image and Video Error Concealment using RST-Invariant Partial Patch Matching Model and Exemplar-based Inpainting

Authors: Shiraz Ahmad, Zhe-Ming Lu

Abstract:

An effective visual error concealment method has been presented by employing a robust rotation, scale, and translation (RST) invariant partial patch matching model (RSTI-PPMM) and exemplar-based inpainting. While the proposed robust and inherently feature-enhanced texture synthesis approach ensures the generation of excellent and perceptually plausible visual error concealment results, the outlier pruning property guarantees the significant quality improvements, both quantitatively and qualitatively. No intermediate user-interaction is required for the pre-segmented media and the presented method follows a bootstrapping approach for an automatic visual loss recovery and the image and video error concealment.

Keywords: Exemplar-based image and video inpainting, outlierpruning, RST-invariant partial patch matching model (RSTI-PPMM), visual error concealment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413
3833 Comparative Analysis of Machine Learning Tools: A Review

Authors: S. Sarumathi, M. Vaishnavi, S. Geetha, P. Ranjetha

Abstract:

Machine learning is a new and exciting area of artificial intelligence nowadays. Machine learning is the most valuable, time, supervised, and cost-effective approach. It is not a narrow learning approach; it also includes a wide range of methods and techniques that can be applied to a wide range of complex realworld problems and time domains. Biological image classification, adaptive testing, computer vision, natural language processing, object detection, cancer detection, face recognition, handwriting recognition, speech recognition, and many other applications of machine learning are widely used in research, industry, and government. Every day, more data are generated, and conventional machine learning techniques are becoming obsolete as users move to distributed and real-time operations. By providing fundamental knowledge of machine learning tools and research opportunities in the field, the aim of this article is to serve as both a comprehensive overview and a guide. A diverse set of machine learning resources is demonstrated and contrasted with the key features in this survey.

Keywords: Artificial intelligence, machine learning, deep learning, machine learning algorithms, machine learning tools.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1849
3832 Semantic Enhanced Social Media Sentiments for Stock Market Prediction

Authors: K. Nirmala Devi, V. Murali Bhaskaran

Abstract:

Traditional document representation for classification follows Bag of Words (BoW) approach to represent the term weights. The conventional method uses the Vector Space Model (VSM) to exploit the statistical information of terms in the documents and they fail to address the semantic information as well as order of the terms present in the documents. Although, the phrase based approach follows the order of the terms present in the documents rather than semantics behind the word. Therefore, a semantic concept based approach is used in this paper for enhancing the semantics by incorporating the ontology information. In this paper a novel method is proposed to forecast the intraday stock market price directional movement based on the sentiments from Twitter and money control news articles. The stock market forecasting is a very difficult and highly complicated task because it is affected by many factors such as economic conditions, political events and investor’s sentiment etc. The stock market series are generally dynamic, nonparametric, noisy and chaotic by nature. The sentiment analysis along with wisdom of crowds can automatically compute the collective intelligence of future performance in many areas like stock market, box office sales and election outcomes. The proposed method utilizes collective sentiments for stock market to predict the stock price directional movements. The collective sentiments in the above social media have powerful prediction on the stock price directional movements as up/down by using Granger Causality test.

Keywords: Bag of Words, Collective Sentiments, Ontology, Semantic relations, Sentiments, Social media, Stock Prediction, Twitter, Vector Space Model and wisdom of crowds.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2800
3831 Effect of Different Diesel Fuels on Formation of the Cavitation Phenomena

Authors: Mohammadreza Nezamirad, Sepideh Amirahmadian, Nasim Sabetpour, Azadeh Yazdi, Amirmasoud Hamedi

Abstract:

Cavitation inside a diesel injector nozzle is investigated numerically in this study. The Reynolds Stress Navier Stokes set of equations (RANS) are utilized to investigate flow behavior inside the nozzle numerically. Moreover, K-ε turbulent model is found to be a better approach comparing to K-ω turbulent model. The Winklhofer rectangular shape nozzle is also simulated in order to verify the current numerical scheme, and with the mass flow rate approach, the current solution is verified. Afterward, a six-hole real size nozzle was simulated and it was found that among the different fuels used in this study with the same condition, diesel fuel provides the largest length of cavitation. Also, it was found that at the same boundary condition, rapeseed methyl ester (RME) fuel leads to the highest value of discharge coefficient and mass flow rate.

Keywords: cavitation, diesel fuel, CFD, real size nozzle, discharge coefficient

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 470
3830 Holistic Approach to Assess the Potential of Using Traditional and Advance Insulation Materials for Energy Retrofit of Office Buildings

Authors: Marco Picco, Mahmood Alam

Abstract:

Improving the energy performance of existing buildings can be challenging, particularly when facades cannot be modified, and the only available option is internal insulation. In such cases, the choice of the most suitable material becomes increasingly complex, as in addition to thermal transmittance and capital cost, the designer needs to account for the impact of the intervention on the internal spaces, and in particular the loss of usable space due to the additional layers of materials installed. This paper explores this issue by analyzing a case study of an average office building needing to go through a refurbishment in order to reach the limits imposed by current regulations to achieve energy efficiency in buildings. The building is simulated through dynamic performance simulation under three different climate conditions in order to evaluate its energy needs. The use of Vacuum Insulated Panels as an option for energy refurbishment is compared to traditional insulation materials (XPS, Mineral Wool). For each scenario, energy consumptions are calculated and, in combination with their expected capital costs, used to perform a financial feasibility analysis. A holistic approach is proposed, taking into account the impact of the intervention on internal space by quantifying the value of the lost usable space and used in the financial feasibility analysis. The proposed approach highlights how taking into account different drivers will lead to the choice of different insulation materials, showing how accounting for the economic value of space can make VIPs an attractive solution for energy retrofitting under various climate conditions.

Keywords: Vacuum insulated panels, building performance simulation, payback period, building energy retrofit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 522
3829 Robust H∞ Filter Design for Uncertain Fuzzy Descriptor Systems: LMI-Based Design

Authors: Wudhichai Assawinchaichote, Sing Kiong Nguang

Abstract:

This paper examines the problem of designing a robust H∞ filter for a class of uncertain fuzzy descriptor systems described by a Takagi-Sugeno (TS) fuzzy model. Based on a linear matrix inequality (LMI) approach, LMI-based sufficient conditions for the uncertain nonlinear descriptor systems to have an H∞ performance are derived. To alleviate the ill-conditioning resulting from the interaction of slow and fast dynamic modes, solutions to the problem are given in terms of linear matrix inequalities which are independent of the singular perturbation ε, when ε is sufficiently small. The proposed approach does not involve the separation of states into slow and fast ones and it can be applied not only to standard, but also to nonstandard uncertain nonlinear descriptor systems. A numerical example is provided to illustrate the design developed in this paper.

Keywords: H∞ control, Takagi-Sugeno (TS) fuzzy model, Linear Matrix Inequalities (LMIs), Descriptor systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1404
3828 Soft Real-Time Fuzzy Task Scheduling for Multiprocessor Systems

Authors: Mahdi Hamzeh, Sied Mehdi Fakhraie, Caro Lucas

Abstract:

All practical real-time scheduling algorithms in multiprocessor systems present a trade-off between their computational complexity and performance. In real-time systems, tasks have to be performed correctly and timely. Finding minimal schedule in multiprocessor systems with real-time constraints is shown to be NP-hard. Although some optimal algorithms have been employed in uni-processor systems, they fail when they are applied in multiprocessor systems. The practical scheduling algorithms in real-time systems have not deterministic response time. Deterministic timing behavior is an important parameter for system robustness analysis. The intrinsic uncertainty in dynamic real-time systems increases the difficulties of scheduling problem. To alleviate these difficulties, we have proposed a fuzzy scheduling approach to arrange real-time periodic and non-periodic tasks in multiprocessor systems. Static and dynamic optimal scheduling algorithms fail with non-critical overload. In contrast, our approach balances task loads of the processors successfully while consider starvation prevention and fairness which cause higher priority tasks have higher running probability. A simulation is conducted to evaluate the performance of the proposed approach. Experimental results have shown that the proposed fuzzy scheduler creates feasible schedules for homogeneous and heterogeneous tasks. It also and considers tasks priorities which cause higher system utilization and lowers deadline miss time. According to the results, it performs very close to optimal schedule of uni-processor systems.

Keywords: Computational complexity, Deadline, Feasible scheduling, Fuzzy scheduling, Priority, Real-time multiprocessor systems, Robustness, System utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2129
3827 A Model for Estimation of Efforts in Development of Software Systems

Authors: Parvinder S. Sandhu, Manisha Prashar, Pourush Bassi, Atul Bisht

Abstract:

Software effort estimation is the process of predicting the most realistic use of effort required to develop or maintain software based on incomplete, uncertain and/or noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets. There are various models like Halstead, Walston-Felix, Bailey-Basili, Doty and GA Based models which have already used to estimate the software effort for projects. In this study Statistical Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are experimented to estimate the software effort for projects. The performances of the developed models were tested on NASA software project datasets and results are compared with the Halstead, Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based models mentioned in the literature. The result shows that the NF Model has the lowest MMRE and RMSE values. The NF Model shows the best results as compared with the Fuzzy-GA based hybrid Inference System and other existing Models that are being used for the Effort Prediction with lowest MMRE and RMSE values.

Keywords: Neuro-Fuzzy Model, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model, GA Based Model, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3227
3826 A Novel Fuzzy Logic Based Controller to Adjust the Brightness of the Television Screen with Respect to Surrounding Light

Authors: A. V. Sai Balasubramanian, N. Ravi Shankar, S. Subbaraman, R. Rengaraj

Abstract:

One of the major cause of eye strain and other problems caused while watching television is the relative illumination between the screen and its surrounding. This can be overcome by adjusting the brightness of the screen with respect to the surrounding light. A controller based on fuzzy logic is proposed in this paper. The fuzzy controller takes in the intensity of light surrounding the screen and the present brightness of the screen as input. The output of the fuzzy controller is the grid voltage corresponding to the required brightness. This voltage is given to CRT and brightness is controller dynamically. For the given test system data, different de-fuzzifier methods have been implemented and the results are compared. In order to validate the effectiveness of the proposed approach, a fuzzy controller has been designed by obtaining a test data from a real time system. The simulations are performed in MATLAB and are verified with standard system data. The proposed approach can be implemented for real time applications.

Keywords: Fuzzy controller, Grid voltage

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2787
3825 Detection of Sags, Swells, and Transients Using Windowing Technique Based On Continuous S-Transform (CST)

Authors: K. Daud, A. F. Abidin, N. Hamzah, H. S. Nagindar Singh

Abstract:

This paper produces a new approach for power quality analysis using a windowing technique based on Continuous S-transform (CST). This half-cycle window technique approach can detect almost correctly for initial detection of disturbances i.e. voltage sags, swells, and transients. Samples in half cycle window has been analyzed based continuous S-transform for entire disturbance waveform. The modified parameter has been produced by MATLAB programming m-file based on continuous s-transform. CST has better time frequency and localization property than traditional and also has ability to detect the disturbance under noisy condition correctly. The excellent time-frequency resolution characteristic of the CST makes it the most an attractive candidate for analysis of power system disturbances signals.

Keywords: Power quality disturbances, initial detection, half cycle windowing, continuous S-transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2049