Search results for: Machine Tool
1927 Generating Qualitative Causal Graph using Modeling Constructs of Qualitative Process Theory for Explaining Organic Chemistry Reactions
Authors: Alicia Y. C. Tang, Rukaini Abdullah, Sharifuddin M. Zain, Noorsaadah A. Rahman
Abstract:
This paper discusses the causal explanation capability of QRIOM, a tool aimed at supporting learning of organic chemistry reactions. The development of the tool is based on the hybrid use of Qualitative Reasoning (QR) technique and Qualitative Process Theory (QPT) ontology. Our simulation combines symbolic, qualitative description of relations with quantity analysis to generate causal graphs. The pedagogy embedded in the simulator is to both simulate and explain organic reactions. Qualitative reasoning through a causal chain will be presented to explain the overall changes made on the substrate; from initial substrate until the production of final outputs. Several uses of the QPT modeling constructs in supporting behavioral and causal explanation during run-time will also be demonstrated. Explaining organic reactions through causal graph trace can help improve the reasoning ability of learners in that their conceptual understanding of the subject is nurtured.Keywords: Qualitative reasoning, causal graph, organicreactions, explanation, QPT, modeling constructs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14191926 FSM-based Recognition of Dynamic Hand Gestures via Gesture Summarization Using Key Video Object Planes
Authors: M. K. Bhuyan
Abstract:
The use of human hand as a natural interface for humancomputer interaction (HCI) serves as the motivation for research in hand gesture recognition. Vision-based hand gesture recognition involves visual analysis of hand shape, position and/or movement. In this paper, we use the concept of object-based video abstraction for segmenting the frames into video object planes (VOPs), as used in MPEG-4, with each VOP corresponding to one semantically meaningful hand position. Next, the key VOPs are selected on the basis of the amount of change in hand shape – for a given key frame in the sequence the next key frame is the one in which the hand changes its shape significantly. Thus, an entire video clip is transformed into a small number of representative frames that are sufficient to represent a gesture sequence. Subsequently, we model a particular gesture as a sequence of key frames each bearing information about its duration. These constitute a finite state machine. For recognition, the states of the incoming gesture sequence are matched with the states of all different FSMs contained in the database of gesture vocabulary. The core idea of our proposed representation is that redundant frames of the gesture video sequence bear only the temporal information of a gesture and hence discarded for computational efficiency. Experimental results obtained demonstrate the effectiveness of our proposed scheme for key frame extraction, subsequent gesture summarization and finally gesture recognition.
Keywords: Hand gesture, MPEG-4, Hausdorff distance, finite state machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20271925 Computational Method for Annotation of Protein Sequence According to Gene Ontology Terms
Authors: Razib M. Othman, Safaai Deris, Rosli M. Illias
Abstract:
Annotation of a protein sequence is pivotal for the understanding of its function. Accuracy of manual annotation provided by curators is still questionable by having lesser evidence strength and yet a hard task and time consuming. A number of computational methods including tools have been developed to tackle this challenging task. However, they require high-cost hardware, are difficult to be setup by the bioscientists, or depend on time intensive and blind sequence similarity search like Basic Local Alignment Search Tool. This paper introduces a new method of assigning highly correlated Gene Ontology terms of annotated protein sequences to partially annotated or newly discovered protein sequences. This method is fully based on Gene Ontology data and annotations. Two problems had been identified to achieve this method. The first problem relates to splitting the single monolithic Gene Ontology RDF/XML file into a set of smaller files that can be easy to assess and process. Thus, these files can be enriched with protein sequences and Inferred from Electronic Annotation evidence associations. The second problem involves searching for a set of semantically similar Gene Ontology terms to a given query. The details of macro and micro problems involved and their solutions including objective of this study are described. This paper also describes the protein sequence annotation and the Gene Ontology. The methodology of this study and Gene Ontology based protein sequence annotation tool namely extended UTMGO is presented. Furthermore, its basic version which is a Gene Ontology browser that is based on semantic similarity search is also introduced.
Keywords: automatic clustering, bioinformatics tool, gene ontology, protein sequence annotation, semantic similarity search
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31281924 Computational Study on Cardiac-Coronary Interaction in Terms of Coronary Flow-Pressure Waveforms in Presence of Drugs: Comparison Between Simulated and In Vivo Data
Authors: C. De Lazzari, E. Del Prete, I. Genuini, F. Fedele
Abstract:
Cardiovascular human simulator can be a useful tool in understanding complex physiopathological process in cardiocirculatory system. It can also be a useful tool in order to investigate the effects of different drugs on hemodynamic parameters. The aim of this work is to test the potentiality of our cardiovascular numerical simulator CARDIOSIM© in reproducing flow/pressure coronary waveforms in presence of two different drugs: Amlodipine (AMLO) and Adenosine (ADO). In particular a time-varying intramyocardial compression, assumed to be proportional to the left ventricular pressure, was related to the venous coronary compliances in order to study its effects on the coronary blood flow and the flow/pressure loop. Considering that coronary circulation dynamics is strongly interrelated with the mechanics of the left ventricular contraction, relaxation, and filling, the numerical model allowed to analyze the effects induced by the left ventricular pressure on the coronary flow.Keywords: Cardiovascular system, Coronary blood flow, Hemodynamic, Numerical simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17351923 Carbon Isotope Discrimination, A Tool for Screening of Salinity Tolerance of Genotypes
Authors: Alireza Dadkhah, Mahmoud Ghorbanzadeh- Neghab
Abstract:
This study carried out in order to investigate the effects of salinity on carbon isotope discrimination (Δ) of shoots and roots of four sugar beet cultivars (cv) including Madison (British origin) and three Iranian culivars (7233-P12, 7233-P21 and 7233-P29). Plants were grown in sand culture medium in greenhouse conditions. Plants irrigated with saline water (tap water as control, 50 mM, 150 mM, 250 mM and 350 mM of NaCl + CaCl2 in 5 to 1 molar ratio) from 4 leaves stage for 16 weeks. Carbon isotope discrimination significantly decreased with increasing salinity. Significant differences of Δ between shoot and root were observed in all cvs and all levels of salinity. Madison cv showed lower Δ in shoot and root than other three cvs at all levels of salinity expect control, but cv 7233-P29 had significantly higher Δ values at saline conditions of 150 mM and above. Therefore, Δ might be applicable, as a useful tool, for study of salinity tolerance of sugar beet genotypes.Keywords: Carbon isotope discrimination, Photosynthesis, Salt stress, Sugar beet
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16931922 Decision Tree Based Scheduling for Flexible Job Shops with Multiple Process Plans
Authors: H.-H. Doh, J.-M. Yu, Y.-J. Kwon, J.-H. Shin, H.-W. Kim, S.-H. Nam, D.-H. Lee
Abstract:
This paper suggests a decision tree based approach for flexible job shop scheduling with multiple process plans, i.e. each job can be processed through alternative operations, each of which can be processed on alternative machines. The main decision variables are: (a) selecting operation/machine pair; and (b) sequencing the jobs assigned to each machine. As an extension of the priority scheduling approach that selects the best priority rule combination after many simulation runs, this study suggests a decision tree based approach in which a decision tree is used to select a priority rule combination adequate for a specific system state and hence the burdens required for developing simulation models and carrying out simulation runs can be eliminated. The decision tree based scheduling approach consists of construction and scheduling modules. In the construction module, a decision tree is constructed using a four-stage algorithm, and in the scheduling module, a priority rule combination is selected using the decision tree. To show the performance of the decision tree based approach suggested in this study, a case study was done on a flexible job shop with reconfigurable manufacturing cells and a conventional job shop, and the results are reported by comparing it with individual priority rule combinations for the objectives of minimizing total flow time and total tardiness.
Keywords: Flexible job shop scheduling, Decision tree, Priority rules, Case study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33191921 Performance Analysis of Genetic Algorithm with kNN and SVM for Feature Selection in Tumor Classification
Authors: C. Gunavathi, K. Premalatha
Abstract:
Tumor classification is a key area of research in the field of bioinformatics. Microarray technology is commonly used in the study of disease diagnosis using gene expression levels. The main drawback of gene expression data is that it contains thousands of genes and a very few samples. Feature selection methods are used to select the informative genes from the microarray. These methods considerably improve the classification accuracy. In the proposed method, Genetic Algorithm (GA) is used for effective feature selection. Informative genes are identified based on the T-Statistics, Signal-to-Noise Ratio (SNR) and F-Test values. The initial candidate solutions of GA are obtained from top-m informative genes. The classification accuracy of k-Nearest Neighbor (kNN) method is used as the fitness function for GA. In this work, kNN and Support Vector Machine (SVM) are used as the classifiers. The experimental results show that the proposed work is suitable for effective feature selection. With the help of the selected genes, GA-kNN method achieves 100% accuracy in 4 datasets and GA-SVM method achieves in 5 out of 10 datasets. The GA with kNN and SVM methods are demonstrated to be an accurate method for microarray based tumor classification.
Keywords: F-Test, Gene Expression, Genetic Algorithm, k- Nearest-Neighbor, Microarray, Signal-to-Noise Ratio, Support Vector Machine, T-statistics, Tumor Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45401920 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images
Authors: Jameela Ali Alkrimi, Loay E. George, Azizah Suliman, Abdul Rahim Ahmad, Karim Al-Jashamy
Abstract:
Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. Anemia is a lack of RBCs is characterized by its level compared to the normal hemoglobin level. In this study, a system based image processing methodology was developed to localize and extract RBCs from microscopic images. Also, the machine learning approach is adopted to classify the localized anemic RBCs images. Several textural and geometrical features are calculated for each extracted RBCs. The training set of features was analyzed using principal component analysis (PCA). With the proposed method, RBCs were isolated in 4.3secondsfrom an image containing 18 to 27 cells. The reasons behind using PCA are its low computation complexity and suitability to find the most discriminating features which can lead to accurate classification decisions. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network RBFNN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained within short time period, and the results became better when PCA was used.
Keywords: Red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31991919 Computer - based Systems for High Speed Vessels Navigators – Engineers Training
Authors: D. E. Gourgoulis, C. G. Yakinthos, M. G. Vassiliadou
Abstract:
With high speed vessels getting ever more sophisti-cated, travelling at higher and higher speeds and operating in With high speed vessels getting ever more sophisticated, travelling at higher and higher speeds and operating in areas of high maritime traffic density, training becomes of the highest priority to ensure that safety levels are maintained, and risks are adequately mitigated. Training onboard the actual craft on the actual route still remains the most effective way for crews to gain experience. However, operational experience and incidents during the last 10 years demonstrate the need for supplementary training whether in the area of simulation or man to man, man/ machine interaction. Training and familiarisation of the crew is the most important aspect in preventing incidents. The use of simulator, computer and web based training systems in conjunction with onboard training focusing on critical situations will improve the man machine interaction and thereby reduce the risk of accidents. Today, both ship simulator and bridge teamwork courses are now becoming the norm in order to improve further emergency response and crisis management skills. One of the main causes of accidents is the human factor. An efficient way to reduce human errors is to provide high-quality training to the personnel and to select the navigators carefully.areas of high maritime traffic density, training becomes of the highest priority to ensure that safety levels are maintained, and risks are adequately mitigated. Training onboard the actual craft on the actual route still remains the most effective way for crews to gain experience. How-ever, operational experience and incidents during the last 10 years demonstrate the need for supplementary training whether in the area of simulation or man to man, man/ machine interaction. Training and familiarisation of the crew is the most important aspect in preventing incidents. The use of simulator, computer and web based training systems in conjunction with onboard training focusing on critical situations will improve the man machine interaction and thereby reduce the risk of accidents. Today, both ship simulator and bridge teamwork courses are now becoming the norm in order to improve further emergency response and crisis management skills. One of the main causes of accidents is the human factor. An efficient way to reduce human errors is to provide high-quality training to the person-nel and to select the navigators carefully. KeywordsCBT - WBT systems, Human factors.Keywords: CBT - WBT systems, Human factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15281918 A TIPSO-SVM Expert System for Efficient Classification of TSTO Surrogates
Authors: Ali Sarosh, Dong Yun-Feng, Muhammad Umer
Abstract:
Fully reusable spaceplanes do not exist as yet. This implies that design-qualification for optimized highly-integrated forebody-inlet configuration of booster-stage vehicle cannot be based on archival data of other spaceplanes. Therefore, this paper proposes a novel TIPSO-SVM expert system methodology. A non-trivial problem related to optimization and classification of hypersonic forebody-inlet configuration in conjunction with mass-model of the two-stage-to-orbit (TSTO) vehicle is solved. The hybrid-heuristic machine learning methodology is based on two-step improved particle swarm optimizer (TIPSO) algorithm and two-step support vector machine (SVM) data classification method. The efficacy of method is tested by first evolving an optimal configuration for hypersonic compression system using TIPSO algorithm; thereafter, classifying the results using two-step SVM method. In the first step extensive but non-classified mass-model training data for multiple optimized configurations is segregated and pre-classified for learning of SVM algorithm. In second step the TIPSO optimized mass-model data is classified using the SVM classification. Results showed remarkable improvement in configuration and mass-model along with sizing parameters.
Keywords: TIPSO-SVM expert system, TIPSO algorithm, two-step SVM method, aerothermodynamics, mass-modeling, TSTO vehicle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23181917 Optimization of Wire EDM Parameters for Fabrication of Micro Channels
Authors: Gurinder Singh Brar, Sarbjeet Singh, Harry Garg
Abstract:
Wire Electric Discharge Machining (WEDM) is thermal machining process capable of machining very hard electrically conductive material irrespective of their hardness. WEDM is being widely used to machine micro scale parts with the high dimensional accuracy and surface finish. The objective of this paper is to optimize the process parameters of wire EDM to fabricate the micro channels and to calculate the surface finish and material removal rate of micro channels fabricated using wire EDM. The material used is aluminum 6061 alloy. The experiments were performed using CNC wire cut electric discharge machine. The effect of various parameters of WEDM like pulse on time (TON) with the levels (100, 150, 200), pulse off time (TOFF) with the levels (25, 35, 45) and current (IP) with the levels (105, 110, 115) were investigated to study the effect on output parameter i.e. Surface Roughness and Material Removal Rate (MRR). Each experiment was conducted under different conditions of pulse on time, pulse off time and peak current. For material removal rate, TON and Ip were the most significant process parameter. MRR increases with the increase in TON and Ip and decreases with the increase in TOFF. For surface roughness, TON and Ip have the maximum effect and TOFF was found out to be less effective.Keywords: Micro Channels, Wire Electric Discharge Machining (WEDM), Metal Removal Rate (MRR), Surface Finish.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26991916 Benchmarking of Pentesting Tools
Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba
Abstract:
The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.Keywords: Cybersecurity, IDS, security, web scanners, web vulnerabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18051915 Multi-Stage Multi-Period Production Planning in Wire and Cable Industry
Authors: Mahnaz Hosseinzadeh, Shaghayegh Rezaee Amiri
Abstract:
This paper presents a methodology for serial production planning problem in wire and cable manufacturing process that addresses the problem of input-output imbalance in different consecutive stations, hoping to minimize the halt of machines in each stage. To this end, a linear Goal Programming (GP) model is developed, in which four main categories of constraints as per the number of runs per machine, machines’ sequences, acceptable inventories of machines at the end of each period, and the necessity of fulfillment of the customers’ orders are considered. The model is formulated based upon on the real data obtained from IKO TAK Company, an important supplier of wire and cable for oil and gas and automotive industries in Iran. By solving the model in GAMS software the optimal number of runs, end-of-period inventories, and the possible minimum idle time for each machine are calculated. The application of the numerical results in the target company has shown the efficiency of the proposed model and the solution in decreasing the lead time of the end product delivery to the customers by 20%. Accordingly, the developed model could be easily applied in wire and cable companies for the aim of optimal production planning to reduce the halt of machines in manufacturing stages.
Keywords: Serial manufacturing process, production planning, wire and cable industry, goal programming approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9311914 Effects of Manufacture and Assembly Errors on the Output Error of Globoidal Cam Mechanisms
Authors: Shuting Ji, Yueming Zhang, Jing Zhao
Abstract:
The output error of the globoidal cam mechanism can be considered as a relevant indicator of mechanism performance, because it determines kinematic and dynamical behavior of mechanical transmission. Based on the differential geometry and the rigid body transformations, the mathematical model of surface geometry of the globoidal cam is established. Then we present the analytical expression of the output error (including the transmission error and the displacement error along the output axis) by considering different manufacture and assembly errors. The effects of the center distance error, the perpendicular error between input and output axes and the rotational angle error of the globoidal cam on the output error are systematically analyzed. A globoidal cam mechanism which is widely used in automatic tool changer of CNC machines is applied for illustration. Our results show that the perpendicular error and the rotational angle error have little effects on the transmission error but have great effects on the displacement error along the output axis. This study plays an important role in the design, manufacture and assembly of the globoidal cam mechanism.Keywords: Globoidal cam mechanism, manufacture error, transmission error, automatic tool changer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23491913 Artificial Intelligence: A Comprehensive and Systematic Literature Review of Applications and Comparative Technologies
Authors: Z. M. Najmi
Abstract:
Over the years, the question around Artificial Intelligence has always been one with many answers. Whether by means of use in business and industry or complicated algorithmic programming, management of these technologies has always been the core focus. More recently, technologies have been questioned in industry and society alike as to whether they have improved human-centred design, assisted choices and objectives, and had a hand in systematic processes across the board. With these questions the answer may lie within AI technologies, and the steps needed in removing common human error. Elements such as Machine Learning, Deep Learning, Recommender Systems and Natural Language Processing will all be features to consider moving forward. Our previous intervention with AI applications has resulted in increased productivity, however, raised concerns for the continuation of traditional human-centred occupations. Emerging technologies such as Augmented Reality and Virtual Reality have all played a part in this during AI’s prominent rise. As mentioned, AI has been constantly under the microscope; the benefits and drawbacks may seem endless is wide, but AI is something we must take notice of and adapt into our everyday lives. The aim of this paper is to give an overview of the technologies surrounding A.I. and its’ related technologies. A comprehensive review has been written as a timeline of the developing events and key points in the history of Artificial Intelligence. This research is gathered entirely from secondary research, academic statements of knowledge and gathered to produce an understanding of the timeline of AI.
Keywords: Artificial Intelligence, Deep Learning, Augmented Reality, Reinforcement Learning, Machine Learning, Supervised Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5781912 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting
Authors: Kemal Polat
Abstract:
In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.
Keywords: Fuzzy C-means clustering, Fuzzy C-means clustering based attribute weighting, Pima Indians diabetes dataset, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17631911 Towards a Compliance Reporting using a Balanced Scorecard
Authors: Michael Amberg, Dipl. Kfm. Johannes C. Panitz
Abstract:
Compliance requires an effective communication within an enterprise as well as towards a company-s external environment. This requirement commences with the implementation of compliance within large scale compliance projects and still persists in the compliance reporting within standard operations. On the one hand the understanding of compliance necessities within the organization is promoted. On the other hand reduction of asymmetric information with compliance stakeholders is achieved. To reach this goal, a central reporting must provide a consolidated view of different compliance efforts- statuses. A concept which could be adapted for this purpose is the balanced scorecard by Kaplan / Norton. This concept has not been analyzed in detail concerning its adequacy for a holistic compliance reporting starting in compliance projects until later usage in regularly compliance operations. At first, this paper evaluates if a holistic compliance reporting can be designed by using the balanced scorecard concept. The current status of compliance reporting clearly shows that scorecards are generally accepted as a compliance reporting tool and are already used for corporate governance reporting. Additional specialized compliance IT - solutions exist in the market. After the scorecard-s adequacy is thoroughly examined and proofed, an example strategy map as the basis to derive a compliance balanced scorecard is defined. This definition answers the question on proceeding in designing a compliance reporting tool.Keywords: Balanced Scorecard, Compliance, ComplianceReporting, Compliance Scorecard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33661910 Development of Genetic-based Machine Learning for Network Intrusion Detection (GBML-NID)
Authors: Wafa' S.Al-Sharafat, Reyadh Naoum
Abstract:
Society has grown to rely on Internet services, and the number of Internet users increases every day. As more and more users become connected to the network, the window of opportunity for malicious users to do their damage becomes very great and lucrative. The objective of this paper is to incorporate different techniques into classier system to detect and classify intrusion from normal network packet. Among several techniques, Steady State Genetic-based Machine Leaning Algorithm (SSGBML) will be used to detect intrusions. Where Steady State Genetic Algorithm (SSGA), Simple Genetic Algorithm (SGA), Modified Genetic Algorithm and Zeroth Level Classifier system are investigated in this research. SSGA is used as a discovery mechanism instead of SGA. SGA replaces all old rules with new produced rule preventing old good rules from participating in the next rule generation. Zeroth Level Classifier System is used to play the role of detector by matching incoming environment message with classifiers to determine whether the current message is normal or intrusion and receiving feedback from environment. Finally, in order to attain the best results, Modified SSGA will enhance our discovery engine by using Fuzzy Logic to optimize crossover and mutation probability. The experiments and evaluations of the proposed method were performed with the KDD 99 intrusion detection dataset.Keywords: MSSGBML, Network Intrusion Detection, SGA, SSGA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16721909 A Novel SVM-Based OOK Detector in Low SNR Infrared Channels
Authors: J. P. Dubois, O. M. Abdul-Latif
Abstract:
Support Vector Machine (SVM) is a recent class of statistical classification and regression techniques playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM is applied to an infrared (IR) binary communication system with different types of channel models including Ricean multipath fading and partially developed scattering channel with additive white Gaussian noise (AWGN) at the receiver. The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these channel stochastic models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to classical binary signal maximum likelihood detection using a matched filter driven by On-Off keying (OOK) modulation. We found that the performance of SVM is superior to that of the traditional optimal detection schemes used in statistical communication, especially for very low signal-to-noise ratio (SNR) ranges. For large SNR, the performance of the SVM is similar to that of the classical detectors. The implication of these results is that SVM can prove very beneficial to IR communication systems that notoriously suffer from low SNR at the cost of increased computational complexity.
Keywords: Least square-support vector machine, on-off keying, matched filter, maximum likelihood detector, wireless infrared communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19531908 Modeling Aeration of Sharp Crested Weirs by Using Support Vector Machines
Authors: Arun Goel
Abstract:
The present paper attempts to investigate the prediction of air entrainment rate and aeration efficiency of a free overfall jets issuing from a triangular sharp crested weir by using regression based modelling. The empirical equations, Support vector machine (polynomial and radial basis function) models and the linear regression techniques were applied on the triangular sharp crested weirs relating the air entrainment rate and the aeration efficiency to the input parameters namely drop height, discharge, and vertex angle. It was observed that there exists a good agreement between the measured values and the values obtained using empirical equations, Support vector machine (Polynomial and rbf) models and the linear regression techniques. The test results demonstrated that the SVM based (Poly & rbf) model also provided acceptable prediction of the measured values with reasonable accuracy along with empirical equations and linear regression techniques in modelling the air entrainment rate and the aeration efficiency of a free overfall jets issuing from triangular sharp crested weir. Further sensitivity analysis has also been performed to study the impact of input parameter on the output in terms of air entrainment rate and aeration efficiency.Keywords: Air entrainment rate, dissolved oxygen, regression, SVM, weir.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19561907 Usability and Affordances: Examinations of Object-Naming and Object-Task Performance in Haptic Interfaces
Authors: Mia Sorensen
Abstract:
The introduction of haptic elements in a graphic user interfaces are becoming more widespread. Since haptics are being introduced rapidly into computational tools, investigating how these models affect Human-Computer Interaction would help define how to integrate and model new modes of interaction. The interest of this paper is to discuss and investigate the issues surrounding Haptic and Graphic User Interface designs (GUI) as separate systems, as well as understand how these work in tandem. The development of these systems is explored from a psychological perspective, based on how usability is addressed through learning and affordances, defined by J.J. Gibson. Haptic design can be a powerful tool, aiding in intuitive learning. The problems discussed within the text is how can haptic interfaces be integrated within a GUI without the sense of frivolity. Juxtaposing haptics and Graphic user interfaces has issues of motivation; GUI tends to have a performatory process, while Haptic Interfaces use affordances to learn tool use. In a deeper view, it is noted that two modes of perception, foveal and ambient, dictate perception. These two modes were once thought to work in tandem, however it has been discovered that these processes work independently from each other. Foveal modes interpret orientation is space which provide for posture, locomotion, and motor skills with variations of the sensory information, which instructs perceptions of object-task performance. It is contended, here, that object-task performance is a key element in the use of Haptic Interfaces because exploratory learning uses affordances in order to use an object, without meditating an experience cognitively. It is a direct experience that, through iteration, can lead to skill-sets. It is also indicated that object-task performance will not work as efficiently without the use of exploratory or kinesthetic learning practices. Therefore, object-task performance is not as congruently explored in GUI than it is practiced in Haptic interfaces.
Keywords: Affordances, Graphic User Interface, HapticInterfaces, Tool-Use, Object-Naming, Object-Task Performance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17531906 From Experiments to Numerical Modeling: A Tool for Teaching Heat Transfer in Mechanical Engineering
Authors: D. Zabala, Y. Cárdenas, G. Núñez
Abstract:
In this work the numerical simulation of transient heat transfer in a cylindrical probe is done. An experiment was conducted introducing a steel cylinder in a heating chamber and registering its surface temperature along the time during one hour. In parallel, a mathematical model was solved for one dimension transient heat transfer in cylindrical coordinates, considering the boundary conditions of the test. The model was solved using finite difference method, because the thermal conductivity in the cylindrical steel bar and the convection heat transfer coefficient used in the model are considered temperature dependant functions, and both conditions prevent the use of the analytical solution. The comparison between theoretical and experimental results showed the average deviation is below 2%. It was concluded that numerical methods are useful in order to solve engineering complex problems. For constant k and h, the experimental methodology used here can be used as a tool for teaching heat transfer in mechanical engineering, using mathematical simplified models with analytical solutions.Keywords: Heat transfer experiment, thermal conductivity, finite difference, engineering education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14601905 The Development of the Quality Management Processes for the Building and Environment of the Basic Education Schools
Authors: Suppara Charoenpoom
Abstract:
The objectives of this research was to design and develop a quality management of the school buildings and environment. A quantitative and qualitative mixed research methodology was used. The population sample included 14 directors of primary schools. Two research tools were used. The first research tool included an in-depth interview and questionnaire. The second research tool included the Quality Business Process and Quality Work Procedure, and a Key Performance Indicator of each activity. The statistics included mean and standard deviation. The findings for the development of a quality management process of buildings and environment administration of the basic schools consisted of one quality business process (QBP) and seven quality work processes (QWP). The result from the experts’ evaluation revealed that the process and implementation of quality management of the school buildings and environment has passed the inspection process with consensus. This implies that the process of quality management of the school buildings and environment is suitable for implementation. Moreover, the level of agreement in the feasibility of the implementation of this plan had the mean in the range of 0.64-1.00 which suggests the design of the new plan is acceptable.
Keywords: Process, Building, Environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16341904 A Critical Review of the Adequacy of EIA Reports-Evidence from Pakistan
Authors: Obaidullah Nadeem, Rizwan Hameed
Abstract:
The preparation of good-quality Environmental Impact Assessment (EIA) reports contribute to enhancing overall effectiveness of EIA. This component of the EIA process becomes more important in situation where public participation is weak and there is lack of expertise on the part of the competent authority. In Pakistan, EIA became mandatory for every project likely to cause adverse environmental impacts from July 1994. The competent authority also formulated guidelines for preparation and review of EIA reports in 1997. However, EIA is yet to prove as a successful decision support tool to help in environmental protection. One of the several reasons of this ineffectiveness is the generally poor quality of EIA reports. This paper critically reviews EIA reports of some randomly selected projects. Interviews of EIA consultants, project proponents and concerned government officials have also been conducted to underpin the root causes of poor quality of EIA reports. The analysis reveals several inadequacies particularly in areas relating to identification, evaluation and mitigation of key impacts and consideration of alternatives. The paper identifies some opportunities and suggests measures for improving the quality of EIA reports and hence making EIA an effective tool to help in environmental protection.
Keywords: Environmental Impact Assessment, EIA Guidelines, EIA Reports, Pakistan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33091903 A Book Cover as an Expression of Conceptualization and a Tool of Social Identity Construction: The Interpretation Based on the Example of G. Ritzer's book McDonaldization of Society
Authors: Jiří Pavelka
Abstract:
The study is based on the assumption that media products are appropriate subjects for the exploration of social and cultural identities as a keystone of value orientations of their authors, producers and target audiences. The research object of the study is the title page of the book cover of a professional publication that serves as a medium of marketing, scientific and intercultural communication, which is the result of semiotic and intercultural transfer. The study aims to answer the question whether the book cover is an expression of conceptualization and tool for social identity construction. It attempts to determine what value orientations and what concepts of social and cultural identities are hidden in the narrative structures of the book cover of the Czech translation of the book by G. Ritzer The McDonaldization of Society (1993), issued after the fall of the iron curtain in 1996 in the Czech Republic.
Keywords: Social and cultural identity, book cover, marketing communication, semiotic and intercultural transfer, narrative structure, the McDonaldisation of Society.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18911902 Intelligent Modeling of the Electrical Activity of the Human Heart
Authors: Lambros V. Skarlas, Grigorios N. Beligiannis, Efstratios F. Georgopoulos, Adam V. Adamopoulos
Abstract:
The aim of this contribution is to present a new approach in modeling the electrical activity of the human heart. A recurrent artificial neural network is being used in order to exhibit a subset of the dynamics of the electrical behavior of the human heart. The proposed model can also be used, when integrated, as a diagnostic tool of the human heart system. What makes this approach unique is the fact that every model is being developed from physiological measurements of an individual. This kind of approach is very difficult to apply successfully in many modeling problems, because of the complexity and entropy of the free variables describing the complex system. Differences between the modeled variables and the variables of an individual, measured at specific moments, can be used for diagnostic purposes. The sensor fusion used in order to optimize the utilization of biomedical sensors is another point that this paper focuses on. Sensor fusion has been known for its advantages in applications such as control and diagnostics of mechanical and chemical processes.Keywords: Artificial Neural Networks, Diagnostic System, Health Condition Modeling Tool, Heart Diagnostics Model, Heart Electricity Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18271901 Feasibility of Risk Assessment for Type 2 Diabetes in Community Pharmacies Using Two Different Approaches: A Pilot Study in Thailand
Authors: Thitaporn Thoopputra, Tipaporn Pongmesa, Shuchuen Li
Abstract:
Aims: To evaluate the application of non-invasive diabetes risk assessment tool in community pharmacy setting. Methods: Thai diabetes risk score was applied to assess individuals at risk of developing type 2 diabetes. Interactive computer-based risk screening (IT) and paper-based risk screening (PT) tools were applied. Participants aged over 25 years with no known diabetes were recruited in six participating pharmacies. Results: A total of 187 clients, mean aged (+SD) was 48.6 (+10.9) years. 35% were at high risk. The mean value of willingness-to-pay for the service fee in IT group was significantly higher than PT group (p=0.013). No significant difference observed for the satisfaction between groups. Conclusions: Non-invasive risk assessment tool, whether paper-based or computerized-based can be applied in community pharmacy to support the enhancing role of pharmacists in chronic disease management. Long term follow up is needed to determine the impact of its application in clinical, humanistic and economic outcomes.
Keywords: Community pharmacy, intervention, prevention, risk assessment, type 2 diabetes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22041900 Identifying Autism Spectrum Disorder Using Optimization-Based Clustering
Authors: Sharifah Mousli, Sona Taheri, Jiayuan He
Abstract:
Autism spectrum disorder (ASD) is a complex developmental condition involving persistent difficulties with social communication, restricted interests, and repetitive behavior. The challenges associated with ASD can interfere with an affected individual’s ability to function in social, academic, and employment settings. Although there is no effective medication known to treat ASD, to our best knowledge, early intervention can significantly improve an affected individual’s overall development. Hence, an accurate diagnosis of ASD at an early phase is essential. The use of machine learning approaches improves and speeds up the diagnosis of ASD. In this paper, we focus on the application of unsupervised clustering methods in ASD, as a large volume of ASD data generated through hospitals, therapy centers, and mobile applications has no pre-existing labels. We conduct a comparative analysis using seven clustering approaches, such as K-means, agglomerative hierarchical, model-based, fuzzy-C-means, affinity propagation, self organizing maps, linear vector quantisation – as well as the recently developed optimization-based clustering (COMSEP-Clust) approach. We evaluate the performances of the clustering methods extensively on real-world ASD datasets encompassing different age groups: toddlers, children, adolescents, and adults. Our experimental results suggest that the COMSEP-Clust approach outperforms the other seven methods in recognizing ASD with well-separated clusters.
Keywords: Autism spectrum disorder, clustering, optimization, unsupervised machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4201899 Toward a Use of Ontology to Reinforcing Semantic Classification of Message Based On LSA
Authors: S. Lgarch, M. Khalidi Idrissi, S. Bennani
Abstract:
For best collaboration, Asynchronous tools and particularly the discussion forums are the most used thanks to their flexibility in terms of time. To convey only the messages that belong to a theme of interest of the tutor in order to help him during his tutoring work, use of a tool for classification of these messages is indispensable. For this we have proposed a semantics classification tool of messages of a discussion forum that is based on LSA (Latent Semantic Analysis), which includes a thesaurus to organize the vocabulary. Benefits offered by formal ontology can overcome the insufficiencies that a thesaurus generates during its use and encourage us then to use it in our semantic classifier. In this work we propose the use of some functionalities that a OWL ontology proposes. We then explain how functionalities like “ObjectProperty", "SubClassOf" and “Datatype" property make our classification more intelligent by way of integrating new terms. New terms found are generated based on the first terms introduced by tutor and semantic relations described by OWL formalism.
Keywords: Classification of messages, collaborative communication tools, discussion forum, e-learning, formal description, latente semantic analysis, ontology, owl, semantic relations, semantic web, thesaurus, tutoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16181898 Antecedents of Word-of-Mouth for Meat with Traceability: Evidence from Thai Consumers
Authors: Kawpong Polyorat, Nathamon Buaprommee
Abstract:
Because of the outbreak of mad cow disease and bird flu, consumers have become more concerned with quality and safety of meat and poultry. As a consequence, meat traceability has been implemented as a tool to raise the standard in the meat production industry. In Thailand, while traceability is relatively common among the manufacturer-wholesaler-retailers cycle, it is rarely used as a marketing tool specifically designed to persuade consumers who are the actual meat endusers. Therefore, the present study attempts to understand what influences consumers to spread their words-of-mouth (WOM) regarding meat with traceability by conducting a study in Thailand where research in this area is rather scant. Data were collected from one hundred and sixty-seven consumers in the northeastern region and analyzed with SEM. The study results reveal that perceived usefulness of traceability system, social norms, and product class knowledge are significant antecedents where consumers spread positive words regarding meat with traceability system. A number of theoretical and managerial implications as well as future study directions are offered at the end of this study report.
Keywords: Perceived usefulness, product knowledge, social norms, traceability, word-of-mouth,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645