Search results for: bioinformatics tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1706

Search results for: bioinformatics tool

1286 W3-Miner: Mining Weighted Frequent Subtree Patterns in a Collection of Trees

Authors: R. AliMohammadzadeh, M. Haghir Chehreghani, A. Zarnani, M. Rahgozar

Abstract:

Mining frequent tree patterns have many useful applications in XML mining, bioinformatics, network routing, etc. Most of the frequent subtree mining algorithms (i.e. FREQT, TreeMiner and CMTreeMiner) use anti-monotone property in the phase of candidate subtree generation. However, none of these algorithms have verified the correctness of this property in tree structured data. In this research it is shown that anti-monotonicity does not generally hold, when using weighed support in tree pattern discovery. As a result, tree mining algorithms that are based on this property would probably miss some of the valid frequent subtree patterns in a collection of trees. In this paper, we investigate the correctness of anti-monotone property for the problem of weighted frequent subtree mining. In addition we propose W3-Miner, a new algorithm for full extraction of frequent subtrees. The experimental results confirm that W3-Miner finds some frequent subtrees that the previously proposed algorithms are not able to discover.

Keywords: Semi-Structured Data Mining, Anti-Monotone Property, Trees.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1355
1285 A Novel Machining Signal Filtering Technique: Z-notch Filter

Authors: Nuawi M. Z., Lamin F., Ismail A. R., Abdullah S., Wahid Z.

Abstract:

A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.

Keywords: Digital signal filtering, I-kaz method, Machiningmonitoring, Noise Cancelling, Sound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862
1284 PmSPARQL: Extended SPARQL for Multi-paradigm Path Extraction

Authors: Thabet Slimani, Boutheina Ben Yaghlane, Khaled Mellouli

Abstract:

In the last few years, the Semantic Web gained scientific acceptance as a means of relationships identification in knowledge base, widely known by semantic association. Query about complex relationships between entities is a strong requirement for many applications in analytical domains. In bioinformatics for example, it is critical to extract exchanges between proteins. Currently, the widely known result of such queries is to provide paths between connected entities from data graph. However, they do not always give good results while facing the user need by the best association or a set of limited best association, because they only consider all existing paths but ignore the path evaluation. In this paper, we present an approach for supporting association discovery queries. Our proposal includes (i) a query language PmSPRQL which provides a multiparadigm query expressions for association extraction and (ii) some quantification measures making easy the process of association ranking. The originality of our proposal is demonstrated by a performance evaluation of our approach on real world datasets.

Keywords: Association extraction, query Language, relationships, knowledge base, multi-paradigm query.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420
1283 Correction of Frequent English Writing Errors by Using Coded Indirect Corrective Feedback and Error Treatment: The Case of Reading and Writing English for Academic Purposes II

Authors: Chaiwat Tantarangsee

Abstract:

The purposes of this study are 1) to study the frequent English writing errors of students registering the course: Reading and Writing English for Academic Purposes II, and 2) to find out the results of writing error correction by using coded indirect corrective feedback and writing error treatments. Samples include 28 2nd year English Major students, Faculty of Education, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tool for data collection includes 4 writing tests of short texts. The research findings disclose that frequent English writing errors found in this course comprise 7 types of grammatical errors, namely Fragment sentence, Subject-verb agreement, Wrong form of verb tense, Singular or plural noun endings, Run-ons sentence, Wrong form of verb pattern and Lack of parallel structure. Moreover, it is found that the results of writing error correction by using coded indirect corrective feedback and error treatment reveal the overall reduction of the frequent English writing errors and the increase of students’ achievement in the writing of short texts with the significance at .05.

Keywords: Coded indirect corrective feedback, error correction, and error treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1072
1282 A Case Study on the Value of Corporate Social Responsibility Systems

Authors: José M. Brotons, Manuel E. Sansalvador

Abstract:

The relationship between Corporate Social Responsibility (CSR) and financial performance (FP) is a subject of great interest that has not yet been resolved. In this work, we have developed a new and original tool to measure this relation. The tool quantifies the value contributed to companies that are committed to CSR. The theoretical model used is the fuzzy discounted cash flow method. Two assumptions have been considered, the first, the company has implemented the IQNet SR10 certification, and the second, the company has not implemented that certification. For the first one, the growth rate used for the time horizon is the rate maintained by the company after obtaining the IQNet SR10 certificate. For the second one, both, the growth rates company prior to the implementation of the certification, and the evolution of the sector will be taken into account. By using triangular fuzzy numbers, it is possible to deal adequately with each company’s forecasts as well as the information corresponding to the sector. Once the annual growth rate of the sales is obtained, the profit and loss accounts are generated from the annual estimate sales. For the remaining elements of this account, their regression with the nets sales has been considered. The difference between these two valuations, made in a fuzzy environment, allows obtaining the value of the IQNet SR10 certification. Although this study presents an innovative methodology to quantify the relation between CSR and FP, the authors are aware that only one company has been analyzed. This is precisely the main limitation of this study which in turn opens up an interesting line for future research: to broaden the sample of companies.

Keywords: Corporate social responsibility, case study, financial performance, company valuation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 770
1281 Analysis and Classification of Hiv-1 Sub- Type Viruses by AR Model through Artificial Neural Networks

Authors: O. Yavuz, L. Ozyilmaz

Abstract:

HIV-1 genome is highly heterogeneous. Due to this variation, features of HIV-I genome is in a wide range. For this reason, the ability to infection of the virus changes depending on different chemokine receptors. From this point of view, R5 HIV viruses use CCR5 coreceptor while X4 viruses use CXCR5 and R5X4 viruses can utilize both coreceptors. Recently, in Bioinformatics, R5X4 viruses have been studied to classify by using the experiments on HIV-1 genome. In this study, R5X4 type of HIV viruses were classified using Auto Regressive (AR) model through Artificial Neural Networks (ANNs). The statistical data of R5X4, R5 and X4 viruses was analyzed by using signal processing methods and ANNs. Accessible residues of these virus sequences were obtained and modeled by AR model since the dimension of residues is large and different from each other. Finally the pre-processed data was used to evolve various ANN structures for determining R5X4 viruses. Furthermore ROC analysis was applied to ANNs to show their real performances. The results indicate that R5X4 viruses successfully classified with high sensitivity and specificity values training and testing ROC analysis for RBF, which gives the best performance among ANN structures.

Keywords: Auto-Regressive Model, HIV, Neural Networks, ROC Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1162
1280 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece

Authors: Panagiotis Karadimos, Leonidas Anthopoulos

Abstract:

Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.

Keywords: Actual cost and duration, attribute selection, bridge projects, neural networks, predicting models, FANN TOOL, WEKA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1148
1279 Info-participation of the Disabled Using the Mixed Preference Data in Improving Their Travel Quality

Authors: Y. Duvarci, S. Mizokami

Abstract:

Today, the preferences and participation of the TD groups such as the elderly and disabled is still lacking in decision-making of transportation planning, and their reactions to certain type of policies are not well known. Thus, a clear methodology is needed. This study aimed to develop a method to extract the preferences of the disabled to be used in the policy-making stage that can also guide to future estimations. The method utilizes the combination of cluster analysis and data filtering using the data of the Arao city (Japan). The method is a process that follows: defining the TD group by the cluster analysis tool, their travel preferences in tabular form from the household surveys by policy variableimpact pairs, zones, and by trip purposes, and the final outcome is the preference probabilities of the disabled. The preferences vary by trip purpose; for the work trips, accessibility and transit system quality policies with the accompanying impacts of modal shifts towards public mode use as well as the decreasing travel costs, and the trip rate increase; for the social trips, the same accessibility and transit system policies leading to the same mode shift impact, together with the travel quality policy area leading to trip rate increase. These results explain the policies to focus and can be used in scenario generation in models, or any other planning purpose as decision support tool.

Keywords: Transportation Disadvantaged, Disabled, Mixed Preference, Stated Preference Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1056
1278 Data Mining for Cancer Management in Egypt Case Study: Childhood Acute Lymphoblastic Leukemia

Authors: Nevine M. Labib, Michael N. Malek

Abstract:

Data Mining aims at discovering knowledge out of data and presenting it in a form that is easily comprehensible to humans. One of the useful applications in Egypt is the Cancer management, especially the management of Acute Lymphoblastic Leukemia or ALL, which is the most common type of cancer in children. This paper discusses the process of designing a prototype that can help in the management of childhood ALL, which has a great significance in the health care field. Besides, it has a social impact on decreasing the rate of infection in children in Egypt. It also provides valubale information about the distribution and segmentation of ALL in Egypt, which may be linked to the possible risk factors. Undirected Knowledge Discovery is used since, in the case of this research project, there is no target field as the data provided is mainly subjective. This is done in order to quantify the subjective variables. Therefore, the computer will be asked to identify significant patterns in the provided medical data about ALL. This may be achieved through collecting the data necessary for the system, determimng the data mining technique to be used for the system, and choosing the most suitable implementation tool for the domain. The research makes use of a data mining tool, Clementine, so as to apply Decision Trees technique. We feed it with data extracted from real-life cases taken from specialized Cancer Institutes. Relevant medical cases details such as patient medical history and diagnosis are analyzed, classified, and clustered in order to improve the disease management.

Keywords: Data Mining, Decision Trees, Knowledge Discovery, Leukemia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
1277 Study on Numerical Simulation Applied to Moisture Buffering Design Method – The Case Study of Pine Wood in a Single Zone Residential Unit in Taiwan

Authors: Y.C. Yeh, Y.S. Tsay, C.M. Chiang

Abstract:

A good green building design project, designers should consider not only energy consumption, but also healthy and comfortable needs of inhabitants. In recent years, the Taiwan government paid attentions on both carbon reduction and indoor air quality issues, which be presented in the legislation of Building Codes and other regulations. Taiwan located in hot and humid climates, dampness in buildings leads to significant microbial pollution and building damage. This means that the high temperature and humidity present a serious indoor air quality issue. The interactions between vapor transfers and energy fluxes are essential for the whole building Heat Air and Moisture (HAM) response. However, a simulation tool with short calculation time, property accuracy and interface is needed for practical building design processes. In this research, we consider the vapor transfer phenomenon of building materials as well as temperature and humidity and energy consumption in a building space. The simulation bases on the EMPD method, which was performed by EnergyPlus, a simulation tool developed by DOE, to simulate the indoor moisture variation in a one-zone residential unit based on the Effective Moisture Penetration Depth Method, which is more suitable for practical building design processes.

Keywords: Effective Moisture Penetration Depth Method, Moisture Buffering Effect, Interior Material, Green Material, EnergyPlus

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1556
1276 Additive Friction Stir Manufacturing Process: Interest in Understanding Thermal Phenomena and Numerical Modeling of the Temperature Rise Phase

Authors: A. Lauvray, F. Poulhaon, P. Michaud, P. Joyot, E. Duc

Abstract:

Additive Friction Stir Manufacturing, or AFSM, is a new industrial process that follows the emergence of friction-based processes. The AFSM process is a solid-state additive process using the energy produced by the friction at the interface between a rotating non-consumable tool and a substrate. Friction depends on various parameters like axial force, rotation speed or friction coefficient. The feeder material is a metallic rod that flows through a hole in the tool. There is still a lack in understanding of the physical phenomena taking place during the process. This research aims at a better AFSM process understanding and implementation, thanks to numerical simulation and experimental validation performed on a prototype effector. Such an approach is considered a promising way for studying the influence of the process parameters and to finally identify a process window that seems relevant. The deposition of material through the AFSM process takes place in several phases. In chronological order these phases are the docking phase, the dwell time phase, the deposition phase, and the removal phase. The present work focuses on the dwell time phase that enables the temperature rise of the system due to pure friction. An analytic modeling of heat generation based on friction considers as main parameters the rotational speed and the contact pressure. Another parameter considered influential is the friction coefficient assumed to be variable, due to the self-lubrication of the system with the rise in temperature or the materials in contact roughness smoothing over time. This study proposes through a numerical modeling followed by an experimental validation to question the influence of the various input parameters on the dwell time phase. Rotation speed, temperature, spindle torque and axial force are the main monitored parameters during experimentations and serve as reference data for the calibration of the numerical model. This research shows that the geometry of the tool as well as fluctuations of the input parameters like axial force and rotational speed are very influential on the temperature reached and/or the time required to reach the targeted temperature. The main outcome is the prediction of a process window which is a key result for a more efficient process implementation.

Keywords: numerical model, additive manufacturing, frictional heat generation, process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 484
1275 Structuring and Visualizing Healthcare Claims Data Using Systems Architecture Methodology

Authors: Inas S. Khayal, Weiping Zhou, Jonathan Skinner

Abstract:

Healthcare delivery systems around the world are in crisis. The need to improve health outcomes while decreasing healthcare costs have led to an imminent call to action to transform the healthcare delivery system. While Bioinformatics and Biomedical Engineering have primarily focused on biological level data and biomedical technology, there is clear evidence of the importance of the delivery of care on patient outcomes. Classic singular decomposition approaches from reductionist science are not capable of explaining complex systems. Approaches and methods from systems science and systems engineering are utilized to structure healthcare delivery system data. Specifically, systems architecture is used to develop a multi-scale and multi-dimensional characterization of the healthcare delivery system, defined here as the Healthcare Delivery System Knowledge Base. This paper is the first to contribute a new method of structuring and visualizing a multi-dimensional and multi-scale healthcare delivery system using systems architecture in order to better understand healthcare delivery.

Keywords: Health informatics, systems thinking, systems architecture, healthcare delivery system, data analytics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1098
1274 A Case Study on Optimization of Contractor’s Financing through Allocation of Subcontractors

Authors: Helen S. Ghali, Engy Serag, A. Samer Ezeldin

Abstract:

In many countries, the construction industry relies heavily on outsourcing models in executing their projects and expanding their businesses to fit in the diverse market. Such extensive integration of subcontractors is becoming an influential factor in contractor’s cash flow management. Accordingly, subcontractors’ financial terms are important phenomena and pivotal components for the well-being of the contractor’s cash flow. The aim of this research is to study the contractor’s cash flow with respect to the owner and subcontractor’s payment management plans, considering variable advance payment, payment frequency, and lag and retention policies. The model is developed to provide contractors with a decision support tool that can assist in selecting the optimum subcontracting plan to minimize the contractor’s financing limits and optimize the profit values. The model is built using Microsoft Excel VBA coding, and the genetic algorithm is utilized as the optimization tool. Three objective functions are investigated, which are minimizing the highest negative overdraft value, minimizing the net present worth of overdraft, and maximizing the project net profit. The model is validated on a full-scale project which includes both self-performed and subcontracted work packages. The results show potential outputs in optimizing the contractor’s negative cash flow values and, in the meantime, assisting contractors in selecting suitable subcontractors to achieve the objective function.

Keywords: Cash flow optimization, payment plan, procurement management, subcontracting plan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 170
1273 Technical, Environmental, and Financial Assessment for the Optimal Sizing of a Run-of-River Small Hydropower Project: A Case Study in Colombia

Authors: David Calderón Villegas, Thomas Kalitzky

Abstract:

Run-of-river (RoR) hydropower projects represent a viable, clean, and cost-effective alternative to dam-based plants and provide decentralized power production. However, RoR schemes’ cost-effectiveness depends on the proper selection of site and design flow, which is a challenging task because it requires multivariate analysis. In this respect, this study presents the development of an investment decision support tool for assessing the optimal size of an RoR scheme considering the technical, environmental, and cost constraints. The net present value (NPV) from a project perspective is used as an objective function for supporting the investment decision. The tool has been tested by applying it to an actual RoR project recently proposed in Colombia. The obtained results show that the optimum point in financial terms does not match the flow that maximizes energy generation from exploiting the river's available flow. For the case study, the flow that maximizes energy corresponds to a value of 5.1 m3/s. In comparison, an amount of 2.1 m3/s maximizes the investors NPV. Finally, a sensitivity analysis is performed to determine the NPV as a function of the debt rate changes and the electricity prices and the CapEx. Even for the worst-case scenario, the optimal size represents a positive business case with an NPV of 2.2 USD million and an internal rate of return (IRR) 1.5 times higher than the discount rate. 

Keywords: small hydropower, renewable energy, RoR schemes, optimal sizing, financial analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 544
1272 Decision Support System for Flood Crisis Management using Artificial Neural Network

Authors: Muhammad Aqil, Ichiro Kita, Akira Yano, Nishiyama Soichi

Abstract:

This paper presents an alternate approach that uses artificial neural network to simulate the flood level dynamics in a river basin. The algorithm was developed in a decision support system environment in order to enable users to process the data. The decision support system is found to be useful due to its interactive nature, flexibility in approach and evolving graphical feature and can be adopted for any similar situation to predict the flood level. The main data processing includes the gauging station selection, input generation, lead-time selection/generation, and length of prediction. This program enables users to process the flood level data, to train/test the model using various inputs and to visualize results. The program code consists of a set of files, which can as well be modified to match other purposes. This program may also serve as a tool for real-time flood monitoring and process control. The running results indicate that the decision support system applied to the flood level seems to have reached encouraging results for the river basin under examination. The comparison of the model predictions with the observed data was satisfactory, where the model is able to forecast the flood level up to 5 hours in advance with reasonable prediction accuracy. Finally, this program may also serve as a tool for real-time flood monitoring and process control.

Keywords: Decision Support System, Neural Network, Flood Level

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1605
1271 A Construction Management Tool: Determining a Project Schedule Typical Behaviors Using Cluster Analysis

Authors: Natalia Rudeli, Elisabeth Viles, Adrian Santilli

Abstract:

Delays in the construction industry are a global phenomenon. Many construction projects experience extensive delays exceeding the initially estimated completion time. The main purpose of this study is to identify construction projects typical behaviors in order to develop a prognosis and management tool. Being able to know a construction projects schedule tendency will enable evidence-based decision-making to allow resolutions to be made before delays occur. This study presents an innovative approach that uses Cluster Analysis Method to support predictions during Earned Value Analyses. A clustering analysis was used to predict future scheduling, Earned Value Management (EVM), and Earned Schedule (ES) principal Indexes behaviors in construction projects. The analysis was made using a database with 90 different construction projects. It was validated with additional data extracted from literature and with another 15 contrasting projects. For all projects, planned and executed schedules were collected and the EVM and ES principal indexes were calculated. A complete linkage classification method was used. In this way, the cluster analysis made considers that the distance (or similarity) between two clusters must be measured by its most disparate elements, i.e. that the distance is given by the maximum span among its components. Finally, through the use of EVM and ES Indexes and Tukey and Fisher Pairwise Comparisons, the statistical dissimilarity was verified and four clusters were obtained. It can be said that construction projects show an average delay of 35% of its planned completion time. Furthermore, four typical behaviors were found and for each of the obtained clusters, the interim milestones and the necessary rhythms of construction were identified. In general, detected typical behaviors are: (1) Projects that perform a 5% of work advance in the first two tenths and maintain a constant rhythm until completion (greater than 10% for each remaining tenth), being able to finish on the initially estimated time. (2) Projects that start with an adequate construction rate but suffer minor delays culminating with a total delay of almost 27% of the planned time. (3) Projects which start with a performance below the planned rate and end up with an average delay of 64%, and (4) projects that begin with a poor performance, suffer great delays and end up with an average delay of a 120% of the planned completion time. The obtained clusters compose a tool to identify the behavior of new construction projects by comparing their current work performance to the validated database, thus allowing the correction of initial estimations towards more accurate completion schedules.

Keywords: Cluster analysis, construction management, earned value, schedule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1165
1270 Proposing an Efficient Method for Frequent Pattern Mining

Authors: Vaibhav Kant Singh, Vijay Shah, Yogendra Kumar Jain, Anupam Shukla, A.S. Thoke, Vinay KumarSingh, Chhaya Dule, Vivek Parganiha

Abstract:

Data mining, which is the exploration of knowledge from the large set of data, generated as a result of the various data processing activities. Frequent Pattern Mining is a very important task in data mining. The previous approaches applied to generate frequent set generally adopt candidate generation and pruning techniques for the satisfaction of the desired objective. This paper shows how the different approaches achieve the objective of frequent mining along with the complexities required to perform the job. This paper will also look for hardware approach of cache coherence to improve efficiency of the above process. The process of data mining is helpful in generation of support systems that can help in Management, Bioinformatics, Biotechnology, Medical Science, Statistics, Mathematics, Banking, Networking and other Computer related applications. This paper proposes the use of both upward and downward closure property for the extraction of frequent item sets which reduces the total number of scans required for the generation of Candidate Sets.

Keywords: Data Mining, Candidate Sets, Frequent Item set, Pruning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661
1269 Domain Knowledge Representation through Multiple Sub Ontologies: An Application Interoperability

Authors: Sunitha Abburu, Golla Suresh Babu

Abstract:

The issues that limit application interoperability is lack of common vocabulary, common structure, application domain knowledge ontology based semantic technology provides solutions that resolves application interoperability issues. Ontology is broadly used in diverse applications such as artificial intelligence, bioinformatics, biomedical, information integration, etc. Ontology can be used to interpret the knowledge of various domains. To reuse, enrich the available ontologies and reduce the duplication of ontologies of the same domain, there is a strong need to integrate the ontologies of the particular domain. The integrated ontology gives complete knowledge about the domain by sharing this comprehensive domain ontology among the groups. As per the literature survey there is no well-defined methodology to represent knowledge of a whole domain. The current research addresses a systematic methodology for knowledge representation using multiple sub-ontologies at different levels that addresses application interoperability and enables semantic information retrieval. The current method represents complete knowledge of a domain by importing concepts from multiple sub ontologies of same and relative domains that reduces ontology duplication, rework, implementation cost through ontology reusability.

Keywords: Knowledge acquisition, knowledge representation, knowledge transfer, ontologies, semantics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 944
1268 An Empirical Study of the Effect of Robot Programming Education on the Computational Thinking of Young Children: The Role of Flowcharts

Authors: Wei Sun, Yan Dong

Abstract:

There is an increasing interest in introducing computational thinking at an early age. Computational thinking, like mathematical thinking, engineering thinking, and scientific thinking, is a kind of analytical thinking. Learning computational thinking skills is not only to improve technological literacy, but also allows learners to equip with practicable skills such as problem-solving skills. As people realize the importance of computational thinking, the field of educational technology faces a problem: how to choose appropriate tools and activities to help students develop computational thinking skills. Robots are gradually becoming a popular teaching tool, as robots provide a tangible way for young children to access to technology, and controlling a robot through programming offers them opportunities to engage in developing computational thinking. This study explores whether the introduction of flowcharts into the robotics programming courses can help children convert natural language into a programming language more easily, and then to better cultivate their computational thinking skills. An experimental study was adopted with a sample of children ages six to seven (N = 16) participated, and a one-meter-tall humanoid robot was used as the teaching tool. Results show that children can master basic programming concepts through robotic courses. Children's computational thinking has been significantly improved. Besides, results suggest that flowcharts do have an impact on young children’s computational thinking skills development, but it only has a significant effect on the "sequencing" and "correspondence" skills. Overall, the study demonstrates that the humanoid robot and flowcharts have qualities that foster young children to learn programming and develop computational thinking skills.

Keywords: Robotics, computational thinking, programming, young children, flowcharts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 773
1267 Influence of Deep Cold Rolling and Low Plasticity Burnishing on Surface Hardness and Surface Roughness of AISI 4140 Steel

Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma

Abstract:

Deep cold rolling (DCR) and low plasticity burnishing (LPB) process are cold working processes, which easily produce a smooth and work-hardened surface by plastic deformation of surface irregularities. The present study focuses on the surface roughness and surface hardness aspects of AISI 4140 work material, using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in order to identify the predominant factors amongst the selected parameters. They were then categorized in order of significance followed by setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. In the present work, the influence of main process parameters (force, feed rate, number of tool passes/overruns, initial roughness of the work piece, ball material, ball diameter and lubricant used) on the surface roughness and the hardness of AISI 4140 steel were studied for both LPB and DCR process and the results are compared. It was observed that by using LPB process surface hardness has been improved by 167% and in DCR process surface hardness has been improved by 442%. It was also found that the force, ball diameter, number of tool passes and initial roughness of the workpiece are the most pronounced parameters, which has a significant effect on the work piece-s surface during deep cold rolling and low plasticity burnishing process.

Keywords: Deep cold rolling, burnishing, surface roughness, surface hardness, design of experiments, AISI4140 steel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3755
1266 Efficient Supplies to Assembly Areas from Storage Stages

Authors: Matthias Schmidt, Steffen C. Eickemeyer, Prof. Peter Nyhuis

Abstract:

Guaranteeing the availability of the required parts at the scheduled time represents a key logistical challenge. This is especially important when several parts are required together. This article describes a tool that supports the positioning in the area of conflict between low stock costs and a high service level for a consumer.

Keywords: Systems Modeling, Manufacturing Systems, Simulation & Control, logistics and supply chain management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1629
1265 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
1264 ARCS for Critical Information Retrieval Development

Authors: Suttipong Boonphadung

Abstract:

The research on ARCS for critical information retrieval development aimed to (1) investigate conditions of critical information retrieval skill of the Mathematics pre-service teachers before applying ARCS model in learning activities, (2) study and analyze the development of critical information retrieval skill of the Mathematics pre-service teachers after utilizing ARCS model in learning activities, and (3) evaluate the Mathematics pre-service teachers’ satisfaction on using ARCS model in learning activities as a tool to development critical information retrieval skill. Forty-one of 4th year Mathematics pre-service teachers who have enrolled in the subject of Research for Learning Development of semester 2 in 2012 were purposively selected as the research cohort. The research tools were self-report and interview questionnaire that was approved as content validity and reliability (IOC=.66-1.00, α =.834). The research found that critical information retrieval skill of the research samples before using ARCS model in learning activities was in the normal high level. According to the in-depth interview and focus group, the result however showed that the pre-service teachers still lack inadequate and effective knowledge in information retrieval. Additionally, critical information retrieval skill of the research cohort after applying ARCS model in learning activities appeared to be high level. The result revealed that the pre-service teachers are able to explain the method of searching, extraction, and selecting information as well as evaluating quality of information, and effectively making decision in accepting information. Moreover, the research discovered that the pre-service teachers showed normal high to highest level of satisfaction on using ARCS model in learning activities as a tool to development their critical information retrieval skill.

Keywords: Critical information retrieval skill, ARCS model, Satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1502
1263 Rural – Urban Partnership for Balanced Spatial Development in Latvia

Authors: Zane Bulderberga

Abstract:

Spatial dimension in development planning is becoming more topical in 21st century as a result of changes in population structure. Sustainable spatial development focuses on identifying and using territorial advantages to foster the harmonized development of the entire country, reducing negative effects of population concentration, increasing availability and mobility. EU and national development planning documents state polycentrism as main tool for balance spatial development, including investment concentration in growth centres. If mutual cooperation of growth centres as well as urban–rural cooperation is not fostered, then territorial differences can deepen and create unbalanced development.

The aim of research: to evaluate the urban–rural interaction, elaborating spatial development scenarios in framework of Latvian regional policy. To perform the research monographic, comparison, abstract–logical method, synthesis and analysis will be used when studying the theoretical aspects of research aiming at collecting the ideas of scientists from different countries, concepts, regulations as well as to create meaningful scientific discussion. Hierarchy analysis process (AHP) will be used to state further scenarios of spatial development in Latvia.

Experts from various institutions recognized urban – rural interaction and co-operation as an essential tool for the development. The most important factors for balanced spatial development in Latvia are availability of public transportation and improvement of service availability. Evaluating the three alternative scenarios, it was concluded that the urban – rural partnership will ensure a balanced development in Latvian regions.

Keywords: Rural – urban interaction, rural – urban cooperation, spatial development, AHP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2780
1262 Web-Based Cognitive Writing Instruction (WeCWI): A Theoretical-and-Pedagogical e-Framework for Language Development

Authors: Boon Yih Mah

Abstract:

Web-based Cognitive Writing Instruction (WeCWI)’s contribution towards language development can be divided into linguistic and non-linguistic perspectives. In linguistic perspective, WeCWI focuses on the literacy and language discoveries, while the cognitive and psychological discoveries are the hubs in non-linguistic perspective. In linguistic perspective, WeCWI draws attention to free reading and enterprises, which are supported by the language acquisition theories. Besides, the adoption of process genre approach as a hybrid guided writing approach fosters literacy development. Literacy and language developments are interconnected in the communication process; hence, WeCWI encourages meaningful discussion based on the interactionist theory that involves input, negotiation, output, and interactional feedback. Rooted in the elearning interaction-based model, WeCWI promotes online discussion via synchronous and asynchronous communications, which allows interactions happened among the learners, instructor, and digital content. In non-linguistic perspective, WeCWI highlights on the contribution of reading, discussion, and writing towards cognitive development. Based on the inquiry models, learners’ critical thinking is fostered during information exploration process through interaction and questioning. Lastly, to lower writing anxiety, WeCWI develops the instructional tool with supportive features to facilitate the writing process. To bring a positive user experience to the learner, WeCWI aims to create the instructional tool with different interface designs based on two different types of perceptual learning style.

Keywords: WeCWI, literacy discovery, language discovery, cognitive discovery, psychological discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3215
1261 Investigating the Effectiveness of a 3D Printed Composite Mold

Authors: Peng Hao Wang, Garam Kim, Ronald Sterkenburg

Abstract:

In composite manufacturing, the fabrication of tooling and tooling maintenance contributes to a large portion of the total cost. However, as the applications of composite materials continue to increase, there is also a growing demand for more tooling. The demand for more tooling places heavy emphasis on the industry’s ability to fabricate high quality tools while maintaining the tool’s cost effectiveness. One of the popular techniques of tool fabrication currently being developed utilizes additive manufacturing technology known as 3D printing. The popularity of 3D printing is due to 3D printing’s ability to maintain low material waste, low cost, and quick fabrication time. In this study, a team of Purdue University School of Aviation and Transportation Technology (SATT) faculty and students investigated the effectiveness of a 3D printed composite mold. A steel valve cover from an aircraft reciprocating engine was modeled utilizing 3D scanning and computer-aided design (CAD) to create a 3D printed composite mold. The mold was used to fabricate carbon fiber versions of the aircraft reciprocating engine valve cover. The carbon fiber valve covers were evaluated for dimensional accuracy and quality while the 3D printed composite mold was evaluated for durability and dimensional stability. The data collected from this study provided valuable information in the understanding of 3D printed composite molds, potential improvements for the molds, and considerations for future tooling design.

Keywords: Additive manufacturing, carbon fiber, composite tooling, molds.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 878
1260 An Approach to Image Extraction and Accurate Skin Detection from Web Pages

Authors: Moheb R. Girgis, Tarek M. Mahmoud, Tarek Abd-El-Hafeez

Abstract:

This paper proposes a system to extract images from web pages and then detect the skin color regions of these images. As part of the proposed system, using BandObject control, we built a Tool bar named 'Filter Tool Bar (FTB)' by modifying the Pavel Zolnikov implementation. The Yahoo! Team provides us with the Yahoo! SDK API, which also supports image search and is really useful. In the proposed system, we introduced three new methods for extracting images from the web pages (after loading the web page by using the proposed FTB, before loading the web page physically from the localhost, and before loading the web page from any server). These methods overcome the drawback of the regular expressions method for extracting images suggested by Ilan Assayag. The second part of the proposed system is concerned with the detection of the skin color regions of the extracted images. So, we studied two famous skin color detection techniques. The first technique is based on the RGB color space and the second technique is based on YUV and YIQ color spaces. We modified the second technique to overcome the failure of detecting complex image's background by using the saturation parameter to obtain an accurate skin detection results. The performance evaluation of the efficiency of the proposed system in extracting images before and after loading the web page from localhost or any server in terms of the number of extracted images is presented. Finally, the results of comparing the two skin detection techniques in terms of the number of pixels detected are presented.

Keywords: Browser Helper Object, Color spaces, Image and URL extraction, Skin detection, Web Browser events.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863
1259 A Short Reflection on the Strengths and Weaknesses of Simulation Optimization

Authors: P. Vazan, P. Tanuska

Abstract:

The paper provides the basic overview of simulation optimization. The procedure of its practical using is demonstrated on the real example in simulator Witness. The simulation optimization is presented as a good tool for solving many problems in real praxis especially in production systems. The authors also characterize their own experiences and they mention the strengths and weakness of simulation optimization.

Keywords: discrete event simulation, simulation optimization, Witness

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2569
1258 Modified Genome-Scale Metabolic Model of Escherichia coli by Adding Hyaluronic Acid Biosynthesis-Related Enzymes (GLMU2 and HYAD) from Pasteurella multocida

Authors: P. Pasomboon, P. Chumnanpuen, T. E-kobon

Abstract:

Hyaluronic acid (HA) consists of linear heteropolysaccharides repeat of D-glucuronic acid and N-acetyl-D-glucosamine. HA has various useful properties to maintain skin elasticity and moisture, reduce inflammation, and lubricate the movement of various body parts without causing immunogenic allergy. HA can be found in several animal tissues as well as in the capsule component of some bacteria including Pasteurella multocida. This study aimed to modify a genome-scale metabolic model of Escherichia coli using computational simulation and flux analysis methods to predict HA productivity under different carbon sources and nitrogen supplement by the addition of two enzymes (GLMU2 and HYAD) from P. multocida to improve the HA production under the specified amount of carbon sources and nitrogen supplements. Result revealed that threonine and aspartate supplement raised the HA production by 12.186%. Our analyses proposed the genome-scale metabolic model is useful for improving the HA production and narrows the number of conditions to be tested further.

Keywords: Pasteurella multocida, Escherichia coli, hyaluronic acid, genome-scale metabolic model, bioinformatics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 769
1257 Gene Expression Signature for Classification of Metastasis Positive and Negative Oral Cancer in Homosapiens

Authors: A. Shukla, A. Tarsauliya, R. Tiwari, S. Sharma

Abstract:

Cancer classification to their corresponding cohorts has been key area of research in bioinformatics aiming better prognosis of the disease. High dimensionality of gene data has been makes it a complex task and requires significance data identification technique in order to reducing the dimensionality and identification of significant information. In this paper, we have proposed a novel approach for classification of oral cancer into metastasis positive and negative patients. We have used significance analysis of microarrays (SAM) for identifying significant genes which constitutes gene signature. 3 different gene signatures were identified using SAM from 3 different combination of training datasets and their classification accuracy was calculated on corresponding testing datasets using k-Nearest Neighbour (kNN), Fuzzy C-Means Clustering (FCM), Support Vector Machine (SVM) and Backpropagation Neural Network (BPNN). A final gene signature of only 9 genes was obtained from above 3 individual gene signatures. 9 gene signature-s classification capability was compared using same classifiers on same testing datasets. Results obtained from experimentation shows that 9 gene signature classified all samples in testing dataset accurately while individual genes could not classify all accurately.

Keywords: Cancer, Gene Signature, SAM, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2051