Search results for: missing data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7424

Search results for: missing data

7394 Experimental teaching, Perceived usefulness, Ease of use, Learning Interest and Science Achievement of Taiwan 8th Graders in TIMSS 2007 Database

Authors: Pei Wen Liao, Tsung Hau Jen

Abstract:

the data of Taiwanese 8th grader in the 4th cycle of Trends in International Mathematics and Science Study (TIMSS) are analyzed to examine the influence of the science teachers- preference in experimental teaching on the relationships between the affective variables ( the perceived usefulness of science, ease of using science and science learning interest) and the academic achievement in science. After dealing with the missing data, 3711 students and 145 science teacher-s data were analyzed through a Hierarchical Linear Modeling technique. The major objective of this study was to determine the role of the experimental teaching moderates the relationship between perceived usefulness and achievement.

Keywords: TIMSS database, Science achievement, Experimental teaching, Perceived Usefulness, Perceived Ease of Use

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1603
7393 SIMGraph: Simplifying Contig Graph to Improve de Novo Genome Assembly Using Next-generation Sequencing Data

Authors: Chien-Ju Li, Chun-Hui Yu, Chi-Chuan Hwang, Tsunglin Liu , Darby Tien-Hao Chang

Abstract:

De novo genome assembly is always fragmented. Assembly fragmentation is more serious using the popular next generation sequencing (NGS) data because NGS sequences are shorter than the traditional Sanger sequences. As the data throughput of NGS is high, the fragmentations in assemblies are usually not the result of missing data. On the contrary, the assembled sequences, called contigs, are often connected to more than one other contigs in a complicated manner, leading to the fragmentations. False connections in such complicated connections between contigs, named a contig graph, are inevitable because of repeats and sequencing/assembly errors. Simplifying a contig graph by removing false connections directly improves genome assembly. In this work, we have developed a tool, SIMGraph, to resolve ambiguous connections between contigs using NGS data. Applying SIMGraph to the assembly of a fungus and a fish genome, we resolved 27.6% and 60.3% ambiguous contig connections, respectively. These results can reduce the experimental efforts in resolving contig connections.

Keywords: Contig graph, NGS, de novo assembly, scaffold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690
7392 Increasing the System Availability of Data Centers by Using Virtualization Technologies

Authors: Chris Ewe, Naoum Jamous, Holger Schrödl

Abstract:

Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.

Keywords: Availability, cloud computing IT service, quality of service, service level agreement, virtualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 954
7391 Remaining Useful Life Prediction Using Elliptical Basis Function Network and Markov Chain

Authors: Yi Yu, Lin Ma, Yong Sun, Yuantong Gu

Abstract:

This paper presents a novel method for remaining useful life prediction using the Elliptical Basis Function (EBF) network and a Markov chain. The EBF structure is trained by a modified Expectation-Maximization (EM) algorithm in order to take into account the missing covariate set. No explicit extrapolation is needed for internal covariates while a Markov chain is constructed to represent the evolution of external covariates in the study. The estimated external and the unknown internal covariates constitute an incomplete covariate set which are then used and analyzed by the EBF network to provide survival information of the asset. It is shown in the case study that the method slightly underestimates the remaining useful life of an asset which is a desirable result for early maintenance decision and resource planning.

Keywords: Elliptical Basis Function Network, Markov Chain, Missing Covariates, Remaining Useful Life

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
7390 Microgrid: Low Power Network Topology and Control

Authors: Amit Sachan

Abstract:

The network designing and data modeling developments which are the two significant research tasks in direction to tolerate power control of Microgrid concluded using IEC 61850 data models and facilities. The current casing areas of IEC 61580 include infrastructures in substation automation systems, among substations and to DERs. So, for LV microgrid power control, previously using the IEC 61850 amenities to control the smart electrical devices, we have to model those devices as IEC 61850 data models and design a network topology to maintenance all-in-one communiqué amid those devices. In adding, though IEC 61850 assists modeling a portion by open-handed several object models for common functions similar measurement, metering, monitoring…etc., there are motionless certain missing smithereens for building a multiplicity of functions for household appliances like tuning the temperature of an electric heater or refrigerator.

Keywords: IEC 61850, RCMC, HCMC, DER Unit Controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2464
7389 Estimation of Missing or Incomplete Data in Road Performance Measurement Systems

Authors: Kristjan Kuhi, Kati K. Kaare, Ott Koppel

Abstract:

Modern management in most fields is performance based; both planning and implementation of maintenance and operational activities are driven by appropriately defined performance indicators. Continuous real-time data collection for management is becoming feasible due to technological advancements. Outdated and insufficient input data may result in incorrect decisions. When using deterministic models the uncertainty of the object state is not visible thus applying the deterministic models are more likely to give false diagnosis. Constructing structured probabilistic models of the performance indicators taking into consideration the surrounding indicator environment enables to estimate the trustworthiness of the indicator values. It also assists to fill gaps in data to improve the quality of the performance analysis and management decisions. In this paper authors discuss the application of probabilistic graphical models in the road performance measurement and propose a high-level conceptual model that enables analyzing and predicting more precisely future pavement deterioration based on road utilization.

Keywords: Probabilistic graphical models, performance indicators, road performance management, data collection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
7388 Estimation of PM2.5 Emissions and Source Apportionment Using Receptor and Dispersion Models

Authors: Swetha Priya Darshini Thammadi, Sateesh Kumar Pisini, Sanjay Kumar Shukla

Abstract:

Source apportionment using Dispersion model depends primarily on the quality of Emission Inventory. In the present study, a CMB receptor model has been used to identify the sources of PM2.5, while the AERMOD dispersion model has been used to account for missing sources of PM2.5 in the Emission Inventory. A statistical approach has been developed to quantify the missing sources not considered in the Emission Inventory. The inventory of each grid was improved by adjusting emissions based on road lengths and deficit in measured and modelled concentrations. The results showed that in CMB analyses, fugitive sources - soil and road dust - contribute significantly to ambient PM2.5 pollution. As a result, AERMOD significantly underestimated the ambient air concentration at most locations. The revised Emission Inventory showed a significant improvement in AERMOD performance which is evident through statistical tests.

Keywords: CMB, GIS, AERMOD, PM2.5, fugitive, emission inventory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 833
7387 A Reasoning Method of Cyber-Attack Attribution Based on Threat Intelligence

Authors: Li Qiang, Yang Ze-Ming, Liu Bao-Xu, Jiang Zheng-Wei

Abstract:

With the increasing complexity of cyberspace security, the cyber-attack attribution has become an important challenge of the security protection systems. The difficult points of cyber-attack attribution were forced on the problems of huge data handling and key data missing. According to this situation, this paper presented a reasoning method of cyber-attack attribution based on threat intelligence. The method utilizes the intrusion kill chain model and Bayesian network to build attack chain and evidence chain of cyber-attack on threat intelligence platform through data calculation, analysis and reasoning. Then, we used a number of cyber-attack events which we have observed and analyzed to test the reasoning method and demo system, the result of testing indicates that the reasoning method can provide certain help in cyber-attack attribution.

Keywords: Reasoning, Bayesian networks, cyber-attack attribution, kill chain, threat intelligence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2624
7386 Probabilistic Approach of Dealing with Uncertainties in Distributed Constraint Optimization Problems and Situation Awareness for Multi-agent Systems

Authors: Sagir M. Yusuf, Chris Baber

Abstract:

In this paper, we describe how Bayesian inferential reasoning will contributes in obtaining a well-satisfied prediction for Distributed Constraint Optimization Problems (DCOPs) with uncertainties. We also demonstrate how DCOPs could be merged to multi-agent knowledge understand and prediction (i.e. Situation Awareness). The DCOPs functions were merged with Bayesian Belief Network (BBN) in the form of situation, awareness, and utility nodes. We describe how the uncertainties can be represented to the BBN and make an effective prediction using the expectation-maximization algorithm or conjugate gradient descent algorithm. The idea of variable prediction using Bayesian inference may reduce the number of variables in agents’ sampling domain and also allow missing variables estimations. Experiment results proved that the BBN perform compelling predictions with samples containing uncertainties than the perfect samples. That is, Bayesian inference can help in handling uncertainties and dynamism of DCOPs, which is the current issue in the DCOPs community. We show how Bayesian inference could be formalized with Distributed Situation Awareness (DSA) using uncertain and missing agents’ data. The whole framework was tested on multi-UAV mission for forest fire searching. Future work focuses on augmenting existing architecture to deal with dynamic DCOPs algorithms and multi-agent information merging.

Keywords: DCOP, multi-agent reasoning, Bayesian reasoning, swarm intelligence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 942
7385 Optimized Vector Quantization for Bayer Color Filter Array

Authors: M. Lakshmi, J. Senthil Kumar

Abstract:

Digital cameras to reduce cost, use an image sensor to capture color images. Color Filter Array (CFA) in digital cameras permits only one of the three primary (red-green-blue) colors to be sensed in a pixel and interpolates the two missing components through a method named demosaicking. Captured data is interpolated into a full color image and compressed in applications. Color interpolation before compression leads to data redundancy. This paper proposes a new Vector Quantization (VQ) technique to construct a VQ codebook with Differential Evolution (DE) Algorithm. The new technique is compared to conventional Linde- Buzo-Gray (LBG) method.

Keywords: Color Filter Array (CFA), Biorthogonal Wavelet, Vector Quantization (VQ), Differential Evolution (DE).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1870
7384 Development and Performance Evaluation of a Gladiolus Planter in Field for Planting Corms

Authors: T. P. Singh, Vijay Gautam

Abstract:

Gladiolus is an important cash crop and is grown mainly for its elegant spikes. Traditionally the gladiolus corms are planted manually which is very tedious, time consuming and labor intensive operation. So far, there is no planter available for planting of gladiolus corms. With a view to mechanize the planting operation of this horticultural crop, a prototype of 4-row gladiolus planter was developed and its performance was evaluated in-situ condition. Cupchain type metering device was used to place each single gladiolus corm in furrow at required spacing while planting. Three levels of corm spacing viz 15, 20 and 25 cm and four levels of forward speed viz 1.0, 1.5, 2.0 and 2.5 km/h was taken as evaluation parameter for the planter. The performance indicators namely corm spacing in each row, coefficient of uniformity, missing index, multiple index, quality of feed index, number of corms per meter length, mechanical damage to the corms etc. were determined during the field test. The data was statistically analyzed using Completely Randomized Design (CRD) for testing the significance of the parameters. The result indicated that planter was able to drop the corms at required nominal spacing with minor variations. The highest deviation from the mean corm spacing was observed as 3.53 cm with maximum coefficient of variation as 13.88%. The highest missing and quality of feed indexes were observed as 6.33% and 97.45% respectively with no multiples. The performance of the planter was observed better at lower forward speed and wider corm spacing. The field capacity of the planter was found as 0.103 ha/h with an observed field efficiency of 76.57%.

Keywords: Coefficient of uniformity, corm spacing, gladiolus planter, mechanization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
7383 Generalized Method for Estimating Best-Fit Vertical Alignments for Profile Data

Authors: Said M. Easa, Shinya Kikuchi

Abstract:

When the profile information of an existing road is missing or not up-to-date and the parameters of the vertical alignment are needed for engineering analysis, the engineer has to recreate the geometric design features of the road alignment using collected profile data. The profile data may be collected using traditional surveying methods, global positioning systems, or digital imagery. This paper develops a method that estimates the parameters of the geometric features that best characterize the existing vertical alignments in terms of tangents and the expressions of the curve, that may be symmetrical, asymmetrical, reverse, and complex vertical curves. The method is implemented using an Excel-based optimization method that minimizes the differences between the observed profile and the profiles estimated from the equations of the vertical curve. The method uses a 'wireframe' representation of the profile that makes the proposed method applicable to all types of vertical curves. A secondary contribution of this paper is to introduce the properties of the equal-arc asymmetrical curve that has been recently developed in the highway geometric design field.

Keywords: Optimization, parameters, data, reverse, spreadsheet, vertical curves

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2396
7382 Mining Genes Relations in Microarray Data Combined with Ontology in Colon Cancer Automated Diagnosis System

Authors: A. Gruzdz, A. Ihnatowicz, J. Siddiqi, B. Akhgar

Abstract:

MATCH project [1] entitle the development of an automatic diagnosis system that aims to support treatment of colon cancer diseases by discovering mutations that occurs to tumour suppressor genes (TSGs) and contributes to the development of cancerous tumours. The constitution of the system is based on a) colon cancer clinical data and b) biological information that will be derived by data mining techniques from genomic and proteomic sources The core mining module will consist of the popular, well tested hybrid feature extraction methods, and new combined algorithms, designed especially for the project. Elements of rough sets, evolutionary computing, cluster analysis, self-organization maps and association rules will be used to discover the annotations between genes, and their influence on tumours [2]-[11]. The methods used to process the data have to address their high complexity, potential inconsistency and problems of dealing with the missing values. They must integrate all the useful information necessary to solve the expert's question. For this purpose, the system has to learn from data, or be able to interactively specify by a domain specialist, the part of the knowledge structure it needs to answer a given query. The program should also take into account the importance/rank of the particular parts of data it analyses, and adjusts the used algorithms accordingly.

Keywords: Bioinformatics, gene expression, ontology, selforganizingmaps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931
7381 Comparison of Multivariate Adaptive Regression Splines and Random Forest Regression in Predicting Forced Expiratory Volume in One Second

Authors: P. V. Pramila, V. Mahesh

Abstract:

Pulmonary Function Tests are important non-invasive diagnostic tests to assess respiratory impairments and provides quantifiable measures of lung function. Spirometry is the most frequently used measure of lung function and plays an essential role in the diagnosis and management of pulmonary diseases. However, the test requires considerable patient effort and cooperation, markedly related to the age of patients resulting in incomplete data sets. This paper presents, a nonlinear model built using Multivariate adaptive regression splines and Random forest regression model to predict the missing spirometric features. Random forest based feature selection is used to enhance both the generalization capability and the model interpretability. In the present study, flow-volume data are recorded for N= 198 subjects. The ranked order of feature importance index calculated by the random forests model shows that the spirometric features FVC, FEF25, PEF, FEF25-75, FEF50 and the demographic parameter height are the important descriptors. A comparison of performance assessment of both models prove that, the prediction ability of MARS with the `top two ranked features namely the FVC and FEF25 is higher, yielding a model fit of R2= 0.96 and R2= 0.99 for normal and abnormal subjects. The Root Mean Square Error analysis of the RF model and the MARS model also shows that the latter is capable of predicting the missing values of FEV1 with a notably lower error value of 0.0191 (normal subjects) and 0.0106 (abnormal subjects) with the aforementioned input features. It is concluded that combining feature selection with a prediction model provides a minimum subset of predominant features to train the model, as well as yielding better prediction performance. This analysis can assist clinicians with a intelligence support system in the medical diagnosis and improvement of clinical care.

Keywords: FEV1, Multivariate Adaptive Regression Splines Pulmonary Function Test, Random Forest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3687
7380 Extended Well-Founded Semantics in Bilattices

Authors: Daniel Stamate

Abstract:

One of the most used assumptions in logic programming and deductive databases is the so-called Closed World Assumption (CWA), according to which the atoms that cannot be inferred from the programs are considered to be false (i.e. a pessimistic assumption). One of the most successful semantics of conventional logic programs based on the CWA is the well-founded semantics. However, the CWA is not applicable in all circumstances when information is handled. That is, the well-founded semantics, if conventionally defined, would behave inadequately in different cases. The solution we adopt in this paper is to extend the well-founded semantics in order for it to be based also on other assumptions. The basis of (default) negative information in the well-founded semantics is given by the so-called unfounded sets. We extend this concept by considering optimistic, pessimistic, skeptical and paraconsistent assumptions, used to complete missing information from a program. Our semantics, called extended well-founded semantics, expresses also imperfect information considered to be missing/incomplete, uncertain and/or inconsistent, by using bilattices as multivalued logics. We provide a method of computing the extended well-founded semantics and show that Kripke-Kleene semantics is captured by considering a skeptical assumption. We show also that the complexity of the computation of our semantics is polynomial time.

Keywords: Logic programs, imperfect information, multivalued logics, bilattices, assumptions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1219
7379 A Review of Lortie’s Schoolteacher

Authors: Tsai-Hsiu Lin

Abstract:

Dan C. Lortie’s Schoolteacher: A sociological study is one of the best works on the sociology of teaching since W. Waller’s classic study. It is a book worthy of review. Following the tradition of symbolic interactionists, Lortie demonstrated the qualities who studied the occupation of teaching. Using several methods to gather effective data, Lortie has portrayed the ethos of the teaching profession. Therefore, the work is an important book on the teaching profession and teacher culture. Though outstanding, Lortie’s work is also flawed in that his perspectives and methodology were adopted largely from symbolic interactionism. First, Lortie in his work analyzed many points regarding teacher culture; for example, he was interested in exploring “sentiment,” “cathexis,” and “ethos.” Thus, he was more a psychologist than a sociologist. Second, symbolic interactionism led him to discern the teacher culture from a micro view, thereby missing the structural aspects. For example, he did not fully discuss the issue of gender and he ignored the issue of race. Finally, following the qualitative sociological tradition, Lortie employed many qualitative methods to gather data but only foucused on obtaining and presenting interview data. Moreover, he used measurement methods that were too simplistic for analyzing quantitative data fully.

Keywords: Lortie’s Schooltacher, Symbolic interactionism, teacher culture, teaching profession.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4172
7378 Role-Governed Categorization and Category Learning as a Result from Structural Alignment: The RoleMap Model

Authors: Yolina A. Petrova, Georgi I. Petkov

Abstract:

The paper presents a symbolic model for category learning and categorization (called RoleMap). Unlike the other models which implement learning in a separate working mode, role-governed category learning and categorization emerge in RoleMap while it does its usual reasoning. The model is based on several basic mechanisms known as reflecting the sub-processes of analogy-making. It steps on the assumption that in their everyday life people constantly compare what they experience and what they know. Various commonalities between the incoming information (current experience) and the stored one (long-term memory) emerge from those comparisons. Some of those commonalities are considered to be highly important, and they are transformed into concepts for further use. This process denotes the category learning. When there is missing knowledge in the incoming information (i.e. the perceived object is still not recognized), the model makes anticipations about what is missing, based on the similar episodes from its long-term memory. Various such anticipations may emerge for different reasons. However, with time only one of them wins and is transformed into a category member. This process denotes the act of categorization.

Keywords: Categorization, category learning, role-governed category, analogy-making, cognitive modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 618
7377 Classifier Based Text Mining for Neural Network

Authors: M. Govindarajan, R. M. Chandrasekaran

Abstract:

Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.

Keywords: Back propagation, classification accuracy, textmining, time complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4164
7376 AniMoveMineR: Animal Behavior Exploratory Analysis Using Association Rules Mining

Authors: Suelane Garcia Fontes, Silvio Luiz Stanzani, Pedro L. Pizzigatti Corrła Ronaldo G. Morato

Abstract:

Environmental changes and major natural disasters are most prevalent in the world due to the damage that humanity has caused to nature and these damages directly affect the lives of animals. Thus, the study of animal behavior and their interactions with the environment can provide knowledge that guides researchers and public agencies in preservation and conservation actions. Exploratory analysis of animal movement can determine the patterns of animal behavior and with technological advances the ability of animals to be tracked and, consequently, behavioral studies have been expanded. There is a lot of research on animal movement and behavior, but we note that a proposal that combines resources and allows for exploratory analysis of animal movement and provide statistical measures on individual animal behavior and its interaction with the environment is missing. The contribution of this paper is to present the framework AniMoveMineR, a unified solution that aggregates trajectory analysis and data mining techniques to explore animal movement data and provide a first step in responding questions about the animal individual behavior and their interactions with other animals over time and space. We evaluated the framework through the use of monitored jaguar data in the city of Miranda Pantanal, Brazil, in order to verify if the use of AniMoveMineR allows to identify the interaction level between these jaguars. The results were positive and provided indications about the individual behavior of jaguars and about which jaguars have the highest or lowest correlation.

Keywords: Data mining, data science, trajectory, animal behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830
7375 Identifying Missing Component in the Bechdel Test Using Principal Component Analysis Method

Authors: Raghav Lakhotia, Chandra Kanth Nagesh, Krishna Madgula

Abstract:

A lot has been said and discussed regarding the rationale and significance of the Bechdel Score. It became a digital sensation in 2013, when Swedish cinemas began to showcase the Bechdel test score of a film alongside its rating. The test has drawn criticism from experts and the film fraternity regarding its use to rate the female presence in a movie. The pundits believe that the score is too simplified and the underlying criteria of a film to pass the test must include 1) at least two women, 2) who have at least one dialogue, 3) about something other than a man, is egregious. In this research, we have considered a few more parameters which highlight how we represent females in film, like the number of female dialogues in a movie, dialogue genre, and part of speech tags in the dialogue. The parameters were missing in the existing criteria to calculate the Bechdel score. The research aims to analyze 342 movies scripts to test a hypothesis if these extra parameters, above with the current Bechdel criteria, are significant in calculating the female representation score. The result of the Principal Component Analysis method concludes that the female dialogue content is a key component and should be considered while measuring the representation of women in a work of fiction.

Keywords: Bechdel test, dialogue genre, parts of speech tags, principal component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 754
7374 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2798
7373 Development of a Complex Meteorological Support System for UAVs

Authors: Z. Bottyán, F. Wantuch, A. Z. Gyöngyösi, Z. Tuba, K. Hadobács, P. Kardos, R. Kurunczi

Abstract:

The sensitivity of UAVs to the atmospheric effects are apparent. All the same the meteorological support for the UAVs missions is often non-adequate or partly missing. In our paper we show a new complex meteorological support system for different types of UAVs pilots, specialists and decision makers, too. The mentioned system has two important parts with different forecasts approach such as the statistical and dynamical ones. The statistical prediction approach is based on a large climatological data base and the special analog method which is able to select similar weather situations from the mentioned data base to apply them during the forecasting procedure. The applied dynamic approach uses the specific WRF model runs twice a day and produces 96 hours, high resolution weather forecast for the UAV users over the Hungary. An easy to use web-based system can give important weather information over the Carpathian basin in Central-Europe. The mentioned products can be reached via internet connection.

Keywords: Aviation meteorology, statistical weather prediction, unmanned aerial systems, WRF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2710
7372 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data

Authors: Benjamin D. Leiby, Darryl K. Ahner

Abstract:

This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known  values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions, while presenting a need for further refinement that mimics predictive mean matching.

Keywords: Correlation, country conflict, imputation, stochastic regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 358
7371 Comparison of Machine Learning Techniques for Single Imputation on Audiograms

Authors: Sarah Beaver, Renee Bryce

Abstract:

Audiograms detect hearing impairment, but missing values pose problems. This work explores imputations in an attempt to improve accuracy. This work implements Linear Regression, Lasso, Linear Support Vector Regression, Bayesian Ridge, K Nearest Neighbors (KNN), and Random Forest machine learning techniques to impute audiogram frequencies ranging from 125 Hz to 8000 Hz. The data contain patients who had or were candidates for cochlear implants. Accuracy is compared across two different Nested Cross-Validation k values. Over 4000 audiograms were used from 800 unique patients. Additionally, training on data combines and compares left and right ear audiograms versus single ear side audiograms. The accuracy achieved using Root Mean Square Error (RMSE) values for the best models for Random Forest ranges from 4.74 to 6.37. The R2 values for the best models for Random Forest ranges from .91 to .96. The accuracy achieved using RMSE values for the best models for KNN ranges from 5.00 to 7.72. The R2 values for the best models for KNN ranges from .89 to .95. The best imputation models received R2 between .89 to .96 and RMSE values less than 8dB. We also show that the accuracy of classification predictive models performed better with our imputation models versus constant imputations by a two percent increase.

Keywords: Machine Learning, audiograms, data imputations, single imputations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37
7370 Complex Energy Signal Model for Digital Human Fingerprint Matching

Authors: Jason Zalev, Reza Sedaghat

Abstract:

This paper describes a complex energy signal model that is isomorphic with digital human fingerprint images. By using signal models, the problem of fingerprint matching is transformed into the signal processing problem of finding a correlation between two complex signals that differ by phase-rotation and time-scaling. A technique for minutiae matching that is independent of image translation, rotation and linear-scaling, and is resistant to missing minutiae is proposed. The method was tested using random data points. The results show that for matching prints the scaling and rotation angles are closely estimated and a stronger match will have a higher correlation.

Keywords: Affine Invariant, Fingerprint Recognition, Matching, Minutiae.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1269
7369 Complex Wavelet Transform Based Image Denoising and Zooming Under the LMMSE Framework

Authors: T. P. Athira, Gibin Chacko George

Abstract:

This paper proposes a dual tree complex wavelet transform (DT-CWT) based directional interpolation scheme for noisy images. The problems of denoising and interpolation are modelled as to estimate the noiseless and missing samples under the same framework of optimal estimation. Initially, DT-CWT is used to decompose an input low-resolution noisy image into low and high frequency subbands. The high-frequency subband images are interpolated by linear minimum mean square estimation (LMMSE) based interpolation, which preserves the edges of the interpolated images. For each noisy LR image sample, we compute multiple estimates of it along different directions and then fuse those directional estimates for a more accurate denoised LR image. The estimation parameters calculated in the denoising processing can be readily used to interpolate the missing samples. The inverse DT-CWT is applied on the denoised input and interpolated high frequency subband images to obtain the high resolution image. Compared with the conventional schemes that perform denoising and interpolation in tandem, the proposed DT-CWT based noisy image interpolation method can reduce many noise-caused interpolation artifacts and preserve well the image edge structures. The visual and quantitative results show that the proposed technique outperforms many of the existing denoising and interpolation methods.

Keywords: Dual-tree complex wavelet transform (DT-CWT), denoising, interpolation, optimal estimation, super resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2127
7368 A Comparative Study on the Dimensional Error of 3D CAD Model and SLS RP Model for Reconstruction of Cranial Defect

Authors: L. Siva Rama Krishna, Sriram Venkatesh, M. Sastish Kumar, M. Uma Maheswara Chary

Abstract:

Rapid Prototyping (RP) is a technology that produces models and prototype parts from 3D CAD model data, CT/MRI scan data, and model data created from 3D object digitizing systems. There are several RP process like Stereolithography (SLA), Solid Ground Curing (SGC), Selective Laser Sintering (SLS), Fused Deposition Modeling (FDM), 3D Printing (3DP) among them SLS and FDM RP processes are used to fabricate pattern of custom cranial implant. RP technology is useful in engineering and biomedical application. This is helpful in engineering for product design, tooling and manufacture etc. RP biomedical applications are design and development of medical devices, instruments, prosthetics and implantation; it is also helpful in planning complex surgical operation. The traditional approach limits the full appreciation of various bony structure movements and therefore the custom implants produced are difficult to measure the anatomy of parts and analyze the changes in facial appearances accurately. Cranioplasty surgery is a surgical correction of a defect in cranial bone by implanting a metal or plastic replacement to restore the missing part. This paper aims to do a comparative study on the dimensional error of CAD and SLS RP Models for reconstruction of cranial defect by comparing the virtual CAD with the physical RP model of a cranial defect.

Keywords: Rapid Prototyping, Selective Laser Sintering, Cranial defect, Dimensional Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3318
7367 2D Rigid Registration of MR Scans using the 1d Binary Projections

Authors: Panos D. Kotsas

Abstract:

This paper presents the application of a signal intensity independent registration criterion for 2D rigid body registration of medical images using 1D binary projections. The criterion is defined as the weighted ratio of two projections. The ratio is computed on a pixel per pixel basis and weighting is performed by setting the ratios between one and zero pixels to a standard high value. The mean squared value of the weighted ratio is computed over the union of the one areas of the two projections and it is minimized using the Chebyshev polynomial approximation using n=5 points. The sum of x and y projections is used for translational adjustment and a 45deg projection for rotational adjustment. 20 T1- T2 registration experiments were performed and gave mean errors 1.19deg and 1.78 pixels. The method is suitable for contour/surface matching. Further research is necessary to determine the robustness of the method with regards to threshold, shape and missing data.

Keywords: Medical image, projections, registration, rigid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1308
7366 Robust Ellipse Detection by Fitting Randomly Selected Edge Patches

Authors: Watcharin Kaewapichai, Pakorn Kaewtrakulpong

Abstract:

In this paper, a method to detect multiple ellipses is presented. The technique is efficient and robust against incomplete ellipses due to partial occlusion, noise or missing edges and outliers. It is an iterative technique that finds and removes the best ellipse until no reasonable ellipse is found. At each run, the best ellipse is extracted from randomly selected edge patches, its fitness calculated and compared to a fitness threshold. RANSAC algorithm is applied as a sampling process together with the Direct Least Square fitting of ellipses (DLS) as the fitting algorithm. In our experiment, the method performs very well and is robust against noise and spurious edges on both synthetic and real-world image data.

Keywords: Direct Least Square Fitting, Ellipse Detection, RANSAC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3184
7365 Analysis and Research of Two-Level Scheduling Profile for Open Real-Time System

Authors: Yongxian Jin, Jingzhou Huang

Abstract:

In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.

Keywords: Hard real-time, two-level scheduling profile, open real-time system, non-distinctive schedule, soft real-time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521