Search results for: Four methods
3015 Emotion Classification for Students with Autism in Mathematics E-learning using Physiological and Facial Expression Measures
Authors: Hui-Chuan Chu, Min-Ju Liao, Wei-Kai Cheng, William Wei-Jen Tsai, Yuh-Min Chen
Abstract:
Avoiding learning failures in mathematics e-learning environments caused by emotional problems in students with autism has become an important topic for combining of special education with information and communications technology. This study presents an adaptive emotional adjustment model in mathematics e-learning for students with autism, emphasizing the lack of emotional perception in mathematics e-learning systems. In addition, an emotion classification for students with autism was developed by inducing emotions in mathematical learning environments to record changes in the physiological signals and facial expressions of students. Using these methods, 58 emotional features were obtained. These features were then processed using one-way ANOVA and information gain (IG). After reducing the feature dimension, methods of support vector machines (SVM), k-nearest neighbors (KNN), and classification and regression trees (CART) were used to classify four emotional categories: baseline, happy, angry, and anxious. After testing and comparisons, in a situation without feature selection, the accuracy rate of the SVM classification can reach as high as 79.3-%. After using IG to reduce the feature dimension, with only 28 features remaining, SVM still has a classification accuracy of 78.2-%. The results of this research could enhance the effectiveness of eLearning in special education.
Keywords: Emotion classification, Physiological and facial Expression measures, Students with autism, Mathematics e-learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17813014 Analyzing of Temperature-Dependent Thermal Conductivity Effect in the Numerical Modeling of Fin-Tube Radiators: Introduction of a New Method
Authors: Farzad Bazdidi-Tehrani, Mohammad Hadi Kamrava
Abstract:
In all industries which are related to heat, suitable thermal ranges are defined for each device to operate well. Consideration of these limits requires a thermal control unit beside the main system. The Satellite Thermal Control Unit exploits from different methods and facilities individually or mixed. For enhancing heat transfer between primary surface and the environment, utilization of radiating extended surfaces are common. Especially for large temperature differences; variable thermal conductivity has a strong effect on performance of such a surface .In most literatures, thermo-physical properties, such as thermal conductivity, are assumed as constant. However, in some recent researches the variation of these parameters is considered. This may be helpful for the evaluation of fin-s temperature distribution in relatively large temperature differences. A new method is introduced to evaluate temperature-dependent thermal conductivity values. The finite volume method is employed to simulate numerically the temperature distribution in a space radiating fin. The present modeling is carried out for Aluminum as fin material and compared with previous method. The present results are also compared with those of two other analytical methods and good agreement is shown.Keywords: Variable thermal conductivity, New method, Finitevolume method, Combined heat transfer, Extended Surface
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23293013 Skin Lesion Segmentation Using Color Channel Optimization and Clustering-based Histogram Thresholding
Authors: Rahil Garnavi, Mohammad Aldeen, M. Emre Celebi, Alauddin Bhuiyan, Constantinos Dolianitis, George Varigos
Abstract:
Automatic segmentation of skin lesions is the first step towards the automated analysis of malignant melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most effective color space for melanoma application. This paper proposes an automatic segmentation algorithm based on color space analysis and clustering-based histogram thresholding, a process which is able to determine the optimal color channel for detecting the borders in dermoscopy images. The algorithm is tested on a set of 30 high resolution dermoscopy images. A comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm, applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. By performing ROC analysis and ranking the metrics, it is demonstrated that the best results are obtained with the X and XoYoR color channels, resulting in an accuracy of approximately 97%. The proposed method is also compared with two state-of-theart skin lesion segmentation methods.Keywords: Border detection, Color space analysis, Dermoscopy, Histogram thresholding, Melanoma, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22473012 Determinants of Never Users of Contraception – Results from Pakistan Demographic and Health Survey 2012-13
Authors: Arsalan Jabbar, Wajiha Javed, Nelofer Mehboob, Zahid Memon
Abstract:
Introduction: There are multiple social, individual and cultural factors that influence an individual’s decision to adopt family planning methods especially among non-users in patriarchal societies like Pakistan. Non-users, if targeted efficiently, can contribute significantly to country’s CPR. A research study showed that nonusers if convinced to adopt lactational amenorrhea method can shift to long term methods in future. Research shows that if non users are targeted efficiently a 59% reduction in unintended pregnancies in Saharan Africa and South-Central and South-East Asia is anticipated. Methods: We did secondary data analysis on Pakistan Demographic Heath Survey (2012-13) dataset. Use of contraception (never-use/ever-use) was the outcome variable. At univariate level Chi-square/Fisher Exact test was used to assess relationship of baseline covariates with contraception use. Then variables to be incorporated in the model were checked for multicollinearity, confounding and interaction. Then binary logistic regression (with an urban-rural stratification) was done to find relationship between contraception use and baseline demographic and social variables. Results: The multivariate analyses of the study showed that younger women (≤ 29 years)were more prone to be never users as compared to those who were >30 years and this trend was seen in urban areas (AOR 1.92, CI 1.453-2.536) as well as rural areas (AOR 1.809, CI 1.421-2.303). While looking at regional variation, women from urban Sindh (AOR 1.548, CI 1.142-2.099) and urban Balochistan (AOR 2.403, CI 1.504-3.839) had more never users as compared to other urban regions. Women in the rich wealth quintile were more never users and this was seen both in urban and rural localities (urban (AOR 1.106 CI .753-1.624); rural areas (AOR 1.162, CI .887-1.524)) even though these were not statistically significant. Women idealizing more children (>4) are more never users as compared to those idealizing less children in both urban (AOR 1.854, CI 1.275-2.697) and rural areas (AOR 2.101, CI 1.514-2.916). Women who never lost a pregnancy were more inclined to be nonusers in rural areas (AOR 1.394, CI 1.127-1.723) .Women familiar with only traditional or no method had more never users in rural areas (AOR 1.717, CI 1.127-1.723) but in urban areas it wasn’t significant. Women unaware of Lady Health Worker’s presence in their area were more never users especially in rural areas (AOR 1.276, CI 1.014-1.607). Women who did not visit any care provider were more never users (urban (AOR 11.738, CI 9.112-15.121) rural areas (AOR 7.832, CI 6.243-9.826)). Discussion/Conclusion: This study concluded that government, policy makers and private sector family planning programs should focus on the untapped pool of never users (younger women from underserved provinces, in higher wealth quintiles, who desire more children.). We need to make sure to cover catchment areas where there are less LHWs and less providers as ignorance to modern methods and never been visited by an LHW are important determinants of never use. This all is in sync with previous literate from similar developing countries.Keywords: Contraception, Demographic and Health Survey, Family Planning, Never users.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21743011 Application of Computational Methods Mm2 and Gussian for Studing Unimolecular Decomposition of Vinil Ethers based on the Mechanism of Hydrogen Bonding
Authors: Behnaz Shahrokh, Garnik N. Sargsyan, Arkadi B. Harutyunyan
Abstract:
Investigations of the unimolecular decomposition of vinyl ethyl ether (VEE), vinyl propyl ether (VPE) and vinyl butyl ether (VBE) have shown that activation of the molecule of a ether results in formation of a cyclic construction - the transition state (TS), which may lead to the displacement of the thermodynamic equilibrium towards the reaction products. The TS is obtained by applying energy minimization relative to the ground state of an ether under the program MM2 when taking into account the hydrogen bond formation between a hydrogen atom of alkyl residue and the extreme atom of carbon of the vinyl group. The dissociation of TS up to the products is studied by energy minimization procedure using the mathematical program Gaussian. The obtained calculation data for VEE testify that the decomposition of this ether may be conditioned by hydrogen bond formation for two possible versions: when α- or β- hydrogen atoms of the ethyl group are bound to carbon atom of the vinyl group. Applying the same calculation methods to other ethers (VPE and VBE) it is shown that only in the case of hydrogen bonding between α-hydrogen atom of the alkyl residue and the extreme atom of carbon of the vinyl group (αH---C) results in decay of theses ethers.Keywords: Gaussian, MM2, ethers, TS, decomposition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12203010 Optimizing Forecasting for Indonesia's Coal and Palm Oil Exports: A Comparative Analysis of ARIMA, ANN, and LSTM Methods
Authors: Mochammad Dewo, Sumarsono Sudarto
Abstract:
The Exponential Triple Smoothing Algorithm approach nowadays, which is used to anticipate the export value of Indonesia's two major commodities, coal and palm oil, has a Mean Percentage Absolute Error (MAPE) value of 30-50%, which may be considered as a "reasonable" forecasting mistake. Forecasting errors of more than 30% shall have a domino effect on industrial output, as extra production adds to raw material, manufacturing and storage expenses. Whereas, reaching an "excellent" classification with an error value of less than 10% will provide new investors and exporters with confidence in the commercial development of related sectors. Industrial growth will bring out a positive impact on economic development. It can be applied for other commodities if the forecast error is less than 10%. The purpose of this project is to create a forecasting technique that can produce precise forecasting results with an error of less than 10%. This research analyzes forecasting methods such as ARIMA (Autoregressive Integrated Moving Average), ANN (Artificial Neural Network) and LSTM (Long-Short Term Memory). By providing a MAPE of 1%, this study reveals that ANN is the most successful strategy for forecasting coal and palm oil commodities in Indonesia.
Keywords: ANN, Artificial Neural Network, ARIMA, Autoregressive Integrated Moving Average, export value, forecast, LSTM, Long Short Term Memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2243009 Ensemble Approach for Predicting Student's Academic Performance
Authors: L. A. Muhammad, M. S. Argungu
Abstract:
Educational data mining (EDM) has recorded substantial considerations. Techniques of data mining in one way or the other have been proposed to dig out out-of-sight knowledge in educational data. The result of the study got assists academic institutions in further enhancing their process of learning and methods of passing knowledge to students. Consequently, the performance of students boasts and the educational products are by no doubt enhanced. This study adopted a student performance prediction model premised on techniques of data mining with Students' Essential Features (SEF). SEF are linked to the learner's interactivity with the e-learning management system. The performance of the student's predictive model is assessed by a set of classifiers, viz. Bayes Network, Logistic Regression, and Reduce Error Pruning Tree (REP). Consequently, ensemble methods of Bagging, Boosting, and Random Forest (RF) are applied to improve the performance of these single classifiers. The study reveals that the result shows a robust affinity between learners' behaviors and their academic attainment. Result from the study shows that the REP Tree and its ensemble record the highest accuracy of 83.33% using SEF. Hence, in terms of the Receiver Operating Curve (ROC), boosting method of REP Tree records 0.903, which is the best. This result further demonstrates the dependability of the proposed model.
Keywords: Ensemble, bagging, Random Forest, boosting, data mining, classifiers, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7613008 Complex Condition Monitoring System of Aircraft Gas Turbine Engine
Authors: A. M. Pashayev, D. D. Askerov, C. Ardil, R. A. Sadiqov, P. S. Abdullayev
Abstract:
Researches show that probability-statistical methods application, especially at the early stage of the aviation Gas Turbine Engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods is considered. According to the purpose of this problem training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. For GTE technical condition more adequate model making dynamics of skewness and kurtosis coefficients- changes are analysed. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE workand output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-by-stage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine technical condition was made.Keywords: aviation gas turbine engine, technical condition, fuzzy logic, neural networks, fuzzy statistics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25453007 Recommended Practice for Experimental Evaluation of the Seepage Sensitivity Damage of Coalbed Methane Reservoirs
Authors: Hao Liu, Lihui Zheng, Chinedu J. Okere, Chao Wang, Xiangchun Wang, Peng Zhang
Abstract:
The coalbed methane (CBM) extraction industry (an unconventional energy source) has not established guidelines for experimental evaluation of sensitivity damage for coal samples. The existing experimental process of previous researches mainly followed the industry standard for conventional oil and gas reservoirs (CIS). However, the existing evaluation method ignores certain critical differences between CBM reservoirs and conventional reservoirs, which could inevitably result in an inaccurate evaluation of sensitivity damage and, eventually, poor decisions regarding the formulation of formation damage prevention measures. In this study, we propose improved experimental guidelines for evaluating seepage sensitivity damage of CBM reservoirs by leveraging on the shortcomings of the existing methods. The proposed method was established via a theoretical analysis of the main drawbacks of the existing methods and validated through comparative experiments. The results show that the proposed evaluation technique provided reliable experimental results that can better reflect actual reservoir conditions and correctly guide the future development of CBM reservoirs. This study is pioneering the research on the optimization of experimental parameters for efficient exploration and development of CBM reservoirs.
Keywords: Coalbed methane, formation damage, permeability, unconventional energy source.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3853006 An Implementation of Fuzzy Logic Technique for Prediction of the Power Transformer Faults
Authors: Omar M. Elmabrouk., Roaa Y. Taha., Najat M. Ebrahim, Sabbreen A. Mohammed
Abstract:
Power transformers are the most crucial part of power electrical system, distribution and transmission grid. This part is maintained using predictive or condition-based maintenance approach. The diagnosis of power transformer condition is performed based on Dissolved Gas Analysis (DGA). There are five main methods utilized for analyzing these gases. These methods are International Electrotechnical Commission (IEC) gas ratio, Key Gas, Roger gas ratio, Doernenburg, and Duval Triangle. Moreover, due to the importance of the transformers, there is a need for an accurate technique to diagnose and hence predict the transformer condition. The main objective of this technique is to avoid the transformer faults and hence to maintain the power electrical system, distribution and transmission grid. In this paper, the DGA was utilized based on the data collected from the transformer records available in the General Electricity Company of Libya (GECOL) which is located in Benghazi-Libya. The Fuzzy Logic (FL) technique was implemented as a diagnostic approach based on IEC gas ratio method. The FL technique gave better results and approved to be used as an accurate prediction technique for power transformer faults. Also, this technique is approved to be a quite interesting for the readers and the concern researchers in the area of FL mathematics and power transformer.
Keywords: Fuzzy logic, dissolved gas-in-oil analysis, DGA, prediction, power transformer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13573005 Predictive Analytics of Student Performance Determinants in Education
Authors: Mahtab Davari, Charles Edward Okon, Somayeh Aghanavesi
Abstract:
Every institute of learning is usually interested in the performance of enrolled students. The level of these performances determines the approach an institute of study may adopt in rendering academic services. The focus of this paper is to evaluate students' academic performance in given courses of study using machine learning methods. This study evaluated various supervised machine learning classification algorithms such as Logistic Regression (LR), Support Vector Machine (SVM), Random Forest, Decision Tree, K-Nearest Neighbors, Linear Discriminant Analysis (LDA), and Quadratic Discriminant Analysis, using selected features to predict study performance. The accuracy, precision, recall, and F1 score obtained from a 5-Fold Cross-Validation were used to determine the best classification algorithm to predict students’ performances. SVM (using a linear kernel), LDA, and LR were identified as the best-performing machine learning methods. Also, using the LR model, this study identified students' educational habits such as reading and paying attention in class as strong determinants for a student to have an above-average performance. Other important features include the academic history of the student and work. Demographic factors such as age, gender, high school graduation, etc., had no significant effect on a student's performance.
Keywords: Student performance, supervised machine learning, prediction, classification, cross-validation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5483004 Methods for Analyzing the Energy Efficiencyand Cost Effectiveness of Evaporative Cooling Air Conditioning
Authors: A Fouda, Z. Melikyan
Abstract:
Air conditioning systems of houses consume large quantity of electricity. To reducing energy consumption for air conditioning purposes it is becoming attractive the use of evaporative cooling air conditioning which is less energy consuming compared to air chillers. But, it is obvious that higher energy efficiency of evaporative cooling is not enough to judge whether evaporative cooling economically is competitive with other types of cooling systems. To proving the higher energy efficiency and cost effectiveness of the evaporative cooling competitive analysis of various types of cooling system should be accomplished. For noted purpose optimization mathematical model for each system should be composed based on system approach analysis. In this paper different types of evaporative cooling-heating systems are discussed and methods for increasing their energy efficiency and as well as determining of their design parameters are developed. The optimization mathematical models for each of them are composed with help of which least specific costs for each of them are reviled. The comparison of specific costs proved that the most efficient and cost effective is considered the “direct evaporating" system if it is applicable for given climatic conditions. Next more universal and applicable for many climatic conditions system providing least cost of heating and cooling is considered the “direct evaporating" system.Keywords: air, conditioning, system, evaporative cooling, mathematical model, optimization, thermoeconomic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17713003 Performance Analysis of Traffic Classification with Machine Learning
Authors: Htay Htay Yi, Zin May Aye
Abstract:
Network security is role of the ICT environment because malicious users are continually growing that realm of education, business, and then related with ICT. The network security contravention is typically described and examined centrally based on a security event management system. The firewalls, Intrusion Detection System (IDS), and Intrusion Prevention System are becoming essential to monitor or prevent of potential violations, incidents attack, and imminent threats. In this system, the firewall rules are set only for where the system policies are needed. Dataset deployed in this system are derived from the testbed environment. The traffic as in DoS and PortScan traffics are applied in the testbed with firewall and IDS implementation. The network traffics are classified as normal or attacks in the existing testbed environment based on six machine learning classification methods applied in the system. It is required to be tested to get datasets and applied for DoS and PortScan. The dataset is based on CICIDS2017 and some features have been added. This system tested 26 features from the applied dataset. The system is to reduce false positive rates and to improve accuracy in the implemented testbed design. The system also proves good performance by selecting important features and comparing existing a dataset by machine learning classifiers.Keywords: False negative rate, intrusion detection system, machine learning methods, performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10703002 Technology for Enhancing the Learning and Teaching Experience in Higher Education
Authors: Sara M. Ismael, Ali H. Al-Badi
Abstract:
The rapid development and growth of technology has changed the method of obtaining information for educators and learners. Technology has created a new world of collaboration and communication among people. Incorporating new technology into the teaching process can enhance learning outcomes. Billions of individuals across the world are now connected together, and are cooperating and contributing their knowledge and intelligence. Time is no longer wasted in waiting until the teacher is ready to share information as learners can go online and get it immediatelt.
The objectives of this paper are to understand the reasons why changes in teaching and learning methods are necessary, to find ways of improving them, and to investigate the challenges that present themselves in the adoption of new ICT tools in higher education institutes.
To achieve these objectives two primary research methods were used: questionnaires, which were distributed among students at higher educational institutes and multiple interviews with faculty members (teachers) from different colleges and universities, which were conducted to find out why teaching and learning methodology should change.
The findings show that both learners and educators agree that educational technology plays a significant role in enhancing instructors’ teaching style and students’ overall learning experience; however, time constraints, privacy issues, and not being provided with enough up-to-date technology do create some challenges.
Keywords: E-books, educational technology, educators, e-learning, learners, social media, Web 2.0, LMS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23253001 An Efficient Protocol for Cyclic Somatic Embryogenesis in Neem (Azadirachta indica A Juss.)
Authors: Mithilesh Singh, Rakhi Chaturvedi
Abstract:
Neem is a highly heterozygous and commercially important perennial plant. Conventionally, it is propagated by seeds which loose viability within two weeks. Strictly cross pollinating nature of the plant causes serious barrier to the genetic improvement by conventional methods. Alternative methods of tree improvement such as somatic hybridization, mutagenesis and genetic transformation require an efficient in vitro plant regeneration system. In this regard, somatic embryogenesis particularly secondary somatic embryogenesis may offer an effective system for large scale plant propagation without affecting the clonal fidelity of the regenerants. It can be used for synthetic seed production, which further bolsters conservation of this tree species which is otherwise very difficult The present report describes the culture conditions necessary to induce and maintain repetitive somatic embryogenesis, for the first time, in neem. Out of various treatments tested, the somatic embryos were induced directly from immature zygotic embryos of neem on MS + TDZ (0.1 μM) + ABA (4 μM), in more than 76 % cultures. Direct secondary somatic embryogenesis occurred from primary somatic embryos on MS + IAA (5 μM) + GA3 (5 μM) in 12.5 % cultures. Embryogenic competence of the explant as well as of the primary embryos was maintained for a long period by repeated subcultures at frequent intervals. A maximum of 10 % of these somatic embryos were converted into plantlets.Keywords: Azadirachta indica A. Juss., Cytokinin, Somatic embryogenesis, zygotic embryo culture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14653000 Entropy Based Spatial Design: A Genetic Algorithm Approach (Case Study)
Authors: Abbas Siefi, Mohammad Javad Karimifar
Abstract:
We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.
Keywords: Spatial design of experiments, maximum entropy sampling, computer experiments, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16572999 Technologies of Acylation of Hydroxyanthraquinones
Authors: Dmitry Yu. Korulkin, Raissa A. Muzychkina
Abstract:
In review the generalized data about different methods of synthesis of biological activity acylatedhydrohyanthraquinones is presented. The basic regularity of a synthesis is analyzed. Action of temperature, pH, solubility, catalysts and other factors on a reaction product yield is revealed.
Keywords: Aminoacidic acylation, hydroxyanthraquinones, nucleophilic exchange, physiologically active substances.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17982998 Technologies of Halogenation of Hydroxyanthraquinones
Authors: Dmitriy Yu. Korulkin, Raissa A. Muzychkina
Abstract:
In review the generalized data about different methods of synthesis of biological activity halogenated di-, tri- and tetrahydroxyanthraquinones is presented. The basic regularity of a synthesis is analyzed. Action of temperature, pH, solubility, catalysts and other factors on a reaction product yield is revealed.
Keywords: Electrophilic substitution, halogenation, hydroxyanthraquinones, physiologically active substances.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21942997 Rotorcraft Performance and Environmental Impact Evaluation by Multidisciplinary Modelling
Authors: Pierre-Marie Basset, Gabriel Reboul, Binh DangVu, Sébastien Mercier
Abstract:
Rotorcraft provides invaluable services thanks to their Vertical Take-Off and Landing (VTOL), hover and low speed capabilities. Yet their use is still often limited by their cost and environmental impact, especially noise and energy consumption. One of the main brakes to the expansion of the use of rotorcraft for urban missions is the environmental impact. The first main concern for the population is the noise. In order to develop the transversal competency to assess the rotorcraft environmental footprint, a collaboration has been launched between six research departments within ONERA. The progress in terms of models and methods are capitalized into the numerical workshop C.R.E.A.T.I.O.N. “Concepts of Rotorcraft Enhanced Assessment Through Integrated Optimization Network”. A typical mission for which the environmental impact issue is of great relevance has been defined. The first milestone is to perform the pre-sizing of a reference helicopter for this mission. In a second milestone, an alternate rotorcraft concept has been defined: a tandem rotorcraft with optional propulsion. The key design trends are given for the pre-sizing of this rotorcraft aiming at a significant reduction of the global environmental impact while still giving equivalent flight performance and safety with respect to the reference helicopter. The models and methods have been improved for catching sooner and more globally, the relative variations on the environmental impact when changing the rotorcraft architecture, the pre-design variables and the operation parameters.Keywords: Environmental impact, flight performance, helicopter, rotorcraft pre-sizing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14972996 A Comparative Analysis of Heuristics Applied to Collecting Used Lubricant Oils Generated in the City of Pereira, Colombia
Authors: Diana Fajardo, Sebastián Ortiz, Oscar Herrera, Angélica Santis
Abstract:
Currently, in Colombia is arising a problem related to collecting used lubricant oils which are generated by the increment of the vehicle fleet. This situation does not allow a proper disposal of this type of waste, which in turn results in a negative impact on the environment. Therefore, through the comparative analysis of various heuristics, the best solution to the VRP (Vehicle Routing Problem) was selected by comparing costs and times for the collection of used lubricant oils in the city of Pereira, Colombia; since there is no presence of management companies engaged in the direct administration of the collection of this pollutant. To achieve this aim, six proposals of through methods of solution of two phases were discussed. First, the assignment of the group of generator points of the residue was made (previously identified). Proposals one and four of through methods are based on the closeness of points. The proposals two and five are using the scanning method and the proposals three and six are considering the restriction of the capacity of collection vehicle. Subsequently, the routes were developed - in the first three proposals by the Clarke and Wright's savings algorithm and in the following proposals by the Traveling Salesman optimization mathematical model. After applying techniques, a comparative analysis of the results was performed and it was determined which of the proposals presented the most optimal values in terms of the distance, cost and travel time.
Keywords: Heuristics, optimization model, savings algorithm used vehicular oil, VRP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13142995 A Renovated Cook's Distance Based On The Buckley-James Estimate In Censored Regression
Authors: Nazrina Aziz, Dong Q. Wang
Abstract:
There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Keywords: Buckley-James estimators, censored regression, censored data, diagnostic analysis, product-limit estimator, renovated Cook's Distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14382994 A Methodology for Automatic Diversification of Document Categories
Authors: Dasom Kim, Chen Liu, Myungsu Lim, Soo-Hyeon Jeon, Byeoung Kug Jeon, Kee-Young Kwahk, Namgyu Kim
Abstract:
Recently, numerous documents including large volumes of unstructured data and text have been created because of the rapid increase in the use of social media and the Internet. Usually, these documents are categorized for the convenience of users. Because the accuracy of manual categorization is not guaranteed, and such categorization requires a large amount of time and incurs huge costs. Many studies on automatic categorization have been conducted to help mitigate the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorize complex documents with multiple topics because they work on the assumption that individual documents can be categorized into single categories only. Therefore, to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, the learning process employed in these studies involves training using a multi-categorized document set. These methods therefore cannot be applied to the multi-categorization of most documents unless multi-categorized training sets using traditional multi-categorization algorithms are provided. To overcome this limitation, in this study, we review our novel methodology for extending the category of a single-categorized document to multiple categorizes, and then introduce a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.Keywords: Big Data Analysis, Document Classification, Text Mining, Topic Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17462993 Socio-Technical Systems: Transforming Theory into Practice
Authors: L. Ngowi, N. H. Mvungi
Abstract:
This paper critically examines the evolution of socio-technical systems theory, its practices, and challenges in system design and development. It examines concepts put forward by researchers focusing on the application of the theory in software engineering. There are various methods developed that use socio-technical concepts based on systems engineering without remarkable success. The main constraint is the large amount of data and inefficient techniques used in the application of the concepts in system engineering for developing time-bound systems and within a limited/controlled budget. This paper critically examines each of the methods, highlight bottlenecks and suggest the way forward. Since socio-technical systems theory only explains what to do, but not how doing it, hence engineers are not using the concept to save time, costs and reduce risks associated with new frameworks. Hence, a new framework, which can be considered as a practical approach is proposed that borrows concepts from soft systems method, agile systems development and object-oriented analysis and design to bridge the gap between theory and practice. The approach will enable the development of systems using socio-technical systems theory to attract/enable the system engineers/software developers to use socio-technical systems theory in building worthwhile information systems to avoid fragilities and hostilities in the work environment.
Keywords: Socio-technical systems, human centered design, software engineering, cognitive engineering, soft systems, systems engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28262992 Statistical Measures and Optimization Algorithms for Gene Selection in Lung and Ovarian Tumor
Authors: C. Gunavathi, K. Premalatha
Abstract:
Microarray technology is universally used in the study of disease diagnosis using gene expression levels. The main shortcoming of gene expression data is that it includes thousands of genes and a small number of samples. Abundant methods and techniques have been proposed for tumor classification using microarray gene expression data. Feature or gene selection methods can be used to mine the genes that directly involve in the classification and to eliminate irrelevant genes. In this paper statistical measures like T-Statistics, Signal-to-Noise Ratio (SNR) and F-Statistics are used to rank the genes. The ranked genes are used for further classification. Particle Swarm Optimization (PSO) algorithm and Shuffled Frog Leaping (SFL) algorithm are used to find the significant genes from the top-m ranked genes. The Naïve Bayes Classifier (NBC) is used to classify the samples based on the significant genes. The proposed work is applied on Lung and Ovarian datasets. The experimental results show that the proposed method achieves 100% accuracy in all the three datasets and the results are compared with previous works.
Keywords: Microarray, T-Statistics, Signal-to-Noise Ratio, FStatistics, Particle Swarm Optimization, Shuffled Frog Leaping, Naïve Bayes Classifier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19452991 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance
Authors: Rajinder Singh, Ram Valluru
Abstract:
Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.
Keywords: Actuarial loss reserving techniques, logistic regression, parametric function, volatility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4162990 A Simple Affymetrix Ratio-transformation Method Yields Comparable Expression Level Quantifications with cDNA Data
Authors: Chintanu K. Sarmah, Sandhya Samarasinghe, Don Kulasiri, Daniel Catchpoole
Abstract:
Gene expression profiling is rapidly evolving into a powerful technique for investigating tumor malignancies. The researchers are overwhelmed with the microarray-based platforms and methods that confer them the freedom to conduct large-scale gene expression profiling measurements. Simultaneously, investigations into cross-platform integration methods have started gaining momentum due to their underlying potential to help comprehend a myriad of broad biological issues in tumor diagnosis, prognosis, and therapy. However, comparing results from different platforms remains to be a challenging task as various inherent technical differences exist between the microarray platforms. In this paper, we explain a simple ratio-transformation method, which can provide some common ground for cDNA and Affymetrix platform towards cross-platform integration. The method is based on the characteristic data attributes of Affymetrix- and cDNA- platform. In the work, we considered seven childhood leukemia patients and their gene expression levels in either platform. With a dataset of 822 differentially expressed genes from both these platforms, we carried out a specific ratio-treatment to Affymetrix data, which subsequently showed an improvement in the relationship with the cDNA data.Keywords: Gene expression profiling, microarray, cDNA, Affymetrix, childhood leukaemia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15222989 Technologies of Amination of Hydroxyanthraquinones
Authors: Dmitry Yu. Korulkin, Raissa A. Muzychkina
Abstract:
In review the generalized data about different methods of synthesis of biological activity aminated hydroxyanthraquinones is presented. The basic regularity of a synthesis is analyzed. Action of temperature, pH, solubility, catalysts and other factors on a reaction product yield is revealed.
Keywords: Amination, hydroxyanthraquinones, nucleophilic exchange, physiologically active substances.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25852988 Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition
Authors: Hariton N. Costin, Iulian Ciocoiu, Tudor Barbu, Cristian Rotariu
Abstract:
In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.Keywords: Biometry, image processing, pattern recognition, speech analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19442987 Bilingual Gaming Kit to Teach English Language through Collaborative Learning
Authors: Sarayu Agarwal
Abstract:
This paper aims to teach English (secondary language) by bridging the understanding between the Regional language (primary language) and the English Language (secondary language). Here primary language is the one a person has learned from birth or within the critical period, while secondary language would be any other language one learns or speaks. The paper also focuses on evolving old teaching methods to a contemporary participatory model of learning and teaching. Pilot studies were conducted to gauge an understanding of student’s knowledge of the English language. Teachers and students were interviewed and their academic curriculum was assessed as a part of the initial study. Extensive literature study and design thinking principles were used to devise a solution to the problem. The objective is met using a holistic learning kit/card game to teach children word recognition, word pronunciation, word spelling and writing words. Implication of the paper is a noticeable improvement in the understanding and grasping of English language. With increasing usage and applicability of English as a second language (ESL) world over, the paper becomes relevant due to its easy replicability to any other primary or secondary language. Future scope of this paper would be transforming the idea of participatory learning into self-regulated learning methods. With the upcoming govt. learning centres in rural areas and provision of smart devices such as tablets, the development of the card games into digital applications seems very feasible.
Keywords: English as a second language, vocabulary-building, learning through gamification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13562986 EEG Analysis of Brain Dynamics in Children with Language Disorders
Authors: Hamed Alizadeh Dashagholi, Hossein Yousefi-Banaem, Mina Naeimi
Abstract:
Current study established for EEG signal analysis in patients with language disorder. Language disorder can be defined as meaningful delay in the use or understanding of spoken or written language. The disorder can include the content or meaning of language, its form, or its use. Here we applied Z-score, power spectrum, and coherence methods to discriminate the language disorder data from healthy ones. Power spectrum of each channel in alpha, beta, gamma, delta, and theta frequency bands was measured. In addition, intra hemispheric Z-score obtained by scoring algorithm. Obtained results showed high Z-score and power spectrum in posterior regions. Therefore, we can conclude that peoples with language disorder have high brain activity in frontal region of brain in comparison with healthy peoples. Results showed that high coherence correlates with irregularities in the ERP and is often found during complex task, whereas low coherence is often found in pathological conditions. The results of the Z-score analysis of the brain dynamics showed higher Z-score peak frequency in delta, theta and beta sub bands of Language Disorder patients. In this analysis there were activity signs in both hemispheres and the left-dominant hemisphere was more active than the right.Keywords: EEG, electroencephalography, coherence methods, language disorder, power spectrum, z-score.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2550