Search results for: Genetic Algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4669

Search results for: Genetic Algorithm

1309 Genesis and Survival Chance of Autotriploid in Natural Diploid Population of Lilium lancifolium Thunb

Authors: Ji-Won Park, Jong-Wha Kim

Abstract:

Triploid L. lancifolium have a wide geographic distribution. By contrast, diploid L. lancifolium have limited distributions in the islands and coastal regions of the South and West Korean Peninsula and northern Tsushima Island, Japan. L. lancifolium diploids and triploids are not sympatrically distributed with other lily species or ploidy lines in West Sea and South Sea Islands of the Korean Peninsula. This observation raises the following questions: 'Why have autotriploid L. lancifolium never been observed in those isolated islands?', 'What mechanism excludes the occurrence of autotriploids, if they arise?'. To determine the occurrence and survival of triploid plants in natural diploid populations of tiger lily (Lilium lancifolium), ploidy analysis was conducted on natural open-pollinated seeds produced from plants grown on isolated islands, and on hybrid seeds produced by artificial crossing between plant populations originating on different Korean islands. Normal seeds were classified into five grades depending on the ratio of embryo/endosperm lengths, including 5/5, 4/5, 3/5, 2/5, and 1/5. Triploids were not observed among seedlings produced from natural open pollinations on isolated islands. Triploids were detected only in seedlings of underdeveloped seed grades(3/5 and 2/5) from artificial crosses between populations from different isolated islands. The triploid occurrence frequency was calculated as 0.0 for natural open-pollinated seedlings and 0.000582 for artificial crosses(6 triploids from 10,303 seedlings). Triploids were produced from crosses between isolated populations located at least 70 km apart; no triploids were detected in inter-population crosses of plants originating on the same islands. Triploid seedlings have very low viability in soil. We analyzed factors affecting triploid occurrence and survival in natural diploid populations of L. lancifolium. The results suggest that triploids originate from fertilization between plants that are genetically isolated due to geographical isolation and/or genotypic differences.

Keywords: Lilium lancifolium, autotriploid, natural population, genetic distance, 2n female gamete

Procedia PDF Downloads 511
1308 Maximum Power Point Tracking for Small Scale Wind Turbine Using Multilayer Perceptron Neural Network Implementation without Mechanical Sensor

Authors: Piyangkun Kukutapan, Siridech Boonsang

Abstract:

The article proposes maximum power point tracking without mechanical sensor using Multilayer Perceptron Neural Network (MLPNN). The aim of article is to reduce the cost and complexity but still retain efficiency. The experimental is that duty cycle is generated maximum power, if it has suitable qualification. The measured data from DC generator, voltage (V), current (I), power (P), turnover rate of power (dP), and turnover rate of voltage (dV) are used as input for MLPNN model. The output of this model is duty cycle for driving the converter. The experiment implemented using Arduino Uno board. This diagram is compared to MPPT using MLPNN and P&O control (Perturbation and Observation control). The experimental results show that the proposed MLPNN based approach is more efficiency than P&O algorithm for this application.

Keywords: maximum power point tracking, multilayer perceptron netural network, optimal duty cycle, DC generator

Procedia PDF Downloads 315
1307 A Review of Encryption Algorithms Used in Cloud Computing

Authors: Derick M. Rakgoale, Topside E. Mathonsi, Vusumuzi Malele

Abstract:

Cloud computing offers distributed online and on-demand computational services from anywhere in the world. Cloud computing services have grown immensely over the past years, especially in the past year due to the Coronavirus pandemic. Cloud computing has changed the working environment and introduced work from work phenomenon, which enabled the adoption of technologies to fulfill the new workings, including cloud services offerings. The increased cloud computing adoption has come with new challenges regarding data privacy and its integrity in the cloud environment. Previously advanced encryption algorithms failed to reduce the memory space required for cloud computing performance, thus increasing the computational cost. This paper reviews the existing encryption algorithms used in cloud computing. In the future, artificial neural networks (ANN) algorithm design will be presented as a security solution to ensure data integrity, confidentiality, privacy, and availability of user data in cloud computing. Moreover, MATLAB will be used to evaluate the proposed solution, and simulation results will be presented.

Keywords: cloud computing, data integrity, confidentiality, privacy, availability

Procedia PDF Downloads 117
1306 Molecular Characterization of Major Isolated Organism Involved in Bovine Subclinical Mastitis

Authors: H. K. Ratre, M. Roy, S. Roy, M. S. Parmar, V. Bhagat

Abstract:

Mastitis is a common problem of dairy industries. Reduction in milk production and an irreparable damage to the udder associated with the disease are common causes of culling of dairy cows. Milk from infected animals is not suitable for drinking and for making different milk products. So, it has a major economic importance in dairy cattle. The aims of this study were to investigate the bacteriological panorama in milk from udder quarters with subclinical mastitis and to carried out for the molecular characterization of the major isolated organisms, from subclinical mastitis-affected cows in and around Durg and Rajnandgaon district of Chhattisgarh. Isolation and identification of bacteria from the milk samples of subclinical mastitis-affected cows were done by standard and routine culture procedures. A total of 78 isolates were obtained from cows and among the various bacteria isolated, Staphylococcus spp. occupied prime position with occurrence rate of 51.282%. However, other bacteria isolated includeStreptococcus spp. (20.512%), Micrococcus spp. (14.102%), E. coli (8.974%), Klebsiela spp. (2.564%), Salmonella spp. (1.282%) and Proteus spp. (1.282%). Staphylococcus spp. was isolated as the major causative agent of subclinical mastitis in the studied area. Molecular characterization of Staphylococus aureusisolates was done for genetic expression of the virulence genes like ‘nuc’ encoding thermonucleaseexoenzyme, coa and spa by PCR amplification of the respective genes in 25 Staphylococcus isolates. In the present study, 15 isolates (77.27%) out of 20 coagulase positive isolates were found to be genotypically positive for ‘nuc’ where as 20 isolates (52.63%) out of 38 CNS expressed the presence of the same virulence gene. In the present study, three Staphylococcus isolates were found to be genotypically positive for coa gene. The Amplification of the coa gene yielded two different products of 627, 710 bp. The amplification of the gene segment encoding the IgG binding region of protein A (spa) revealed a size of 220 and 253bp in twostaphylococcus isolates. The X-region binding of the spa gene produced an amplicon of 315 bp in one Staphylococcal isolates. Staphylococcus aureus was found to be major isolate (51.28%) responsible for causing subclinical mastitis in cows which also showed expression of virulence genesnuc, coa and spa.

Keywords: mastitis, bacteria, characterization, expression, gene

Procedia PDF Downloads 205
1305 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia

Authors: Triano Nurhikmat

Abstract:

Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.

Keywords: association rule, data mining, industrial accidents, rules

Procedia PDF Downloads 284
1304 DNA Fingerprinting of Some Major Genera of Subterranean Termites (Isoptera) (Anacanthotermes, Psammotermes and Microtermes) from Western Saudi Arabia

Authors: AbdelRahman A. Faragalla, Mohamed H. Alqhtani, Mohamed M. M.Ahmed

Abstract:

Saudi Arabia has currently been beset by a barrage of bizarre assemblages of subterranean termite fauna, inflicting heavy catastrophic havocs on human valued properties in various homes, storage facilities, warehouses, agricultural and horticultural crops including okra, sweet pepper, tomatoes, sorghum, date palm trees, citruses and many forest domains and green lush desert oases. The most pressing urgent priority is to use modern technologies to alleviate the painstaking obstacle of taxonomic identification of these injurious noxious pests that might lead to effective pest control in both infested agricultural commodities and field crops. Our study has indicated the use of DNA fingerprinting technologies, in order to generate basic information of the genetic similarity between 3 predominant families containing the most destructive termite species. The methodologies included extraction and DNA isolation from members of the major families and the use of randomly selected primers and PCR amplifications with the nucleotide sequences. GC content and annealing temperatures for all primers, PCR amplifications and agarose gel electrophoresis were also conducted in addition to the scoring and analysis of Random Amplification Polymorphic DNA-PCR (RAPDs). A phylogenetic analysis for different species using statistical computer program on the basis of RAPD-DNA results, represented as a dendrogram based on the average of band sharing ratio between different species. Our study aims to shed more light on this intriguing subject, which may lead to an expedited display of the kinship and relatedness of species in an ambitious undertaking to arrive at correct taxonomic classification of termite species, discover sibling species, so that a logistic rational pest management strategy could be delineated.

Keywords: DNA fingerprinting, Western Saudi Arabia, DNA primers, RAPD

Procedia PDF Downloads 417
1303 The Influence of Beta Shape Parameters in Project Planning

Authors: Αlexios Kotsakis, Stefanos Katsavounis, Dimitra Alexiou

Abstract:

Networks can be utilized to represent project planning problems, using nodes for activities and arcs to indicate precedence relationship between them. For fixed activity duration, a simple algorithm calculates the amount of time required to complete a project, followed by the activities that comprise the critical path. Program Evaluation and Review Technique (PERT) generalizes the above model by incorporating uncertainty, allowing activity durations to be random variables, producing nevertheless a relatively crude solution in planning problems. In this paper, based on the findings of the relevant literature, which strongly suggests that a Beta distribution can be employed to model earthmoving activities, we utilize Monte Carlo simulation, to estimate the project completion time distribution and measure the influence of skewness, an element inherent in activities of modern technical projects. We also extract the activity criticality index, with an ultimate goal to produce more accurate planning estimations.

Keywords: beta distribution, PERT, Monte Carlo simulation, skewness, project completion time distribution

Procedia PDF Downloads 137
1302 'Low Electronic Noise' Detector Technology in Computed Tomography

Authors: A. Ikhlef

Abstract:

Image noise in computed tomography, is mainly caused by the statistical noise, system noise reconstruction algorithm filters. Since last few years, low dose x-ray imaging became more and more desired and looked as a technical differentiating technology among CT manufacturers. In order to achieve this goal, several technologies and techniques are being investigated, including both hardware (integrated electronics and photon counting) and software (artificial intelligence and machine learning) based solutions. From a hardware point of view, electronic noise could indeed be a potential driver for low and ultra-low dose imaging. We demonstrated that the reduction or elimination of this term could lead to a reduction of dose without affecting image quality. Also, in this study, we will show that we can achieve this goal using conventional electronics (low cost and affordable technology), designed carefully and optimized for maximum detective quantum efficiency. We have conducted the tests using large imaging objects such as 30 cm water and 43 cm polyethylene phantoms. We compared the image quality with conventional imaging protocols with radiation as low as 10 mAs (<< 1 mGy). Clinical validation of such results has been performed as well.

Keywords: computed tomography, electronic noise, scintillation detector, x-ray detector

Procedia PDF Downloads 109
1301 Inverse Problem Method for Microwave Intrabody Medical Imaging

Authors: J. Chamorro-Servent, S. Tassani, M. A. Gonzalez-Ballester, L. J. Roca, J. Romeu, O. Camara

Abstract:

Electromagnetic and microwave imaging (MWI) have been used in medical imaging in the last years, being the most common applications of breast cancer and stroke detection or monitoring. In those applications, the subject or zone to observe is surrounded by a number of antennas, and the Nyquist criterium can be satisfied. Additionally, the space between the antennas (transmitting and receiving the electromagnetic fields) and the zone to study can be prepared in a homogeneous scenario. However, this may differ in other cases as could be intracardiac catheters, stomach monitoring devices, pelvic organ systems, liver ablation monitoring devices, or uterine fibroids’ ablation systems. In this work, we analyzed different MWI algorithms to find the most suitable method for dealing with an intrabody scenario. Due to the space limitations usually confronted on those applications, the device would have a cylindrical configuration of a maximum of eight transmitters and eight receiver antennas. This together with the positioning of the supposed device inside a body tract impose additional constraints in order to choose a reconstruction method; for instance, it inhabitants the use of well-known algorithms such as filtered backpropagation for diffraction tomography (due to the unusual configuration with probes enclosed by the imaging region). Finally, the difficulty of simulating a realistic non-homogeneous background inside the body (due to the incomplete knowledge of the dielectric properties of other tissues between the antennas’ position and the zone to observe), also prevents the use of Born and Rytov algorithms due to their limitations with a heterogeneous background. Instead, we decided to use a time-reversed algorithm (mostly used in geophysics) due to its characteristics of ignoring heterogeneities in the background medium, and of focusing its generated field onto the scatters. Therefore, a 2D time-reversed finite difference time domain was developed based on the time-reversed approach for microwave breast cancer detection. Simultaneously an in-silico testbed was also developed to compare ground-truth dielectric properties with corresponding microwave imaging reconstruction. Forward and inverse problems were computed varying: the frequency used related to a small zone to observe (7, 7.5 and 8 GHz); a small polyp diameter (5, 7 and 10 mm); two polyp positions with respect to the closest antenna (aligned or disaligned); and the (transmitters-to-receivers) antenna combination used for the reconstruction (1-1, 8-1, 8-8 or 8-3). Results indicate that when using the existent time-reversed method for breast cancer here for the different combinations of transmitters and receivers, we found false positives due to the high degrees of freedom and unusual configuration (and the possible violation of Nyquist criterium). Those false positives founded in 8-1 and 8-8 combinations, highly reduced with the 1-1 and 8-3 combination, being the 8-3 configuration de most suitable (three neighboring receivers at each time). The 8-3 configuration creates a region-of-interest reduced problem, decreasing the ill-posedness of the inverse problem. To conclude, the proposed algorithm solves the main limitations of the described intrabody application, successfully detecting the angular position of targets inside the body tract.

Keywords: FDTD, time-reversed, medical imaging, microwave imaging

Procedia PDF Downloads 112
1300 Machine Learning Approach for Yield Prediction in Semiconductor Production

Authors: Heramb Somthankar, Anujoy Chakraborty

Abstract:

This paper presents a classification study on yield prediction in semiconductor production using machine learning approaches. A complicated semiconductor production process is generally monitored continuously by signals acquired from sensors and measurement sites. A monitoring system contains a variety of signals, all of which contain useful information, irrelevant information, and noise. In the case of each signal being considered a feature, "Feature Selection" is used to find the most relevant signals. The open-source UCI SECOM Dataset provides 1567 such samples, out of which 104 fail in quality assurance. Feature extraction and selection are performed on the dataset, and useful signals were considered for further study. Afterward, common machine learning algorithms were employed to predict whether the signal yields pass or fail. The most relevant algorithm is selected for prediction based on the accuracy and loss of the ML model.

Keywords: deep learning, feature extraction, feature selection, machine learning classification algorithms, semiconductor production monitoring, signal processing, time-series analysis

Procedia PDF Downloads 99
1299 Local Texture and Global Color Descriptors for Content Based Image Retrieval

Authors: Tajinder Kaur, Anu Bala

Abstract:

An image retrieval system is a computer system for browsing, searching, and retrieving images from a large database of digital images a new algorithm meant for content-based image retrieval (CBIR) is presented in this paper. The proposed method combines the color and texture features which are extracted the global and local information of the image. The local texture feature is extracted by using local binary patterns (LBP), which are evaluated by taking into consideration of local difference between the center pixel and its neighbors. For the global color feature, the color histogram (CH) is used which is calculated by RGB (red, green, and blue) spaces separately. In this paper, the combination of color and texture features are proposed for content-based image retrieval. The performance of the proposed method is tested on Corel 1000 database which is the natural database. The results after being investigated show a significant improvement in terms of their evaluation measures as compared to LBP and CH.

Keywords: color, texture, feature extraction, local binary patterns, image retrieval

Procedia PDF Downloads 348
1298 Blind Super-Resolution Reconstruction Based on PSF Estimation

Authors: Osama A. Omer, Amal Hamed

Abstract:

Successful blind image Super-Resolution algorithms require the exact estimation of the Point Spread Function (PSF). In the absence of any prior information about the imagery system and the true image; this estimation is normally done by trial and error experimentation until an acceptable restored image quality is obtained. Multi-frame blind Super-Resolution algorithms often have disadvantages of slow convergence and sensitiveness to complex noises. This paper presents a Super-Resolution image reconstruction algorithm based on estimation of the PSF that yields the optimum restored image quality. The estimation of PSF is performed by the knife-edge method and it is implemented by measuring spreading of the edges in the reproduced HR image itself during the reconstruction process. The proposed image reconstruction approach is using L1 norm minimization and robust regularization based on a bilateral prior to deal with different data and noise models. A series of experiment results show that the proposed method can outperform other previous work robustly and efficiently.

Keywords: blind, PSF, super-resolution, knife-edge, blurring, bilateral, L1 norm

Procedia PDF Downloads 354
1297 Clustering Based Level Set Evaluation for Low Contrast Images

Authors: Bikshalu Kalagadda, Srikanth Rangu

Abstract:

The important object of images segmentation is to extract objects with respect to some input features. One of the important methods for image segmentation is Level set method. Generally medical images and synthetic images with low contrast of pixel profile, for such images difficult to locate interested features in images. In conventional level set function, develops irregularity during its process of evaluation of contour of objects, this destroy the stability of evolution process. For this problem a remedy is proposed, a new hybrid algorithm is Clustering Level Set Evolution. Kernel fuzzy particles swarm optimization clustering with the Distance Regularized Level Set (DRLS) and Selective Binary, and Gaussian Filtering Regularized Level Set (SBGFRLS) methods are used. The ability of identifying different regions becomes easy with improved speed. Efficiency of the modified method can be evaluated by comparing with the previous method for similar specifications. Comparison can be carried out by considering medical and synthetic images.

Keywords: segmentation, clustering, level set function, re-initialization, Kernel fuzzy, swarm optimization

Procedia PDF Downloads 342
1296 Spatial-Temporal Awareness Approach for Extensive Re-Identification

Authors: Tyng-Rong Roan, Fuji Foo, Wenwey Hseush

Abstract:

Recent development of AI and edge computing plays a critical role to capture meaningful events such as detection of an unattended bag. One of the core problems is re-identification across multiple CCTVs. Immediately following the detection of a meaningful event is to track and trace the objects related to the event. In an extensive environment, the challenge becomes severe when the number of CCTVs increases substantially, imposing difficulties in achieving high accuracy while maintaining real-time performance. The algorithm that re-identifies cross-boundary objects for extensive tracking is referred to Extensive Re-Identification, which emphasizes the issues related to the complexity behind a great number of CCTVs. The Spatial-Temporal Awareness approach challenges the conventional thinking and concept of operations which is labor intensive and time consuming. The ability to perform Extensive Re-Identification through a multi-sensory network provides the next-level insights – creating value beyond traditional risk management.

Keywords: long-short-term memory, re-identification, security critical application, spatial-temporal awareness

Procedia PDF Downloads 103
1295 Optimal Bayesian Chart for Controlling Expected Number of Defects in Production Processes

Authors: V. Makis, L. Jafari

Abstract:

In this paper, we develop an optimal Bayesian chart to control the expected number of defects per inspection unit in production processes with long production runs. We formulate this control problem in the optimal stopping framework. The objective is to determine the optimal stopping rule minimizing the long-run expected average cost per unit time considering partial information obtained from the process sampling at regular epochs. We prove the optimality of the control limit policy, i.e., the process is stopped and the search for assignable causes is initiated when the posterior probability that the process is out of control exceeds a control limit. An algorithm in the semi-Markov decision process framework is developed to calculate the optimal control limit and the corresponding average cost. Numerical examples are presented to illustrate the developed optimal control chart and to compare it with the traditional u-chart.

Keywords: Bayesian u-chart, economic design, optimal stopping, semi-Markov decision process, statistical process control

Procedia PDF Downloads 559
1294 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection

Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari

Abstract:

In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.

Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs

Procedia PDF Downloads 359
1293 Research on Knowledge Graph Inference Technology Based on Proximal Policy Optimization

Authors: Yihao Kuang, Bowen Ding

Abstract:

With the increasing scale and complexity of knowledge graph, modern knowledge graph contains more and more types of entity, relationship, and attribute information. Therefore, in recent years, it has been a trend for knowledge graph inference to use reinforcement learning to deal with large-scale, incomplete, and noisy knowledge graphs and improve the inference effect and interpretability. The Proximal Policy Optimization (PPO) algorithm utilizes a near-end strategy optimization approach. This allows for more extensive updates of policy parameters while constraining the update extent to maintain training stability. This characteristic enables PPOs to converge to improved strategies more rapidly, often demonstrating enhanced performance early in the training process. Furthermore, PPO has the advantage of offline learning, effectively utilizing historical experience data for training and enhancing sample utilization. This means that even with limited resources, PPOs can efficiently train for reinforcement learning tasks. Based on these characteristics, this paper aims to obtain a better and more efficient inference effect by introducing PPO into knowledge inference technology.

Keywords: reinforcement learning, PPO, knowledge inference

Procedia PDF Downloads 220
1292 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum

Authors: Abdulrahman Sumayli, Saad M. AlShahrani

Abstract:

For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectively

Keywords: temperature, pressure variations, machine learning, oil treatment

Procedia PDF Downloads 58
1291 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application

Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro

Abstract:

This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.

Keywords: item response theory, dimensionality, submodel theory, factorial analysis

Procedia PDF Downloads 355
1290 Cryptic Diversity: Identifying Two Morphologically Similar Species of Invasive Apple Snails in Peninsular Malaysia

Authors: Suganiya Rama Rao, Yoon-Yen Yow, Thor-Seng Liew, Shyamala Ratnayeke

Abstract:

Invasive snails in the genus Pomacea have spread across Southeast Asia including Peninsular Malaysia. Apart from significant economic costs to wetland crops, very little is known about the snails’ effects on native species, and wetland function through their alteration of macrophyte communities. This study was conducted to establish diagnostic characteristics of Pomacea species in the Malaysian environment using genetic and morphological criteria. Snails were collected from eight localities in northern and central regions of Peninsular Malaysia. The mitochondrial COI gene of 52 adult snails was amplified and sequenced. Maximum likelihood analysis was used to analyse species identity and assess phylogenetic relationships among snails from different geographic locations. Shells of the two species were compared using geometric morphometric analysis and covariance analyses. Shell height accounted for most of the observed variation between P. canaliculata and P. maculata, with the latter possessing a smaller mean ratio of shell height: aperture height (p < 0.0001) and shell height to shell width (give p < 0.0001). Genomic and phylogenetic analysis demonstrated the presence of two monophyletic taxa, P. canaliculata and P. maculata, in Peninsular Malaysia samples. P. maculata co-occurred with P. canaliculata in 5 localities, but samples from 3 localities contained only P. canaliculata. This study is the first to confirm the presence of two of the most invasive species of Pomacea in Peninsular Malaysia using a genomic approach. P. canaliculata appears to be the more widespread species. Despite statistical differences, both quantitative and qualitative morphological characteristics demonstrate much interspecific overlap and intraspecific variability; thus morphology alone cannot reliably verify species identity. Molecular techniques for distinguishing between these two highly invasive Pomacea species are needed to understand their specific ecological niches and develop effective protocols for their management.

Keywords: Pomacea canaliculata, Pomacea maculata, invasive species, phylog enetic analysis, geometric morphometric analysis

Procedia PDF Downloads 249
1289 Representativity Based Wasserstein Active Regression

Authors: Benjamin Bobbia, Matthias Picard

Abstract:

In recent years active learning methodologies based on the representativity of the data seems more promising to limit overfitting. The presented query methodology for regression using the Wasserstein distance measuring the representativity of our labelled dataset compared to the global distribution. In this work a crucial use of GroupSort Neural Networks is made therewith to draw a double advantage. The Wasserstein distance can be exactly expressed in terms of such neural networks. Moreover, one can provide explicit bounds for their size and depth together with rates of convergence. However, heterogeneity of the dataset is also considered by weighting the Wasserstein distance with the error of approximation at the previous step of active learning. Such an approach leads to a reduction of overfitting and high prediction performance after few steps of query. After having detailed the methodology and algorithm, an empirical study is presented in order to investigate the range of our hyperparameters. The performances of this method are compared, in terms of numbers of query needed, with other classical and recent query methods on several UCI datasets.

Keywords: active learning, Lipschitz regularization, neural networks, optimal transport, regression

Procedia PDF Downloads 71
1288 Forensic Medical Capacities of Research of Saliva Stains on Physical Evidence after Washing

Authors: Saule Mussabekova

Abstract:

Recent advances in genetics have allowed increasing acutely the capacities of the formation of reliable evidence in conducting forensic examinations. Thus, traces of biological origin are important sources of information about a crime. Currently, around the world, sexual offenses have increased, and among them are those in which the criminals use various detergents to remove traces of their crime. A feature of modern synthetic detergents is the presence of biological additives - enzymes. Enzymes purposefully destroy stains of biological origin. To study the nature and extent of the impact of modern washing powders on saliva stains on the physical evidence, specially prepared test specimens of different types of tissues to which saliva was applied have been examined. Materials and Methods: Washing machines of famous manufacturers of household appliances have been used with different production characteristics and advertised brands of washing powder for test washing. Over 3,500 experimental samples were tested. After washing, the traces of saliva were identified using modern research methods of forensic medicine. Results: The influence was tested and the dependence of the use of different washing programs, types of washing machines and washing powders in the process of establishing saliva trace and identify of the stains on the physical evidence while washing was revealed. The results of experimental and practical expert studies have shown that in most cases it is not possible to draw the conclusions in the identification of saliva traces on physical evidence after washing. This is a consequence of the effect of biological additives and other additional factors on traces of saliva during washing. Conclusions: On the basis of the results of the study, the feasibility of saliva traces of the stains on physical evidence after washing is established. The use of modern molecular genetic methods makes it possible to partially solve the problems arising in the study of unlaundered evidence. Additional study of physical evidence after washing facilitates detection and investigation of sexual offenses against women and children.

Keywords: saliva research, modern synthetic detergents, laundry detergents, forensic medicine

Procedia PDF Downloads 208
1287 Phytoremediation-A Plant Based Cleansing Method to Obtain Quality Medicinal Plants and Natural Products

Authors: Hannah S. Elizabeth, D. Gnanasekaran, M. R. Manju Gowda, Antony George

Abstract:

Phytoremediation a new technology of remediating the contaminated soil, water and air using plants and serves as a green technology with environmental friendly approach. The main aim of this technique is cleansing and detoxifying of organic compounds, organo-phosphorous pesticides, heavy metals like arsenic, iron, cadmium, gold, radioactive elements which cause teratogenic and life threatening diseases to mankind and animal kingdom when consume the food crops, vegetables, fruits, cerals, and millets obtained from the contaminated soil. Also, directly they may damage the genetic materials thereby alters the biosynthetic pathways of secondary metabolites and other phytoconstituents which may have different pharmacological activities which lead to lost their efficacy and potency as well. It would reflect in mutagenicity, drug resistance and affect other antagonistic properties of normal metabolism. Is the technology for real clean-up of contaminated soils and the contaminants which are potentially toxic. It reduces the risks produced by a contaminated soil by decreasing contaminants using plants as a source. The advantages are cost-effectiveness and less ecosystem disruption. Plants may also help to stabilize contaminants by accumulating and precipitating toxic trace elements in the roots. Organic pollutants can potentially be chemically degraded and ultimately mineralized into harmless biological compounds. Hence, the use of plants to revitalize contaminated sites is gaining more attention and preferred for its cost-effective when compared to other chemical methods. The introduction of harmful substances into the environment has been shown to have many adverse effects on human health, agricultural productivity, and natural ecosystems. Because the costs of growing a crop are minimal compared to those of soil removal and replacement, the use of plants to remediate hazardous soils is seen as having great promise.

Keywords: cost effective, eco-friendly, phytoremediation, secondary metabolites

Procedia PDF Downloads 267
1286 Breast Cancer in Very Young (Less Than 25 Yeras) Women: An Institutional Analysis from Developing Country

Authors: Ajay Gogia, Svs Deo, Dn Sharma, Atul Batra, Ashutash Mishra

Abstract:

Background and Aims: Breast cancer in women aged less than 25 years (defined as very young breast cancer, VYBC) is rare and accounts for 0.25% of all breast cancer in the West. There is no data available on VYBC from developing countries. The aim of this study was to analyze the clinical, pathological, and prognostic factors and outcomes in VYBC. Methods: This retrospective analysis was performed on 80 patients aged 25 years or less (screened 8000 files of female BC) who were registered at All India Institute of Medical Sciences (AIIMS), New Delhi, India, over a 15-year period between 2011 and 2023. Results: The median age was 21.5 years (range 16-25). A positive family history (siblings and parents) was elicited in 30% of cases, and breast cancer gene (BRCA1/2) mutation was found in 33% of cases patients. Ten patients (12.5%) patients have pregnancy-associated breast cancer (BC detected during pregnancy or 1 year after postpartum period). The TNM stage distribution was Stage I was 0, stage II -30%, stage III –60% and Stage IV -10 %patients. Seventy percent of tumors were high grade, and 90% had pathological node-positive disease. Estrogen, Progesterone, and human epidermal growth factor receptor 2 (HER2)/neu positivity were 25%,25% and 35%, respectively. Triple-negative breast cancer constituted 40% of patients. With a median follow-up of 42 months, 3 years, relapse-free survival (nonmetastatic disease), progression-free survival (metastatic disease) and overall survival were 30%, 15% and 50%, respectively. Conclusions: Very young women constituted 1% of all breast cancer cases. Advanced disease at presentation and high-risk pathological features result in poor outcomes. One-third of VYBCs are associated with BRCA mutation, which requires genetic counseling and risk reduction surgery if required. Due to the aggressive behavior of BC in this age group, need early diagnosis and prompt treatment

Keywords: very young, breast cancer, outcome, developing country, India

Procedia PDF Downloads 12
1285 Double Encrypted Data Communication Using Cryptography and Steganography

Authors: Adine Barett, Jermel Watson, Anteneh Girma, Kacem Thabet

Abstract:

In information security, secure communication of data across networks has always been a problem at the forefront. Transfer of information across networks is susceptible to being exploited by attackers engaging in malicious activity. In this paper, we leverage steganography and cryptography to create a layered security solution to protect the information being transmitted. The first layer of security leverages crypto- graphic techniques to scramble the information so that it cannot be deciphered even if the steganography-based layer is compromised. The second layer of security relies on steganography to disguise the encrypted in- formation so that it cannot be seen. We consider three cryptographic cipher methods in the cryptography layer, namely, Playfair cipher, Blowfish cipher, and Hills cipher. Then, the encrypted message is passed through the least significant bit (LSB) to the steganography algorithm for further encryption. Both encryption approaches are combined efficiently to help secure information in transit over a network. This multi-layered encryption is a solution that will benefit cloud platforms, social media platforms and networks that regularly transfer private information such as banks and insurance companies.

Keywords: cryptography, steganography, layered security, Cipher, encryption

Procedia PDF Downloads 67
1284 Efects of Data Corelation in a Sparse-View Compresive Sensing Based Image Reconstruction

Authors: Sajid Abas, Jon Pyo Hong, Jung-Ryun Le, Seungryong Cho

Abstract:

Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.

Keywords: computed tomography, computed laminography, compressive sending, low-dose

Procedia PDF Downloads 456
1283 Component Based Testing Using Clustering and Support Vector Machine

Authors: Iqbaldeep Kaur, Amarjeet Kaur

Abstract:

Software Reusability is important part of software development. So component based software development in case of software testing has gained a lot of practical importance in the field of software engineering from academic researcher and also from software development industry perspective. Finding test cases for efficient reuse of test cases is one of the important problems aimed by researcher. Clustering reduce the search space, reuse test cases by grouping similar entities according to requirements ensuring reduced time complexity as it reduce the search time for retrieval the test cases. In this research paper we proposed approach for re-usability of test cases by unsupervised approach. In unsupervised learning we proposed k-mean and Support Vector Machine. We have designed the algorithm for requirement and test case document clustering according to its tf-idf vector space and the output is set of highly cohesive pattern groups.

Keywords: software testing, reusability, clustering, k-mean, SVM

Procedia PDF Downloads 419
1282 Constructing White-Box Implementations Based on Threshold Shares and Composite Fields

Authors: Tingting Lin, Manfred von Willich, Dafu Lou, Phil Eisen

Abstract:

A white-box implementation of a cryptographic algorithm is a software implementation intended to resist extraction of the secret key by an adversary. To date, most of the white-box techniques are used to protect block cipher implementations. However, a large proportion of the white-box implementations are proven to be vulnerable to affine equivalence attacks and other algebraic attacks, as well as differential computation analysis (DCA). In this paper, we identify a class of block ciphers for which we propose a method of constructing white-box implementations. Our method is based on threshold implementations and operations in composite fields. The resulting implementations consist of lookup tables and few exclusive OR operations. All intermediate values (inputs and outputs of the lookup tables) are masked. The threshold implementation makes the distribution of the masked values uniform and independent of the original inputs, and the operations in composite fields reduce the size of the lookup tables. The white-box implementations can provide resistance against algebraic attacks and DCA-like attacks.

Keywords: white-box, block cipher, composite field, threshold implementation

Procedia PDF Downloads 157
1281 Model of Transhipment and Routing Applied to the Cargo Sector in Small and Medium Enterprises of Bogotá, Colombia

Authors: Oscar Javier Herrera Ochoa, Ivan Dario Romero Fonseca

Abstract:

This paper presents a design of a model for planning the distribution logistics operation. The significance of this work relies on the applicability of this fact to the analysis of small and medium enterprises (SMEs) of dry freight in Bogotá. Two stages constitute this implementation: the first one is the place where optimal planning is achieved through a hybrid model developed with mixed integer programming, which considers the transhipment operation based on a combined load allocation model as a classic transshipment model; the second one is the specific routing of that operation through the heuristics of Clark and Wright. As a result, an integral model is obtained to carry out the step by step planning of the distribution of dry freight for SMEs in Bogotá. In this manner, optimum assignments are established by utilizing transshipment centers with that purpose of determining the specific routing based on the shortest distance traveled.

Keywords: transshipment model, mixed integer programming, saving algorithm, dry freight transportation

Procedia PDF Downloads 212
1280 Profit-Based Artificial Neural Network (ANN) Trained by Migrating Birds Optimization: A Case Study in Credit Card Fraud Detection

Authors: Ashkan Zakaryazad, Ekrem Duman

Abstract:

A typical classification technique ranks the instances in a data set according to the likelihood of belonging to one (positive) class. A credit card (CC) fraud detection model ranks the transactions in terms of probability of being fraud. In fact, this approach is often criticized, because firms do not care about fraud probability but about the profitability or costliness of detecting a fraudulent transaction. The key contribution in this study is to focus on the profit maximization in the model building step. The artificial neural network proposed in this study works based on profit maximization instead of minimizing the error of prediction. Moreover, some studies have shown that the back propagation algorithm, similar to other gradient–based algorithms, usually gets trapped in local optima and swarm-based algorithms are more successful in this respect. In this study, we train our profit maximization ANN using the Migrating Birds optimization (MBO) which is introduced to literature recently.

Keywords: neural network, profit-based neural network, sum of squared errors (SSE), MBO, gradient descent

Procedia PDF Downloads 462