Search results for: method of similarity
19125 Molecular Diagnosis of Influenza Strains Was Carried Out on Patients of the Social Security Clinic in Karaj Using the RT-PCR Technique
Authors: A. Ferasat, S. Rostampour Yasouri
Abstract:
Seasonal flu is a highly contagious infection caused by influenza viruses. These viruses undergo genetic changes that result in new epidemics across the globe. Medical attention is crucial in severe cases, particularly for the elderly, frail, and those with chronic illnesses, as their immune systems are often weaker. The purpose of this study was to detect new subtypes of the influenza A virus rapidly using a specific RT-PCR method based on the HA gene (hemagglutinin). In the winter and spring of 2022_2023, 120 embryonated egg samples were cultured, suspected of seasonal influenza. RNA synthesis, followed by cDNA synthesis, was performed. Finally, the PCR technique was applied using a pair of specific primers designed based on the HA gene. The PCR product was identified after purification, and the nucleotide sequence of purified PCR products was compared with the sequences in the gene bank. The results showed a high similarity between the sequence of the positive samples isolated from the patients and the sequence of the new strains isolated in recent years. This RT-PCR technique is entirely specific in this study, enabling the detection and multiplication of influenza and its subspecies from clinical samples. The RT-PCR technique based on the HA gene, along with sequencing, is a fast, specific, and sensitive diagnostic method for those infected with influenza viruses and its new subtypes. Rapid molecular diagnosis of influenza is essential for suspected people to control and prevent the spread of the disease to others. It also prevents the occurrence of secondary (sometimes fatal) pneumonia that results from influenza and pathogenic bacteria. The critical role of rapid diagnosis of new strains of influenza is to prepare a drug vaccine against the latest viruses that did not exist in the community last year and are entirely new viruses.Keywords: influenza, molecular diagnosis, patients, RT-PCR technique
Procedia PDF Downloads 7219124 MHD Non-Newtonian Nanofluid Flow over a Permeable Stretching Sheet with Heat Generation and Velocity Slip
Authors: Rama Bhargava, Mania Goyal
Abstract:
The problem of magnetohydrodynamics boundary layer flow and heat transfer on a permeable stretching surface in a second grade nanofluid under the effect of heat generation and partial slip is studied theoretically. The Brownian motion and thermophoresis effects are also considered. The boundary layer equations governed by the PDE’s are transformed into a set of ODE’s with the help of local similarity transformations. The differential equations are solved by variational finite element method. The effects of different controlling parameters on the flow field and heat transfer characteristics are examined. The numerical results for the dimensionless velocity, temperature and nanoparticle volume fraction as well as the reduced Nusselt and Sherwood number have been presented graphically. The comparison confirmed excellent agreement. The present study is of great interest in coating and suspensions, cooling of metallic plate, oils and grease, paper production, coal water or coal-oil slurries, heat exchangers technology, materials processing exploiting.Keywords: viscoelastic nanofluid, partial slip, stretching sheet, heat generation/absorption, MHD flow, FEM
Procedia PDF Downloads 31319123 Iris Recognition Based on the Low Order Norms of Gradient Components
Authors: Iman A. Saad, Loay E. George
Abstract:
Iris pattern is an important biological feature of human body; it becomes very hot topic in both research and practical applications. In this paper, an algorithm is proposed for iris recognition and a simple, efficient and fast method is introduced to extract a set of discriminatory features using first order gradient operator applied on grayscale images. The gradient based features are robust, up to certain extents, against the variations may occur in contrast or brightness of iris image samples; the variations are mostly occur due lightening differences and camera changes. At first, the iris region is located, after that it is remapped to a rectangular area of size 360x60 pixels. Also, a new method is proposed for detecting eyelash and eyelid points; it depends on making image statistical analysis, to mark the eyelash and eyelid as a noise points. In order to cover the features localization (variation), the rectangular iris image is partitioned into N overlapped sub-images (blocks); then from each block a set of different average directional gradient densities values is calculated to be used as texture features vector. The applied gradient operators are taken along the horizontal, vertical and diagonal directions. The low order norms of gradient components were used to establish the feature vector. Euclidean distance based classifier was used as a matching metric for determining the degree of similarity between the features vector extracted from the tested iris image and template features vectors stored in the database. Experimental tests were performed using 2639 iris images from CASIA V4-Interival database, the attained recognition accuracy has reached up to 99.92%.Keywords: iris recognition, contrast stretching, gradient features, texture features, Euclidean metric
Procedia PDF Downloads 33419122 Different Views and Evaluations of IT Artifacts
Authors: Sameh Al-Natour, Izak Benbasat
Abstract:
The introduction of a multitude of new and interactive e-commerce information technology (IT) artifacts has impacted adoption research. Rather than solely functioning as productivity tools, new IT artifacts assume the roles of interaction mediators and social actors. This paper describes the varying roles assumed by IT artifacts, and proposes and distinguishes between four distinct foci of how the artifacts are evaluated. It further proposes a theoretical model that maps the different views of IT artifacts to four distinct types of evaluations.Keywords: IT adoption, IT artifacts, similarity, social actor
Procedia PDF Downloads 39019121 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods
Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard
Abstract:
The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.Keywords: algorithms, genetics, matching, population
Procedia PDF Downloads 14319120 Elvis Improved Method for Solving Simultaneous Equations in Two Variables with Some Applications
Authors: Elvis Adam Alhassan, Kaiyu Tian, Akos Konadu, Ernest Zamanah, Michael Jackson Adjabui, Ibrahim Justice Musah, Esther Agyeiwaa Owusu, Emmanuel K. A. Agyeman
Abstract:
In this paper, how to solve simultaneous equations using the Elvis improved method is shown. The Elvis improved method says; to make one variable in the first equation the subject; make the same variable in the second equation the subject; equate the results and simplify to obtain the value of the unknown variable; put the value of the variable found into one equation from the first or second steps and simplify for the remaining unknown variable. The difference between our Elvis improved method and the substitution method is that: with Elvis improved method, the same variable is made the subject in both equations, and the two resulting equations equated, unlike the substitution method where one variable is made the subject of only one equation and substituted into the other equation. After describing the Elvis improved method, findings from 100 secondary students and the views of 5 secondary tutors to demonstrate the effectiveness of the method are presented. The study's purpose is proved by hypothetical examples.Keywords: simultaneous equations, substitution method, elimination method, graphical method, Elvis improved method
Procedia PDF Downloads 13419119 Identification of Analogues to EGCG for the Inhibition of HPV E7: A Fundamental Insights through Structural Dynamics Study
Authors: Murali Aarthy, Sanjeev Kumar Singh
Abstract:
High risk human papillomaviruses are highly associated with the carcinoma of the cervix and the other genital tumors. Cervical cancer develops through the multistep process in which increasingly severe premalignant dysplastic lesions called cervical intraepithelial neoplastic progress to invasive cancer. The oncoprotein E7 of human papillomavirus expressed in the lower epithelial layers drives the cells into S-phase creating an environment conducive for viral genome replication and cell proliferation. The replication of the virus occurs in the terminally differentiating epithelium and requires the activation of cellular DNA replication proteins. To date, no suitable drug molecule is available to treat HPV infection whereas identification of potential drug targets and development of novel anti-HPV chemotherapies with unique mode of actions are expected. Hence, our present study aimed to identify the potential inhibitors analogous to EGCG, a green tea molecule which is considered to be safe to use for mammalian systems. A 3D similarity search on the natural small molecule library from natural product database using EGCG identified 11 potential hits based on their similarity score. The structure based docking strategies were implemented in the potential hits and the key interacting residues of protein with compounds were identified through simulation studies and binding free energy calculations. The conformational changes between the apoprotein and the complex were analyzed with the simulation and the results demonstrated that the dynamical and structural effects observed in the protein were induced by the compounds and indicated the dominance to the oncoprotein. Overall, our study provides the basis for the structural insights of the identified potential hits and EGCG and hence, the analogous compounds identified can be potent inhibitors against the HPV 16 E7 oncoprotein.Keywords: EGCG, oncoprotein, molecular dynamics simulation, analogues
Procedia PDF Downloads 12719118 Reducing the Computational Overhead of Metaheuristics Parameterization with Exploratory Landscape Analysis
Authors: Iannick Gagnon, Alain April
Abstract:
The performance of a metaheuristic on a given problem class depends on the class itself and the choice of parameters. Parameter tuning is the most time-consuming phase of the optimization process after the main calculations and it often nullifies the speed advantage of metaheuristics over traditional optimization algorithms. Several off-the-shelf parameter tuning algorithms are available, but when the objective function is expensive to evaluate, these can be prohibitively expensive to use. This paper presents a surrogate-like method for finding adequate parameters using fitness landscape analysis on simple benchmark functions and real-world objective functions. The result is a simple compound similarity metric based on the empirical correlation coefficient and a measure of convexity. It is then used to find the best benchmark functions to serve as surrogates. The near-optimal parameter set is then found using fractional factorial design. The real-world problem of NACA airfoil lift coefficient maximization is used as a preliminary proof of concept. The overall aim of this research is to reduce the computational overhead of metaheuristics parameterization.Keywords: metaheuristics, stochastic optimization, particle swarm optimization, exploratory landscape analysis
Procedia PDF Downloads 15119117 Different Methods of Fe3O4 Nano Particles Synthesis
Authors: Arezoo Hakimi, Afshin Farahbakhsh
Abstract:
Herein, we comparison synthesized Fe3O4 using, hydrothermal method, Mechanochemical processes and solvent thermal method. The Hydrothermal Technique has been the most popular one, gathering interest from scientists and technologists of different disciplines, particularly in the last fifteen years. In the hydrothermal method Fe3O4 microspheres, in which many nearly monodisperse spherical particles with diameters of about 400nm, in the mechanochemical method regular morphology indicates that the particles are well crystallized and in the solvent thermal method Fe3O4 nanoparticles have good properties of uniform size and good dispersion.Keywords: Fe3O4 nanoparticles, hydrothermal method, mechanochemical processes, solvent thermal method
Procedia PDF Downloads 35019116 The Influence of Thermal Radiation and Chemical Reaction on MHD Micropolar Fluid in The Presence of Heat Generation/Absorption
Authors: Binyam Teferi
Abstract:
Numerical and theoretical analysis of mixed convection flow of magneto- hydrodynamics micropolar fluid with stretching capillary in the presence of thermal radiation, chemical reaction, viscous dissipation, and heat generation/ absorption have been studied. The non-linear partial differential equations of momentum, angular velocity, energy, and concentration are converted into ordinary differential equations using similarity transformations which can be solved numerically. The dimensionless governing equations are solved by using Runge Kutta fourth and fifth order along with the shooting method. The effect of physical parameters viz., micropolar parameter, unsteadiness parameter, thermal buoyancy parameter, concentration buoyancy parameter, Hartmann number, spin gradient viscosity parameter, microinertial density parameter, thermal radiation parameter, Prandtl number, Eckert number, heat generation or absorption parameter, Schmidt number and chemical reaction parameter on flow variables viz., the velocity of the micropolar fluid, microrotation, temperature, and concentration has been analyzed and discussed graphically. MATLAB code is used to analyze numerical and theoretical facts. From the simulation study, it can be concluded that an increment of micropolar parameter, Hartmann number, unsteadiness parameter, thermal and concentration buoyancy parameter results in decrement of velocity flow of micropolar fluid; microrotation of micropolar fluid decreases with an increment of micropolar parameter, unsteadiness parameter, microinertial density parameter, and spin gradient viscosity parameter; temperature profile of micropolar fluid decreases with an increment of thermal radiation parameter, Prandtl number, micropolar parameter, unsteadiness parameter, heat absorption, and viscous dissipation parameter; concentration of micropolar fluid decreases as unsteadiness parameter, Schmidt number and chemical reaction parameter increases. Furthermore, computational values of local skin friction coefficient, local wall coupled coefficient, local Nusselt number, and local Sherwood number for different values of parameters have been investigated. In this paper, the following important results are obtained; An increment of micropolar parameter and Hartmann number results in a decrement of velocity flow of micropolar fluid. Microrotation decreases with an increment of the microinertial density parameter. Temperature decreases with an increasing value of the thermal radiation parameter and viscous dissipation parameter. Concentration decreases as the values of Schmidt number and chemical reaction parameter increases. The coefficient of local skin friction is enhanced with an increase in values of both the unsteadiness parameter and micropolar parameter. Increasing values of unsteadiness parameter and micropolar parameter results in an increment of the local couple stress. An increment of values of unsteadiness parameter and thermal radiation parameter results in an increment of the rate of heat transfer. As the values of Schmidt number and unsteadiness parameter increases, Sherwood number decreases.Keywords: thermal radiation, chemical reaction, viscous dissipation, heat absorption/ generation, similarity transformation
Procedia PDF Downloads 12619115 A Comparison of Smoothing Spline Method and Penalized Spline Regression Method Based on Nonparametric Regression Model
Authors: Autcha Araveeporn
Abstract:
This paper presents a study about a nonparametric regression model consisting of a smoothing spline method and a penalized spline regression method. We also compare the techniques used for estimation and prediction of nonparametric regression model. We tried both methods with crude oil prices in dollars per barrel and the Stock Exchange of Thailand (SET) index. According to the results, it is concluded that smoothing spline method performs better than that of penalized spline regression method.Keywords: nonparametric regression model, penalized spline regression method, smoothing spline method, Stock Exchange of Thailand (SET)
Procedia PDF Downloads 43819114 Content-Based Mammograms Retrieval Based on Breast Density Criteria Using Bidimensional Empirical Mode Decomposition
Authors: Sourour Khouaja, Hejer Jlassi, Nadia Feddaoui, Kamel Hamrouni
Abstract:
Most medical images, and especially mammographies, are now stored in large databases. Retrieving a desired image is considered of great importance in order to find previous similar cases diagnosis. Our method is implemented to assist radiologists in retrieving mammographic images containing breast with similar density aspect as seen on the mammogram. This is becoming a challenge seeing the importance of density criteria in cancer provision and its effect on segmentation issues. We used the BEMD (Bidimensional Empirical Mode Decomposition) to characterize the content of images and Euclidean distance measure similarity between images. Through the experiments on the MIAS mammography image database, we confirm that the results are promising. The performance was evaluated using precision and recall curves comparing query and retrieved images. Computing recall-precision proved the effectiveness of applying the CBIR in the large mammographic image databases. We found a precision of 91.2% for mammography with a recall of 86.8%.Keywords: BEMD, breast density, contend-based, image retrieval, mammography
Procedia PDF Downloads 23119113 Brown-Spot Needle Blight: An Emerging Threat Causing Loblolly Pine Needle Defoliation in Alabama, USA
Authors: Debit Datta, Jeffrey J. Coleman, Scott A. Enebak, Lori G. Eckhardt
Abstract:
Loblolly pine (Pinus taeda) is a leading productive timber species in the southeastern USA. Over the past three years, an emerging threat is expressed by successive needle defoliation followed by stunted growth and tree mortality in loblolly pine plantations. Considering economic significance, it has now become a rising concern among landowners, forest managers, and forest health state cooperators. However, the symptoms of the disease were perplexed somewhat with root disease(s) and recurrently attributed to invasive Phytophthora species due to the similarity of disease nature and devastation. Therefore, the study investigated the potential causal agent of this disease and characterized the fungi associated with loblolly pine needle defoliation in the southeastern USA. Besides, 70 trees were selected at seven long-term monitoring plots at Chatom, Alabama, to monitor and record the annual disease incidence and severity. Based on colony morphology and ITS-rDNA sequence data, a total of 28 species of fungi representing 17 families have been recovered from diseased loblolly pine needles. The native brown-spot pathogen, Lecanosticta acicola, was the species most frequently recovered from unhealthy loblolly pine needles in combination with some other common needle cast and rust pathogen(s). Identification was confirmed using morphological similarity and amplification of translation elongation factor 1-alpha gene region of interest. Tagged trees were consistently found chlorotic and defoliated from 2019 to 2020. The current emergence of the brown-spot pathogen causing loblolly pine mortality necessitates the investigation of the role of changing climatic conditions, which might be associated with increased pathogen pressure to loblolly pines in the southeastern USA.Keywords: brown-spot needle blight, loblolly pine, needle defoliation, plantation forestry
Procedia PDF Downloads 15219112 Comparative Analysis of Edge Detection Techniques for Extracting Characters
Authors: Rana Gill, Chandandeep Kaur
Abstract:
Segmentation of images can be implemented using different fundamental algorithms like edge detection (discontinuity based segmentation), region growing (similarity based segmentation), iterative thresholding method. A comprehensive literature review relevant to the study gives description of different techniques for vehicle number plate detection and edge detection techniques widely used on different types of images. This research work is based on edge detection techniques and calculating threshold on the basis of five edge operators. Five operators used are Prewitt, Roberts, Sobel, LoG and Canny. Segmentation of characters present in different type of images like vehicle number plate, name plate of house and characters on different sign boards are selected as a case study in this work. The proposed methodology has seven stages. The proposed system has been implemented using MATLAB R2010a. Comparison of all the five operators has been done on the basis of their performance. From the results it is found that Canny operators produce best results among the used operators and performance of different edge operators in decreasing order is: Canny>Log>Sobel>Prewitt>Roberts.Keywords: segmentation, edge detection, text, extracting characters
Procedia PDF Downloads 42519111 Multimodal Direct Neural Network Positron Emission Tomography Reconstruction
Authors: William Whiteley, Jens Gregor
Abstract:
In recent developments of direct neural network based positron emission tomography (PET) reconstruction, two prominent architectures have emerged for converting measurement data into images: 1) networks that contain fully-connected layers; and 2) networks that primarily use a convolutional encoder-decoder architecture. In this paper, we present a multi-modal direct PET reconstruction method called MDPET, which is a hybrid approach that combines the advantages of both types of networks. MDPET processes raw data in the form of sinograms and histo-images in concert with attenuation maps to produce high quality multi-slice PET images (e.g., 8x440x440). MDPET is trained on a large whole-body patient data set and evaluated both quantitatively and qualitatively against target images reconstructed with the standard PET reconstruction benchmark of iterative ordered subsets expectation maximization. The results show that MDPET outperforms the best previously published direct neural network methods in measures of bias, signal-to-noise ratio, mean absolute error, and structural similarity.Keywords: deep learning, image reconstruction, machine learning, neural network, positron emission tomography
Procedia PDF Downloads 10919110 Influence of Optimization Method on Parameters Identification of Hyperelastic Models
Authors: Bale Baidi Blaise, Gilles Marckmann, Liman Kaoye, Talaka Dya, Moustapha Bachirou, Gambo Betchewe, Tibi Beda
Abstract:
This work highlights the capabilities of particles swarm optimization (PSO) method to identify parameters of hyperelastic models. The study compares this method with Genetic Algorithm (GA) method, Least Squares (LS) method, Pattern Search Algorithm (PSA) method, Beda-Chevalier (BC) method and the Levenberg-Marquardt (LM) method. Four classic hyperelastic models are used to test the different methods through parameters identification. Then, the study compares the ability of these models to reproduce experimental Treloar data in simple tension, biaxial tension and pure shear.Keywords: particle swarm optimization, identification, hyperelastic, model
Procedia PDF Downloads 17019109 On q-Non-extensive Statistics with Non-Tsallisian Entropy
Authors: Petr Jizba, Jan Korbel
Abstract:
We combine an axiomatics of Rényi with the q-deformed version of Khinchin axioms to obtain a measure of information (i.e., entropy) which accounts both for systems with embedded self-similarity and non-extensivity. We show that the entropy thus obtained is uniquely solved in terms of a one-parameter family of information measures. The ensuing maximal-entropy distribution is phrased in terms of a special function known as the Lambert W-function. We analyze the corresponding ‘high’ and ‘low-temperature’ asymptotics and reveal a non-trivial structure of the parameter space.Keywords: multifractals, Rényi information entropy, THC entropy, MaxEnt, heavy-tailed distributions
Procedia PDF Downloads 44219108 Ensemble of Deep CNN Architecture for Classifying the Source and Quality of Teff Cereal
Authors: Belayneh Matebie, Michael Melese
Abstract:
The study focuses on addressing the challenges in classifying and ensuring the quality of Eragrostis Teff, a small and round grain that is the smallest cereal grain. Employing a traditional classification method is challenging because of its small size and the similarity of its environmental characteristics. To overcome this, this study employs a machine learning approach to develop a source and quality classification system for Teff cereal. Data is collected from various production areas in the Amhara regions, considering two types of cereal (high and low quality) across eight classes. A total of 5,920 images are collected, with 740 images for each class. Image enhancement techniques, including scaling, data augmentation, histogram equalization, and noise removal, are applied to preprocess the data. Convolutional Neural Network (CNN) is then used to extract relevant features and reduce dimensionality. The dataset is split into 80% for training and 20% for testing. Different classifiers, including FVGG16, FINCV3, QSCTC, EMQSCTC, SVM, and RF, are employed for classification, achieving accuracy rates ranging from 86.91% to 97.72%. The ensemble of FVGG16, FINCV3, and QSCTC using the Max-Voting approach outperforms individual algorithms.Keywords: Teff, ensemble learning, max-voting, CNN, SVM, RF
Procedia PDF Downloads 5219107 Influence of Water Reservoir Parameters on the Climate and Coastal Areas
Authors: Lia Matchavariani
Abstract:
Water reservoir construction on the rivers flowing into the sea complicates the coast protection, seashore starts to degrade causing coast erosion and disaster on the backdrop of current climate change. The instruments of the impact of a water reservoir on the climate and coastal areas are its contact surface with the atmosphere and the area irrigated with its water or humidified with infiltrated waters. The Black Sea coastline is characterized by the highest ecological vulnerability. The type and intensity of the water reservoir impact are determined by its morphometry, type of regulation, level regime, and geomorphological and geological characteristics of the adjoining area. Studies showed the impact of the water reservoir on the climate, on its comfort parameters is positive if it is located in the zone of insufficient humidity and vice versa, is negative if the water reservoir is found in the zone with abundant humidity. There are many natural and anthropogenic factors determining the peculiarities of the impact of the water reservoir on the climate, which can be assessed with maximum accuracy by the so-called “long series” method, which operates on the meteorological elements (temperature, wind, precipitations, etc.) with the long series formed with the stationary observation data. This is the time series, which consists of two periods with statistically sufficient duration. The first period covers the observations up to the formation of the water reservoir and another period covers the observations accomplished during its operation. If no such data are available, or their series is statistically short, “an analog” method is used. Such an analog water reservoir is selected based on the similarity of the environmental conditions. It must be located within the zone of the designed water reservoir, under similar environmental conditions, and besides, a sufficient number of observations accomplished in its coastal zone.Keywords: coast-constituent sediment, eustasy, meteorological parameters, seashore degradation, water reservoirs impact
Procedia PDF Downloads 4319106 Mathematical Reconstruction of an Object Image Using X-Ray Interferometric Fourier Holography Method
Authors: M. K. Balyan
Abstract:
The main principles of X-ray Fourier interferometric holography method are discussed. The object image is reconstructed by the mathematical method of Fourier transformation. The three methods are presented – method of approximation, iteration method and step by step method. As an example the complex amplitude transmission coefficient reconstruction of a beryllium wire is considered. The results reconstructed by three presented methods are compared. The best results are obtained by means of step by step method.Keywords: dynamical diffraction, hologram, object image, X-ray holography
Procedia PDF Downloads 39319105 Modified Approximation Methods for Finding an Optimal Solution for the Transportation Problem
Authors: N. Guruprasad
Abstract:
This paper presents a modification of approximation method for transportation problems. The initial basic feasible solution can be computed using either Russel's or Vogel's approximation methods. Russell’s approximation method provides another excellent criterion that is still quick to implement on a computer (not manually) In most cases Russel's method yields a better initial solution, though it takes longer than Vogel's method (finding the next entering variable in Russel's method is in O(n1*n2), and in O(n1+n2) for Vogel's method). However, Russel's method normally has a lesser total running time because less pivots are required to reach the optimum for all but small problem sizes (n1+n2=~20). With this motivation behind we have incorporated a variation of the same – what we have proposed it has TMC (Total Modified Cost) to obtain fast and efficient solutions.Keywords: computation, efficiency, modified cost, Russell’s approximation method, transportation, Vogel’s approximation method
Procedia PDF Downloads 54419104 Steepest Descent Method with New Step Sizes
Authors: Bib Paruhum Silalahi, Djihad Wungguli, Sugi Guritman
Abstract:
Steepest descent method is a simple gradient method for optimization. This method has a slow convergence in heading to the optimal solution, which occurs because of the zigzag form of the steps. Barzilai and Borwein modified this algorithm so that it performs well for problems with large dimensions. Barzilai and Borwein method results have sparked a lot of research on the method of steepest descent, including alternate minimization gradient method and Yuan method. Inspired by previous works, we modified the step size of the steepest descent method. We then compare the modification results against the Barzilai and Borwein method, alternate minimization gradient method and Yuan method for quadratic function cases in terms of the iterations number and the running time. The average results indicate that the steepest descent method with the new step sizes provide good results for small dimensions and able to compete with the results of Barzilai and Borwein method and the alternate minimization gradient method for large dimensions. The new step sizes have faster convergence compared to the other methods, especially for cases with large dimensions.Keywords: steepest descent, line search, iteration, running time, unconstrained optimization, convergence
Procedia PDF Downloads 53919103 Investment Projects Selection Problem under Hesitant Fuzzy Environment
Authors: Irina Khutsishvili
Abstract:
In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations, since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.Keywords: In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.
Procedia PDF Downloads 11719102 Heat Transfer of an Impinging Jet on a Plane Surface
Authors: Jian-Jun Shu
Abstract:
A cold, thin film of liquid impinging on an isothermal hot, horizontal surface has been investigated. An approximate solution for the velocity and temperature distributions in the flow along the horizontal surface is developed, which exploits the hydrodynamic similarity solution for thin film flow. The approximate solution may provide a valuable basis for assessing flow and heat transfer in more complex settings.Keywords: flux, free impinging jet, solid-surface, uniform wall temperature
Procedia PDF Downloads 47819101 MHD Stagnation Point Flow towards a Shrinking Sheet with Suction in an Upper-Convected Maxwell (UCM) Fluid
Authors: K. Jafar, R. Nazar, A. Ishak, I. Pop
Abstract:
The present analysis considers the steady stagnation point flow and heat transfer towards a permeable sheet in an upper-convected Maxwell (UCM) electrically conducting fluid, with a constant magnetic field applied in the transverse direction to flow, and a local heat generation within the boundary layer with a heat generation rate proportional to (T-T_inf)^p. Using a similarity transformation, the governing system of partial differential equations is first transformed into a system of ordinary differential equations, which is then solved numerically using a finite-difference scheme known as the Keller-box method. Numerical results are obtained for the flow and thermal fields for various values of the shrinking/stretching parameter lambda, the magnetic parameter M, the elastic parameter K, the Prandtl number Pr, the suction parameter s, the heat generation parameter Q, and the exponent p. The results indicate the existence of dual solutions for the shrinking sheet up to a critical value lambda_c whose value depends on the value of M, K, and s. In the presence of internal heat absorbtion (Q<0), the surface heat transfer rate decreases with increasing p but increases with parameter Q and s, when the sheet is either stretched or shrunk.Keywords: magnetohydrodynamic (MHD), boundary layer flow, UCM fluid, stagnation point, shrinking sheet
Procedia PDF Downloads 35319100 Calculating Stress Intensity Factor of Cracked Axis by Using a Meshless Method
Authors: S. Shahrooi, A. Talavari
Abstract:
Numeral study on the crack and discontinuity using element-free methods has been widely spread in recent years. In this study, for stress intensity factor calculation of the cracked axis under torsional loading has been used from a new element-free method as MLPG method. Region range is discretized by some dispersed nodal points. From method of moving least square (MLS) utilized to create the functions using these nodal points. Then, results of meshless method and finite element method (FEM) were compared. The results is shown which the element-free method was of good accuracy.Keywords: stress intensity factor, crack, torsional loading, meshless method
Procedia PDF Downloads 56419099 Sustainable Approach for Strategic Planning of Construction of Buildings using Multi-Criteria Decision Making Tools
Authors: Kishor Bhagwat, Gayatri Vyas
Abstract:
Construction industry is earmarked with complex processes depending on the nature and scope of the project. In recent years, developments in this sector are remarkable and have resulted in both positive and negative impacts on the environment and human being. Sustainable construction can be looked upon as one of the solution to overcome the negative impacts since sustainable construction is a vast concept, which includes many parameters, and sometimes the use of multi-criteria decision making [MCDM] tools becomes necessary. The main objective of this study is to determine the weightage of sustainable building parameters with the help of MCDM tools. Questionnaire survey was conducted to examine the perspective of respondents on the importance of weights of the criterion, and the respondents were architects, green building consultants, and civil engineers. This paper presents an overview of research related to Indian and international green building rating systems and MCDM. The results depict that economy, environmental health, and safety, site selection, climatic condition, etc., are important parameters in sustainable construction.Keywords: green building, sustainability, multi-criteria decision making method [MCDM], analytical hierarchy process [AHP], technique for order preference by similarity to an ideal solution [TOPSIS], entropy
Procedia PDF Downloads 9819098 An Efficient Approach to Optimize the Cost and Profit of a Tea Garden by Using Branch and Bound Method
Authors: Abu Hashan Md Mashud, M. Sharif Uddin, Aminur Rahman Khan
Abstract:
In this paper, we formulate a new problem as a linear programming and Integer Programming problem and maximize profit within the limited budget and limited resources based on the construction of a tea garden problem. It describes a new idea about how to optimize profit and focuses on the practical aspects of modeling and the challenges of providing a solution to a complex real life problem. Finally, a comparative study is carried out among Graphical method, Simplex method and Branch and bound method.Keywords: integer programming, tea garden, graphical method, simplex method, branch and bound method
Procedia PDF Downloads 62219097 Automatic Identification of Pectoral Muscle
Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina
Abstract:
Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle
Procedia PDF Downloads 35019096 Ceratocystis manginecans Causal Agent of a Destructive Mangoes in Pakistan
Authors: Asma Rashid, Shazia Iram, Iftikhar Ahmad
Abstract:
Mango sudden death is an emerging problem in Pakistan. As its prevalence is observed in almost all mango growing areas and severity varied from 2-5% in Punjab and 5-10% in Sindh. Symptoms on affected trees include bark splitting, discoloration of the vascular tissue, wilting, gummosis and at the end rapid death. Total of n= 45 isolates were isolated from different mango growing areas of Punjab and Sindh. Pathogenicity of these fungal isolates was tested through artificial inoculation method on different hosts (potato tubers, detached mango leaves, detached mango twigs and mango plants) under controlled conditions and all were proved pathogenic with varying degree of aggressiveness in reference to control. The findings of the present study proved that out of these four methods, potato tubers inoculation method was the most ideal as this fix the inoculums on the target site. Increased fungal growth and spore numbers may be due to soft tissues of potato tubers from which Ceratocystis isolates can easily pass. Lesion area on potato tubers was in the range of 7.09-0.14 cm2 followed by detached mango twigs which were ranged from 0.48-0.09 cm2). All pathological results were proved highly significant at P<0.05 through ANOVA but isolate to isolate showed non-significant behaviour but they have the positive effect on lesion area. Re-isolation of respective fungi was achieved with 100 percent success which results in the verification of Koch’s postulates. DNA of fungal pathogens was successfully extracted through phenol chloroform method. Amplification was done through ITS, b-tubulin gene, and Transcription Elongation Factor (EF1-a) gene primers and the amplified amplicons were sequenced and compared from NCBI which showed 99-100 % similarity with Ceratocystis manginecans. Fungus Ceratocystis manginecans formed one of strongly supported sub-clades through phylogenetic tree. Results obtained through this work would be supportive in establishment of relation of isolates with their region and will give information about pathogenicity level of isolates that would be useful to develop the management policies to reduce the afflictions in orchards caused by mango sudden death.Keywords: artificial inoculation, mango, Ceratocystis manginecans, phylogenetic, screening
Procedia PDF Downloads 245