Search results for: approximation algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2520

Search results for: approximation algorithms

1530 Maximum Power Point Tracking Using FLC Tuned with GA

Authors: Mohamed Amine Haraoubia, Abdelaziz Hamzaoui, Najib Essounbouli

Abstract:

The pursuit of the MPPT has led to the development of many kinds of controllers, one of which is the Fuzzy Logic Controller, which has proven its worth. To further tune this controller this paper will discuss and analyze the use of Genetic Algorithms to tune the Fuzzy Logic Controller. It will provide an introduction to both systems, and test their compatibility and performance.

Keywords: fuzzy logic controller, fuzzy logic, genetic algorithm, maximum power point, maximum power point tracking

Procedia PDF Downloads 373
1529 Red Blood Cells Deformability: A Chaotic Process

Authors: Ana M. Korol, Bibiana Riquelme, Osvaldo A. Rosso

Abstract:

Since erythrocyte deformability analysis is mostly qualitative, the development of quantitative nonlinear methods is crucial for restricting subjectivity in the study of cell behaviour. An electro-optic mechanic system called erythrodeformeter has been developed and constructed in our laboratory in order to evaluate the erythrocytes' viscoelasticity. A numerical method formulated on the basis of fractal approximation for ordinary (OBM) and fractionary Brownian motion (FBM), as well as wavelet transform analysis, are proposed to distinguish chaos from noise based on the assumption that diffractometric data involves both deterministic and stochastic components, so it could be modelled as a system of bounded correlated random walk. Here we report studies on 25 donors: 4 alpha thalassaemic patients, 11 beta thalassaemic patients, and 10 healthy controls non-alcoholic and non-smoker individuals. The Correlation Coefficient, a nonlinear parameter, showed evidence of the changes in the erythrocyte deformability; the Wavelet Entropy could quantify those differences which are detected by the light diffraction patterns. Such quantifiers allow a good deal of promise and the possibility of a better understanding of the rheological erythrocytes aspects and also could help in clinical diagnosis.

Keywords: red blood cells, deformability, nonlinear dynamics, chaos theory, wavelet trannsform

Procedia PDF Downloads 59
1528 Transforming Data Science Curriculum Through Design Thinking

Authors: Samar Swaid

Abstract:

Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.

Keywords: data science, design thinking, AI, currculum, transformation

Procedia PDF Downloads 81
1527 The Pressure Effect and First-Principles Study of Strontium Chalcogenides SrS

Authors: Benallou Yassine, Amara Kadda, Bouazza Boubakar, Soudini Belabbes, Arbouche Omar, M. Zemouli

Abstract:

The study of the pressure effect on the materials, their functionality and their properties is very important, insofar as it provides the opportunity to identify others applications such the optical properties in the alkaline earth chalcogenides, as like the SrS. Here we present the first-principles calculations which have been performed using the full potential linearized augmented plane wave method (FP-LAPW) within the Generalized Gradient Approximation developed by Perdew–Burke–Ernzerhor for solids (PBEsol). The calculated structural parameters like the lattice parameters, the bulk modulus B and their pressure derivative B' are in reasonable agreement with the available experimental and theoretical data. In addition, the elastic properties such as elastic constants (C11, C12, and C44), the shear modulus G, the Young modulus E, the Poisson’s ratio ν and the B/G ratio are also given. The treatments of exchange and correlation effects were done by the Tran-Blaha modified Becke-Johnson (TB-mBJ) potential for the electronic. The pressure effect on the electronic properties was visualized by calculating the variations of the gap as a function of pressure. The obtained results are compared to available experimental data and to other theoretical calculations

Keywords: SrS, GGA-PBEsol+TB-MBJ, density functional, Perdew–Burke–Ernzerhor, FP-LAPW, pressure effect

Procedia PDF Downloads 569
1526 Systematic and Meta-Analysis of Navigation in Oral and Maxillofacial Trauma and Impact of Machine Learning and AI in Management

Authors: Shohreh Ghasemi

Abstract:

Introduction: Managing oral and maxillofacial trauma is a multifaceted challenge, as it can have life-threatening consequences and significant functional and aesthetic impact. Navigation techniques have been introduced to improve surgical precision to meet this challenge. A machine learning algorithm was also developed to support clinical decision-making regarding treating oral and maxillofacial trauma. Given these advances, this systematic meta-analysis aims to assess the efficacy of navigational techniques in treating oral and maxillofacial trauma and explore the impact of machine learning on their management. Methods: A detailed and comprehensive analysis of studies published between January 2010 and September 2021 was conducted through a systematic meta-analysis. This included performing a thorough search of Web of Science, Embase, and PubMed databases to identify studies evaluating the efficacy of navigational techniques and the impact of machine learning in managing oral and maxillofacial trauma. Studies that did not meet established entry criteria were excluded. In addition, the overall quality of studies included was evaluated using Cochrane risk of bias tool and the Newcastle-Ottawa scale. Results: Total of 12 studies, including 869 patients with oral and maxillofacial trauma, met the inclusion criteria. An analysis of studies revealed that navigation techniques effectively improve surgical accuracy and minimize the risk of complications. Additionally, machine learning algorithms have proven effective in predicting treatment outcomes and identifying patients at high risk for complications. Conclusion: The introduction of navigational technology has great potential to improve surgical precision in oral and maxillofacial trauma treatment. Furthermore, developing machine learning algorithms offers opportunities to improve clinical decision-making and patient outcomes. Still, further studies are necessary to corroborate these results and establish the optimal use of these technologies in managing oral and maxillofacial trauma

Keywords: trauma, machine learning, navigation, maxillofacial, management

Procedia PDF Downloads 58
1525 Machine Learning in Agriculture: A Brief Review

Authors: Aishi Kundu, Elhan Raza

Abstract:

"Necessity is the mother of invention" - Rapid increase in the global human population has directed the agricultural domain toward machine learning. The basic need of human beings is considered to be food which can be satisfied through farming. Farming is one of the major revenue generators for the Indian economy. Agriculture is not only considered a source of employment but also fulfils humans’ basic needs. So, agriculture is considered to be the source of employment and a pillar of the economy in developing countries like India. This paper provides a brief review of the progress made in implementing Machine Learning in the agricultural sector. Accurate predictions are necessary at the right time to boost production and to aid the timely and systematic distribution of agricultural commodities to make their availability in the market faster and more effective. This paper includes a thorough analysis of various machine learning algorithms applied in different aspects of agriculture (crop management, soil management, water management, yield tracking, livestock management, etc.).Due to climate changes, crop production is affected. Machine learning can analyse the changing patterns and come up with a suitable approach to minimize loss and maximize yield. Machine Learning algorithms/ models (regression, support vector machines, bayesian models, artificial neural networks, decision trees, etc.) are used in smart agriculture to analyze and predict specific outcomes which can be vital in increasing the productivity of the Agricultural Food Industry. It is to demonstrate vividly agricultural works under machine learning to sensor data. Machine Learning is the ongoing technology benefitting farmers to improve gains in agriculture and minimize losses. This paper discusses how the irrigation and farming management systems evolve in real-time efficiently. Artificial Intelligence (AI) enabled programs to emerge with rich apprehension for the support of farmers with an immense examination of data.

Keywords: machine Learning, artificial intelligence, crop management, precision farming, smart farming, pre-harvesting, harvesting, post-harvesting

Procedia PDF Downloads 105
1524 Evolutional Substitution Cipher on Chaotic Attractor

Authors: Adda Ali-Pacha, Naima Hadj-Said

Abstract:

Nowadays, the security of information is primarily founded on the calculation of algorithms that confidentiality depend on the number of bits necessary to define a cryptographic key. In this work, we introduce a new chaotic cryptosystem that we call evolutional substitution cipher on a chaotic attractor. In this research paper, we take the Henon attractor. The evolutional substitution cipher on Henon attractor is based on the principle of monoalphabetic cipher and it associates the plaintext at a succession of real numbers calculated from the attractor equations.

Keywords: cryptography, substitution cipher, chaos theory, Henon attractor, evolutional substitution cipher

Procedia PDF Downloads 430
1523 Principal Component Analysis Combined Machine Learning Techniques on Pharmaceutical Samples by Laser Induced Breakdown Spectroscopy

Authors: Kemal Efe Eseller, Göktuğ Yazici

Abstract:

Laser-induced breakdown spectroscopy (LIBS) is a rapid optical atomic emission spectroscopy which is used for material identification and analysis with the advantages of in-situ analysis, elimination of intensive sample preparation, and micro-destructive properties for the material to be tested. LIBS delivers short pulses of laser beams onto the material in order to create plasma by excitation of the material to a certain threshold. The plasma characteristics, which consist of wavelength value and intensity amplitude, depends on the material and the experiment’s environment. In the present work, medicine samples’ spectrum profiles were obtained via LIBS. Medicine samples’ datasets include two different concentrations for both paracetamol based medicines, namely Aferin and Parafon. The spectrum data of the samples were preprocessed via filling outliers based on quartiles, smoothing spectra to eliminate noise and normalizing both wavelength and intensity axis. Statistical information was obtained and principal component analysis (PCA) was incorporated to both the preprocessed and raw datasets. The machine learning models were set based on two different train-test splits, which were 70% training – 30% test and 80% training – 20% test. Cross-validation was preferred to protect the models against overfitting; thus the sample amount is small. The machine learning results of preprocessed and raw datasets were subjected to comparison for both splits. This is the first time that all supervised machine learning classification algorithms; consisting of Decision Trees, Discriminant, naïve Bayes, Support Vector Machines (SVM), k-NN(k-Nearest Neighbor) Ensemble Learning and Neural Network algorithms; were incorporated to LIBS data of paracetamol based pharmaceutical samples, and their different concentrations on preprocessed and raw dataset in order to observe the effect of preprocessing.

Keywords: machine learning, laser-induced breakdown spectroscopy, medicines, principal component analysis, preprocessing

Procedia PDF Downloads 87
1522 Analysis of Three-Dimensional Longitudinal Rolls Induced by Double Diffusive Poiseuille-Rayleigh-Benard Flows in Rectangular Channels

Authors: O. Rahli, N. Mimouni, R. Bennacer, K. Bouhadef

Abstract:

This numerical study investigates the travelling wave’s appearance and the behavior of Poiseuille-Rayleigh-Benard (PRB) flow induced in 3D thermosolutale mixed convection (TSMC) in horizontal rectangular channels. The governing equations are discretized by using a control volume method with third order Quick scheme in approximating the advection terms. Simpler algorithm is used to handle coupling between the momentum and continuity equations. To avoid the excessively high computer time, full approximation storage (FAS) with full multigrid (FMG) method is used to solve the problem. For a broad range of dimensionless controlling parameters, the contribution of this work is to analyzing the flow regimes of the steady longitudinal thermoconvective rolls (noted R//) for both thermal and mass transfer (TSMC). The transition from the opposed volume forces to cooperating ones, considerably affects the birth and the development of the longitudinal rolls. The heat and mass transfers distribution are also examined.

Keywords: heat and mass transfer, mixed convection, poiseuille-rayleigh-benard flow, rectangular duct

Procedia PDF Downloads 299
1521 Iterative Solver for Solving Large-Scale Frictional Contact Problems

Authors: Thierno Diop, Michel Fortin, Jean Deteix

Abstract:

Since the precise formulation of the elastic part is irrelevant for the description of the algorithm, we shall consider a generic case. In practice, however, we will have to deal with a non linear material (for instance a Mooney-Rivlin model). We are interested in solving a finite element approximation of the problem, leading to large-scale non linear discrete problems and, after linearization, to large linear systems and ultimately to calculations needing iterative methods. This also implies that penalty method, and therefore augmented Lagrangian method, are to be banned because of their negative effect on the condition number of the underlying discrete systems and thus on the convergence of iterative methods. This is in rupture to the mainstream of methods for contact in which augmented Lagrangian is the principal tool. We shall first present the problem and its discretization; this will lead us to describe a general solution algorithm relying on a preconditioner for saddle-point problems which we shall describe in some detail as it is not entirely standard. We will propose an iterative approach for solving three-dimensional frictional contact problems between elastic bodies, including contact with a rigid body, contact between two or more bodies and also self-contact.

Keywords: frictional contact, three-dimensional, large-scale, iterative method

Procedia PDF Downloads 211
1520 Body Image Dissatifaction with and Personal Behavioral Control in Obese Patients Who are Attending to Treatment

Authors: Mariela Gonzalez, Zoraide Lugli, Eleonora Vivas, Rosana Guzmán

Abstract:

The objective was to determine the predictive capacity of self-efficacy perceived for weight control, locus of weight control and skills of weight self-management in the dissatisfaction of the body image in obese people who attend treatment. Sectional study conducted in the city of Maracay, Venezuela, with 243 obese who attend to treatment, 173 of the feminine gender and 70 of the male, with ages ranging between 18 and 57 years old. The sample body mass index ranged between 29.39 and 44.14. The following instruments were used: The Body Shape Questionnaire (BSQ), the inventory of body weight self-regulation, The Inventory of self-efficacy in the regulation of body weight and the Inventory of the Locus of weight control. Calculating the descriptive statistics and of central tendency, coefficients of correlation and multiple regression; it was found that a low ‘perceived Self-efficacy in the weight control’ and a high ‘Locus of external control’, predict the dissatisfaction with body image in obese who attend treatment. The findings are a first approximation to give an account of the importance of the personal control variables in the study of the psychological grief on the overweight individual.

Keywords: dissatisfaction with body image, obese people, personal control, psychological variables

Procedia PDF Downloads 433
1519 Estimation of Population Mean Using Characteristics of Poisson Distribution: An Application to Earthquake Data

Authors: Prayas Sharma

Abstract:

This paper proposed a generalized class of estimators, an exponential class of estimators based on the adaption of Sharma and Singh (2015) and Solanki and Singh (2013), and a simple difference estimator for estimating unknown population mean in the case of Poisson distributed population in simple random sampling without replacement. The expressions for mean square errors of the proposed classes of estimators are derived from the first order of approximation. It is shown that the adapted version of Solanki and Singh (2013), the exponential class of estimator, is always more efficient than the usual estimator, ratio, product, exponential ratio, and exponential product type estimators and equally efficient to simple difference estimator. Moreover, the adapted version of Sharma and Singh's (2015) estimator is always more efficient than all the estimators available in the literature. In addition, theoretical findings are supported by an empirical study to show the superiority of the constructed estimators over others with an application to earthquake data of Turkey.

Keywords: auxiliary attribute, point bi-serial, mean square error, simple random sampling, Poisson distribution

Procedia PDF Downloads 156
1518 Financial Ethics: A Review of 2010 Flash Crash

Authors: Omer Farooq, Salman Ahmed Khan, Sadaf Khalid

Abstract:

Modern day stock markets have almost entirely became automated. Even though it means increased profits for the investors by algorithms acting upon the slightest price change in order of microseconds, it also has given birth to many ethical dilemmas in the sense that slightest mistake can cause people to lose all of their livelihoods. This paper reviews one such event that happened on May 06, 2010 in which $1 trillion dollars disappeared from the Dow Jones Industrial Average. We are going to discuss its various aspects and the ethical dilemmas that have arisen due to it.

Keywords: flash crash, market crash, stock market, stock market crash

Procedia PDF Downloads 520
1517 FracXpert: Ensemble Machine Learning Approach for Localization and Classification of Bone Fractures in Cricket Athletes

Authors: Madushani Rodrigo, Banuka Athuraliya

Abstract:

In today's world of medical diagnosis and prediction, machine learning stands out as a strong tool, transforming old ways of caring for health. This study analyzes the use of machine learning in the specialized domain of sports medicine, with a focus on the timely and accurate detection of bone fractures in cricket athletes. Failure to identify bone fractures in real time can result in malunion or non-union conditions. To ensure proper treatment and enhance the bone healing process, accurately identifying fracture locations and types is necessary. When interpreting X-ray images, it relies on the expertise and experience of medical professionals in the identification process. Sometimes, radiographic images are of low quality, leading to potential issues. Therefore, it is necessary to have a proper approach to accurately localize and classify fractures in real time. The research has revealed that the optimal approach needs to address the stated problem and employ appropriate radiographic image processing techniques and object detection algorithms. These algorithms should effectively localize and accurately classify all types of fractures with high precision and in a timely manner. In order to overcome the challenges of misidentifying fractures, a distinct model for fracture localization and classification has been implemented. The research also incorporates radiographic image enhancement and preprocessing techniques to overcome the limitations posed by low-quality images. A classification ensemble model has been implemented using ResNet18 and VGG16. In parallel, a fracture segmentation model has been implemented using the enhanced U-Net architecture. Combining the results of these two implemented models, the FracXpert system can accurately localize exact fracture locations along with fracture types from the available 12 different types of fracture patterns, which include avulsion, comminuted, compressed, dislocation, greenstick, hairline, impacted, intraarticular, longitudinal, oblique, pathological, and spiral. This system will generate a confidence score level indicating the degree of confidence in the predicted result. Using ResNet18 and VGG16 architectures, the implemented fracture segmentation model, based on the U-Net architecture, achieved a high accuracy level of 99.94%, demonstrating its precision in identifying fracture locations. Simultaneously, the classification ensemble model achieved an accuracy of 81.0%, showcasing its ability to categorize various fracture patterns, which is instrumental in the fracture treatment process. In conclusion, FracXpert has become a promising ML application in sports medicine, demonstrating its potential to revolutionize fracture detection processes. By leveraging the power of ML algorithms, this study contributes to the advancement of diagnostic capabilities in cricket athlete healthcare, ensuring timely and accurate identification of bone fractures for the best treatment outcomes.

Keywords: multiclass classification, object detection, ResNet18, U-Net, VGG16

Procedia PDF Downloads 120
1516 Vibration of Nanobeam Subjected to Constant Magnetic Field and Ramp-Type Thermal Loading under Non-Fourier Heat Conduction Law of Lord-Shulman

Authors: Hamdy M. Youssef

Abstract:

In this work, the usual Euler–Bernoulli nanobeam has been modeled in the context of Lord-Shulman thermoelastic theorem, which contains non-Fourier heat conduction law. The nanobeam has been subjected to a constant magnetic field and ramp-type thermal loading. The Laplace transform definition has been applied to the governing equations, and the solutions have been obtained by using a direct approach. The inversions of the Laplace transform have been calculated numerically by using Tzou approximation method. The solutions have been applied to a nanobeam made of silicon nitride. The distributions of the temperature increment, lateral deflection, strain, stress, and strain-energy density have been represented in figures with different values of the magnetic field intensity and ramp-time heat parameter. The value of the magnetic field intensity and ramp-time heat parameter have significant effects on all the studied functions, and they could be used as tuners to control the energy which has been generated through the nanobeam.

Keywords: nanobeam, vibration, constant magnetic field, ramp-type thermal loading, non-Fourier heat conduction law

Procedia PDF Downloads 138
1515 First Principls Study of Structural, Electronic, Magnetic and Optical Properties of SiNi₂O₄ Spinel Oxide

Authors: Karkour Selma

Abstract:

We conducted first principles full potential calculations using the Wien2k code to explore the structural, electronic, magnetic, and optical properties of SiNi₂O₄, a cubic normal spinel oxide. Our calculations, based on the GGA-PBEsol of the generalized gradient approximation, revealed several key findings. The spinel oxides exhibited a stable cubic structure in the ferromagnetic phase and showed 100% spin polarization. We determined the equilibrium lattice constant and internal parameter values. In terms of the electronic properties, we observed a direct bandgap of 2.68 eV for the spin-up configuration, while the spin-down configuration exhibited an indirect bandgap of 0.82 eV. Additionally, we calculated the total density of states and partial densities for each atom, finding a magnetic moment spin density of states of 8.0 μB per formula unit. The optical properties have been calculated. The real, Ԑ₁(ω) and the imaginary, Ԑ₂(ω) parts of the complex dielectric constants, refractivity, reflection and energy loss when light scattered from the material. The absorption region spanned from 1.5 eV to 14 eV, with significant intensity. The calculated results confirm the suitability of this material for optical and spintronic devices application.

Keywords: DFT, spintronic, GGA, spinel

Procedia PDF Downloads 76
1514 Performance Evaluation of Packet Scheduling with Channel Conditioning Aware Based on Wimax Networks

Authors: Elmabruk Laias, Abdalla M. Hanashi, Mohammed Alnas

Abstract:

Worldwide Interoperability for Microwave Access (WiMAX) became one of the most challenging issues, since it was responsible for distributing available resources of the network among all users this leaded to the demand of constructing and designing high efficient scheduling algorithms in order to improve the network utilization, to increase the network throughput, and to minimize the end-to-end delay. In this study, the proposed algorithm focuses on an efficient mechanism to serve non-real time traffic in congested networks by considering channel status.

Keywords: WiMAX, Quality of Services (QoS), OPNE, Diff-Serv (DS).

Procedia PDF Downloads 286
1513 Two-Photon-Exchange Effects in the Electromagnetic Production of Pions

Authors: Hui-Yun Cao, Hai-Qing Zhou

Abstract:

The high precision measurements and experiments play more and more important roles in particle physics and atomic physics. To analyse the precise experimental data sets, the corresponding precise and reliable theoretical calculations are necessary. Until now, the form factors of elemental constituents such as pion and proton are still attractive issues in current Quantum Chromodynamics (QCD). In this work, the two-photon-exchange (TPE) effects in ep→enπ⁺ at small -t are discussed within a hadronic model. Under the pion dominance approximation and the limit mₑ→0, the TPE contribution to the amplitude can be described by a scalar function. We calculate TPE contributions to the amplitude, and the unpolarized differential cross section with the only elastic intermediate state is considered. The results show that the TPE corrections to the unpolarized differential cross section are about from -4% to -20% at Q²=1-1.6 GeV². After considering the TPE corrections to the experimental data sets of unpolarized differential cross section, we analyze the TPE corrections to the separated cross sections σ(L,T,LT,TT). We find that the TPE corrections (at Q²=1-1.6 GeV²) to σL are about from -10% to -30%, to σT are about 20%, and to σ(LT,TT) are much larger. By these analyses, we conclude that the TPE contributions in ep→enπ⁺ at small -t are important to extract the separated cross sections σ(L,T,LT,TT) and the electromagnetic form factor of π⁺ in the experimental analysis.

Keywords: differential cross section, form factor, hadronic, two-photon

Procedia PDF Downloads 133
1512 Second Harmonic Generation of Higher-Order Gaussian Laser Beam in Density Rippled Plasma

Authors: Jyoti Wadhwa, Arvinder Singh

Abstract:

This work presents the theoretical investigation of an enhanced second-harmonic generation of higher-order Gaussian laser beam in plasma having a density ramp. The mechanism responsible for the self-focusing of a laser beam in plasma is considered to be the relativistic mass variation of plasma electrons under the effect of a highly intense laser beam. Using the moment theory approach and considering the Wentzel-Kramers-Brillouin approximation for the non-linear Schrodinger wave equation, the differential equation is derived, which governs the spot size of the higher-order Gaussian laser beam in plasma. The nonlinearity induced by the laser beam creates the density gradient in the background plasma electrons, which is responsible for the excitation of the electron plasma wave. The large amplitude electron plasma wave interacts with the fundamental beam, which further produces the coherent radiations with double the frequency of the incident beam. The analysis shows the important role of the different modes of higher-order Gaussian laser beam and density ramp on the efficiency of generated harmonics.

Keywords: density rippled plasma, higher order Gaussian laser beam, moment theory approach, second harmonic generation.

Procedia PDF Downloads 180
1511 Study and Analysis of the Factors Affecting Road Safety Using Decision Tree Algorithms

Authors: Naina Mahajan, Bikram Pal Kaur

Abstract:

The purpose of traffic accident analysis is to find the possible causes of an accident. Road accidents cannot be totally prevented but by suitable traffic engineering and management the accident rate can be reduced to a certain extent. This paper discusses the classification techniques C4.5 and ID3 using the WEKA Data mining tool. These techniques use on the NH (National highway) dataset. With the C4.5 and ID3 technique it gives best results and high accuracy with less computation time and error rate.

Keywords: C4.5, ID3, NH(National highway), WEKA data mining tool

Procedia PDF Downloads 338
1510 GGA-PBEsol+TB-MBJ Studies of SrxPb1-xS Ternary Semiconductor Alloys

Authors: Y. Benallou, K. Amara, O. Arbouche

Abstract:

In this paper, we report a density functional study of the structural, electronic and elastic properties of the ordered phases of SrxPb1-xS ternary semiconductor alloys namely rocksalt compounds: PbS and SrS and the rocksalt-based compounds: SrPb3S4, SrPbS2, and Sr3PbS4. These First-principles calculations have been performed using the full potential linearized augmented plane wave method (FP-LAPW) within the Generalized Gradient Approximation developed by Perdew–Burke–Ernzerhor for solids (PBEsol). The calculated structural parameters like the lattice parameters, the bulk modulus B and their pressure derivative B' are in reasonable agreement with the available experimental and theoretical data. In addition, the elastic properties such as elastic constants (C11, C12, and C44), the shear modulus G, the Young modulus E, the Poisson’s ratio ν and the B/G ratio are also given. For the electronic properties calculations, the exchange and correlation effects were treated by the Tran-Blaha modified Becke-Johnson (TB-mBJ) potential to prevent the shortcoming of the underestimation of the energy gaps in both LDA and GGA approximations. The obtained results are compared to available experimental data and to other theoretical calculations.

Keywords: SrxPb1-xS, GGA-PBEsol+TB-MBJ, density functional, Perdew–Burke–Ernzerhor, FP-LAPW

Procedia PDF Downloads 398
1509 Key Transfer Protocol Based on Non-invertible Numbers

Authors: Luis A. Lizama-Perez, Manuel J. Linares, Mauricio Lopez

Abstract:

We introduce a method to perform remote user authentication on what we call non-invertible cryptography. It exploits the fact that the multiplication of an invertible integer and a non-invertible integer in a ring Zn produces a non-invertible integer making infeasible to compute factorization. The protocol requires the smallest key size when is compared with the main public key algorithms as Diffie-Hellman, Rivest-Shamir-Adleman or Elliptic Curve Cryptography. Since we found that the unique opportunity for the eavesdropper is to mount an exhaustive search on the keys, the protocol seems to be post-quantum.

Keywords: invertible, non-invertible, ring, key transfer

Procedia PDF Downloads 179
1508 Automatic Assignment of Geminate and Epenthetic Vowel for Amharic Text-to-Speech System

Authors: Tadesse Anberbir, Felix Bankole, Tomio Takara, Girma Mamo

Abstract:

In the development of a text-to-speech synthesizer, automatic derivation of correct pronunciation from the grapheme form of a text is a central problem. Particularly deriving phonological features which are not shown in orthography is challenging. In the Amharic language, geminates and epenthetic vowels are very crucial for proper pronunciation but neither is shown in orthography. In this paper, we proposed and integrated a morphological analyzer into an Amharic Text-to-Speech system, mainly to predict geminates and epenthetic vowel positions, and prepared a duration modeling method. Amharic Text-to-Speech system (AmhTTS) is a parametric and rule-based system that adopts a cepstral method and uses a source filter model for speech production and a Log Magnitude Approximation (LMA) filter as the vocal tract filter. The naturalness of the system after employing the duration modeling was evaluated by sentence listening test and we achieved an average Mean Opinion Score (MOS) 3.4 (68%) which is moderate. By modeling the duration of geminates and controlling the locations of epenthetic vowel, we are able to synthesize good quality speech. Our system is mainly suitable to be customized for other Ethiopian languages with limited resources.

Keywords: Amharic, gemination, speech synthesis, morphology, epenthesis

Procedia PDF Downloads 87
1507 Variation of Base Width of a Typical Concrete Gravity Dam under Different Seismic Conditions Using Static Seismic Loading

Authors: Prasanna Kumar Khaund, Sukanya Talukdar

Abstract:

A concrete gravity dam is a major hydraulic structure and it is very essential to consider the earthquake forces, to get a proper design base width, so that the entire weight of the dam resists the overturning moment due to earthquake and other forces. The main objective of this study is to obtain the design base width of a dam for different seismic conditions by varying the earthquake coefficients in both vertical and horizontal directions. This shall be done by equating the factor of safety against overturning, factor of safety against sliding and factor of safety against shear friction factor for a dam with their limiting values, under both tail water and no tail water condition. The shape of the Mettur dam in India is considered for the study. The study has been done taking a constant head of water at the reservoir, which is the maximum reservoir water level and a constant height of tail water. Using linear approximation method of Newton Raphson, the obtained equations against different factors of safety under different earthquake conditions are solved using a programme in C++ to get different values of base width of dam for varying earthquake conditions.

Keywords: design base width, horizontal earthquake coefficient, tail water, vertical earthquake coefficient

Procedia PDF Downloads 282
1506 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications

Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso

Abstract:

The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.

Keywords: interferometry, MIMO RADAR, SAR, tomography

Procedia PDF Downloads 195
1505 Maximum Deformation Estimation for Reinforced Concrete Buildings Using Equivalent Linearization Method

Authors: Chien-Kuo Chiu

Abstract:

In the displacement-based seismic design and evaluation, equivalent linearization method is one of the approximation methods to estimate the maximum inelastic displacement response of a system. In this study, the accuracy of two equivalent linearization methods are investigated. The investigation consists of three soil condition in Taiwan (Taipei Basin 1, 2, and 3) and five different heights of building (H_r= 10, 20, 30, 40, and 50 m). The first method is the Taiwan equivalent linearization method (TELM) which was proposed based on Japanese equivalent linear method considering the modification factor, α_T= 0.85. On the basis of Lin and Miranda study, the second method is proposed with some modification considering Taiwan soil conditions. From this study, it is shown that Taiwanese equivalent linearization method gives better estimation compared to the modified Lin and Miranda method (MLM). The error index for the Taiwanese equivalent linearization method are 16%, 13%, and 12% for Taipei Basin 1, 2, and 3, respectively. Furthermore, a ductility demand spectrum of single-degree-of-freedom (SDOF) system is presented in this study as a guide for engineers to estimate the ductility demand of a structure.

Keywords: displacement-based design, ductility demand spectrum, equivalent linearization method, RC buildings, single-degree-of-freedom

Procedia PDF Downloads 162
1504 Analytical Solution for Multi-Segmented Toroidal Shells under Uniform Pressure

Authors: Nosakhare Enoma, Alphose Zingoni

Abstract:

The requirements for various toroidal shell forms are increasing due to new applications, available storage space and the consideration of appearance. Because of the complexity of some of these structural forms, the finite element method is nowadays mainly used for their analysis, even for simple static studies. This paper presents an easy-to-use analytical algorithm for pressurized multi-segmented toroidal shells of revolution. The membrane solution, which acts as a particular solution of the bending-theory equations, is developed based on membrane theory of shells, and a general approach is formulated for quantifying discontinuity effects at the shell junctions using the well-known Geckeler’s approximation. On superimposing these effects, and applying the ensuing solution to the problem of the pressurized toroid with four segments, closed-form stress results are obtained for the entire toroid. A numerical example is carried out using the developed method. The analytical results obtained show excellent agreement with those from the finite element method, indicating that the proposed method can be also used for complementing and verifying FEM results, and providing insights on other related problems.

Keywords: bending theory of shells, membrane hypothesis, pressurized toroid, segmented toroidal vessel, shell analysis

Procedia PDF Downloads 320
1503 Unsupervised Part-of-Speech Tagging for Amharic Using K-Means Clustering

Authors: Zelalem Fantahun

Abstract:

Part-of-speech tagging is the process of assigning a part-of-speech or other lexical class marker to each word into naturally occurring text. Part-of-speech tagging is the most fundamental and basic task almost in all natural language processing. In natural language processing, the problem of providing large amount of manually annotated data is a knowledge acquisition bottleneck. Since, Amharic is one of under-resourced language, the availability of tagged corpus is the bottleneck problem for natural language processing especially for POS tagging. A promising direction to tackle this problem is to provide a system that does not require manually tagged data. In unsupervised learning, the learner is not provided with classifications. Unsupervised algorithms seek out similarity between pieces of data in order to determine whether they can be characterized as forming a group. This paper explicates the development of unsupervised part-of-speech tagger using K-Means clustering for Amharic language since large amount of data is produced in day-to-day activities. In the development of the tagger, the following procedures are followed. First, the unlabeled data (raw text) is divided into 10 folds and tokenization phase takes place; at this level, the raw text is chunked at sentence level and then into words. The second phase is feature extraction which includes word frequency, syntactic and morphological features of a word. The third phase is clustering. Among different clustering algorithms, K-means is selected and implemented in this study that brings group of similar words together. The fourth phase is mapping, which deals with looking at each cluster carefully and the most common tag is assigned to a group. This study finds out two features that are capable of distinguishing one part-of-speech from others these are morphological feature and positional information and show that it is possible to use unsupervised learning for Amharic POS tagging. In order to increase performance of the unsupervised part-of-speech tagger, there is a need to incorporate other features that are not included in this study, such as semantic related information. Finally, based on experimental result, the performance of the system achieves a maximum of 81% accuracy.

Keywords: POS tagging, Amharic, unsupervised learning, k-means

Procedia PDF Downloads 451
1502 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia

Authors: Rohan Bhasin

Abstract:

Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.

Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM

Procedia PDF Downloads 164
1501 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market

Authors: Cristian Păuna

Abstract:

In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.

Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex

Procedia PDF Downloads 130