Search results for: random parameters
10512 A Statistical Approach to Predict and Classify the Commercial Hatchability of Chickens Using Extrinsic Parameters of Breeders and Eggs
Authors: M. S. Wickramarachchi, L. S. Nawarathna, C. M. B. Dematawewa
Abstract:
Hatchery performance is critical for the profitability of poultry breeder operations. Some extrinsic parameters of eggs and breeders cause to increase or decrease the hatchability. This study aims to identify the affecting extrinsic parameters on the commercial hatchability of local chicken's eggs and determine the most efficient classification model with a hatchability rate greater than 90%. In this study, seven extrinsic parameters were considered: egg weight, moisture loss, breeders age, number of fertilised eggs, shell width, shell length, and shell thickness. Multiple linear regression was performed to determine the most influencing variable on hatchability. First, the correlation between each parameter and hatchability were checked. Then a multiple regression model was developed, and the accuracy of the fitted model was evaluated. Linear Discriminant Analysis (LDA), Classification and Regression Trees (CART), k-Nearest Neighbors (kNN), Support Vector Machines (SVM) with a linear kernel, and Random Forest (RF) algorithms were applied to classify the hatchability. This grouping process was conducted using binary classification techniques. Hatchability was negatively correlated with egg weight, breeders' age, shell width, shell length, and positive correlations were identified with moisture loss, number of fertilised eggs, and shell thickness. Multiple linear regression models were more accurate than single linear models regarding the highest coefficient of determination (R²) with 94% and minimum AIC and BIC values. According to the classification results, RF, CART, and kNN had performed the highest accuracy values 0.99, 0.975, and 0.972, respectively, for the commercial hatchery process. Therefore, the RF is the most appropriate machine learning algorithm for classifying the breeder outcomes, which are economically profitable or not, in a commercial hatchery.Keywords: classification models, egg weight, fertilised eggs, multiple linear regression
Procedia PDF Downloads 8810511 Using Predictive Analytics to Identify First-Year Engineering Students at Risk of Failing
Authors: Beng Yew Low, Cher Liang Cha, Cheng Yong Teoh
Abstract:
Due to a lack of continual assessment or grade related data, identifying first-year engineering students in a polytechnic education at risk of failing is challenging. Our experience over the years tells us that there is no strong correlation between having good entry grades in Mathematics and the Sciences and excelling in hardcore engineering subjects. Hence, identifying students at risk of failure cannot be on the basis of entry grades in Mathematics and the Sciences alone. These factors compound the difficulty of early identification and intervention. This paper describes the development of a predictive analytics model in the early detection of students at risk of failing and evaluates its effectiveness. Data from continual assessments conducted in term one, supplemented by data of student psychological profiles such as interests and study habits, were used. Three classification techniques, namely Logistic Regression, K Nearest Neighbour, and Random Forest, were used in our predictive model. Based on our findings, Random Forest was determined to be the strongest predictor with an Area Under the Curve (AUC) value of 0.994. Correspondingly, the Accuracy, Precision, Recall, and F-Score were also highest among these three classifiers. Using this Random Forest Classification technique, students at risk of failure could be identified at the end of term one. They could then be assigned to a Learning Support Programme at the beginning of term two. This paper gathers the results of our findings. It also proposes further improvements that can be made to the model.Keywords: continual assessment, predictive analytics, random forest, student psychological profile
Procedia PDF Downloads 13610510 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods
Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun
Abstract:
Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics
Procedia PDF Downloads 46910509 Approximation of the Time Series by Fractal Brownian Motion
Authors: Valeria Bondarenko
Abstract:
In this paper, we propose two problems related to fractal Brownian motion. First problem is simultaneous estimation of two parameters, Hurst exponent and the volatility, that describe this random process. Numerical tests for the simulated fBm provided an efficient method. Second problem is approximation of the increments of the observed time series by a power function by increments from the fractional Brownian motion. Approximation and estimation are shown on the example of real data, daily deposit interest rates.Keywords: fractional Brownian motion, Gausssian processes, approximation, time series, estimation of properties of the model
Procedia PDF Downloads 37710508 Experimental Determination of Aluminum 7075-T6 Parameters Using Stabilized Cycle Tests to Predict Thermal Ratcheting
Authors: Armin Rahmatfam, Mohammad Zehsaz, Farid Vakili Tahami, Nasser Ghassembaglou
Abstract:
In this paper the thermal ratcheting, kinematic hardening parameters C, γ, isotropic hardening parameters and also k, b, Q combined isotropic/kinematic hardening parameters have been obtained experimentally from the monotonic, strain controlled cyclic tests at room and elevated temperatures of 20°C, 100°C, and 400°C. These parameters are used in nonlinear combined isotropic/kinematic hardening model to predict better description of the loading and reloading cycles in the cyclic indentation as well as thermal ratcheting. For this purpose, three groups of specimens made of Aluminum 7075-T6 have been investigated. After each test and using stable hysteretic cycles, material parameters have been obtained for using in combined nonlinear isotropic/kinematic hardening models. Also the methodology of obtaining the correct kinematic/isotropic hardening parameters is presented.Keywords: combined hardening model, kinematic hardening, isotropic hardening, cyclic tests
Procedia PDF Downloads 48010507 Quantum Inspired Security on a Mobile Phone
Authors: Yu Qin, Wanjiaman Li
Abstract:
The widespread use of mobile electronic devices increases the complexities of mobile security. This thesis aims to provide a secure communication environment for smartphone users. Some research proves that the one-time pad is one of the securest encryption methods, and that the key distribution problem can be solved by using the QKD (quantum key distribution). The objective of this project is to design an Android APP (application) to exchange several random keys between mobile phones. Inspired by QKD, the developed APP uses the quick response (QR) code as a carrier to dispatch large amounts of one-time keys. After evaluating the performance of APP, it allows the mobile phone to capture and decode 1800 bytes of random data in 600ms. The continuous scanning mode of APP is designed to improve the overall transmission performance and user experience, and the maximum transmission rate of this mode is around 2200 bytes/s. The omnidirectional readability and error correction capability of QR code gives it a better real-life application, and the features of adequate storage capacity and quick response optimize overall transmission efficiency. The security of this APP is guaranteed since QR code is exchanged face-to-face, eliminating the risk of being eavesdropped. Also, the id of QR code is the only message that would be transmitted through the whole communication. The experimental results show this project can achieve superior transmission performance, and the correlation between the transmission rate of the system and several parameters, such as the QR code size, has been analyzed. In addition, some existing technologies and the main findings in the context of the project are summarized and critically compared in detail.Keywords: one-time pad, QKD (quantum key distribution), QR code, application
Procedia PDF Downloads 14610506 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach
Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta
Abstract:
Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.Keywords: support vector machines, decision tree, random forest
Procedia PDF Downloads 4210505 An Experimental Analysis of Squeeze Casting Parameters for 2017 a Wrought Al Alloy
Authors: Mohamed Ben Amar, Najib Souissi, Chedly Bradai
Abstract:
A Taguchi design investigation has been made into the relationship between the ductility and process variables in a squeeze cast 2017A wrought aluminium alloy. The considered process parameters were: squeeze pressure, melt temperature and die preheating temperature. An orthogonal array (OA), main effect, signal-to-noise (S/N) ratio, and the analysis of variance (ANOVA) are employed to analyze the effect of casting parameters. The results have shown that the selected parameters significantly affect the ductility of 2017A wrought Al alloy castings. Optimal squeeze cast process parameters were provided to illustrate the proposed approach and the results were proven to be trustworthy through practical experiments.Keywords: Taguchi method, squeeze casting, process parameters, ductility, microstructure
Procedia PDF Downloads 40010504 Performance Comparison of Cooperative Banks in the EU, USA and Canada
Authors: Matěj Kuc
Abstract:
This paper compares different types of profitability measures of cooperative banks from two developed regions: the European Union and the United States of America together with Canada. We created balanced dataset of more than 200 cooperative banks covering 2011-2016 period. We made series of tests and run Random Effects estimation on panel data. We found that American and Canadian cooperatives are more profitable in terms of return on assets (ROA) and return on equity (ROE). There is no significant difference in net interest margin (NIM). Our results show that the North American cooperative banks accommodated better to the current market environment.Keywords: cooperative banking, panel data, profitability measures, random effects
Procedia PDF Downloads 11310503 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 4710502 A Sequential Approach for Random-Effects Meta-Analysis
Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya
Abstract:
The objective in meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence based for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research finding changed with year publication and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable to fixed effect model (FEM). For random-effects model (REM), the analysis incorporates the heterogeneity variance, tau-squared and its estimation create complications. In this paper proposed the use of Gombay and Serbian (2005) truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring of REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of application.Keywords: meta-analysis, random-effects model, sequential test, temporal changes in effect sizes
Procedia PDF Downloads 46910501 Discontinuous Spacetime with Vacuum Holes as Explanation for Gravitation, Quantum Mechanics and Teleportation
Authors: Constantin Z. Leshan
Abstract:
Hole Vacuum theory is based on discontinuous spacetime that contains vacuum holes. Vacuum holes can explain gravitation, some laws of quantum mechanics and allow teleportation of matter. All massive bodies emit a flux of holes which curve the spacetime; if we increase the concentration of holes, it leads to length contraction and time dilation because the holes do not have the properties of extension and duration. In the limited case when space consists of holes only, the distance between every two points is equal to zero and time stops - outside of the Universe, the extension and duration properties do not exist. For this reason, the vacuum hole is the only particle in physics capable of describing gravitation using its own properties only. All microscopic particles must 'jump' continually and 'vibrate' due to the appearance of holes (impassable microscopic 'walls' in space), and it is the cause of the quantum behavior. Vacuum holes can explain the entanglement, non-locality, wave properties of matter, tunneling, uncertainty principle and so on. Particles do not have trajectories because spacetime is discontinuous and has impassable microscopic 'walls' due to the simple mechanical motion is impossible at small scale distances; it is impossible to 'trace' a straight line in the discontinuous spacetime because it contains the impassable holes. Spacetime 'boils' continually due to the appearance of the vacuum holes. For teleportation to be possible, we must send a body outside of the Universe by enveloping it with a closed surface consisting of vacuum holes. Since a material body cannot exist outside of the Universe, it reappears instantaneously in a random point of the Universe. Since a body disappears in one volume and reappears in another random volume without traversing the physical space between them, such a transportation method can be called teleportation (or Hole Teleportation). It is shown that Hole Teleportation does not violate causality and special relativity due to its random nature and other properties. Although Hole Teleportation has a random nature, it can be used for colonization of extrasolar planets by the help of the method called 'random jumps': after a large number of random teleportation jumps, there is a probability that the spaceship may appear near a habitable planet. We can create vacuum holes experimentally using the method proposed by Descartes: we must remove a body from the vessel without permitting another body to occupy this volume.Keywords: border of the Universe, causality violation, perfect isolation, quantum jumps
Procedia PDF Downloads 42710500 Number of Necessary Parameters for Parametrization of Stabilizing Controllers for two times two RHinf Systems
Authors: Kazuyoshi Mori
Abstract:
In this paper, we consider the number of parameters for the parametrization of stabilizing controllers for RHinf systems with size 2 × 2. Fortunately, any plant of this model can admit doubly coprime factorization. Thus we can use the Youla parameterization to parametrize the stabilizing contollers . However, Youla parameterization does not give itself the minimal number of parameters. This paper shows that the minimal number of parameters is four. As a result, we show that the Youla parametrization naturally gives the parameterization of stabilizing controllers with minimal numbers.Keywords: RHinfo, parameterization, number of parameters, multi-input, multi-output systems
Procedia PDF Downloads 41010499 Intelligent Production Machine
Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan
Abstract:
This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.Keywords: cutting process, sound processing, intelligent late, sound analysis
Procedia PDF Downloads 33410498 Parameter Estimation of Additive Genetic and Unique Environment (AE) Model on Diabetes Mellitus Type 2 Using Bayesian Method
Authors: Andi Darmawan, Dewi Retno Sari Saputro, Purnami Widyaningsih
Abstract:
Diabetes mellitus (DM) is a chronic disease in human that occurred if pancreas cannot produce enough of insulin hormone or the body uses ineffectively insulin hormone which causes increasing level of glucose in the blood, or it was called hyperglycemia. In Indonesia, DM is a serious disease on health because it can cause blindness, kidney disease, diabetic feet (gangrene), and stroke. The type of DM criteria can also be divided based on the main causes; they are DM type 1, type 2, and gestational. Diabetes type 1 or previously known as insulin-independent diabetes is due to a lack of production of insulin hormone. Diabetes type 2 or previously known as non-insulin dependent diabetes is due to ineffective use of insulin while gestational diabetes is a hyperglycemia that found during pregnancy. The most one type commonly found in patient is DM type 2. The main factors of this disease are genetic (A) and life style (E). Those disease with 2 factors can be constructed with additive genetic and unique environment (AE) model. In this article was discussed parameter estimation of AE model using Bayesian method and the inheritance character simulation on parent-offspring. On the AE model, there are response variable, predictor variables, and parameters were capable of representing the number of population on research. The population can be measured through a taken random sample. The response and predictor variables can be determined by sample while the parameters are unknown, so it was required to estimate the parameters based on the sample. Estimation of AE model parameters was obtained based on a joint posterior distribution. The simulation was conducted to get the value of genetic variance and life style variance. The results of simulation are 0.3600 for genetic variance and 0.0899 for life style variance. Therefore, the variance of genetic factor in DM type 2 is greater than life style.Keywords: AE model, Bayesian method, diabetes mellitus type 2, genetic, life style
Procedia PDF Downloads 28510497 Analysis of Overall Thermo-Elastic Properties of Random Particulate Nanocomposites with Various Interphase Models
Authors: Lidiia Nazarenko, Henryk Stolarski, Holm Altenbach
Abstract:
In the paper, a (hierarchical) approach to analysis of thermo-elastic properties of random composites with interphases is outlined and illustrated. It is based on the statistical homogenization method – the method of conditional moments – combined with recently introduced notion of the energy-equivalent inhomogeneity which, in this paper, is extended to include thermal effects. After exposition of the general principles, the approach is applied in the investigation of the effective thermo-elastic properties of a material with randomly distributed nanoparticles. The basic idea of equivalent inhomogeneity is to replace the inhomogeneity and the surrounding it interphase by a single equivalent inhomogeneity of constant stiffness tensor and coefficient of thermal expansion, combining thermal and elastic properties of both. The equivalent inhomogeneity is then perfectly bonded to the matrix which allows to analyze composites with interphases using techniques devised for problems without interphases. From the mechanical viewpoint, definition of the equivalent inhomogeneity is based on Hill’s energy equivalence principle, applied to the problem consisting only of the original inhomogeneity and its interphase. It is more general than the definitions proposed in the past in that, conceptually and practically, it allows to consider inhomogeneities of various shapes and various models of interphases. This is illustrated considering spherical particles with two models of interphases, Gurtin-Murdoch material surface model and spring layer model. The resulting equivalent inhomogeneities are subsequently used to determine effective thermo-elastic properties of randomly distributed particulate composites. The effective stiffness tensor and coefficient of thermal extension of the material with so defined equivalent inhomogeneities are determined by the method of conditional moments. Closed-form expressions for the effective thermo-elastic parameters of a composite consisting of a matrix and randomly distributed spherical inhomogeneities are derived for the bulk and the shear moduli as well as for the coefficient of thermal expansion. Dependence of the effective parameters on the interphase properties is included in the resulting expressions, exhibiting analytically the nature of the size-effects in nanomaterials. As a numerical example, the epoxy matrix with randomly distributed spherical glass particles is investigated. The dependence of the effective bulk and shear moduli, as well as of the effective thermal expansion coefficient on the particle volume fraction (for different radii of nanoparticles) and on the radius of nanoparticle (for fixed volume fraction of nanoparticles) for different interphase models are compared to and discussed in the context of other theoretical predictions. Possible applications of the proposed approach to short-fiber composites with various types of interphases are discussed.Keywords: effective properties, energy equivalence, Gurtin-Murdoch surface model, interphase, random composites, spherical equivalent inhomogeneity, spring layer model
Procedia PDF Downloads 18610496 Taguchi Method for Analyzing a Flexible Integrated Logistics Network
Authors: E. Behmanesh, J. Pannek
Abstract:
Logistics network design is known as one of the strategic decision problems. As these kinds of problems belong to the category of NP-hard problems, traditional ways are failed to find an optimal solution in short time. In this study, we attempt to involve reverse flow through an integrated design of forward/reverse supply chain network that formulated into a mixed integer linear programming. This Integrated, multi-stages model is enriched by three different delivery path which makes the problem more complex. To tackle with such an NP-hard problem a revised random path direct encoding method based memetic algorithm is considered as the solution methodology. Each algorithm has some parameters that need to be investigate to reveal the best performance. In this regard, Taguchi method is adapted to identify the optimum operating condition of the proposed memetic algorithm to improve the results. In this study, four factors namely, population size, crossover rate, local search iteration and a number of iteration are considered. Analyzing the parameters and improvement in results are the outlook of this research.Keywords: integrated logistics network, flexible path, memetic algorithm, Taguchi method
Procedia PDF Downloads 19110495 Secure Watermarking not at the Cost of Low Robustness
Authors: Jian Cao
Abstract:
This paper describes a novel watermarking technique which we call the random direction embedding (RDE) watermarking. Unlike traditional watermarking techniques, the watermark energy after the RDE embedding does not focus on a fixed direction, leading to the security against the traditional unauthorized watermark removal attack. In addition, the experimental results show that when compared with the existing secure watermarking, namely natural watermarking (NW), the RDE watermarking gains significant improvement in terms of robustness. In fact, the security of the RDE watermarking is not at the cost of low robustness, and it can even achieve more robust than the traditional spread spectrum watermarking, which has been shown to be very insecure.Keywords: robustness, spread spectrum watermarking, watermarking security, random direction embedding (RDE)
Procedia PDF Downloads 38510494 A Study of Non Linear Partial Differential Equation with Random Initial Condition
Authors: Ayaz Ahmad
Abstract:
In this work, we present the effect of noise on the solution of a partial differential equation (PDE) in three different setting. We shall first consider random initial condition for two nonlinear dispersive PDE the non linear Schrodinger equation and the Kortteweg –de vries equation and analyse their effect on some special solution , the soliton solutions.The second case considered a linear partial differential equation , the wave equation with random initial conditions allow to substantially decrease the computational and data storage costs of an algorithm to solve the inverse problem based on the boundary measurements of the solution of this equation. Finally, the third example considered is that of the linear transport equation with a singular drift term, when we shall show that the addition of a multiplicative noise term forbids the blow up of solutions under a very weak hypothesis for which we have finite time blow up of a solution in the deterministic case. Here we consider the problem of wave propagation, which is modelled by a nonlinear dispersive equation with noisy initial condition .As observed noise can also be introduced directly in the equations.Keywords: drift term, finite time blow up, inverse problem, soliton solution
Procedia PDF Downloads 21610493 Multiscale Modelization of Multilayered Bi-Dimensional Soils
Authors: I. Hosni, L. Bennaceur Farah, N. Saber, R Bennaceur
Abstract:
Soil moisture content is a key variable in many environmental sciences. Even though it represents a small proportion of the liquid freshwater on Earth, it modulates interactions between the land surface and the atmosphere, thereby influencing climate and weather. Accurate modeling of the above processes depends on the ability to provide a proper spatial characterization of soil moisture. The measurement of soil moisture content allows assessment of soil water resources in the field of hydrology and agronomy. The second parameter in interaction with the radar signal is the geometric structure of the soil. Most traditional electromagnetic models consider natural surfaces as single scale zero mean stationary Gaussian random processes. Roughness behavior is characterized by statistical parameters like the Root Mean Square (RMS) height and the correlation length. Then, the main problem is that the agreement between experimental measurements and theoretical values is usually poor due to the large variability of the correlation function, and as a consequence, backscattering models have often failed to predict correctly backscattering. In this study, surfaces are considered as band-limited fractal random processes corresponding to a superposition of a finite number of one-dimensional Gaussian process each one having a spatial scale. Multiscale roughness is characterized by two parameters, the first one is proportional to the RMS height, and the other one is related to the fractal dimension. Soil moisture is related to the complex dielectric constant. This multiscale description has been adapted to two-dimensional profiles using the bi-dimensional wavelet transform and the Mallat algorithm to describe more correctly natural surfaces. We characterize the soil surfaces and sub-surfaces by a three layers geo-electrical model. The upper layer is described by its dielectric constant, thickness, a multiscale bi-dimensional surface roughness model by using the wavelet transform and the Mallat algorithm, and volume scattering parameters. The lower layer is divided into three fictive layers separated by an assumed plane interface. These three layers were modeled by an effective medium characterized by an apparent effective dielectric constant taking into account the presence of air pockets in the soil. We have adopted the 2D multiscale three layers small perturbations model including, firstly air pockets in the soil sub-structure, and then a vegetable canopy in the soil surface structure, that is to simulate the radar backscattering. A sensitivity analysis of backscattering coefficient dependence on multiscale roughness and new soil moisture has been performed. Later, we proposed to change the dielectric constant of the multilayer medium because it takes into account the different moisture values of each layer in the soil. A sensitivity analysis of the backscattering coefficient, including the air pockets in the volume structure with respect to the multiscale roughness parameters and the apparent dielectric constant, was carried out. Finally, we proposed to study the behavior of the backscattering coefficient of the radar on a soil having a vegetable layer in its surface structure.Keywords: multiscale, bidimensional, wavelets, backscattering, multilayer, SPM, air pockets
Procedia PDF Downloads 12510492 Machine Learning Techniques for Estimating Ground Motion Parameters
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine
Procedia PDF Downloads 12310491 Efficient Signcryption Scheme with Provable Security for Smart Card
Authors: Jayaprakash Kar, Daniyal M. Alghazzawi
Abstract:
The article proposes a novel construction of signcryption scheme with provable security which is most suited to implement on smart card. It is secure in random oracle model and the security relies on Decisional Bilinear Diffie-Hellmann Problem. The proposed scheme is secure against adaptive chosen ciphertext attack (indistiguishbility) and adaptive chosen message attack (unforgebility). Also, it is inspired by zero-knowledge proof. The two most important security goals for smart card are Confidentiality and authenticity. These functions are performed in one logical step in low computational cost.Keywords: random oracle, provable security, unforgebility, smart card
Procedia PDF Downloads 59310490 Seismic Hazard Assessment of Tehran
Authors: Dorna Kargar, Mehrasa Masih
Abstract:
Due to its special geological and geographical conditions, Iran has always been exposed to various natural hazards. Earthquake is one of the natural hazards with random nature that can cause significant financial damages and casualties. This is a serious threat, especially in areas with active faults. Therefore, considering the population density in some parts of the country, locating and zoning high-risk areas are necessary and significant. In the present study, seismic hazard assessment via probabilistic and deterministic method for Tehran, the capital of Iran, which is located in Alborz-Azerbaijan province, has been done. The seismicity study covers a range of 200 km from the north of Tehran (X=35.74° and Y= 51.37° in LAT-LONG coordinate system) to identify the seismic sources and seismicity parameters of the study region. In order to identify the seismic sources, geological maps at the scale of 1: 250,000 are used. In this study, we used Kijko-Sellevoll's method (1992) to estimate seismicity parameters. The maximum likelihood estimation of earthquake hazard parameters (maximum regional magnitude Mmax, activity rate λ, and the Gutenberg-Richter parameter b) from incomplete data files is extended to the case of uncertain magnitude values. By the combination of seismicity and seismotectonic studies of the site, the acceleration with antiseptic probability may happen during the useful life of the structure is calculated with probabilistic and deterministic methods. Applying the results of performed seismicity and seismotectonic studies in the project and applying proper weights in used attenuation relationship, maximum horizontal and vertical acceleration for return periods of 50, 475, 950 and 2475 years are calculated. Horizontal peak ground acceleration on the seismic bedrock for 50, 475, 950 and 2475 return periods are 0.12g, 0.30g, 0.37g and 0.50, and Vertical peak ground acceleration on the seismic bedrock for 50, 475, 950 and 2475 return periods are 0.08g, 0.21g, 0.27g and 0.36g.Keywords: peak ground acceleration, probabilistic and deterministic, seismic hazard assessment, seismicity parameters
Procedia PDF Downloads 7110489 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation
Authors: Sopheak Sorn, Kwok Yip Szeto
Abstract:
Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio
Procedia PDF Downloads 42010488 Experimental Study on Improving the Engineering Properties of Sand Dunes Using Random Fibers-Geogrid Reinforcement
Authors: Adel M. Belal, Sameh Abu El-Soud, Mariam Farid
Abstract:
This study presents the effect of reinforcement inclusions (fibers-geogrids) on fine sand bearing capacity under strip footings. Experimental model tests were carried out using a rectangular plates [(10cm x 38 cm), (7.5 cm x 38 cm), and (12.5 cm x 38 cm)] with a geogrids and randomly reinforced fibers. The width and depth of the geogrid were varied to determine their effects on the engineering properties of treated poorly graded fine sand. Laboratory model test results for the ultimate stresses and the settlement of a rigid strip foundation supported by single and multi-layered fiber-geogrid-reinforced sand are presented. The number of layers of geogrid was varied between 1 to 4. The effect of the first geogrid reinforcement depth, the spacing between the reinforcement and its length on the bearing capacity is investigated by experimental program. Results show that the use of flexible random fibers with a content of 0.125% by weight of the treated sand dunes, with 3 geogrid reinforcement layers, u/B= 0.25 and L/B=7.5, has a significant increase in the bearing capacity of the proposed system.Keywords: earth reinforcement, geogrid, random fiber, reinforced soil
Procedia PDF Downloads 31210487 Comparison of Multivariate Adaptive Regression Splines and Random Forest Regression in Predicting Forced Expiratory Volume in One Second
Authors: P. V. Pramila , V. Mahesh
Abstract:
Pulmonary Function Tests are important non-invasive diagnostic tests to assess respiratory impairments and provides quantifiable measures of lung function. Spirometry is the most frequently used measure of lung function and plays an essential role in the diagnosis and management of pulmonary diseases. However, the test requires considerable patient effort and cooperation, markedly related to the age of patients esulting in incomplete data sets. This paper presents, a nonlinear model built using Multivariate adaptive regression splines and Random forest regression model to predict the missing spirometric features. Random forest based feature selection is used to enhance both the generalization capability and the model interpretability. In the present study, flow-volume data are recorded for N= 198 subjects. The ranked order of feature importance index calculated by the random forests model shows that the spirometric features FVC, FEF 25, PEF,FEF 25-75, FEF50, and the demographic parameter height are the important descriptors. A comparison of performance assessment of both models prove that, the prediction ability of MARS with the `top two ranked features namely the FVC and FEF 25 is higher, yielding a model fit of R2= 0.96 and R2= 0.99 for normal and abnormal subjects. The Root Mean Square Error analysis of the RF model and the MARS model also shows that the latter is capable of predicting the missing values of FEV1 with a notably lower error value of 0.0191 (normal subjects) and 0.0106 (abnormal subjects). It is concluded that combining feature selection with a prediction model provides a minimum subset of predominant features to train the model, yielding better prediction performance. This analysis can assist clinicians with a intelligence support system in the medical diagnosis and improvement of clinical care.Keywords: FEV, multivariate adaptive regression splines pulmonary function test, random forest
Procedia PDF Downloads 31110486 Interpretation and Prediction of Geotechnical Soil Parameters Using Ensemble Machine Learning
Authors: Goudjil kamel, Boukhatem Ghania, Jlailia Djihene
Abstract:
This paper delves into the development of a sophisticated desktop application designed to calculate soil bearing capacity and predict limit pressure. Drawing from an extensive review of existing methodologies, the study meticulously examines various approaches employed in soil bearing capacity calculations, elucidating their theoretical foundations and practical applications. Furthermore, the study explores the burgeoning intersection of artificial intelligence (AI) and geotechnical engineering, underscoring the transformative potential of AI- driven solutions in enhancing predictive accuracy and efficiency.Central to the research is the utilization of cutting-edge machine learning techniques, including Artificial Neural Networks (ANN), XGBoost, and Random Forest, for predictive modeling. Through comprehensive experimentation and rigorous analysis, the efficacy and performance of each method are rigorously evaluated, with XGBoost emerging as the preeminent algorithm, showcasing superior predictive capabilities compared to its counterparts. The study culminates in a nuanced understanding of the intricate dynamics at play in geotechnical analysis, offering valuable insights into optimizing soil bearing capacity calculations and limit pressure predictions. By harnessing the power of advanced computational techniques and AI-driven algorithms, the paper presents a paradigm shift in the realm of geotechnical engineering, promising enhanced precision and reliability in civil engineering projects.Keywords: limit pressure of soil, xgboost, random forest, bearing capacity
Procedia PDF Downloads 2510485 Manufacturing Anomaly Detection Using a Combination of Gated Recurrent Unit Network and Random Forest Algorithm
Authors: Atinkut Atinafu Yilma, Eyob Messele Sefene
Abstract:
Anomaly detection is one of the essential mechanisms to control and reduce production loss, especially in today's smart manufacturing. Quick anomaly detection aids in reducing the cost of production by minimizing the possibility of producing defective products. However, developing an anomaly detection model that can rapidly detect a production change is challenging. This paper proposes Gated Recurrent Unit (GRU) combined with Random Forest (RF) to detect anomalies in the production process in real-time quickly. The GRU is used as a feature detector, and RF as a classifier using the input features from GRU. The model was tested using various synthesis and real-world datasets against benchmark methods. The results show that the proposed GRU-RF outperforms the benchmark methods with the shortest time taken to detect anomalies in the production process. Based on the investigation from the study, this proposed model can eliminate or reduce unnecessary production costs and bring a competitive advantage to manufacturing industries.Keywords: anomaly detection, multivariate time series data, smart manufacturing, gated recurrent unit network, random forest
Procedia PDF Downloads 12210484 Analysis of Performance of 3T1D Dynamic Random-Access Memory Cell
Authors: Nawang Chhunid, Gagnesh Kumar
Abstract:
On-chip memories consume a significant portion of the overall die space and power in modern microprocessors. On-chip caches depend on Static Random-Access Memory (SRAM) cells and scaling of technology occurring as per Moore’s law. Unfortunately, the scaling is affecting stability, performance, and leakage power which will become major problems for future SRAMs in aggressive nanoscale technologies due to increasing device mismatch and variations. 3T1D Dynamic Random-Access Memory (DRAM) cell is a non-destructive read DRAM cell with three transistors and a gated diode. In 3T1D DRAM cell gated diode (D1) acts as a storage device and also as an amplifier, which leads to fast read access. Due to its high tolerance to process variation, high density, and low cost of memory as compared to 6T SRAM cell, it is universally used by the advanced microprocessor for on chip data and program memory. In the present paper, it has been shown that 3T1D DRAM cell can perform better in terms of fast read access as compared to 6T, 4T, 3T SRAM cells, respectively.Keywords: DRAM Cell, Read Access Time, Retention Time, Average Power dissipation
Procedia PDF Downloads 31310483 Manufacture and Characterization of Poly (Tri Methylene Terephthalate) Nanofibers by Electrospinning
Authors: Omid Saligheh
Abstract:
Poly (tri methylene terephthalate) (PTT) nanofibers were prepared by electrospinning, being directly deposited in the form of a random fibers web. The effect of changing processing parameters such as solution concentration and electrospinning voltage on the morphology of the electrospun PTT nanofibers was investigated with scanning electron microscopy (SEM). The electrospun fibers diameter increased with rising concentration and decreased by increasing the electrospinning voltage, thermal and mechanical properties of electrospun fibers were characterized by DSC and tensile testing, respectively.Keywords: poly tri methylene terephthalate, electrospinning, morphology, thermal behavior, mechanical properties
Procedia PDF Downloads 87