Search results for: k0-based method
16078 Physics-Informed Neural Network for Predicting Strain Demand in Inelastic Pipes under Ground Movement with Geometric and Soil Resistance Nonlinearities
Authors: Pouya Taraghi, Yong Li, Nader Yoosef-Ghodsi, Muntaseer Kainat, Samer Adeeb
Abstract:
Buried pipelines play a crucial role in the transportation of energy products such as oil, gas, and various chemical fluids, ensuring their efficient and safe distribution. However, these pipelines are often susceptible to ground movements caused by geohazards like landslides, fault movements, lateral spreading, and more. Such ground movements can lead to strain-induced failures in pipes, resulting in leaks or explosions, leading to fires, financial losses, environmental contamination, and even loss of human life. Therefore, it is essential to study how buried pipelines respond when traversing geohazard-prone areas to assess the potential impact of ground movement on pipeline design. As such, this study introduces an approach called the Physics-Informed Neural Network (PINN) to predict the strain demand in inelastic pipes subjected to permanent ground displacement (PGD). This method uses a deep learning framework that does not require training data and makes it feasible to consider more realistic assumptions regarding existing nonlinearities. It leverages the underlying physics described by differential equations to approximate the solution. The study analyzes various scenarios involving different geohazard types, PGD values, and crossing angles, comparing the predictions with results obtained from finite element methods. The findings demonstrate a good agreement between the results of the proposed method and the finite element method, highlighting its potential as a simulation-free, data-free, and meshless alternative. This study paves the way for further advancements, such as the simulation-free reliability assessment of pipes subjected to PGD, as part of ongoing research that leverages the proposed method.Keywords: strain demand, inelastic pipe, permanent ground displacement, machine learning, physics-informed neural network
Procedia PDF Downloads 6116077 Particle Swarm Optimization Based Method for Minimum Initial Marking in Labeled Petri Nets
Authors: Hichem Kmimech, Achref Jabeur Telmoudi, Lotfi Nabli
Abstract:
The estimation of the initial marking minimum (MIM) is a crucial problem in labeled Petri nets. In the case of multiple choices, the search for the initial marking leads to a problem of optimization of the minimum allocation of resources with two constraints. The first concerns the firing sequence that could be legal on the initial marking with respect to the firing vector. The second deals with the total number of tokens that can be minimal. In this article, the MIM problem is solved by the meta-heuristic particle swarm optimization (PSO). The proposed approach presents the advantages of PSO to satisfy the two previous constraints and find all possible combinations of minimum initial marking with the best computing time. This method, more efficient than conventional ones, has an excellent impact on the resolution of the MIM problem. We prove through a set of definitions, lemmas, and examples, the effectiveness of our approach.Keywords: marking, production system, labeled Petri nets, particle swarm optimization
Procedia PDF Downloads 18016076 User-Awareness from Eye Line Tracing During Specification Writing to Improve Specification Quality
Authors: Yoshinori Wakatake
Abstract:
Many defects after the release of software packages are caused due to omissions of sufficient test items in test specifications. Poor test specifications are detected by manual review, which imposes a high human load. The prevention of omissions depends on the end-user awareness of test specification writers. If test specifications were written while envisioning the behavior of end-users, the number of omissions in test items would be greatly reduced. The paper pays attention to the point that writers who can achieve it differ from those who cannot in not only the description richness but also their gaze information. It proposes a method to estimate the degree of user-awareness of writers through the analysis of their gaze information when writing test specifications. We conduct an experiment to obtain the gaze information of a writer of the test specifications. Test specifications are automatically classified using gaze information. In this method, a Random Forest model is constructed for the classification. The classification is highly accurate. By looking at the explanatory variables which turn out to be important variables, we know behavioral features to distinguish test specifications of high quality from others. It is confirmed they are pupil diameter size and the number and the duration of blinks. The paper also investigates test specifications automatically classified with gaze information to discuss features in their writing ways in each quality level. The proposed method enables us to automatically classify test specifications. It also prevents test item omissions, because it reveals writing features that test specifications of high quality should satisfy.Keywords: blink, eye tracking, gaze information, pupil diameter, quality improvement, specification document, user-awareness
Procedia PDF Downloads 6516075 Stabilization of Transition Metal Chromite Nanoparticles in Silica Matrix
Authors: J. Plocek, P. Holec, S. Kubickova, B. Pacakova, I. Matulkova, A. Mantlikova, I. Němec, D. Niznansky, J. Vejpravova
Abstract:
This article presents summary on preparation and characterization of zinc, copper, cadmium and cobalt chromite nano crystals, embedded in an amorphous silica matrix. The ZnCr2O4/SiO2, CuCr2O4/SiO2, CdCr2O4/SiO2 and CoCr2O4/SiO2 nano composites were prepared by a conventional sol-gel method under acid catalysis. Final heat treatment of the samples was carried out at temperatures in the range of 900–1200 °C to adjust the phase composition and the crystallite size, respectively. The resulting samples were characterized by Powder X-ray diffraction (PXRD), High Resolution Transmission Electron Microscopy (HRTEM), Raman/FTIR spectroscopy and magnetic measurements. Formation of the spinel phase was confirmed in all samples. The average size of the nano crystals was determined from the PXRD data and by direct particle size observation on HRTEM; both results were correlated. The mean particle size (reviewed by HRTEM) was in the range from ~ 4 to 46 nm. The results showed that the sol-gel method can be effectively used for preparation of the spinel chromite nano particles embedded in the silica matrix and the particle size is driven by the type of the cation A2+ in the spinel structure and the temperature of the final heat treatment. Magnetic properties of the nano crystals were found to be just moderately modified in comparison to the bulk phases.Keywords: sol-gel method, nanocomposites, Rietveld refinement, Raman spectroscopy, Fourier transform infrared spectroscopy, magnetic properties, spinel, chromite
Procedia PDF Downloads 21616074 Approximation of Geodesics on Meshes with Implementation in Rhinoceros Software
Authors: Marian Sagat, Mariana Remesikova
Abstract:
In civil engineering, there is a problem how to industrially produce tensile membrane structures that are non-developable surfaces. Nondevelopable surfaces can only be developed with a certain error and we want to minimize this error. To that goal, the non-developable surfaces are cut into plates along to the geodesic curves. We propose a numerical algorithm for finding approximations of open geodesics on meshes and surfaces based on geodesic curvature flow. For practical reasons, it is important to automatize the choice of the time step. We propose a method for automatic setting of the time step based on the diagonal dominance criterion for the matrix of the linear system obtained by discretization of our partial differential equation model. Practical experiments show reliability of this method. Because approximation of the model is made by numerical method based on classic derivatives, it is necessary to solve obstacles which occur for meshes with sharp corners. We solve this problem for big family of meshes with sharp corners via special rotations which can be seen as partial unfolding of the mesh. In practical applications, it is required that the approximation of geodesic has its vertices only on the edges of the mesh. This problem is solved by a specially designed pointing tracking algorithm. We also partially solve the problem of finding geodesics on meshes with holes. We implemented the whole algorithm in Rhinoceros (commercial 3D computer graphics and computer-aided design software ). It is done by using C# language as C# assembly library for Grasshopper, which is plugin in Rhinoceros.Keywords: geodesic, geodesic curvature flow, mesh, Rhinoceros software
Procedia PDF Downloads 15316073 Phytoremediation Aeration System by Using Water Lettuce (Pistia Stratiotes I) Based on Zero Waste to Reduce the Impact of Industrial Liquid Waste in Jember, Indonesia
Authors: Wahyu Eko Diyanto, Amalia Dyah Arumsari, Ulfatu Layinatinnahdiyah Arrosyadi
Abstract:
Tofu industry is one of the local food industry which is can being competitive industry in the ASEAN Economic Community (AEC). However, a lot of tofu entrepreneurs just thinking how to produce good quality product without considering the impact of environmental conditions from the production process. Production of tofu per day requires a number of 15 kg with liquid waste generated is 652.5 liters. That liquid waste is discharged directly into waterways, whereas tofu liquid waste contains organic compounds that quickly unraveled, so it can pollute waterways. In addition, tofu liquid waste is high in Biological Oxygen Demand (BOD), Chemical Oxygen Demand (COD), Total Suspended Solid (TSS), nitrogen and phosphorus. This research is aim to create a method of handling liquid waste effectively and efficiently by using water lettuce. The method is done by observation and experiment by using phytoremediation method in the tofu liquid waste using water lettuce and adding aeration to reduce the concentration of contaminants. The results of the research analyzed the waste quality standard parameters based on SNI (National Standardization Agency of Indonesia). The efficiency concentration and parameters average of tofu liquid waste are obtained pH 3,42% (from 4,0 to be 3,3), COD 76,13% (from 3579 ppm to be 854 ppm), BOD 55 % (from 11600 ppm to be 5242 ppm), TSS 93,6% (from 3174 ppm to be 203 ppm), turbidity is 64,8% (from 977 NTU to be 1013 NTU), and temperature 36oC (from 45oC to be 40oC). The efficiency of these parameters indicates a safe value for the effluent to be channeled in waterways. Water lettuce and tofu liquid waste phytoremediation result will be used as biogas as renewable energy.Keywords: aeration, phytoremediation, water letuce, tofu liquid waste
Procedia PDF Downloads 38216072 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images
Authors: U. Datta
Abstract:
The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection
Procedia PDF Downloads 13616071 A Two-Stage Bayesian Variable Selection Method with the Extension of Lasso for Geo-Referenced Data
Authors: Georgiana Onicescu, Yuqian Shen
Abstract:
Due to the complex nature of geo-referenced data, multicollinearity of the risk factors in public health spatial studies is a commonly encountered issue, which leads to low parameter estimation accuracy because it inflates the variance in the regression analysis. To address this issue, we proposed a two-stage variable selection method by extending the least absolute shrinkage and selection operator (Lasso) to the Bayesian spatial setting, investigating the impact of risk factors to health outcomes. Specifically, in stage I, we performed the variable selection using Bayesian Lasso and several other variable selection approaches. Then, in stage II, we performed the model selection with only the selected variables from stage I and compared again the methods. To evaluate the performance of the two-stage variable selection methods, we conducted a simulation study with different distributions for the risk factors, using geo-referenced count data as the outcome and Michigan as the research region. We considered the cases when all candidate risk factors are independently normally distributed, or follow a multivariate normal distribution with different correlation levels. Two other Bayesian variable selection methods, Binary indicator, and the combination of Binary indicator and Lasso were considered and compared as alternative methods. The simulation results indicated that the proposed two-stage Bayesian Lasso variable selection method has the best performance for both independent and dependent cases considered. When compared with the one-stage approach, and the other two alternative methods, the two-stage Bayesian Lasso approach provides the highest estimation accuracy in all scenarios considered.Keywords: Lasso, Bayesian analysis, spatial analysis, variable selection
Procedia PDF Downloads 14616070 Prevention of Heart Failure Progression in Patients with Post-Infarction Cardiosclerosis After Coronavirus Infection
Authors: Sujayeva V. A., Karpova I. S., Koslataya O. V., Kolyadko M. G., Russkikh I. I., Vankovich E. A.
Abstract:
Objective: The goal of this study is to develop a method for the prevention of the progression of heart failure (HF) in patients with post-infarction cardiosclerosis who have suffered coronavirus infection. Methods: 135 patients with post-infarction cardiosclerosis were divided into 2 groups: Group I - patients who had suffered COVID-19 - 85 people, and Group II - patients who had not suffered COVID-19 - 50 people. Patients of group I, depending on the level of N-terminal fragment of natriuretic peptide (NTproBNP), were divided into 2 subgroups - subgroup A - with HF - 40 people, subgroup B - without HF - 45 people. All patients underwent a clinical examination, echocardiography, electrocardiotopography in 60 leads, computed angiography of the coronary arteries, heart magnetic resonance imaging, NTproBNP. Results: In the post-Covid period, in patients with post-infarction cardiosclerosis, remodeling of the left ventricle and right parts of the heart, deterioration of the systolic-diastolic function of both ventricles, increased pressure in the pulmonary artery, progression of coronary artery atherosclerosis, and an increase in the size of myocardial fibrosis were revealed. The consequence of these changes was the progression of heart failure. The developed method of medical prevention made it possible to improve the clinical course of coronary artery disease and prevent the progression of chronic heart failure in patients with post-infarction cardiosclerosis. Conclusions: In patients with post-infarction cardiosclerosis who initially had HF, after 1 year, according to laboratory and instrumental data, a slight decrease in its severity was revealed. In patients with post-infarction cardiosclerosis who did not have HF before COVID-19, HF developed 1 year after the coronavirus disease, which may be due to the identified process of myocardial fibrosis, which dictates the need to prevent the development of HF in patients with post-infarction cardiosclerosis, even those who did not initially have HF. The proposed method of medical prevention made it possible to improve the clinical course of coronary artery disease in patients with post-infarction cardiosclerosis after COVID-19, both in persons with and without HF, when included in the study. A method of medical prevention in people with post-infarction cardiosclerosis after COVID-19 infection, including spironolactone, loop diuretics, empagliflozin, sacubitril/valsartan, helped prevent the progression of HF.Keywords: elderly, myocardial infarction, COVID-19, prevention
Procedia PDF Downloads 2516069 Numerical Methods versus Bjerksund and Stensland Approximations for American Options Pricing
Authors: Marasovic Branka, Aljinovic Zdravka, Poklepovic Tea
Abstract:
Numerical methods like binomial and trinomial trees and finite difference methods can be used to price a wide range of options contracts for which there are no known analytical solutions. American options are the most famous of that kind of options. Besides numerical methods, American options can be valued with the approximation formulas, like Bjerksund-Stensland formulas from 1993 and 2002. When the value of American option is approximated by Bjerksund-Stensland formulas, the computer time spent to carry out that calculation is very short. The computer time spent using numerical methods can vary from less than one second to several minutes or even hours. However to be able to conduct a comparative analysis of numerical methods and Bjerksund-Stensland formulas, we will limit computer calculation time of numerical method to less than one second. Therefore, we ask the question: Which method will be most accurate at nearly the same computer calculation time?Keywords: Bjerksund and Stensland approximations, computational analysis, finance, options pricing, numerical methods
Procedia PDF Downloads 45716068 Evaluation and Selection of SaaS Product Based on User Preferences
Authors: Boussoualim Nacira, Aklouf Youcef
Abstract:
Software as a Service (SaaS) is a software delivery paradigm in which the product is not installed on-premise, but it is available on Internet and Web. The customers do not pay to possess the software itself but rather to use it. This concept of pay per use is very attractive. Hence, we see increasing number of organizations adopting SaaS. However, each customer is unique, which leads to a very large variation in the requirements off the software. As several suppliers propose SaaS products, the choice of this latter becomes a major issue. When multiple criteria are involved in decision making, we talk about a problem of «Multi-Criteria Decision-Making» (MCDM). Therefore, this paper presents a method to help customers to choose a better SaaS product satisfying most of their conditions and alternatives. Also, we know that a good method of adaptive selection should be based on the correct definition of the different parameters of choice. This is why we started by extraction and analysis the various parameters involved in the process of the selection of a SaaS application.Keywords: cloud computing, business operation, Multi-Criteria Decision-Making (MCDM), Software as a Service (SaaS)
Procedia PDF Downloads 48316067 Attention-Based Spatio-Temporal Approach for Fire and Smoke Detection
Authors: Alireza Mirrashid, Mohammad Khoshbin, Ali Atghaei, Hassan Shahbazi
Abstract:
In various industries, smoke and fire are two of the most important threats in the workplace. One of the common methods for detecting smoke and fire is the use of infrared thermal and smoke sensors, which cannot be used in outdoor applications. Therefore, the use of vision-based methods seems necessary. The problem of smoke and fire detection is spatiotemporal and requires spatiotemporal solutions. This paper presents a method that uses spatial features along with temporal-based features to detect smoke and fire in the scene. It consists of three main parts; the task of each part is to reduce the error of the previous part so that the final model has a robust performance. This method also uses transformer modules to increase the accuracy of the model. The results of our model show the proper performance of the proposed approach in solving the problem of smoke and fire detection and can be used to increase workplace safety.Keywords: attention, fire detection, smoke detection, spatio-temporal
Procedia PDF Downloads 20316066 Reinforced Concrete Bridge Deck Condition Assessment Methods Using Ground Penetrating Radar and Infrared Thermography
Authors: Nicole M. Martino
Abstract:
Reinforced concrete bridge deck condition assessments primarily use visual inspection methods, where an inspector looks for and records locations of cracks, potholes, efflorescence and other signs of probable deterioration. Sounding is another technique used to diagnose the condition of a bridge deck, however this method listens for damage within the subsurface as the surface is struck with a hammer or chain. Even though extensive procedures are in place for using these inspection techniques, neither one provides the inspector with a comprehensive understanding of the internal condition of a bridge deck – the location where damage originates from. In order to make accurate estimates of repair locations and quantities, in addition to allocating the necessary funding, a total understanding of the deck’s deteriorated state is key. The research presented in this paper collected infrared thermography and ground penetrating radar data from reinforced concrete bridge decks without an asphalt overlay. These decks were of various ages and their condition varied from brand new, to in need of replacement. The goals of this work were to first verify that these nondestructive evaluation methods could identify similar areas of healthy and damaged concrete, and then to see if combining the results of both methods would provide a higher confidence than if the condition assessment was completed using only one method. The results from each method were presented as plan view color contour plots. The results from one of the decks assessed as a part of this research, including these plan view plots, are presented in this paper. Furthermore, in order to answer the interest of transportation agencies throughout the United States, this research developed a step-by-step guide which demonstrates how to collect and assess a bridge deck using these nondestructive evaluation methods. This guide addresses setup procedures on the deck during the day of data collection, system setups and settings for different bridge decks, data post-processing for each method, and data visualization and quantification.Keywords: bridge deck deterioration, ground penetrating radar, infrared thermography, NDT of bridge decks
Procedia PDF Downloads 15516065 Objects Tracking in Catadioptric Images Using Spherical Snake
Authors: Khald Anisse, Amina Radgui, Mohammed Rziza
Abstract:
Tracking objects on video sequences is a very challenging task in many works in computer vision applications. However, there is no article that treats this topic in catadioptric vision. This paper is an attempt that tries to describe a new approach of omnidirectional images processing based on inverse stereographic projection in the half-sphere. We used the spherical model proposed by Gayer and al. For object tracking, our work is based on snake method, with optimization using the Greedy algorithm, by adapting its different operators. The algorithm will respect the deformed geometries of omnidirectional images such as spherical neighborhood, spherical gradient and reformulation of optimization algorithm on the spherical domain. This tracking method that we call "spherical snake" permitted to know the change of the shape and the size of object in different replacements in the spherical image.Keywords: computer vision, spherical snake, omnidirectional image, object tracking, inverse stereographic projection
Procedia PDF Downloads 40416064 A Study on Reinforced Concrete Beams Enlarged with Polymer Mortar and UHPFRC
Authors: Ga Ye Kim, Hee Sun Kim, Yeong Soo Shin
Abstract:
Many studies have been done on the repair and strengthening method of concrete structure, so far. The traditional retrofit method was to attach fiber sheet such as CFRP (Carbon Fiber Reinforced Polymer), GFRP (Glass Fiber Reinforced Polymer) and AFRP (Aramid Fiber Reinforced Polymer) on the concrete structure. However, this method had many downsides in that there are a risk of debonding and an increase in displacement by a shortage of structure section. Therefore, it is effective way to enlarge the structural member with polymer mortar or Ultra-High Performance Fiber Reinforced Concrete (UHPFRC) as a means of strengthening concrete structure. This paper intends to investigate structural performance of reinforced concrete (RC) beams enlarged with polymer mortar and compare the experimental results with analytical results. Nonlinear finite element analyses were conducted to compare the experimental results and predict structural behavior of retrofitted RC beams accurately without cost consuming experimental process. In addition, this study aims at comparing differences of retrofit material between commonly used material (polymer mortar) and recently used material (UHPFRC) by conducting nonlinear finite element analyses. In the first part of this paper, the RC beams having different cover type were fabricated for the experiment and the size of RC beams was 250 millimeters in depth, 150 millimeters in width and 2800 millimeters in length. To verify the experiment, nonlinear finite element models were generated using commercial software ABAQUS 6.10-3. From this study, both experimental and analytical results demonstrated good strengthening effect on RC beam and showed similar tendency. For the future, the proposed analytical method can be used to predict the effect of strengthened RC beam. In the second part of the study, the main parameters were type of retrofit materials. The same nonlinear finite element models were generated to compare the polymer mortar with UHPFRCC. Two types of retrofit material were evaluated and retrofit effect was verified by analytical results.Keywords: retrofit material, polymer mortar, UHPFRC, nonlinear finite element analysis
Procedia PDF Downloads 41916063 Evaluation of Methodologies for Measuring Harmonics and Inter-Harmonics in Photovoltaic Facilities
Authors: Anésio de Leles Ferreira Filho, Wesley Rodrigues de Oliveira, Jéssica Santoro Gonçalves, Jorge Andrés Cormane Angarita
Abstract:
The increase in electric power demand in face of environmental issues has intensified the participation of renewable energy sources such as photovoltaics, in the energy matrix of various countries. Due to their operational characteristics, they can generate time-varying harmonic and inter-harmonic distortions. For this reason, the application of methods of measurement based on traditional Fourier analysis, as proposed by IEC 61000-4-7, can provide inaccurate results. Considering the aspects mentioned herein, came the idea of the development of this work which aims to present the results of a comparative evaluation between a methodology arising from the combination of the Prony method with the Kalman filter and another method based on the IEC 61000-4-30 and IEC 61000-4-7 standards. Employed in this study were synthetic signals and data acquired through measurements in a 50kWp photovoltaic installation.Keywords: harmonics, inter-harmonics, iec61000-4-7, parametric estimators, photovoltaic generation
Procedia PDF Downloads 48716062 Investigation on Morphologies, Forming Mechanism, Photocatalytic and Electronic Properties of Co-Zn Ferrite Nanostructure Grown on the Reduced Graphene Oxide Support
Authors: Qinglei Liu, Ali Charkhesht, Tiva Sharifi, Ashkan Bahadoran
Abstract:
Graphene sheets are promising nanoscale building blocks as a support material for the dispersion of nanoparticles. In this work, a solvothermal method employed to directly grow Co1-xZnxFe2O4 ferrite nanospheres on graphene oxide support that is subsequently reduced to graphene. The samples morphology, structure and crystallography were investigated using field-emission scanning electron microscopy (FE-SEM) and powder X-ray diffraction (XRD). The influences of the Zn2+ content on photocatalytic activity, electrical conductivity and magnetic property of the samples are also investigated. The results showed that Co1-x Znx Fe2 O4 nanoparticles are dispersed on graphene sheets and obtained nanocomposites are soft magnetic materials. In addition the samples showed excellent photocatalytic activity under visible light irradiation.Keywords: reduced graphene oxide, ferrite, magnetic nanocomposite, photocatalytic activity, solvothermal method
Procedia PDF Downloads 25016061 Cycle Number Estimation Method on Fatigue Crack Initiation Using Voronoi Tessellation and the Tanaka Mura Model
Authors: Mohammad Ridzwan Bin Abd Rahim, Siegfried Schmauder, Yupiter HP Manurung, Peter Binkele, Meor Iqram B. Meor Ahmad, Kiarash Dogahe
Abstract:
This paper deals with the short crack initiation of the material P91 under cyclic loading at two different temperatures, concluded with the estimation of the short crack initiation Wöhler (S/N) curve. An artificial but representative model microstructure was generated using Voronoi tessellation and the Finite Element Method, and the non-uniform stress distribution was calculated accordingly afterward. The number of cycles needed for crack initiation is estimated on the basis of the stress distribution in the model by applying the physically-based Tanaka-Mura model. Initial results show that the number of cycles to generate crack initiation is strongly correlated with temperature.Keywords: short crack initiation, P91, Wöhler curve, Voronoi tessellation, Tanaka-Mura model
Procedia PDF Downloads 10116060 The Principle of Methodological Rationality and Security of Organisations
Authors: Jan Franciszek Jacko
Abstract:
This investigation presents the principle of methodological rationality of decision making and discusses the impact of an organisation's members' methodologically rational or irrational decisions on its security. This study formulates and partially justifies some research hypotheses regarding the impact. The thinking experiment is used according to Max Weber's ideal types method. Two idealised situations("models") are compared: Model A, whereall decision-makers follow methodologically rational decision-making procedures. Model B, in which these agents follow methodologically irrational decision-making practices. Analysing and comparing the two models will allow the formulation of some research hypotheses regarding the impact of methodologically rational and irrational attitudes of members of an organisation on its security. In addition to the method, phenomenological analyses of rationality and irrationality are applied.Keywords: methodological rationality, rational decisions, security of organisations, philosophy of economics
Procedia PDF Downloads 13916059 Electronic Structure and Optical Properties of YNi₄Si-Type GdNi₅: A Coulomb Corrected Local-Spin Density Approximation Study
Authors: Sapan Mohan Saini
Abstract:
In this work, we report the calculations on the electronic and optical properties of YNi₄Si-type GdNi₅ compound. Calculations are performed using the full-potential augmented plane wave (FPLAPW) method in the framework of density functional theory (DFT). The Coulomb corrected local-spin density approximation (LSDA+U) in the self-interaction correction (SIC) has been used for exchange-correlation potential. Spin polarised calculations of band structure show that several bands cross the Fermi level (EF) reflect the metallic character. Analysis of density of states (DOS) demonstrates that spin up Gd-f states lie around 7.5 eV below EF and spin down Gd-f lie around 4.5 eV above EF. We found Ni-3d states mainly contribute to DOS from -5.0 eV to the EF. Our calculated results of optical conductivity agree well with the experimental data.Keywords: electronic structure, optical properties, FPLAPW method, YNi₄Si-type GdNi₅
Procedia PDF Downloads 17316058 A Study of Basic and Reactive Dyes Removal from Synthetic and Industrial Wastewater by Electrocoagulation Process
Authors: Almaz Negash, Dessie Tibebe, Marye Mulugeta, Yezbie Kassa
Abstract:
Large-scale textile industries use large amounts of toxic chemicals, which are very hazardous to human health and environmental sustainability. In this study, the removal of various dyes from effluents of textile industries using the electrocoagulation process was investigated. The studied dyes were Reactive Red 120 (RR-120), Basic Blue 3 (BB-3), and Basic Red 46 (BR-46), which were found in samples collected from effluents of three major textile factories in the Amhara region, Ethiopia. For maximum removal, the dye BB-3 required an acidic pH 3, RR120 basic pH 11, while BR-46 neutral pH 7 conditions. BB-3 required a longer treatment time of 80 min than BR46 and RR-120, which required 30 and 40 min, respectively. The best removal efficiency of 99.5%, 93.5%, and 96.3% was achieved for BR-46, BB-3, and RR-120, respectively, from synthetic wastewater containing 10 mg L1of each dye at an applied potential of 10 V. The method was applied to real textile wastewaters and 73.0 to 99.5% removal of the dyes was achieved, Indicating Electrocoagulation can be used as a simple, and reliable method for the treatment of real wastewater from textile industries. It is used as a potentially viable and inexpensive tool for the treatment of textile dyes. Analysis of the electrochemically generated sludge by X-ray Diffraction, Scanning Electron Microscope, and Fourier Transform Infrared Spectroscopy revealed the expected crystalline aluminum oxides (bayerite (Al(OH)3 diaspore (AlO(OH)) found in the sludge. The amorphous phase was also found in the floc. Textile industry owners should be aware of the impact of the discharge of effluents on the Ecosystem and should use the investigated electrocoagulation method for effluent treatment before discharging into the environment.Keywords: electrocoagulation, aluminum electrodes, Basic Blue 3, Basic Red 46, Reactive Red 120, textile industry, wastewater
Procedia PDF Downloads 5516057 Graded Orientation of the Linear Polymers
Authors: Levan Nadareishvili, Roland Bakuradze, Barbara Kilosanidze, Nona Topuridze, Liana Sharashidze, Ineza Pavlenishvili
Abstract:
Some regularities of formation of a new structural state of the thermoplastic polymers-gradually oriented (stretched) state (GOS) are discussed. Transition into GOS is realized by the graded oriented stretching-by action of inhomogeneous mechanical field on the isotropic linear polymers or by zonal stretching that is implemented on a standard tensile-testing machine with using a specially designed zone stretching device (ZSD). Both technical approaches (especially zonal stretching method) allows to manage the such quantitative parameters of gradually oriented polymers as a range of change in relative elongation/orientation degree, length of this change and profile (linear, hyperbolic, parabolic, logarithmic, etc.). Uniaxial graded stretching method should be considered as an effective technological solution to create polymer materials with a predetermined gradient of physical properties.Keywords: controlled graded stretching, gradually oriented state, linear polymers, zone stretching device
Procedia PDF Downloads 43616056 Residual Lifetime Estimation for Weibull Distribution by Fusing Expert Judgements and Censored Data
Authors: Xiang Jia, Zhijun Cheng
Abstract:
The residual lifetime of a product is the operation time between the current time and the time point when the failure happens. The residual lifetime estimation is rather important in reliability analysis. To predict the residual lifetime, it is necessary to assume or verify a particular distribution that the lifetime of the product follows. And the two-parameter Weibull distribution is frequently adopted to describe the lifetime in reliability engineering. Due to the time constraint and cost reduction, a life testing experiment is usually terminated before all the units have failed. Then the censored data is usually collected. In addition, other information could also be obtained for reliability analysis. The expert judgements are considered as it is common that the experts could present some useful information concerning the reliability. Therefore, the residual lifetime is estimated for Weibull distribution by fusing the censored data and expert judgements in this paper. First, the closed-forms concerning the point estimate and confidence interval for the residual lifetime under the Weibull distribution are both presented. Next, the expert judgements are regarded as the prior information and how to determine the prior distribution of Weibull parameters is developed. For completeness, the cases that there is only one, and there are more than two expert judgements are both focused on. Further, the posterior distribution of Weibull parameters is derived. Considering that it is difficult to derive the posterior distribution of residual lifetime, a sample-based method is proposed to generate the posterior samples of Weibull parameters based on the Monte Carlo Markov Chain (MCMC) method. And these samples are used to obtain the Bayes estimation and credible interval for the residual lifetime. Finally, an illustrative example is discussed to show the application. It demonstrates that the proposed method is rather simple, satisfactory, and robust.Keywords: expert judgements, information fusion, residual lifetime, Weibull distribution
Procedia PDF Downloads 14216055 Numerical Computation of Sturm-Liouville Problem with Robin Boundary Condition
Authors: Theddeus T. Akano, Omotayo A. Fakinlede
Abstract:
The modelling of physical phenomena, such as the earth’s free oscillations, the vibration of strings, the interaction of atomic particles, or the steady state flow in a bar give rise to Sturm-Liouville (SL) eigenvalue problems. The boundary applications of some systems like the convection-diffusion equation, electromagnetic and heat transfer problems requires the combination of Dirichlet and Neumann boundary conditions. Hence, the incorporation of Robin boundary condition in the analyses of Sturm-Liouville problem. This paper deals with the computation of the eigenvalues and eigenfunction of generalized Sturm-Liouville problems with Robin boundary condition using the finite element method. Numerical solutions of classical Sturm–Liouville problems are presented. The results show an agreement with the exact solution. High results precision is achieved with higher number of elements.Keywords: Sturm-Liouville problem, Robin boundary condition, finite element method, eigenvalue problems
Procedia PDF Downloads 36216054 Bipolar Impulse Noise Removal and Edge Preservation in Color Images and Video Using Improved Kuwahara Filter
Authors: Reji Thankachan, Varsha PS
Abstract:
Both image capturing devices and human visual systems are nonlinear. Hence nonlinear filtering methods outperforms its linear counterpart in many applications. Linear methods are unable to remove impulsive noise in images by preserving its edges and fine details. In addition, linear algorithms are unable to remove signal dependent or multiplicative noise in images. This paper presents an approach to denoise and smoothen the Bipolar impulse noised images and videos using improved Kuwahara filter. It involves a 2 stage algorithm which includes a noise detection followed by filtering. Numerous simulation demonstrate that proposed method outperforms the existing method by eliminating the painting like flattening effect along the local feature direction while preserving edge with improvement in PSNR and MSE.Keywords: bipolar impulse noise, Kuwahara, PSNR MSE, PDF
Procedia PDF Downloads 49916053 A Simple Fluid Dynamic Model for Slippery Pulse Pattern in Traditional Chinese Pulse Diagnosis
Authors: Yifang Gong
Abstract:
Pulse diagnosis is one of the most important diagnosis methods in traditional Chinese medicine. It is also the trickiest method to learn. It is known as that it can only to be sensed not explained. This becomes a serious threat to the survival of this diagnostic method. However, there are a large amount of experiences accumulated during the several thousand years of practice of Chinese doctors. A pulse pattern called 'Slippery pulse' is one of the indications of pregnancy. A simple fluid dynamic model is proposed to simulate the effects of the existence of a placenta. The placenta is modeled as an extra plenum in an extremely simplified fluid network model. It is found that because of the existence of the extra plenum, indeed the pulse pattern shows a secondary peak in one pulse period. As for the author’s knowledge, this work is the first time to show the link between Pulse diagnoses and basic physical principle. Key parameters which might affect the pattern are also investigated.Keywords: Chinese medicine, flow network, pregnancy, pulse
Procedia PDF Downloads 38616052 A Preliminary Study for Building an Arabic Corpus of Pair Questions-Texts from the Web: Aqa-Webcorp
Authors: Wided Bakari, Patrce Bellot, Mahmoud Neji
Abstract:
With the development of electronic media and the heterogeneity of Arabic data on the Web, the idea of building a clean corpus for certain applications of natural language processing, including machine translation, information retrieval, question answer, become more and more pressing. In this manuscript, we seek to create and develop our own corpus of pair’s questions-texts. This constitution then will provide a better base for our experimentation step. Thus, we try to model this constitution by a method for Arabic insofar as it recovers texts from the web that could prove to be answers to our factual questions. To do this, we had to develop a java script that can extract from a given query a list of html pages. Then clean these pages to the extent of having a database of texts and a corpus of pair’s question-texts. In addition, we give preliminary results of our proposal method. Some investigations for the construction of Arabic corpus are also presented in this document.Keywords: Arabic, web, corpus, search engine, URL, question, corpus building, script, Google, html, txt
Procedia PDF Downloads 32416051 Liver and Liver Lesion Segmentation From Abdominal CT Scans
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithmKeywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm
Procedia PDF Downloads 45116050 Robot Movement Using the Trust Region Policy Optimization
Authors: Romisaa Ali
Abstract:
The Policy Gradient approach is one of the deep reinforcement learning families that combines deep neural networks (DNN) with reinforcement learning RL to discover the optimum of the control problem through experience gained from the interaction between the robot and its surroundings. In contrast to earlier policy gradient algorithms, which were unable to handle these two types of error because of over-or under-estimation introduced by the deep neural network model, this article will discuss the state-of-the-art SOTA policy gradient technique, trust region policy optimization (TRPO), by applying this method in various environments compared to another policy gradient method, the Proximal Policy Optimization (PPO), to explain their robust optimization, using this SOTA to gather experience data during various training phases after observing the impact of hyper-parameters on neural network performance.Keywords: deep neural networks, deep reinforcement learning, proximal policy optimization, state-of-the-art, trust region policy optimization
Procedia PDF Downloads 17016049 A Modified Nonlinear Conjugate Gradient Algorithm for Large Scale Unconstrained Optimization Problems
Authors: Tsegay Giday Woldu, Haibin Zhang, Xin Zhang, Yemane Hailu Fissuh
Abstract:
It is well known that nonlinear conjugate gradient method is one of the widely used first order methods to solve large scale unconstrained smooth optimization problems. Because of the low memory requirement, attractive theoretical features, practical computational efficiency and nice convergence properties, nonlinear conjugate gradient methods have a special role for solving large scale unconstrained optimization problems. Large scale optimization problems are with important applications in practical and scientific world. However, nonlinear conjugate gradient methods have restricted information about the curvature of the objective function and they are likely less efficient and robust compared to some second order algorithms. To overcome these drawbacks, the new modified nonlinear conjugate gradient method is presented. The noticeable features of our work are that the new search direction possesses the sufficient descent property independent of any line search and it belongs to a trust region. Under mild assumptions and standard Wolfe line search technique, the global convergence property of the proposed algorithm is established. Furthermore, to test the practical computational performance of our new algorithm, numerical experiments are provided and implemented on the set of some large dimensional unconstrained problems. The numerical results show that the proposed algorithm is an efficient and robust compared with other similar algorithms.Keywords: conjugate gradient method, global convergence, large scale optimization, sufficient descent property
Procedia PDF Downloads 208