Search results for: Fiber Bragg Grating (FBG) sensing method
18233 Laboratory Calibration of Soil Pressure Transducer for a Specified Field Application
Authors: Mohammad Zahidul Islam Bhuiyan, Shanyong Wang, Scott William Sloan, Daichao Sheng
Abstract:
Nowadays soil pressure transducers are widely used to measure the soil stress states in laboratory and field experiments. The soil pressure transducers, investigated here, are traditional diaphragm-type earth pressure cells (DEPC) based on strain gauge principle. It is found that the output of these sensors varies with the soil conditions as well as the position of a sensor. Therefore, it is highly recommended to calibrate the pressure sensors based on the similar conditions of their intended applications. The factory calibration coefficients of the EPCs are not reliable to use since they are normally calibrated by applying fluid (a special type of oil) pressure only over load sensing zone, which does not represent the actual field conditions. Thus, the calibration of these sensors is utmost important, and they play a pivotal role for assessing earth pressures precisely. In the present study, TML soil pressure sensor is used to compare its sensitivity under different calibration systems, for example, fluid calibration, and static load calibration with or without soil. The results report that the sensor provides higher sensitivity (more accurate results) under soil calibration system.Keywords: calibration, soil pressure, earth pressure cell, sensitivity
Procedia PDF Downloads 24018232 Strength Analysis of RCC Dams Subject to the Layer-by-Layer Construction Method
Authors: Archil Motsonelidze, Vitaly Dvalishvili
Abstract:
Existing roller compacted concrete (RCC) dams indicate that the layer-by-layer construction method gives considerable economies as compared with the conventional methods. RCC dams have also gained acceptance in the regions of high seismic activity. Earthquake resistance analysis of RCC gravity dams based on nonlinear finite element technique is presented. An elastic-plastic approach is used to describe the material of a dam while it is under static conditions (period of construction). Seismic force, as an acceleration equivalent to that produced by a real earthquake, is supposed to act when the dam is completed. The materials of the dam and foundation may be nonhomogeneous and anisotropic. The “dam-foundation” system is idealized as a plain strain problem.Keywords: finite element method, layer-by-layer construction, RCC dams, strength analysis
Procedia PDF Downloads 54918231 Investigation on the Properties of Particulate Reinforced AA2014 Metal Matrix Composite Materials Produced by Vacuum Infiltration Method
Authors: Isil Kerti, Onur Okur, Sibel Daglilar, Recep Calin
Abstract:
Particulate reinforced aluminium matrix composites have gained more importance in automotive, aeronautical and defense industries due to their specific properties like as low density, high strength and stiffness, good fatigue strength, dimensional stability at high temperature and acceptable tribological properties. In this study, 2014 Aluminium alloy used as a matrix material and B₄C and SiC were selected as reinforcements components. For production of composites materials, vacuum infiltration method was used. In the experimental studies, the reinforcement volume ratios were defined by mixing as totally 10% B₄C and SiC. Aging treatment (T6) was applied to the specimens. The effect of T6 treatment on hardness was determined by using Brinell hardness test method. The effects of the aging treatment on microstructure and chemical structure were analysed by making XRD, SEM and EDS analysis on the specimens.Keywords: metal matrix composite, vacumm infiltration method, aluminum metal matrix, mechanical feature
Procedia PDF Downloads 31618230 Sonochemically Prepared Non-Noble Metal Oxide Catalysts for Methane Catalytic Combustion
Authors: Przemyslaw J. Jodlowski, Roman J. Jedrzejczyk, Damian K. Chlebda, Anna Dziedzicka, Lukasz Kuterasinski, Anna Gancarczyk, Maciej Sitarz
Abstract:
The aim of this study was to obtain highly active catalysts based on non-noble metal oxides supported on zirconia prepared via a sonochemical method. In this study, the influence of the stabilizers addition during the preparation step was checked. The final catalysts were characterized by using such characterization methods as X-ray Diffraction (XRD), nitrogen adsorption, X-ray fluorescence (XRF), scanning electron microscopy (SEM) equipped with energy dispersive X-ray spectrometer (EDS), transmission electron microscopy (TEM) and µRaman. The proposed preparation method allowed to obtain uniformly dispersed metal-oxide nanoparticles at the support’s surface. The catalytic activity of prepared catalyst samples was measured in a methane combustion reaction. The activity of the catalysts prepared by the sonochemical method was considerably higher than their counterparts prepared by the incipient wetness method.Keywords: methane catalytic combustion, nanoparticles, non-noble metals, sonochemistry
Procedia PDF Downloads 21818229 Conceptual Synthesis as a Platform for Psychotherapy Integration: The Case of Transference and Overgeneralization
Authors: Merav Rabinovich
Abstract:
Background: Psychoanalytic and cognitive therapy attend problems from a different point of view. At the recent decade the integrating movement gaining momentum. However only little has been studied regarding the theoretical interrelationship among these therapy approaches. Method: 33 transference case-studies that were published in peer-reviewed academic journals were coded by Luborsky's Core Conflictual Relationship Theme (CCRT) method (components of wish, response from other – real or imaginal - and the response of self). CCRT analysis was conducted through tailor-made method, a valid tool to identify transference patterns. Rabinovich and Kacen's (2010, 2013) Relationship Between Categories (RBC) method was used to analyze the relationship among these transference patterns with cognitive and behavior components appearing at those psychoanalytic case-studies. Result: 30 of 33 cases (90%) were found to connect the transference themes with cognitive overgeneralization. In these cases, overgeneralizations were organized around Luborsky's transference themes of response from other and response of self. Additionally, overgeneralization was found to be an antithesis of the wish component, and the tension between them found to be linked with powerful behavioral and emotional reactions. Conclusion: The findings indicate that thinking distortions of overgeneralization (cognitive therapy) are the actual expressions of transference patterns. These findings point to a theoretical junction, a platform for clinical integration. Awareness to this junction can help therapists to promote well psychotherapy outcomes relying on the accumulative wisdom of the different therapies.Keywords: transference, overgeneralization, theoretical integration, case-study metasynthesis, CCRT method, RBC method
Procedia PDF Downloads 14218228 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data
Authors: Huinan Zhang, Wenjie Jiang
Abstract:
Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.Keywords: Artificial intelligence, deep learning, data mining, remote sensing
Procedia PDF Downloads 6318227 Estimating Destinations of Bus Passengers Using Smart Card Data
Authors: Hasik Lee, Seung-Young Kho
Abstract:
Nowadays, automatic fare collection (AFC) system is widely used in many countries. However, smart card data from many of cities does not contain alighting information which is necessary to build OD matrices. Therefore, in order to utilize smart card data, destinations of passengers should be estimated. In this paper, kernel density estimation was used to forecast probabilities of alighting stations of bus passengers and applied to smart card data in Seoul, Korea which contains boarding and alighting information. This method was also validated with actual data. In some cases, stochastic method was more accurate than deterministic method. Therefore, it is sufficiently accurate to be used to build OD matrices.Keywords: destination estimation, Kernel density estimation, smart card data, validation
Procedia PDF Downloads 35218226 Chemometric Estimation of Phytochemicals Affecting the Antioxidant Potential of Lettuce
Authors: Milica Karadzic, Lidija Jevric, Sanja Podunavac-Kuzmanovic, Strahinja Kovacevic, Aleksandra Tepic-Horecki, Zdravko Sumic
Abstract:
In this paper, the influence of six different phytochemical content (phenols, carotenoids, chlorophyll a, chlorophyll b, chlorophyll a + b and vitamin C) on antioxidant potential of Murai and Levistro lettuce varieties was evaluated. Variable selection was made by generalized pair correlation method (GPCM) as a novel ranking method. This method is used for the discrimination between two variables that almost equal correlate to a dependent variable. Fisher’s conditional exact and McNemar’s test were carried out. Established multiple linear (MLR) models were statistically evaluated. As the best phytochemicals for the antioxidant potential prediction, chlorophyll a, chlorophyll a + b and total carotenoids content stand out. This was confirmed through both GPCM and MLR, predictive ability of obtained MLR can be used for antioxidant potential estimation for similar lettuce samples. This article is based upon work from the project of the Provincial Secretariat for Science and Technological Development of Vojvodina (No. 114-451-347/2015-02).Keywords: antioxidant activity, generalized pair correlation method, lettuce, regression analysis
Procedia PDF Downloads 38818225 Digital Watermarking Using Fractional Transform and (k,n) Halftone Visual Cryptography (HVC)
Authors: R. Rama Kishore, Sunesh Malik
Abstract:
Development in the usage of internet for different purposes in recent times creates great threat for the copy right protection of the digital images. Digital watermarking is the best way to rescue from the said problem. This paper presents detailed review of the different watermarking techniques, latest trends in the field and categorized like spatial and transform domain, blind and non-blind methods, visible and non visible techniques etc. It also discusses the different optimization techniques used in the field of watermarking in order to improve the robustness and imperceptibility of the method. Different measures are discussed to evaluate the performance of the watermarking algorithm. At the end, this paper proposes a watermarking algorithm using (k.n) shares of halftone visual cryptography (HVC) instead of (2, 2) share cryptography. (k,n) shares visual cryptography improves the security of the watermark. As halftone is a method of reprographic, it helps in improving the visual quality of watermark image. The proposed method uses fractional transformation to improve the robustness of the copyright protection of the method.Keywords: digital watermarking, fractional transform, halftone, visual cryptography
Procedia PDF Downloads 35518224 Holy Quran’s Hermeneutics from Self-Referentiality to the Quran by Quran’s Interpretation
Authors: Mohammad Ba’azm
Abstract:
The self-referentiality method as the missing ring of the Qur’an by Qur’an’s interpretation has a precise application at the level of the Quranic vocabulary, but after entering the domain of the verses, chapters and the whole Qur’an, it reveals its defect. Self-referentiality cannot show the clear concept of the Quranic scriptures, unlike the Qur’an by Qur’an’s interpretation method that guides us to the comprehension and exact hermeneutics. The Qur’an by Qur’an’s interpretation is a solid way of comprehension of the verses of the Qur'an and does not use external resources to provide implications and meanings with different theoretical and practical supports. In this method, theoretical supports are based on the basics and modalities that support and validate the legitimacy and validity of the interpretive method discussed, and the practical supports also relate to the practitioners of the religious elite. The combination of these two methods illustrates the exact understanding of the Qur'an at the level of Quranic verses, chapters, and the whole Qur’an. This study by examining the word 'book' in the Qur'an shows the difference between the two methods, and the necessity of attachment of these, in order to attain a desirable level for comprehensions meaning of the Qur'an. In this article, we have proven that by aspects of the meaning of the Quranic words, we cannot say any word has an exact meaning.Keywords: Qur’an’s hermeneutic, self-referentiality, The Qur’an by Qur’an’s Interpretation, polysemy
Procedia PDF Downloads 18818223 A Comparative Study between FEM and Meshless Methods
Authors: Jay N. Vyas, Sachin Daxini
Abstract:
Numerical simulation techniques are widely used now in product development and testing instead of expensive, time-consuming and sometimes dangerous laboratory experiments. Numerous numerical methods are available for performing simulation of physical problems of different engineering fields. Grid based methods, like Finite Element Method, are extensively used in performing various kinds of static, dynamic, structural and non-structural analysis during product development phase. Drawbacks of grid based methods in terms of discontinuous secondary field variable, dealing fracture mechanics and large deformation problems led to development of a relatively a new class of numerical simulation techniques in last few years, which are popular as Meshless methods or Meshfree Methods. Meshless Methods are expected to be more adaptive and flexible than Finite Element Method because domain descretization in Meshless Method requires only nodes. Present paper introduces Meshless Methods and differentiates it with Finite Element Method in terms of following aspects: Shape functions used, role of weight function, techniques to impose essential boundary conditions, integration techniques for discrete system equations, convergence rate, accuracy of solution and computational effort. Capabilities, benefits and limitations of Meshless Methods are discussed and concluded at the end of paper.Keywords: numerical simulation, Grid-based methods, Finite Element Method, Meshless Methods
Procedia PDF Downloads 38918222 Critical Appraisal of Different Drought Indices of Drought Predection and Their Application in KBK Districts of Odisha
Authors: Bibhuti Bhusan Sahoo, Ramakar Jha
Abstract:
Mapping of the extreme events (droughts) is one of the adaptation strategies to consequences of increasing climatic inconsistency and climate alterations. There is no operational practice to forecast the drought. One of the suggestions is to update mapping of drought prone areas for developmental planning. Drought indices play a significant role in drought mitigation. Many scientists have worked on different statistical analysis in drought and other climatological hazards. Many researchers have studied droughts individually for different sub-divisions or for India. Very few workers have studied district wise probabilities over large scale. In the present study, district wise drought probabilities over KBK (Kalahandi-Balangir-Koraput) districts of Odisha, India, Which are seriously prone to droughts, has been established using Hydrological drought index and Meteorological drought index along with the remote sensing drought indices to develop a multidirectional approach in the field of drought mitigation. Mapping for moderate and severe drought probabilities for KBK districts has been done and regions belonging different class intervals of probabilities of drought have been demarcated. Such type of information would be a good tool for planning purposes, for input in modelling and better promising results can be achieved.Keywords: drought indices, KBK districts, proposed drought severity index, SPI
Procedia PDF Downloads 45118221 Extended Kalman Filter and Markov Chain Monte Carlo Method for Uncertainty Estimation: Application to X-Ray Fluorescence Machine Calibration and Metal Testing
Authors: S. Bouhouche, R. Drai, J. Bast
Abstract:
This paper is concerned with a method for uncertainty evaluation of steel sample content using X-Ray Fluorescence method. The considered method of analysis is a comparative technique based on the X-Ray Fluorescence; the calibration step assumes the adequate chemical composition of metallic analyzed sample. It is proposed in this work a new combined approach using the Kalman Filter and Markov Chain Monte Carlo (MCMC) for uncertainty estimation of steel content analysis. The Kalman filter algorithm is extended to the model identification of the chemical analysis process using the main factors affecting the analysis results; in this case, the estimated states are reduced to the model parameters. The MCMC is a stochastic method that computes the statistical properties of the considered states such as the probability distribution function (PDF) according to the initial state and the target distribution using Monte Carlo simulation algorithm. Conventional approach is based on the linear correlation, the uncertainty budget is established for steel Mn(wt%), Cr(wt%), Ni(wt%) and Mo(wt%) content respectively. A comparative study between the conventional procedure and the proposed method is given. This kind of approaches is applied for constructing an accurate computing procedure of uncertainty measurement.Keywords: Kalman filter, Markov chain Monte Carlo, x-ray fluorescence calibration and testing, steel content measurement, uncertainty measurement
Procedia PDF Downloads 28318220 The Stability of Vegetable-Based Synbiotic Drink during Storage
Authors: Camelia Vizireanu, Daniela Istrati, Alina Georgiana Profir, Rodica Mihaela Dinica
Abstract:
Globally, there is a great interest in promoting the consumption of fruit and vegetables to improve health. Due to the content of essential compounds such as antioxidants, important amounts of fruits and vegetables should be included in the daily diet. Juices are good sources of vitamins and can also help increase overall fruit and vegetable consumption. Starting from this trend (introduction into the daily diet of vegetables and fruits) as well as the desire to diversify the range of functional products for both adults and children, a fermented juice was made using probiotic microorganisms based on root vegetables, with potential beneficial effects in the diet of children, vegetarians and people with lactose intolerance. The three vegetables selected for this study, red beet, carrot, and celery bring a significant contribution to functional compounds such as carotenoids, flavonoids, betalain, vitamin B and C, minerals and fiber. By fermentation, the functional value of the vegetable juice increases due to the improved stability of these compounds. The combination of probiotic microorganisms and vegetable fibers resulted in a nutrient-rich synbiotic product. The stability of the nutritional and sensory qualities of the obtained synbiotic product has been tested throughout its shelf life. The evaluation of the physico-chemical changes of the synbiotic drink during storage confirmed that: (i) vegetable juice enriched with honey and vegetable pulp is an important source of nutritional compounds, especially carbohydrates and fiber; (ii) microwave treatment used to inhibit pathogenic microflora did not significantly affect nutritional compounds in vegetable juice, vitamin C concentration remained at baseline and beta-carotene concentration increased due to increased bioavailability; (iii) fermentation has improved the nutritional quality of vegetable juice by increasing the content of B vitamins, polyphenols and flavonoids and has a good antioxidant capacity throughout the shelf life; (iv) the FTIR and Raman spectra have highlighted the results obtained using physicochemical methods. Based on the analysis of IR absorption frequencies, the most striking bands belong to the frequencies 3330 cm⁻¹, 1636 cm⁻¹ and 1050 cm⁻¹, specific for groups of compounds such as polyphenols, carbohydrates, fatty acids, and proteins. Statistical data processing revealed a good correlation between the content of flavonoids, betalain, β-carotene, ascorbic acid and polyphenols, the fermented juice having a stable antioxidant activity. Also, principal components analysis showed that there was a negative correlation between the evolution of the concentration of B vitamins and antioxidant activity. Acknowledgment: This study has been founded by the Francophone University Agency, Project Réseau régional dans le domaine de la santé, la nutrition et la sécurité alimentaire (SaIN), No. at Dunarea de Jos University of Galati 21899/ 06.09.2017 and by the Sectorial Operational Programme Human Resources Development of the Romanian Ministry of Education, Research, Youth and Sports trough the Financial Agreement POSDRU/159/1.5/S/132397 ExcelDOC.Keywords: bioactive compounds, fermentation, synbiotic drink from vegetables, stability during storage
Procedia PDF Downloads 15018219 The Solution of Nonlinear Partial Differential Equation for The Phenomenon of Instability in Homogeneous Porous Media by Homotopy Analysis Method
Authors: Kajal K. Patel, M. N. Mehta, T. R. Singh
Abstract:
When water is injected in oil formatted area in secondary oil recovery process the instability occurs near common interface due to viscosity difference of injected water and native oil. The governing equation gives rise to the non-linear partial differential equation and its solution has been obtained by Homotopy analysis method with appropriate guess value of the solution together with some conditions and standard relations. The solution gives the average cross-sectional area occupied by the schematic fingers during the occurs of instability phenomenon. The numerical and graphical presentation has developed by using Maple software.Keywords: capillary pressure, homotopy analysis method, instability phenomenon, viscosity
Procedia PDF Downloads 49618218 Numerical Solutions of an Option Pricing Rainfall Derivatives Model
Authors: Clarinda Vitorino Nhangumbe, Ercília Sousa
Abstract:
Weather derivatives are financial products used to cover non catastrophic weather events with a weather index as the underlying asset. The rainfall weather derivative pricing model is modeled based in the assumption that the rainfall dynamics follows Ornstein-Uhlenbeck process, and the partial differential equation approach is used to derive the convection-diffusion two dimensional time dependent partial differential equation, where the spatial variables are the rainfall index and rainfall depth. To compute the approximation solutions of the partial differential equation, the appropriate boundary conditions are suggested, and an explicit numerical method is proposed in order to deal efficiently with the different choices of the coefficients involved in the equation. Being an explicit numerical method, it will be conditionally stable, then the stability region of the numerical method and the order of convergence are discussed. The model is tested for real precipitation data.Keywords: finite differences method, ornstein-uhlenbeck process, partial differential equations approach, rainfall derivatives
Procedia PDF Downloads 10718217 Error Amount in Viscoelasticity Analysis Depending on Time Step Size and Method used in ANSYS
Authors: A. Fettahoglu
Abstract:
Theory of viscoelasticity is used by many researchers to represent behavior of many materials such as pavements on roads or bridges. Several researches used analytical methods and rheology to predict the material behaviors of simple models. Today, more complex engineering structures are analyzed using Finite Element Method, in which material behavior is embedded by means of three dimensional viscoelastic material laws. As a result, structures of unordinary geometry and domain like pavements of bridges can be analyzed by means of Finite Element Method and three dimensional viscoelastic equations. In the scope of this study, rheological models embedded in ANSYS, namely, generalized Maxwell elements and Prony series, which are two methods used by ANSYS to represent viscoelastic material behavior, are presented explicitly. Subsequently, a practical problem, which has an analytical solution given in literature, is used to verify the applicability of viscoelasticity tool embedded in ANSYS. Finally, amount of error in the results of ANSYS is compared with the analytical results to indicate the influence of used method and time step size.Keywords: generalized Maxwell model, finite element method, prony series, time step size, viscoelasticity
Procedia PDF Downloads 36918216 Numerical Analysis on the Effect of Abrasive Parameters on Wall Shear Stress and Jet Exit Kinetic Energy
Authors: D. Deepak, N. Yagnesh Sharma
Abstract:
Abrasive Water Jet (AWJ) machining is a relatively new nontraditional machine tool used in machining of fiber reinforced composite. The quality of machined surface depends on jet exit kinetic energy which depends on various operating and material parameters. In the present work the effect abrasive parameters such as its size, concentration and type on jet kinetic energy is investigated using computational fluid dynamics (CFD). In addition, the effect of these parameters on wall shear stress developed inside the nozzle is also investigated. It is found that for the same operating parameters, increase in the abrasive volume fraction (concentration) results in significant decrease in the wall shear stress as well as the jet exit kinetic energy. Increase in the abrasive particle size results in marginal decrease in the jet exit kinetic energy. Numerical simulation also indicates that garnet abrasives produce better jet exit kinetic energy than aluminium oxide and silicon carbide.Keywords: abrasive water jet machining, jet kinetic energy, operating pressure, wall shear stress, Garnet abrasive
Procedia PDF Downloads 37718215 Hyperspectral Mapping Methods for Differentiating Mangrove Species along Karachi Coast
Authors: Sher Muhammad, Mirza Muhammad Waqar
Abstract:
It is necessary to monitor and identify mangroves types and spatial extent near coastal areas because it plays an important role in coastal ecosystem and environmental protection. This research aims at identifying and mapping mangroves types along Karachi coast ranging from 24.79 to 24.85 degree in latitude and 66.91 to 66.97 degree in longitude using hyperspectral remote sensing data and techniques. Image acquired during February, 2012 through Hyperion sensor have been used for this research. Image preprocessing includes geometric and radiometric correction followed by Minimum Noise Fraction (MNF) and Pixel Purity Index (PPI). The output of MNF and PPI has been analyzed by visualizing it in n-dimensions for end-member extraction. Well-distributed clusters on the n-dimensional scatter plot have been selected with the region of interest (ROI) tool as end members. These end members have been used as an input for classification techniques applied to identify and map mangroves species including Spectral Angle Mapper (SAM), Spectral Feature Fitting (SFF), and Spectral Information Diversion (SID). Only two types of mangroves namely Avicennia Marina (white mangroves) and Avicennia Germinans (black mangroves) have been observed throughout the study area.Keywords: mangrove, hyperspectral, hyperion, SAM, SFF, SID
Procedia PDF Downloads 36218214 The Forensic Analysis of Engravers' Handwriting
Authors: Olivia Rybak-Karkosz
Abstract:
The purpose of this paper is to present the result of scientific research using forensic handwriting analysis. It was conducted to verify the stability and lability of handwriting of engravers and check if gravers transfer their traits from handwriting to plates and other surfaces they rework. This research methodology consisted of completing representative samples of signatures of gravers written on a piece of paper using a ballpen and signatures engraved on other surfaces. The forensic handwriting analysis was conducted using the graphic-comparative method (graphic method), and all traits were analysed. The paper contains a concluding statement of the similarities and differences between the samples.Keywords: artist’s signatures, engraving, forensic handwriting analysis, graphic-comparative method
Procedia PDF Downloads 10218213 Visualization of Flow Behaviour in Micro-Cavities during Micro Injection Moulding
Authors: Reza Gheisari, Paulo J. Bartolo, Nicholas Goddard
Abstract:
Polymeric micro-cantilevers (Cs) are rapidly becoming popular for MEMS applications such as chemo- and bio-sensing as well as purely electromechanical applications such as microrelays. Polymer materials present suitable physical and chemical properties combined with low-cost mass production. Hence, micro-cantilevers made of polymers indicate much more biocompatibility and adaptability of rapid prototyping along with mechanical properties. This research studies the effects of three process and one size factors on the filling behaviour in micro cavity, and the role of each in the replication of micro parts using different polymer materials i.e. polypropylene (PP) SABIC 56M10 and acrylonitrile butadiene styrene (ABS) Magnum 8434. In particular, the following factors are considered: barrel temperature, mould temperature, injection speed and the thickness of micro features. The study revealed that the barrel temperature and the injection speed are the key factors affecting the flow length of micro features replicated in PP and ABS. For both materials, an increase of feature sizes improves the melt flow. However, the melt fill of micro features does not increase linearly with the increase of their thickness.Keywords: flow length, micro cantilevers, micro injection moulding, microfabrication
Procedia PDF Downloads 39518212 Impact of Depreciation Technique on Taxable Income and Financial Performance of Quoted Consumer Goods Company in Nigeria
Authors: Ibrahim Ali, Adamu Danlami Ahmed
Abstract:
This study examines the impact of depreciation on taxable income and financial performance of consumer goods companies quoted on the Nigerian stock exchange. The study adopts ex-post factor research design. Data were collected using a secondary source. The findings of the study suggest that, method of depreciation adopted in any organization influence the taxable profit. Depreciation techniques can either be: depressive, accelerative and linear depreciation. It was also recommended that consumer goods should adjust their method of depreciation to make sure an appropriate method is adopted. This will go a long way to revitalize their taxable profit.Keywords: accelerated, linear, depressive, depreciation
Procedia PDF Downloads 28518211 Stability-Indicating High-Performance Thin-Layer Chromatography Method for Estimation of Naftopidil
Authors: P. S. Jain, K. D. Bobade, S. J. Surana
Abstract:
A simple, selective, precise and Stability-indicating High-performance thin-layer chromatographic method for analysis of Naftopidil both in a bulk and in pharmaceutical formulation has been developed and validated. The method employed, HPTLC aluminium plates precoated with silica gel as the stationary phase. The solvent system consisted of hexane: ethyl acetate: glacial acetic acid (4:4:2 v/v). The system was found to give compact spot for Naftopidil (Rf value of 0.43±0.02). Densitometric analysis of Naftopidil was carried out in the absorbance mode at 253 nm. The linear regression analysis data for the calibration plots showed good linear relationship with r2=0.999±0.0001 with respect to peak area in the concentration range 200-1200 ng per spot. The method was validated for precision, recovery and robustness. The limits of detection and quantification were 20.35 and 61.68 ng per spot, respectively. Naftopidil was subjected to acid and alkali hydrolysis, oxidation and thermal degradation. The drug undergoes degradation under acidic, basic, oxidation and thermal conditions. This indicates that the drug is susceptible to acid, base, oxidation and thermal conditions. The degraded product was well resolved from the pure drug with significantly different Rf value. Statistical analysis proves that the method is repeatable, selective and accurate for the estimation of investigated drug. The proposed developed HPTLC method can be applied for identification and quantitative determination of Naftopidil in bulk drug and pharmaceutical formulation.Keywords: naftopidil, HPTLC, validation, stability, degradation
Procedia PDF Downloads 40018210 Calibration of Discrete Element Method Parameters for Modelling DRI Pellets Flow
Authors: A. Hossein Madadi-Najafabadi, Masoud Nasiri
Abstract:
The discrete element method is a powerful technique for numerical modeling the flow of granular materials such as direct reduced iron. It would enable us to study processes and equipment related to the production and handling of the material. However, the characteristics and properties of the granules have to be adjusted precisely to achieve reliable results in a DEM simulation. The main properties for DEM simulation are size distribution, density, Young's modulus, Poisson's ratio and the contact coefficients of restitution, rolling friction and sliding friction. In the present paper, the mentioned properties are determined for DEM simulation of DRI pellets. A reliable DEM simulation would contribute to optimizing the handling system of DRIs in an iron-making plant. Among the mentioned properties, Young's modulus is the most important parameter, which is usually hard to get for particulate solids. Here, an especial method is utilized to precisely determine this parameter for DRI.Keywords: discrete element method, direct reduced iron, simulation parameters, granular material
Procedia PDF Downloads 18018209 Developing Digital Twins of Steel Hull Processes
Authors: V. Ložar, N. Hadžić, T. Opetuk, R. Keser
Abstract:
The development of digital twins strongly depends on efficient algorithms and their capability to mirror real-life processes. Nowadays, such efforts are required to establish factories of the future faced with new demands of custom-made production. The ship hull processes face these challenges too. Therefore, it is important to implement design and evaluation approaches based on production system engineering. In this study, the recently developed finite state method is employed to describe the stell hull process as a platform for the implementation of digital twinning technology. The application is justified by comparing the finite state method with the analytical approach. This method is employed to rebuild a model of a real shipyard ship hull process using a combination of serial and splitting lines. The key performance indicators such as the production rate, work in process, probability of starvation, and blockade are calculated and compared to the corresponding results obtained through a simulation approach using the software tool Enterprise dynamics. This study confirms that the finite state method is a suitable tool for digital twinning applications. The conclusion highlights the advantages and disadvantages of methods employed in this context.Keywords: digital twin, finite state method, production system engineering, shipyard
Procedia PDF Downloads 9918208 Development and Validation Method for Quantitative Determination of Rifampicin in Human Plasma and Its Application in Bioequivalence Test
Authors: Endang Lukitaningsih, Fathul Jannah, Arief R. Hakim, Ratna D. Puspita, Zullies Ikawati
Abstract:
Rifampicin is a semisynthetic antibiotic derivative of rifamycin B produced by Streptomyces mediterranei. RIF has been used worldwide as first line drug-prescribed throughout tuberculosis therapy. This study aims to develop and to validate an HPLC method couple with a UV detection for determination of rifampicin in spiked human plasma and its application for bioequivalence study. The chromatographic separation was achieved on an RP-C18 column (LachromHitachi, 250 x 4.6 mm., 5μm), utilizing a mobile phase of phosphate buffer/acetonitrile (55:45, v/v, pH 6.8 ± 0.1) at a flow of 1.5 mL/min. Detection was carried out at 337 nm by using spectrophotometer. The developed method was statistically validated for the linearity, accuracy, limit of detection, limit of quantitation, precise and specifity. The specifity of the method was ascertained by comparing chromatograms of blank plasma and plasma containing rifampicin; the matrix and rifampicin were well separated. The limit of detection and limit of quantification were 0.7 µg/mL and 2.3 µg/mL, respectively. The regression curve of standard was linear (r > 0.999) over a range concentration of 20.0 – 100.0 µg/mL. The mean recovery of the method was 96.68 ± 8.06 %. Both intraday and interday precision data showed reproducibility (R.S.D. 2.98% and 1.13 %, respectively). Therefore, the method can be used for routine analysis of rifampicin in human plasma and in bioequivalence study. The validated method was successfully applied in pharmacokinetic and bioequivalence study of rifampicin tablet in a limited number of subjects (under an Ethical Clearance No. KE/FK/6201/EC/2015). The mean values of Cmax, Tmax, AUC(0-24) and AUC(o-∞) for the test formulation of rifampicin were 5.81 ± 0.88 µg/mL, 1.25 hour, 29.16 ± 4.05 µg/mL. h. and 29.41 ± 4.07 µg/mL. h., respectively. Meanwhile for the reference formulation, the values were 5.04 ± 0.54 µg/mL, 1.31 hour, 27.20 ± 3.98 µg/mL.h. and 27.49 ± 4.01 µg/mL.h. From bioequivalence study, the 90% CIs for the test formulation/reference formulation ratio for the logarithmic transformations of Cmax and AUC(0-24) were 97.96-129.48% and 99.13-120.02%, respectively. According to the bioequivamence test guidelines of the European Commission-European Medicines Agency, it can be concluded that the test formulation of rifampicin is bioequivalence with the reference formulation.Keywords: validation, HPLC, plasma, bioequivalence
Procedia PDF Downloads 29118207 Use of Cobalt Graphene in Place of Platnium in Catalytic Converter
Authors: V. Srinivasan, S. M. Sriram Nandan
Abstract:
Today in the modern world the most important problem faced by the mankind is increasing the pollution in a very high rate. It affects the ecosystem of the environment and also aids to increase the greenhouse effect. The exhaust gases from the automobile is the major cause of a pollution. Automobiles have increased to a large number which has increased the pollution of our world to an alarming rate. There are two methods of controlling the pollution namely, pre-pollution control method and post-pollution control method. This paper is based on controlling the emission by post-pollution control method. The ratio of surface area of nanoparticles to the volume of the nanoparticles is inversely proportional to the radius of the nanoparticles. So decreasing the radius, this ratio is leading resulting in an increased rate of reaction and thus the concentration of the pollution is decreased. To achieve this objective, use of cobalt-graphene element is proposed. The proposed method is mainly to decrease the cost of platinum as it is expensive. This has a longer life than the platinum-based catalysts.Keywords: automobile emissions, catalytic converter, cobalt-graphene, replacement of platinum
Procedia PDF Downloads 39018206 Design of a Vehicle Door Structure Based on Finite Element Method
Authors: Tawanda Mushiri, Charles Mbohwa
Abstract:
The performance of door assembly is very significant for the vehicle design. In the present paper, the finite element method is used in the development processes of the door assembly. The stiffness, strength, modal characteristic, and anti-extrusion of a newly developed passenger vehicle door assembly are calculated and evaluated by several finite element analysis commercial software. The structural problems discovered by FE analysis have been modified and finally achieved the expected door structure performance target of this new vehicle. The issue in focus is to predict the performance of the door assembly by powerful finite element analysis software, and optimize the structure to meet the design targets. It is observed that this method can be used to forecast the performance of vehicle door efficiently when it’s designed. In order to reduce lead time and cost in the product development of vehicles more development will be made virtually.Keywords: vehicle door, structure, strength, stiffness, modal characteristic, anti-extrusion, Finite Element Method
Procedia PDF Downloads 42918205 A Review of Different Studies on Hidden Markov Models for Multi-Temporal Satellite Images: Stationarity and Non-Stationarity Issues
Authors: Ali Ben Abbes, Imed Riadh Farah
Abstract:
Due to the considerable advances in Multi-Temporal Satellite Images (MTSI), remote sensing application became more accurate. Recently, many advances in modeling MTSI are developed using various models. The purpose of this article is to present an overview of studies using Hidden Markov Model (HMM). First of all, we provide a background of using HMM and their applications in this context. A comparison of the different works is discussed, and possible areas and challenges are highlighted. Secondly, we discussed the difference on vegetation monitoring as well as urban growth. Nevertheless, most research efforts have been used only stationary data. From another point of view, in this paper, we describe a new non-stationarity HMM, that is defined with a set of parts of the time series e.g. seasonal, trend and random. In addition, a new approach giving more accurate results and improve the applicability of the HMM in modeling a non-stationary data series. In order to assess the performance of the HMM, different experiments are carried out using Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI time series of the northwestern region of Tunisia and Landsat time series of tres Cantos-Madrid in Spain.Keywords: multi-temporal satellite image, HMM , nonstationarity, vegetation, urban
Procedia PDF Downloads 35418204 Comparative Analysis of Predictive Models for Customer Churn Prediction in the Telecommunication Industry
Authors: Deepika Christopher, Garima Anand
Abstract:
To determine the best model for churn prediction in the telecom industry, this paper compares 11 machine learning algorithms, namely Logistic Regression, Support Vector Machine, Random Forest, Decision Tree, XGBoost, LightGBM, Cat Boost, AdaBoost, Extra Trees, Deep Neural Network, and Hybrid Model (MLPClassifier). It also aims to pinpoint the top three factors that lead to customer churn and conducts customer segmentation to identify vulnerable groups. According to the data, the Logistic Regression model performs the best, with an F1 score of 0.6215, 81.76% accuracy, 68.95% precision, and 56.57% recall. The top three attributes that cause churn are found to be tenure, Internet Service Fiber optic, and Internet Service DSL; conversely, the top three models in this article that perform the best are Logistic Regression, Deep Neural Network, and AdaBoost. The K means algorithm is applied to establish and analyze four different customer clusters. This study has effectively identified customers that are at risk of churn and may be utilized to develop and execute strategies that lower customer attrition.Keywords: attrition, retention, predictive modeling, customer segmentation, telecommunications
Procedia PDF Downloads 57