Search results for: the probability integral transform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3486

Search results for: the probability integral transform

2886 A Combination of Anisotropic Diffusion and Sobel Operator to Enhance the Performance of the Morphological Component Analysis for Automatic Crack Detection

Authors: Ankur Dixit, Hiroaki Wagatsuma

Abstract:

The crack detection on a concrete bridge is an important and constant task in civil engineering. Chronically, humans are checking the bridge for inspection of cracks to maintain the quality and reliability of bridge. But this process is very long and costly. To overcome such limitations, we have used a drone with a digital camera, which took some images of bridge deck and these images are processed by morphological component analysis (MCA). MCA technique is a very strong application of sparse coding and it explores the possibility of separation of images. In this paper, MCA has been used to decompose the image into coarse and fine components with the effectiveness of two dictionaries namely anisotropic diffusion and wavelet transform. An anisotropic diffusion is an adaptive smoothing process used to adjust diffusion coefficient by finding gray level and gradient as features. These cracks in image are enhanced by subtracting the diffused coarse image into the original image and the results are treated by Sobel edge detector and binary filtering to exhibit the cracks in a fine way. Our results demonstrated that proposed MCA framework using anisotropic diffusion followed by Sobel operator and binary filtering may contribute to an automation of crack detection even in open field sever conditions such as bridge decks.

Keywords: anisotropic diffusion, coarse component, fine component, MCA, Sobel edge detector and wavelet transform

Procedia PDF Downloads 170
2885 Rapid Discrimination of Porcine and Tilapia Fish Gelatin by Fourier Transform Infrared- Attenuated Total Reflection Combined with 2 Dimensional Infrared Correlation Analysis

Authors: Norhidayu Muhamad Zain

Abstract:

Gelatin, a purified protein derived mostly from porcine and bovine sources, is used widely in food manufacturing, pharmaceutical, and cosmetic industries. However, the presence of any porcine-related products are strictly forbidden for Muslim and Jewish consumption. Therefore, analytical methods offering reliable results to differentiate the sources of gelatin are needed. The aim of this study was to differentiate the sources of gelatin (porcine and tilapia fish) using Fourier transform infrared- attenuated total reflection (FTIR-ATR) combined with two dimensional infrared (2DIR) correlation analysis. Porcine gelatin (PG) and tilapia fish gelatin (FG) samples were diluted in distilled water at concentrations ranged from 4-20% (w/v). The samples were then analysed using FTIR-ATR and 2DIR correlation software. The results showed a significant difference in the pattern map of synchronous spectra at the region of 1000 cm⁻¹ to 1100 cm⁻¹ between PG and FG samples. The auto peak at 1080 cm⁻¹ that attributed to C-O functional group was observed at high intensity in PG samples compared to FG samples. Meanwhile, two auto peaks (1080 cm⁻¹ and 1030 cm⁻¹) at lower intensity were identified in FG samples. In addition, using 2D correlation analysis, the original broad water OH bands in 1D IR spectra can be effectively differentiated into six auto peaks located at 3630, 3340, 3230, 3065, 2950 and 2885 cm⁻¹ for PG samples and five auto peaks at 3630, 3330, 3230, 3060 and 2940 cm⁻¹ for FG samples. Based on the rule proposed by Noda, the sequence of the spectral changes in PG samples is as following: NH₃⁺ amino acid > CH₂ and CH₃ aliphatic > OH stretch > carboxylic acid OH stretch > NH in secondary amide > NH in primary amide. In contrast, the sequence was totally in the opposite direction for FG samples and thus both samples provide different 2D correlation spectra ranged from 2800 cm-1 to 3700 cm⁻¹. This method may provide a rapid determination of gelatin source for application in food, pharmaceutical, and cosmetic products.

Keywords: 2 dimensional infrared (2DIR) correlation analysis, Fourier transform infrared- attenuated total reflection (FTIR-ATR), porcine gelatin, tilapia fish gelatin

Procedia PDF Downloads 243
2884 Selecting the Best Risk Exposure to Assess Collision Risks in Container Terminals

Authors: Mohammad Ali Hasanzadeh, Thierry Van Elslander, Eddy Van De Voorde

Abstract:

About 90 percent of world merchandise trade by volume being carried by sea. Maritime transport remains as back bone behind the international trade and globalization meanwhile all seaborne goods need using at least two ports as origin and destination. Amid seaborne traded cargos, container traffic is a prosperous market with about 16% in terms of volume. Albeit containerized cargos are less in terms of tonnage but, containers carry the highest value cargos amongst all. That is why efficient handling of containers in ports is very important. Accidents are the foremost causes that lead to port inefficiency and a surge in total transport cost. Having different port safety management systems (PSMS) in place, statistics on port accidents show that numerous accidents occur in ports. Some of them claim peoples’ life; others damage goods, vessels, port equipment and/or the environment. Several accident investigation illustrate that the most common accidents take place throughout transport operation, it sometimes accounts for 68.6% of all events, therefore providing a safer workplace depends on reducing collision risk. In order to quantify risks at the port area different variables can be used as exposure measurement. One of the main motives for defining and using exposure in studies related to infrastructure is to account for the differences in intensity of use, so as to make comparisons meaningful. In various researches related to handling containers in ports and intermodal terminals, different risk exposures and also the likelihood of each event have been selected. Vehicle collision within the port area (10-7 per kilometer of vehicle distance travelled) and dropping containers from cranes, forklift trucks, or rail mounted gantries (1 x 10-5 per lift) are some examples. According to the objective of the current research, three categories of accidents selected for collision risk assessment; fall of container during ship to shore operation, dropping container during transfer operation and collision between vehicles and objects within terminal area. Later on various consequences, exposure and probability identified for each accident. Hence, reducing collision risks profoundly rely on picking the right risk exposures and probability of selected accidents, to prevent collision accidents in container terminals and in the framework of risk calculations, such risk exposures and probabilities can be useful in assessing the effectiveness of safety programs in ports.

Keywords: container terminal, collision, seaborne trade, risk exposure, risk probability

Procedia PDF Downloads 370
2883 Teleconnection between El Nino-Southern Oscillation and Seasonal Flow of the Surma River and Possibilities of Long Range Flood Forecasting

Authors: Monika Saha, A. T. M. Hasan Zobeyer, Nasreen Jahan

Abstract:

El Nino-Southern Oscillation (ENSO) is the interaction between atmosphere and ocean in tropical Pacific which causes inconsistent warm/cold weather in tropical central and eastern Pacific Ocean. Due to the impact of climate change, ENSO events are becoming stronger in recent times, and therefore it is very important to study the influence of ENSO in climate studies. Bangladesh, being in the low-lying deltaic floodplain, experiences the worst consequences due to flooding every year. To reduce the catastrophe of severe flooding events, non-structural measures such as flood forecasting can be helpful in taking adequate precautions and steps. Forecasting seasonal flood with a longer lead time of several months is a key component of flood damage control and water management. The objective of this research is to identify the possible strength of teleconnection between ENSO and river flow of Surma and examine the potential possibility of long lead flood forecasting in the wet season. Surma is one of the major rivers of Bangladesh and is a part of the Surma-Meghna river system. In this research, sea surface temperature (SST) has been considered as the ENSO index and the lead time is at least a few months which is greater than the basin response time. The teleconnection has been assessed by the correlation analysis between July-August-September (JAS) flow of Surma and SST of Nino 4 region of the corresponding months. Cumulative frequency distribution of standardized JAS flow of Surma has also been determined as part of assessing the possible teleconnection. Discharge data of Surma river from 1975 to 2015 is used in this analysis, and remarkable increased value of correlation coefficient between flow and ENSO has been observed from 1985. From the cumulative frequency distribution of the standardized JAS flow, it has been marked that in any year the JAS flow has approximately 50% probability of exceeding the long-term average JAS flow. During El Nino year (warm episode of ENSO) this probability of exceedance drops to 23% and while in La Nina year (cold episode of ENSO) it increases to 78%. Discriminant analysis which is known as 'Categoric Prediction' has been performed to identify the possibilities of long lead flood forecasting. It has helped to categorize the flow data (high, average and low) based on the classification of predicted SST (warm, normal and cold). From the discriminant analysis, it has been found that for Surma river, the probability of a high flood in the cold period is 75% and the probability of a low flood in the warm period is 33%. A synoptic parameter, forecasting index (FI) has also been calculated here to judge the forecast skill and to compare different forecasts. This study will help the concerned authorities and the stakeholders to take long-term water resources decisions and formulate policies on river basin management which will reduce possible damage of life, agriculture, and property.

Keywords: El Nino-Southern Oscillation, sea surface temperature, surma river, teleconnection, cumulative frequency distribution, discriminant analysis, forecasting index

Procedia PDF Downloads 144
2882 Fault Detection and Diagnosis of Broken Bar Problem in Induction Motors Base Wavelet Analysis and EMD Method: Case Study of Mobarakeh Steel Company in Iran

Authors: M. Ahmadi, M. Kafil, H. Ebrahimi

Abstract:

Nowadays, induction motors have a significant role in industries. Condition monitoring (CM) of this equipment has gained a remarkable importance during recent years due to huge production losses, substantial imposed costs and increases in vulnerability, risk, and uncertainty levels. Motor current signature analysis (MCSA) is one of the most important techniques in CM. This method can be used for rotor broken bars detection. Signal processing methods such as Fast Fourier transformation (FFT), Wavelet transformation and Empirical Mode Decomposition (EMD) are used for analyzing MCSA output data. In this study, these signal processing methods are used for broken bar problem detection of Mobarakeh steel company induction motors. Based on wavelet transformation method, an index for fault detection, CF, is introduced which is the variation of maximum to the mean of wavelet transformation coefficients. We find that, in the broken bar condition, the amount of CF factor is greater than the healthy condition. Based on EMD method, the energy of intrinsic mode functions (IMF) is calculated and finds that when motor bars become broken the energy of IMFs increases.

Keywords: broken bar, condition monitoring, diagnostics, empirical mode decomposition, fourier transform, wavelet transform

Procedia PDF Downloads 147
2881 Vulnerability Assessment of Reinforced Concrete Frames Based on Inelastic Spectral Displacement

Authors: Chao Xu

Abstract:

Selecting ground motion intensity measures reasonably is one of the very important issues to affect the input ground motions selecting and the reliability of vulnerability analysis results. In this paper, inelastic spectral displacement is used as an alternative intensity measure to characterize the ground motion damage potential. The inelastic spectral displacement is calculated based modal pushover analysis and inelastic spectral displacement based incremental dynamic analysis is developed. Probability seismic demand analysis of a six story and an eleven story RC frame are carried out through cloud analysis and advanced incremental dynamic analysis. The sufficiency and efficiency of inelastic spectral displacement are investigated by means of regression and residual analysis, and compared with elastic spectral displacement. Vulnerability curves are developed based on inelastic spectral displacement. The study shows that inelastic spectral displacement reflects the impact of different frequency components with periods larger than fundamental period on inelastic structural response. The damage potential of ground motion on structures with fundamental period prolonging caused by structural soften can be caught by inelastic spectral displacement. To be compared with elastic spectral displacement, inelastic spectral displacement is a more sufficient and efficient intensity measure, which reduces the uncertainty of vulnerability analysis and the impact of input ground motion selection on vulnerability analysis result.

Keywords: vulnerability, probability seismic demand analysis, ground motion intensity measure, sufficiency, efficiency, inelastic time history analysis

Procedia PDF Downloads 349
2880 Enhancing the Pricing Expertise of an Online Distribution Channel

Authors: Luis N. Pereira, Marco P. Carrasco

Abstract:

Dynamic pricing is a revenue management strategy in which hotel suppliers define, over time, flexible and different prices for their services for different potential customers, considering the profile of e-consumers and the demand and market supply. This means that the fundamentals of dynamic pricing are based on economic theory (price elasticity of demand) and market segmentation. This study aims to define a dynamic pricing strategy and a contextualized offer to the e-consumers profile in order to improve the number of reservations of an online distribution channel. Segmentation methods (hierarchical and non-hierarchical) were used to identify and validate an optimal number of market segments. A profile of the market segments was studied, considering the characteristics of the e-consumers and the probability of reservation a room. In addition, the price elasticity of demand was estimated for each segment using econometric models. Finally, predictive models were used to define rules for classifying new e-consumers into pre-defined segments. The empirical study illustrates how it is possible to improve the intelligence of an online distribution channel system through an optimal dynamic pricing strategy and a contextualized offer to the profile of each new e-consumer. A database of 11 million e-consumers of an online distribution channel was used in this study. The results suggest that an appropriate policy of market segmentation in using of online reservation systems is benefit for the service suppliers because it brings high probability of reservation and generates more profit than fixed pricing.

Keywords: dynamic pricing, e-consumers segmentation, online reservation systems, predictive analytics

Procedia PDF Downloads 233
2879 Effective Supply Chain Coordination with Hybrid Demand Forecasting Techniques

Authors: Gurmail Singh

Abstract:

Effective supply chain is the main priority of every organization which is the outcome of strategic corporate investments with deliberate management action. Value-driven supply chain is defined through development, procurement and by configuring the appropriate resources, metrics and processes. However, responsiveness of the supply chain can be improved by proper coordination. So the Bullwhip effect (BWE) and Net stock amplification (NSAmp) values were anticipated and used for the control of inventory in organizations by both discrete wavelet transform-Artificial neural network (DWT-ANN) and Adaptive Network-based fuzzy inference system (ANFIS). This work presents a comparative methodology of forecasting for the customers demand which is non linear in nature for a multilevel supply chain structure using hybrid techniques such as Artificial intelligence techniques including Artificial neural networks (ANN) and Adaptive Network-based fuzzy inference system (ANFIS) and Discrete wavelet theory (DWT). The productiveness of these forecasting models are shown by computing the data from real world problems for Bullwhip effect and Net stock amplification. The results showed that these parameters were comparatively less in case of discrete wavelet transform-Artificial neural network (DWT-ANN) model and using Adaptive network-based fuzzy inference system (ANFIS).

Keywords: bullwhip effect, hybrid techniques, net stock amplification, supply chain flexibility

Procedia PDF Downloads 124
2878 Risk Analysis of Leaks from a Subsea Oil Facility Based on Fuzzy Logic Techniques

Authors: Belén Vinaixa Kinnear, Arturo Hidalgo López, Bernardo Elembo Wilasi, Pablo Fernández Pérez, Cecilia Hernández Fuentealba

Abstract:

The expanded use of risk assessment in legislative and corporate decision-making has increased the role of expert judgement in giving data for security-related decision-making. Expert judgements are required in most steps of risk assessment: danger recognizable proof, hazard estimation, risk evaluation, and examination of choices. This paper presents a fault tree analysis (FTA), which implies a probabilistic failure analysis applied to leakage of oil in a subsea production system. In standard FTA, the failure probabilities of items of a framework are treated as exact values while evaluating the failure probability of the top event. There is continuously insufficiency of data for calculating the failure estimation of components within the drilling industry. Therefore, fuzzy hypothesis can be used as a solution to solve the issue. The aim of this paper is to examine the leaks from the Zafiro West subsea oil facility by using fuzzy fault tree analysis (FFTA). As a result, the research has given theoretical and practical contributions to maritime safety and environmental protection. It has been also an effective strategy used traditionally in identifying hazards in nuclear installations and power industries.

Keywords: expert judgment, probability assessment, fault tree analysis, risk analysis, oil pipelines, subsea production system, drilling, quantitative risk analysis, leakage failure, top event, off-shore industry

Procedia PDF Downloads 188
2877 Subjective Well-being, Beliefs, and Lifestyles of First Year University Students in the UK

Authors: Kaili C. Zhang

Abstract:

Mental well-being is an integral part of university students’ overall well-being and has been a matter of increasing concern in the UK. This study addressed the impact of university experience on students by investigating the changes students experience in their beliefs, lifestyles, and well-being during their first year of study, as well as the factors contributing to such changes. Using a longitudinal two-wave mixed method design, this project identified importantfactors that contribute to or inhibit these changes. Implications for universities across the UK are discussed.

Keywords: subjective well-being, beliefs, lifestyles, university students

Procedia PDF Downloads 195
2876 Assessment of DNA Sequence Encoding Techniques for Machine Learning Algorithms Using a Universal Bacterial Marker

Authors: Diego Santibañez Oyarce, Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán

Abstract:

The advent of high-throughput sequencing technologies has revolutionized genomics, generating vast amounts of genetic data that challenge traditional bioinformatics methods. Machine learning addresses these challenges by leveraging computational power to identify patterns and extract information from large datasets. However, biological sequence data, being symbolic and non-numeric, must be converted into numerical formats for machine learning algorithms to process effectively. So far, some encoding methods, such as one-hot encoding or k-mers, have been explored. This work proposes additional approaches for encoding DNA sequences in order to compare them with existing techniques and determine if they can provide improvements or if current methods offer superior results. Data from the 16S rRNA gene, a universal marker, was used to analyze eight bacterial groups that are significant in the pulmonary environment and have clinical implications. The bacterial genes included in this analysis are Prevotella, Abiotrophia, Acidovorax, Streptococcus, Neisseria, Veillonella, Mycobacterium, and Megasphaera. These data were downloaded from the NCBI database in Genbank file format, followed by a syntactic analysis to selectively extract relevant information from each file. For data encoding, a sequence normalization process was carried out as the first step. From approximately 22,000 initial data points, a subset was generated for testing purposes. Specifically, 55 sequences from each bacterial group met the length criteria, resulting in an initial sample of approximately 440 sequences. The sequences were encoded using different methods, including one-hot encoding, k-mers, Fourier transform, and Wavelet transform. Various machine learning algorithms, such as support vector machines, random forests, and neural networks, were trained to evaluate these encoding methods. The performance of these models was assessed using multiple metrics, including the confusion matrix, ROC curve, and F1 Score, providing a comprehensive evaluation of their classification capabilities. The results show that accuracies between encoding methods vary by up to approximately 15%, with the Fourier transform obtaining the best results for the evaluated machine learning algorithms. These findings, supported by the detailed analysis using the confusion matrix, ROC curve, and F1 Score, provide valuable insights into the effectiveness of different encoding methods and machine learning algorithms for genomic data analysis, potentially improving the accuracy and efficiency of bacterial classification and related genomic studies.

Keywords: DNA encoding, machine learning, Fourier transform, Fourier transformation

Procedia PDF Downloads 15
2875 Effects of Family Order and Informal Social Control on Protecting against Child Maltreatment: A Comparative Study of Seoul and Kathmandu

Authors: Thapa Sirjana, Clifton R. Emery

Abstract:

This paper examines the family order and Informal Social Control (ISC) by the extended families as a protective factor against Child Maltreatment. The findings are discussed using the main effects and the interaction effects of family order and informal social control by the extended families. The findings suggest that IPV mothers are associated with child abuse and child neglect. The children are neglected in the home more and physical abuse occurs in the case, if mothers are abused by their husbands. The mother’s difficulties of being abused may lead them to neglect their children. The findings suggest that ‘family order’ is a significant protective factor against child maltreatment. The results suggest that if the family order is neither too high nor too low than that can play a role as a protective factor. Soft type of ISC is significantly associated with child maltreatment. This study suggests that the soft type of ISC by the extended families is a helpful approach to develop child protection in both the countries. This study is analyzed the data collected from Seoul and Kathmandu families and neighborhood study (SKFNS). Random probability cluster sample of married or partnered women in 20 Kathmandu wards and in Seoul 34 dongs were selected using probability proportional to size (PPS) sampling. Overall, the study is to make a comparative study of Korea and Nepal and examine how the cultural differences and similarities associate with the child maltreatment.

Keywords: child maltreatment, intimate partner violence, informal social control and family order Seoul, Kathmandu

Procedia PDF Downloads 240
2874 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals

Authors: Christine F. Boos, Fernando M. Azevedo

Abstract:

Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.

Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing

Procedia PDF Downloads 526
2873 Sequence Polymorphism and Haplogroup Distribution of Mitochondrial DNA Control Regions HVS1 and HVS2 in a Southwestern Nigerian Population

Authors: Ogbonnaya O. Iroanya, Samson T. Fakorede, Osamudiamen J. Edosa, Hadiat A. Azeez

Abstract:

The human mitochondrial DNA (mtDNA) is about 17 kbp circular DNA fragments found within the mitochondria together with smaller fragments of 1200 bp known as the control region. Knowledge of variation within populations has been employed in forensic and molecular anthropology studies. The study was aimed at investigating the polymorphic nature of the two hypervariable segments (HVS) of the mtDNA, i.e., HVS1 and HVS2, and to determine the haplogroup distribution among individuals resident in Lagos, Southwestern Nigeria. Peripheral blood samples were obtained from sixty individuals who are not related maternally, followed by DNA extraction and amplification of the extracted DNA using primers specific for the regions under investigation. DNA amplicons were sequenced, and sequenced data were aligned and compared to the revised Cambridge Reference Sequence (rCRS) GenBank Accession number: NC_012920.1) using BioEdit software. Results obtained showed 61 and 52 polymorphic nucleotide positions for HVS1 and HVS2, respectively. While a total of three indels mutation were recorded for HVS1, there were seven for HVS2. Also, transition mutations predominate nucleotide change observed in the study. Genetic diversity (GD) values for HVS1 and HVS2 were estimated to be 84.21 and 90.4%, respectively, while random match probability was 0.17% for HVS1 and 0.89% for HVS2. The study also revealed mixed haplogroups specific to the African (L1-L3) and the Eurasians (U and H) lineages. New polymorphic sites obtained from the study are promising for human identification purposes.

Keywords: hypervariable region, indels, mitochondrial DNA, polymorphism, random match probability

Procedia PDF Downloads 109
2872 Green Synthesis of Magnetic, Silica Nanocomposite and Its Adsorptive Performance against Organochlorine Pesticides

Authors: Waleed A. El-Said, Dina M. Fouad, Mohamed H. Aly, Mohamed A. El-Gahami

Abstract:

Green synthesis of nanomaterials has received increasing attention as an eco-friendly technology in materials science. Here, we have used two types of extractions from green tea leaf (i.e. total extraction and tannin extraction) as reducing agents for a rapid, simple and one step synthesis method of mesoporous silica nanoparticles (MSNPs)/iron oxide (Fe3O4) nanocomposite based on deposition of Fe3O4 onto MSNPs. MSNPs/Fe3O4 nanocomposite were characterized by X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy, energy dispersive X-ray, vibrating sample magnetometer, N2 adsorption, and high-resolution transmission electron microscopy. The average mesoporous silica particle diameter was found to be around 30 nm with high surface area (818 m2/gm). MSNPs/Fe3O4 nanocomposite was used for removing lindane pesticide (an environmental hazard material) from aqueous solutions. Fourier transform infrared, UV-vis, High-performance liquid chromatography and gas chromatography techniques were used to confirm the high ability of MSNPs/Fe3O4 nanocomposite for sensing and capture of lindane molecules with high sorption capacity (more than 89%) that could develop a new eco-friendly strategy for detection and removing of pesticide and as a promising material for water treatment application.

Keywords: green synthesis, mesoporous silica, magnetic iron oxide NPs, adsorption Lindane

Procedia PDF Downloads 433
2871 Globalization as Instrument for Multi-National Corporation in Transforming Asian’s Perspective towards Clean Water Consumption

Authors: Atanta Gian

Abstract:

It is inevitable that globalization has succeeded in transforming the world today. The influence of globalization has emerged in almost every aspect of life nowadays, especially in shaping the perception of the people. It can be seen on how easy for people are affected by the information surrounding them. Due to globalization, the flow of information has become more rapid along with the development of technology. People tend to believe in information that they actually get by themselves, if there is information where most of the people believe it is true, then this information could be categorized as factual and relevant. Therefore if people gain information on what is best for them in terms of daily consumption, then this information could transform their perspective, and it becomes a consideration in selecting their needs for daily consumption. By looking at this trend, the author sees that globalization could be used by Multi-National Corporation (MNC) to enhance the promotion of their products. This is applied by shaping the perspectives of the world regarding what is the best for them. Multi-National Corporation which has better technology in terms of the development of their external promotion could utilize this opportunity to affect people’s perspectives into what they want. In this paper, the author would like to elaborate how globalization is applied by MNC to shape people’s perspective regarding what is the best for them. The author would utilize a case study to analyze on how MNC could transform the perspectives of Asian people regarding the necessary of having a better quality drinking water, which in this case, MNC has shaped the perspective of Asian people in choosing their product by promoting the bottled water as the best choice for them. In the end of this paper, author would come to a conclusion that MNCs are able to shape the world’s perspective regarding the needs of their products which is supported by the globalization that is happening now.

Keywords: consumption, globalisation, influence, information technology, multi-national corporations

Procedia PDF Downloads 205
2870 Corrosion Risk Assessment/Risk Based Inspection (RBI)

Authors: Lutfi Abosrra, Alseddeq Alabaoub, Nuri Elhaloudi

Abstract:

Corrosion processes in the Oil & Gas industry can lead to failures that are usually costly to repair, costly in terms of loss of contaminated product, in terms of environmental damage and possibly costly in terms of human safety. This article describes the results of the corrosion review and criticality assessment done at Mellitah Gas (SRU unit) for pressure equipment and piping system. The information gathered through the review was intended for developing a qualitative RBI study. The corrosion criticality assessment has been carried out by applying company procedures and industrial recommended practices such as API 571, API 580/581, ASME PCC 3, which provides a guideline for establishing corrosion integrity assessment. The corrosion review is intimately related to the probability of failure (POF). During the corrosion study, the process units are reviewed by following the applicable process flow diagrams (PFDs) in the presence of Mellitah’s personnel from process engineering, inspection, and corrosion/materials and reliability engineers. The expected corrosion damage mechanism (internal and external) was identified, and the corrosion rate was estimated for every piece of equipment and corrosion loop in the process units. A combination of both Consequence and Likelihood of failure was used for determining the corrosion risk. A qualitative consequence of failure (COF) for each individual item was assigned based on the characteristics of the fluid as per its flammability, toxicity, and pollution into three levels (High, Medium, and Low). A qualitative probability of failure (POF)was applied to evaluate the internal and external degradation mechanism, a high-level point-based (0 to 10) for the purpose of risk prioritizing in the range of Low, Medium, and High.

Keywords: corrosion, criticality assessment, RBI, POF, COF

Procedia PDF Downloads 70
2869 The Moment of the Optimal Average Length of the Multivariate Exponentially Weighted Moving Average Control Chart for Equally Correlated Variables

Authors: Edokpa Idemudia Waziri, Salisu S. Umar

Abstract:

The Hotellng’s T^2 is a well-known statistic for detecting a shift in the mean vector of a multivariate normal distribution. Control charts based on T have been widely used in statistical process control for monitoring a multivariate process. Although it is a powerful tool, the T statistic is deficient when the shift to be detected in the mean vector of a multivariate process is small and consistent. The Multivariate Exponentially Weighted Moving Average (MEWMA) control chart is one of the control statistics used to overcome the drawback of the Hotellng’s T statistic. In this paper, the probability distribution of the Average Run Length (ARL) of the MEWMA control chart when the quality characteristics exhibit substantial cross correlation and when the process is in-control and out-of-control was derived using the Markov Chain algorithm. The derivation of the probability functions and the moments of the run length distribution were also obtained and they were consistent with some existing results for the in-control and out-of-control situation. By simulation process, the procedure identified a class of ARL for the MEWMA control when the process is in-control and out-of-control. From our study, it was observed that the MEWMA scheme is quite adequate for detecting a small shift and a good way to improve the quality of goods and services in a multivariate situation. It was also observed that as the in-control average run length ARL0¬ or the number of variables (p) increases, the optimum value of the ARL0pt increases asymptotically and as the magnitude of the shift σ increases, the optimal ARLopt decreases. Finally, we use the example from the literature to illustrate our method and demonstrate its efficiency.

Keywords: average run length, markov chain, multivariate exponentially weighted moving average, optimal smoothing parameter

Procedia PDF Downloads 418
2868 Impact of Zeolite NaY Synthesized from Kaolin on the Properties of Pyrolytic Oil Derived from Used Tire

Authors: Julius Ilawe Osayi, Peter Osifo

Abstract:

Solid waste disposal, such as used tires is a global challenge as well as energy crisis due to rising energy demand amidst price uncertainty and depleting fossil fuel reserves. Therefore, the effectiveness of pyrolysis as a disposal method that can transform used tires into liquid fuel and other end-products has made the process attractive to researchers. Although used tires have been converted to liquid fuel using pyrolysis, there is the need to improve on the liquid fuel properties. Hence, this paper reports the investigation of zeolite NaY synthesized from kaolin, a locally abundant soil material in the Benin metropolis as a suitable catalyst and its effect on the properties of pyrolytic oil produced from used tires. The pyrolysis process was conducted for a range of 1 to 10 wt.% of catalyst concentration to used tire at a temperature of 600 oC, a heating rate of 15oC/min and particle size of 6mm. Although no significant increase in pyrolytic oil yield was observed compared to the previously investigated non-catalytic pyrolysis of a used tire. However, the Fourier transform infrared (FTIR), Nuclear Magnetic Resonance (NMR); and Gas chromatography-mass spectrometry (GC-MS) characterization results revealed the pyrolytic oil to possess an improved physicochemical and fuel properties alongside valuable industrial chemical species. This confirms the possibility of transforming kaolin into a catalyst suitable for improved fuel properties of the liquid fraction obtainable from thermal cracking of hydrocarbon materials.

Keywords: catalytic pyrolysis, fossil fuel, kaolin, pyrolytic oil, used tyres, Zeolite NaY

Procedia PDF Downloads 175
2867 Computer-Aided Detection of Simultaneous Abdominal Organ CT Images by Iterative Watershed Transform

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

Interpretation of medical images benefits from anatomical and physiological priors to optimize computer-aided diagnosis applications. Segmentation of liver, spleen and kidneys is regarded as a major primary step in the computer-aided diagnosis of abdominal organ diseases. In this paper, a semi-automated method for medical image data is presented for the abdominal organ segmentation data using mathematical morphology. Our proposed method is based on hierarchical segmentation and watershed algorithm. In our approach, a powerful technique has been designed to suppress over-segmentation based on mosaic image and on the computation of the watershed transform. Our algorithm is currency in two parts. In the first, we seek to improve the quality of the gradient-mosaic image. In this step, we propose a method for improving the gradient-mosaic image by applying the anisotropic diffusion filter followed by the morphological filters. Thereafter, we proceed to the hierarchical segmentation of the liver, spleen and kidney. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.

Keywords: anisotropic diffusion filter, CT images, morphological filter, mosaic image, simultaneous organ segmentation, the watershed algorithm

Procedia PDF Downloads 432
2866 Degree of Approximation of Functions by Product Means

Authors: Hare Krishna Nigam

Abstract:

In this paper, for the first time, (E,q)(C,2) product summability method is introduced and two quite new results on degree of approximation of the function f belonging to Lip (alpha,r)class and W(L(r), xi(t)) class by (E,q)(C,2) product means of Fourier series, has been obtained.

Keywords: Degree of approximation, (E, q)(C, 2) means, Fourier series, Lebesgue integral, Lip (alpha, r)class, W(L(r), xi(t))class of functions

Procedia PDF Downloads 515
2865 From Responses of Macroinvertebrate Metrics to the Definition of Reference Thresholds

Authors: Hounyèmè Romuald, Mama Daouda, Argillier Christine

Abstract:

The present study focused on the use of benthic macrofauna to define the reference state of an anthropized lagoon (Nokoué-Benin) from the responses of relevant metrics to proxies. The approach used is a combination of a joint species distribution model and Bayesian networks. The joint species distribution model was used to select the relevant metrics and generate posterior probabilities that were then converted into posterior response probabilities for each of the quality classes (pressure levels), which will constitute the conditional probability tables allowing the establishment of the probabilistic graph representing the different causal relationships between metrics and pressure proxies. For the definition of the reference thresholds, the predicted responses for low-pressure levels were read via probability density diagrams. Observations collected during high and low water periods spanning 03 consecutive years (2004-2006), sampling 33 macroinvertebrate taxa present at all seasons and sampling points, and measurements of 14 environmental parameters were used as application data. The study demonstrated reliable inferences, selection of 07 relevant metrics and definition of quality thresholds for each environmental parameter. The relevance of the metrics as well as the reference thresholds for ecological assessment despite the small sample size, suggests the potential for wider applicability of the approach for aquatic ecosystem monitoring and assessment programs in developing countries generally characterized by a lack of monitoring data.

Keywords: pressure proxies, bayesian inference, bioindicators, acadjas, functional traits

Procedia PDF Downloads 78
2864 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses

Authors: Neil Bar, Andrew Heweston

Abstract:

Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.

Keywords: probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability

Procedia PDF Downloads 206
2863 Normative Reflections on the International Court of Justice's Jurisprudence on the Protection of Human Rights in Times of War

Authors: Roger-Claude Liwanga

Abstract:

This article reflects on the normative aspects of the jurisprudence on the protection of human rights in times of war that the International Court of Justice (ICJ) developed in 2005 in the Case Concerning Armed Activities on the Territory of the Congo (Democratic Republic of Congo v. Uganda). The article focuses on theories raised in connection with the Democratic Republic of Congo (DRC)'s claim of the violation of human rights of its populations by Uganda as opposed to the violation of its territorial integrity claims. The article begins with a re-visitation of the doctrine of state extraterritorial responsibility for violations of human rights by suggesting that a state's accountability for the breach of its international obligations is not territorially confined but rather transcends the State's national borders. The article highlights the criteria of assessing the State's extraterritorial responsibility, including the circumstances: (1) where the concerned State has effective control over the territory of another State in the context of belligerent occupation, and (2) when the unlawful actions committed by the State's organs on the occupied territory can be attributable to that State. The article also analyzes the ICJ's opinions articulated in DRC v. Uganda with reference to the relationship between human rights law and humanitarian law, and it contends that the ICJ had revised the traditional interaction between these two bodies of law to the extent that human rights law can no longer be excluded from applying in times of war as both branches are complementary rather than exclusive. The article correspondingly looks at the issue of reparations for victims of human rights violations. It posits that reparations for victims of human rights violations should be integral (including restitution, compensation, rehabilitation, satisfaction, and guarantees of non-repetition). Yet, the article concludes by emphasizing that reparations for victims were not integral in DRC v. Uganda because: (1) the ICJ failed to set a reasonable timeframe for the negotiations between the DRC and Uganda on the amount of compensation, resulting in Uganda paying no financial reparation to the DRC since 2005; and (2) the ICJ did not request Uganda to domestically prosecute the perpetrators of human rights abuses.

Keywords: human rights law, humanitarian law, civilian protection, extraterritorial responsibility

Procedia PDF Downloads 131
2862 Estimation of Effective Radiation Dose Following Computed Tomography Urography at Aminu Kano Teaching Hospital, Kano Nigeria

Authors: Idris Garba, Aisha Rabiu Abdullahi, Mansur Yahuza, Akintade Dare

Abstract:

Background: CT urography (CTU) is efficient radiological examination for the evaluation of the urinary system disorders. However, patients are exposed to a significant radiation dose which is in a way associated with increased cancer risks. Objectives: To determine Computed Tomography Dose Index following CTU, and to evaluate organs equivalent doses. Materials and Methods: A prospective cohort study was carried at a tertiary institution located in Kano northwestern. Ethical clearance was sought and obtained from the research ethics board of the institution. Demographic, scan parameters and CT radiation dose data were obtained from patients that had CTU procedure. Effective dose, organ equivalent doses, and cancer risks were estimated using SPSS statistical software version 16 and CT dose calculator software. Result: A total of 56 patients were included in the study, consisting of 29 males and 27 females. The common indication for CTU examination was found to be renal cyst seen commonly among young adults (15-44yrs). CT radiation dose values in DLP, CTDI and effective dose for CTU were 2320 mGy cm, CTDIw 9.67 mGy and 35.04 mSv respectively. The probability of cancer risks was estimated to be 600 per a million CTU examinations. Conclusion: In this study, the radiation dose for CTU is considered significantly high, with increase in cancer risks probability. Wide radiation dose variations between patient doses suggest that optimization is not fulfilled yet. Patient radiation dose estimate should be taken into consideration when imaging protocols are established for CT urography.

Keywords: CT urography, cancer risks, effective dose, radiation exposure

Procedia PDF Downloads 341
2861 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 80
2860 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'

Authors: Anthony Coogan

Abstract:

Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.

Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle

Procedia PDF Downloads 198
2859 Biodiesel Production from Edible Oil Wastewater Sludge with Bioethanol Using Nano-Magnetic Catalysis

Authors: Wighens Ngoie Ilunga, Pamela J. Welz, Olewaseun O. Oyekola, Daniel Ikhu-Omoregbe

Abstract:

Currently, most sludge from the wastewater treatment plants of edible oil factories is disposed to landfills, but landfill sites are finite and potential sources of environmental pollution. Production of biodiesel from wastewater sludge can contribute to energy production and waste minimization. However, conventional biodiesel production is energy and waste intensive. Generally, biodiesel is produced from the transesterification reaction of oils with alcohol (i.e., Methanol, ethanol) in the presence of a catalyst. Homogeneously catalysed transesterification is the conventional approach for large-scale production of biodiesel as reaction times are relatively short. Nevertheless, homogenous catalysis presents several challenges such as high probability of soap. The current study aimed to reuse wastewater sludge from the edible oil industry as a novel feedstock for both monounsaturated fats and bioethanol for the production of biodiesel. Preliminary results have shown that the fatty acid profile of the oilseed wastewater sludge is favourable for biodiesel production with 48% (w/w) monounsaturated fats and that the residue left after the extraction of fats from the sludge contains sufficient fermentable sugars after steam explosion followed by an enzymatic hydrolysis for the successful production of bioethanol [29% (w/w)] using a commercial strain of Saccharomyces cerevisiae. A novel nano-magnetic catalyst was synthesised from mineral processing alkaline tailings, mainly containing dolomite originating from cupriferous ores using a modified sol-gel. The catalyst elemental chemical compositions and structural properties were characterised by X-ray diffraction (XRD), scanning electron microscopy (SEM), Fourier transform infra-red (FTIR) and the BET for the surface area with 14.3 m²/g and 34.1 nm average pore diameter. The mass magnetization of the nano-magnetic catalyst was 170 emu/g. Both the catalytic properties and reusability of the catalyst were investigated. A maximum biodiesel yield of 78% was obtained, which dropped to 52% after the fourth transesterification reaction cycle. The proposed approach has the potential to reduce material costs, energy consumption and water usage associated with conventional biodiesel production technologies. It may also mitigate the impact of conventional biodiesel production on food and land security, while simultaneously reducing waste.

Keywords: biodiesel, bioethanol, edible oil wastewater sludge, nano-magnetism

Procedia PDF Downloads 140
2858 Harmonic Mitigation and Total Harmonic Distortion Reduction in Grid-Connected PV Systems: A Case Study Using Real-Time Data and Filtering Techniques

Authors: Atena Tazikeh Lemeski, Ismail Ozdamar

Abstract:

This study presents a detailed analysis of harmonic distortion in a grid-connected photovoltaic (PV) system using real-time data captured from a solar power plant. Harmonics introduced by inverters in PV systems can degrade power quality and lead to increased Total Harmonic Distortion (THD), which poses challenges such as transformer overheating, increased power losses, and potential grid instability. This research addresses these issues by applying Fast Fourier Transform (FFT) to identify significant harmonic components and employing notch filters to target specific frequencies, particularly the 3rd harmonic (150 Hz), which was identified as the largest contributor to THD. Initial analysis of the unfiltered voltage signal revealed a THD of 21.15%, with prominent harmonic peaks at 150 Hz, 250 Hz and 350 Hz, corresponding to the 3rd, 5th, and 7th harmonics, respectively. After implementing the notch filters, the THD was reduced to 5.72%, demonstrating the effectiveness of this approach in mitigating harmonic distortion without affecting the fundamental frequency. This paper provides practical insights into the application of real-time filtering techniques in PV systems and their role in improving overall grid stability and power quality. The results indicate that targeted harmonic mitigation is crucial for the sustainable integration of renewable energy sources into modern electrical grids.

Keywords: grid-connected photovoltaic systems, fast Fourier transform, harmonic filtering, inverter-induced harmonics

Procedia PDF Downloads 6
2857 Reconstruction and Rejection of External Disturbances in a Dynamical System

Authors: Iftikhar Ahmad, A. Benallegue, A. El Hadri

Abstract:

In this paper, we have proposed an observer for the reconstruction and a control law for the rejection application of unknown bounded external disturbance in a dynamical system. The strategy of both the observer and the controller is designed like a second order sliding mode with a proportional-integral (PI) term. Lyapunov theory is used to prove the exponential convergence and stability. Simulations results are given to show the performance of this method.

Keywords: non-linear systems, sliding mode observer, disturbance rejection, nonlinear control

Procedia PDF Downloads 330