Search results for: loss distribution approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20089

Search results for: loss distribution approach

19639 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement

Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova

Abstract:

One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.

Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank

Procedia PDF Downloads 361
19638 Tumor Boundary Extraction Using Intensity and Texture-Based on Gradient Vector

Authors: Namita Mittal, Himakshi Shekhawat, Ankit Vidyarthi

Abstract:

In medical research study, doctors and radiologists face lot of complexities in analysing the brain tumors in Magnetic Resonance (MR) images. Brain tumor detection is difficult due to amorphous tumor shape and overlapping of similar tissues in nearby region. So, radiologists require one such clinically viable solution which helps in automatic segmentation of tumor inside brain MR image. Initially, segmentation methods were used to detect tumor, by dividing the image into segments but causes loss of information. In this paper, a hybrid method is proposed which detect Region of Interest (ROI) on the basis of difference in intensity values and texture values of tumor region using nearby tissues with Gradient Vector Flow (GVF) technique in the identification of ROI. Proposed approach uses both intensity and texture values for identification of abnormal section of the brain MR images. Experimental results show that proposed method outperforms GVF method without any loss of information.

Keywords: brain tumor, GVF, intensity, MR images, segmentation, texture

Procedia PDF Downloads 409
19637 Modeling of System Availability and Bayesian Analysis of Bivariate Distribution

Authors: Muhammad Farooq, Ahtasham Gul

Abstract:

To meet the desired standard, it is important to monitor and analyze different engineering processes to get desired output. The bivariate distributions got a lot of attention in recent years to describe the randomness of natural as well as artificial mechanisms. In this article, a bivariate model is constructed using two independent models developed by the nesting approach to study the effect of each component on reliability for better understanding. Further, the Bayes analysis of system availability is studied by considering prior parametric variations in the failure time and repair time distributions. Basic statistical characteristics of marginal distribution, like mean median and quantile function, are discussed. We use inverse Gamma prior to study its frequentist properties by conducting Monte Carlo Markov Chain (MCMC) sampling scheme.

Keywords: reliability, system availability Weibull, inverse Lomax, Monte Carlo Markov Chain, Bayesian

Procedia PDF Downloads 51
19636 Remote Sensing Approach to Predict the Impacts of Land Use/Land Cover Change on Urban Thermal Comfort Using Machine Learning Algorithms

Authors: Ahmad E. Aldousaria, Abdulla Al Kafy

Abstract:

Urbanization is an incessant process that involves the transformation of land use/land cover (LULC), resulting in a reduction of cool land covers and thermal comfort zones (TCZs). This study explores the directional shrinkage of TCZs in Kuwait using Landsat satellite data from 1991 – 2021 to predict the future LULC and TCZ distribution for 2026 and 2031 using cellular automata (CA) and artificial neural network (ANN) algorithms. Analysis revealed a rapid urban expansion (40 %) in SE, NE, and NW directions and TCZ shrinkage in N – NW and SW directions with 25 % of the very uncomfortable area. The predicted result showed an urban area increase from 44 % in 2021 to 47 % and 52 % in 2026 and 2031, respectively, where uncomfortable zones were found to be concentrated around urban areas and bare lands in N – NE and N – NW directions. This study proposes an effective and sustainable framework to control TCZ shrinkage, including zero soil policies, planned landscape design, manmade water bodies, and rooftop gardens. This study will help urban planners and policymakers to make Kuwait an eco–friendly, functional, and sustainable country.

Keywords: land cover change, thermal environment, green cover loss, machine learning, remote sensing

Procedia PDF Downloads 203
19635 Analytical Formulae for the Approach Velocity Head Coefficient

Authors: Abdulrahman Abdulrahman

Abstract:

Critical depth meters, such as abroad crested weir, Venture Flume and combined control flume are standard devices for measuring flow in open channels. The discharge relation for these devices cannot be solved directly, but it needs iteration process to account for the approach velocity head. In this paper, analytical solution was developed to calculate the discharge in a combined critical depth-meter namely, a hump combined with lateral contraction in rectangular channel with subcritical approach flow including energy losses. Also analytical formulae were derived for approach velocity head coefficient for different types of critical depth meters. The solution was derived by solving a standard cubic equation considering energy loss on the base of trigonometric identity. The advantage of this technique is to avoid iteration process adopted in measuring flow by these devices. Numerical examples are chosen for demonstration of the proposed solution.

Keywords: broad crested weir, combined control meter, control structures, critical flow, discharge measurement, flow control, hydraulic engineering, hydraulic structures, open channel flow

Procedia PDF Downloads 250
19634 Velocity Distribution in Density Currents Flowing over Rough Beds

Authors: Reza Nasrollahpour, Mohamad Hidayat Bin Jamal, Zulhilmi Bin Ismail

Abstract:

Density currents are generated when the fluid of one density is released into another fluid with a different density. These currents occur in a variety of natural and man-made environments, and this emphasises the importance of studying them. In most practical cases, the density currents flow over the surfaces which are not plane; however, there have been limited investigations in this regard. This study uses laboratory experiments to analyse the influence of bottom roughness on the velocity distribution within these dense underflows. The currents are analysed over a plane surface and three different configurations of beam-roughened beds. The velocity profiles are collected using Acoustic Doppler Velocimetry technique, and the distribution of velocity within these currents is formulated for the tested beds. The results indicate that the empirical power and Gaussian relations can describe the velocity distribution in the inner and outer regions of the profiles, respectively. Moreover, it is found that the bottom roughness is the primary controlling parameter in the inner region.

Keywords: density currents, velocity profiles, Acoustic Doppler Velocimeter, bed roughness

Procedia PDF Downloads 158
19633 The Modality of Multivariate Skew Normal Mixture

Authors: Bader Alruwaili, Surajit Ray

Abstract:

Finite mixtures are a flexible and powerful tool that can be used for univariate and multivariate distributions, and a wide range of research analysis has been conducted based on the multivariate normal mixture and multivariate of a t-mixture. Determining the number of modes is an important activity that, in turn, allows one to determine the number of homogeneous groups in a population. Our work currently being carried out relates to the study of the modality of the skew normal distribution in the univariate and multivariate cases. For the skew normal distribution, the aims are associated with studying the modality of the skew normal distribution and providing the ridgeline, the ridgeline elevation function, the $\Pi$ function, and the curvature function, and this will be conducive to an exploration of the number and location of mode when mixing the two components of skew normal distribution. The subsequent objective is to apply these results to the application of real world data sets, such as flow cytometry data.

Keywords: mode, modality, multivariate skew normal, finite mixture, number of mode

Procedia PDF Downloads 463
19632 Synchronized Vehicle Routing for Equitable Resource Allocation in Food Banks

Authors: Rabiatu Bonku, Faisal Alkaabneh

Abstract:

Inspired by a food banks distribution operation for non-profit organization, we study a variant synchronized vehicle routing problem for equitable resource allocation. This research paper introduces a Mixed Integer Programming (MIP) model aimed at addressing the complex challenge of efficiently distributing vital resources, particularly for food banks serving vulnerable populations in urban areas. Our optimization approach places a strong emphasis on social equity, ensuring a fair allocation of food to partner agencies while minimizing wastage. The primary objective is to enhance operational efficiency while guaranteeing fair distribution and timely deliveries to prevent food spoilage. Furthermore, we assess four distinct models that consider various aspects of sustainability, including social and economic factors. We conduct a comprehensive numerical analysis using real-world data to gain insights into the trade-offs that arise, while also demonstrating the models’ performance in terms of fairness, effectiveness, and the percentage of food waste. This provides valuable managerial insights for food bank managers. We show that our proposed approach makes a significant contribution to the field of logistics optimization and social responsibility, offering valuable insights for improving the operations of food banks.

Keywords: food banks, humanitarian logistics, equitable resource allocation, synchronized vehicle routing

Procedia PDF Downloads 39
19631 Point Estimation for the Type II Generalized Logistic Distribution Based on Progressively Censored Data

Authors: Rana Rimawi, Ayman Baklizi

Abstract:

Skewed distributions are important models that are frequently used in applications. Generalized distributions form a class of skewed distributions and gain widespread use in applications because of their flexibility in data analysis. More specifically, the Generalized Logistic Distribution with its different types has received considerable attention recently. In this study, based on progressively type-II censored data, we will consider point estimation in type II Generalized Logistic Distribution (Type II GLD). We will develop several estimators for its unknown parameters, including maximum likelihood estimators (MLE), Bayes estimators and linear estimators (BLUE). The estimators will be compared using simulation based on the criteria of bias and Mean square error (MSE). An illustrative example of a real data set will be given.

Keywords: point estimation, type II generalized logistic distribution, progressive censoring, maximum likelihood estimation

Procedia PDF Downloads 170
19630 An Investigation on Electric Field Distribution around 380 kV Transmission Line for Various Pylon Models

Authors: C. F. Kumru, C. Kocatepe, O. Arikan

Abstract:

In this study, electric field distribution analyses for three pylon models are carried out by a Finite Element Method (FEM) based software. Analyses are performed in both stationary and time domains to observe instantaneous values along with the effective ones. Considering the results of the study, different line geometries is considerably affecting the magnitude and distribution of electric field although the line voltages are the same. Furthermore, it is observed that maximum values of instantaneous electric field obtained in time domain analysis are quite higher than the effective ones in stationary mode. In consequence, electric field distribution analyses should be individually made for each different line model and the limit exposure values or distances to residential buildings should be defined according to the results obtained.

Keywords: electric field, energy transmission line, finite element method, pylon

Procedia PDF Downloads 706
19629 Analysis of Operating Speed on Four-Lane Divided Highways under Mixed Traffic Conditions

Authors: Chaitanya Varma, Arpan Mehar

Abstract:

The present study demonstrates the procedure to analyse speed data collected on various four-lane divided sections in India. Field data for the study was collected at different straight and curved sections on rural highways with the help of radar speed gun and video camera. The data collected at the sections were analysed and parameters pertain to speed distributions were estimated. The different statistical distribution was analysed on vehicle type speed data and for mixed traffic speed data. It was found that vehicle type speed data was either follows the normal distribution or Log-normal distribution, whereas the mixed traffic speed data follows more than one type of statistical distribution. The most common fit observed on mixed traffic speed data were Beta distribution and Weibull distribution. The separate operating speed model based on traffic and roadway geometric parameters were proposed in the present study. The operating speed model with traffic parameters and curve geometry parameters were established. Two different operating speed models were proposed with variables 1/R and Ln(R) and were found to be realistic with a different range of curve radius. The models developed in the present study are simple and realistic and can be used for forecasting operating speed on four-lane highways.

Keywords: highway, mixed traffic flow, modeling, operating speed

Procedia PDF Downloads 439
19628 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index

Authors: Todd Zhou, Mikhail Yurochkin

Abstract:

Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.

Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index

Procedia PDF Downloads 102
19627 Climate Change Impacts on Future Wheat Growing Areas

Authors: Rasha Aljaryian, Lalit Kumar

Abstract:

Climate is undergoing continuous change and this trend will affect the cultivation areas ofmost crops, including wheat (Triticum aestivum L.), in the future. The current suitable cultivation areas may become unsuitable climatically. Countries that depend on wheat cultivation and export may suffer an economic loss because of production decline. On the other hand, some regions of the world could gain economically by increasing cultivation areas. This study models the potential future climatic suitability of wheat by using CLIMEX software. Two different global climate models (GCMs) were used, CSIRO-Mk3.0 (CS) and MIROC-H (MR), with two emission scenarios (A2, A1B). The results of this research indicate that the suitable climatic areas for wheat in the southern hemisphere, such as Australia, are expected to contract by the end of this century. However, some unsuitable or marginal areas will become climatically suitable under future climate scenarios. In North America and Europe further expansion inland could occur. Also, the results illustrate that heat and dry stresses as abiotic climatic factors will play an important role in wheat distribution in the future. Providing sufficient information about future wheat distribution will be useful for agricultural ministries and organizations to manage the shift in production areas in the future. They can minimize the expected harmful economic consequences by preparing strategic plans and identifying new areas for wheat cultivation.

Keywords: Climate change, Climate modelling, CLIMEX, Triticum aestivum, Wheat

Procedia PDF Downloads 223
19626 Challenges in Early Diagnosis of Enlarged Vestibular Aqueduct (EVA) in Pediatric Population: A Single Case Report

Authors: Asha Manoharan, Sooraj A. O, Anju K. G

Abstract:

Enlarged vestibular aqueduct (EVA) refers to the presence of congenital sensorineural hearing loss with an enlarged vestibular aqueduct. The Audiological symptoms of EVA are fluctuating and progressive in nature and the diagnosis of EVAS can be confirmed only with radiological evaluation. Hence it is difficult to differentiate EVA from conditions like Meniere’s disease, semi-circular dehiscence, etc based on audiological findings alone. EVA in adults is easy to identify due to distinct vestibular symptoms. In children, EVA can remain either unidentified or misdiagnosed until the vestibular symptoms are evident. Motor developmental delay, especially the ones involving a change of body alignment, has been reported in the pediatric population with EVA. So, it should be made mandatory to recommend radiological evaluation in young children with fluctuating hearing loss reporting with motor developmental delay. This single case study of a baby with Enlarged Vestibular Aqueduct (EVA) primarily aimed to address the following: a) Challenges while diagnosing young patients with EVA and fluctuating hearing loss, b) Importance of radiological evaluation in audiological diagnosis in the pediatric population, c) Need for regular monitoring of hearing, hearing aid performance, and cochlear implant mapping closely for potential fluctuations in such populations, d) Importance of reviewing developmental, language milestones in very young children with fluctuating hearing loss.

Keywords: enlarged vestibular aqueduct (EVA), motor delay, radiological evaluation, fluctuating hearing loss, cochlear implant

Procedia PDF Downloads 135
19625 Force Distribution and Muscles Activation for Ankle Instability Patients with Rigid and Kinesiotape while Standing

Authors: Norazlin Mohamad, Saiful Adli Bukry, Zarina Zahari, Haidzir Manaf, Hanafi Sawalludin

Abstract:

Background: Deficit in neuromuscular recruitment and decrease force distribution were the common problems among ankle instability patients due to altered joint kinematics that lead to recurrent ankle injuries. Rigid Tape and KT Tape had widely been used as therapeutic and performance enhancement tools in ankle stability. However the difference effect between this two tapes is still controversial. Objective: To investigate the different effect between Rigid Tape and KT Tape on force distribution and muscle activation among ankle instability patients while standing. Study design: Crossover trial. Participants: 27 patients, age between 18 to 30 years old participated in this study. All the subjects were applied with KT Tape & Rigid Tape on their affected ankle with 3 days of interval for each intervention. The subjects were tested with their barefoot (without tape) first to act as a baseline before proceeding with KT Tape, and then with Rigid Tape. Result: There were no significant difference on force distribution at forefoot and back-foot for both tapes while standing. However the mean data shows that Rigid Tape has the highest force distribution at back-foot rather than forefoot when compared with KT Tape that had more force distribution at forefoot while standing. Regarding muscle activation (Peroneus Longus), results showed significant difference between Rigid Tape and KT Tape (p= 0.048). However, there was no significant difference on Tibialis Anterior muscle activation between both tapes while standing. Conclusion: The results indicated that Peroneus longus muscle was more active when applied Rigid Tape rather than KT Tape in ankle instability patients while standing.

Keywords: ankle instability, kinematic, muscle activation, force distribution, Rigid Tape, KT tape

Procedia PDF Downloads 384
19624 Conservativeness of Probabilistic Constrained Optimal Control Method for Unknown Probability Distribution

Authors: Tomoaki Hashimoto

Abstract:

In recent decades, probabilistic constrained optimal control problems have attracted much attention in many research field. Although probabilistic constraints are generally intractable in an optimization problem, several tractable methods haven been proposed to handle probabilistic constraints. In most methods, probabilistic constraints are reduced to deterministic constraints that are tractable in an optimization problem. However, there is a gap between the transformed deterministic constraints in case of known and unknown probability distribution. This paper examines the conservativeness of probabilistic constrained optimization method with the unknown probability distribution. The objective of this paper is to provide a quantitative assessment of the conservatism for tractable constraints in probabilistic constrained optimization with the unknown probability distribution.

Keywords: optimal control, stochastic systems, discrete time systems, probabilistic constraints

Procedia PDF Downloads 554
19623 An Extended Inverse Pareto Distribution, with Applications

Authors: Abdel Hadi Ebraheim

Abstract:

This paper introduces a new extension of the Inverse Pareto distribution in the framework of Marshal-Olkin (1997) family of distributions. This model is capable of modeling various shapes of aging and failure data. The statistical properties of the new model are discussed. Several methods are used to estimate the parameters involved. Explicit expressions are derived for different types of moments of value in reliability analysis are obtained. Besides, the order statistics of samples from the new proposed model have been studied. Finally, the usefulness of the new model for modeling reliability data is illustrated using two real data sets with simulation study.

Keywords: pareto distribution, marshal-Olkin, reliability, hazard functions, moments, estimation

Procedia PDF Downloads 57
19622 Food Foam Characterization: Rheology, Texture and Microstructure Studies

Authors: Rutuja Upadhyay, Anurag Mehra

Abstract:

Solid food foams/cellular foods are colloidal systems which impart structure, texture and mouthfeel to many food products such as bread, cakes, ice-cream, meringues, etc. Their heterogeneous morphology makes the quantification of structure/mechanical relationships complex. The porous structure of solid food foams is highly influenced by the processing conditions, ingredient composition, and their interactions. Sensory perceptions of food foams are dependent on bubble size, shape, orientation, quantity and distribution and determines the texture of foamed foods. The state and structure of the solid matrix control the deformation behavior of the food, such as elasticity/plasticity or fracture, which in turn has an effect on the force-deformation curves. The obvious step in obtaining the relationship between the mechanical properties and the porous structure is to quantify them simultaneously. Here, we attempt to research food foams such as bread dough, baked bread and steamed rice cakes to determine the link between ingredients and the corresponding effect of each of them on the rheology, microstructure, bubble size and texture of the final product. Dynamic rheometry (SAOS), confocal laser scanning microscopy, flatbed scanning, image analysis and texture profile analysis (TPA) has been used to characterize the foods studied. In all the above systems, there was a common observation that when the mean bubble diameter is smaller, the product becomes harder as evidenced by the increase in storage and loss modulus (G′, G″), whereas when the mean bubble diameter is large the product is softer with decrease in moduli values (G′, G″). Also, the bubble size distribution affects texture of foods. It was found that bread doughs with hydrocolloids (xanthan gum, alginate) aid a more uniform bubble size distribution. Bread baking experiments were done to study the rheological changes and mechanisms involved in the structural transition of dough to crumb. Steamed rice cakes with xanthan gum (XG) addition at 0.1% concentration resulted in lower hardness with a narrower pore size distribution and larger mean pore diameter. Thus, control of bubble size could be an important parameter defining final food texture.

Keywords: food foams, rheology, microstructure, texture

Procedia PDF Downloads 312
19621 A Comparative Study of Generalized Autoregressive Conditional Heteroskedasticity (GARCH) and Extreme Value Theory (EVT) Model in Modeling Value-at-Risk (VaR)

Authors: Longqing Li

Abstract:

The paper addresses the inefficiency of the classical model in measuring the Value-at-Risk (VaR) using a normal distribution or a Student’s t distribution. Specifically, the paper focuses on the one day ahead Value-at-Risk (VaR) of major stock market’s daily returns in US, UK, China and Hong Kong in the most recent ten years under 95% confidence level. To improve the predictable power and search for the best performing model, the paper proposes using two leading alternatives, Extreme Value Theory (EVT) and a family of GARCH models, and compares the relative performance. The main contribution could be summarized in two aspects. First, the paper extends the GARCH family model by incorporating EGARCH and TGARCH to shed light on the difference between each in estimating one day ahead Value-at-Risk (VaR). Second, to account for the non-normality in the distribution of financial markets, the paper applies Generalized Error Distribution (GED), instead of the normal distribution, to govern the innovation term. A dynamic back-testing procedure is employed to assess the performance of each model, a family of GARCH and the conditional EVT. The conclusion is that Exponential GARCH yields the best estimate in out-of-sample one day ahead Value-at-Risk (VaR) forecasting. Moreover, the discrepancy of performance between the GARCH and the conditional EVT is indistinguishable.

Keywords: Value-at-Risk, Extreme Value Theory, conditional EVT, backtesting

Procedia PDF Downloads 300
19620 Li2o Loss of Lithium Niobate Nanocrystals during High-Energy Ball-Milling

Authors: Laura Kocsor, Laszlo Peter, Laszlo Kovacs, Zsolt Kis

Abstract:

The aim of our research is to prepare rare-earth-doped lithium niobate (LiNbO3) nanocrystals, having only a few dopant ions in the focal point of an exciting laser beam. These samples will be used to achieve individual addressing of the dopant ions by light beams in a confocal microscope setup. One method for the preparation of nanocrystalline materials is to reduce the particle size by mechanical grinding. High-energy ball-milling was used in several works to produce nano lithium niobate. Previously, it was reported that dry high-energy ball-milling of lithium niobate in a shaker mill results in the partial reduction of the material, which leads to a balanced formation of bipolarons and polarons yielding gray color together with oxygen release and Li2O segregation on the open surfaces. In the present work we focus on preparing LiNbO3 nanocrystals by high-energy ball-milling using a Fritsch Pulverisette 7 planetary mill. Every ball-milling process was carried out in zirconia vial with zirconia balls of different sizes (from 3 mm to 0.1 mm), wet grinding with water, and the grinding time being less than an hour. Gradually decreasing the ball size to 0.1 mm, an average particle size of about 10 nm could be obtained determined by dynamic light scattering and verified by scanning electron microscopy. High-energy ball-milling resulted in sample darkening evidenced by optical absorption spectroscopy measurements indicating that the material underwent partial reduction. The unwanted lithium oxide loss decreases the Li/Nb ratio in the crystal, strongly influencing the spectroscopic properties of lithium niobate. Zirconia contamination was found in ground samples proved by energy-dispersive X-ray spectroscopy measurements; however, it cannot be explained based on the hardness properties of the materials involved in the ball-milling process. It can be understood taking into account the presence of lithium hydroxide formed the segregated lithium oxide and water during the ball-milling process, through chemically induced abrasion. The quantity of the segregated Li2O was measured by coulometric titration. During the wet milling process in the planetary mill, it was found that the lithium oxide loss increases linearly in the early phase of the milling process, then a saturation of the Li2O loss can be seen. This change goes along with the disappearance of the relatively large particles until a relatively narrow size distribution is achieved in accord with the dynamic light scattering measurements. With the 3 mm ball size and 1100 rpm rotation rate, the mean particle size achieved is 100 nm, and the total Li2O loss is about 1.2 wt.% of the original LiNbO3. Further investigations have been done to minimize the Li2O segregation during the ball-milling process. Since the Li2O loss was observed to increase with the growing total surface of the particles, the influence of ball-milling parameters on its quantity has also been studied.

Keywords: high-energy ball-milling, lithium niobate, mechanochemical reaction, nanocrystals

Procedia PDF Downloads 105
19619 DCDNet: Lightweight Document Corner Detection Network Based on Attention Mechanism

Authors: Kun Xu, Yuan Xu, Jia Qiao

Abstract:

The document detection plays an important role in optical character recognition and text analysis. Because the traditional detection methods have weak generalization ability, and deep neural network has complex structure and large number of parameters, which cannot be well applied in mobile devices, this paper proposes a lightweight Document Corner Detection Network (DCDNet). DCDNet is a two-stage architecture. The first stage with Encoder-Decoder structure adopts depthwise separable convolution to greatly reduce the network parameters. After introducing the Feature Attention Union (FAU) module, the second stage enhances the feature information of spatial and channel dim and adaptively adjusts the size of receptive field to enhance the feature expression ability of the model. Aiming at solving the problem of the large difference in the number of pixel distribution between corner and non-corner, Weighted Binary Cross Entropy Loss (WBCE Loss) is proposed to define corner detection problem as a classification problem to make the training process more efficient. In order to make up for the lack of Dataset of document corner detection, a Dataset containing 6620 images named Document Corner Detection Dataset (DCDD) is made. Experimental results show that the proposed method can obtain fast, stable and accurate detection results on DCDD.

Keywords: document detection, corner detection, attention mechanism, lightweight

Procedia PDF Downloads 331
19618 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios

Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu

Abstract:

Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.

Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method

Procedia PDF Downloads 129
19617 Evolution of Nettlespurge Oil Mud for Drilling Mud System: A Comparative Study of Diesel Oil and Nettlespurge Oil as Oil-Based Drilling Mud

Authors: Harsh Agarwal, Pratikkumar Patel, Maharshi Pathak

Abstract:

Recently the low prices of Crude oil and increase in strict environmental regulations limit limits the use of diesel based muds as these muds are relatively costlier and toxic, as a result disposal of cuttings into the eco-system is a major issue faced by the drilling industries. To overcome these issues faced by the Oil Industry, an attempt has been made to develop oil-in-water emulsion mud system using nettlespurge oil. Nettlespurge oil could be easily available and its cost is around ₹30/litre which is about half the price of diesel in India. Oil-based mud (OBM) was formulated with Nettlespurge oil extracted from Nettlespurge seeds using the Soxhlet extraction method. The formulated nettlespurge oil mud properties were analysed with diesel oil mud properties. The compared properties were rheological properties, yield point and gel strength, and mud density and filtration loss properties, fluid loss and filter cake. The mud density measurement showed that nettlespurge OBM was slightly higher than diesel OBM with mud density values of 9.175 lb/gal and 8.5 lb/gal, respectively, at barite content of 70 g. Thus it has a higher lubricating property. Additionally, the filtration loss test results showed that nettlespurge mud fluid loss volumes, oil was 11 ml, compared to diesel oil mud volume of 15 ml. The filtration loss test indicated that the nettlespurge oil mud with filter cake thickness of 2.2 mm had a cake characteristic of thin and squashy while the diesel oil mud resulted in filter cake thickness of 2.7 mm with cake characteristic of tenacious, rubbery and resilient. The filtration loss test results showed that nettlespurge oil mud fluid loss volumes was much less than the diesel based oil mud. The filtration loss test indicated that the nettlespurge oil mud filter cake thickness less than the diesel oil mud filter cake thickness. So Low formation damage and the emulsion stability effect was analysed with this experiment. The nettlespurge oil-in-water mud system had lower coefficient of friction than the diesel oil based mud system. All the rheological properties have shown better results relative to the diesel based oil mud. Therefore, with all the above mentioned factors and with the data of the conducted experiment we could conclude that the Nettlespurge oil based mud is economically and well as eco-logically much more feasible than the worn out and shabby diesel-based oil mud in the Drilling Industry.

Keywords: economical feasible, ecological feasible, emulsion stability, nettle spurge oil, rheological properties, soxhlet extraction method

Procedia PDF Downloads 180
19616 50+ Customers' Behavior in the Financial Market of the Czech Republic

Authors: K. Matušínská, H. Starzyczná, M. Stoklasa

Abstract:

The paper deals with behaviour of the segment 50+ in the financial market in the Czech Republic. This segment could be said as the strong market power and it can be a crucial business potential for financial business units. The main defined objective of this paper is analysis of the customers´ behaviour of the segment 50-60 years in the financial market in the Czech Republic and proposal making of the suitable marketing approach to satisfy their demands in the area of product, price, distribution and marketing communication policy. This paper is based on data from one part of primary marketing research. Paper determinates the basic problem areas as well as definition of financial services marketing, defining the primary research problem, hypothesis and primary research methodology. Finally suitable marketing approach to selected sub-segment at age of 50-60 years is proposed according to marketing research findings.

Keywords: population aging in the Czech Republic, segment 50-60 years, financial services marketing, marketing research, marketing approach

Procedia PDF Downloads 354
19615 Impact of the Photovoltaic Integration in Power Distribution Network: Case Study in Badak Liquefied Natural Gas (LNG)

Authors: David Hasurungan

Abstract:

This paper objective is to analyze the impact from photovoltaic system integration to power distribution network. The case study in Badak Liquefied Natural Gas (LNG) plant is presented in this paper. Badak LNG electricity network is operated in islanded mode. The total power generation in Badak LNG plant is significantly affected to feed gas supply. Meanwhile, to support the Government regulation, Badak LNG continuously implemented the grid-connected photovoltaic system in existing power distribution network. The impact between train operational mode change in Badak LNG plant and the growth of photovoltaic system is also encompassed in analysis. The analysis and calculation are performed using software Power Factory 15.1.

Keywords: power quality, distribution network, grid-connected photovoltaic system, power management system

Procedia PDF Downloads 339
19614 Optimization Analysis of Controlled Cooling Process for H-Shape Steam Beams

Authors: Jiin-Yuh Jang, Yu-Feng Gan

Abstract:

In order to improve the comprehensive mechanical properties of the steel, the cooling rate, and the temperature distribution must be controlled in the cooling process. A three-dimensional numerical model for the prediction of the heat transfer coefficient distribution of H-beam in the controlled cooling process was performed in order to obtain the uniform temperature distribution and minimize the maximum stress and the maximum deformation after the controlled cooling. An algorithm developed with a simplified conjugated-gradient method was used as an optimizer to optimize the heat transfer coefficient distribution. The numerical results showed that, for the case of air cooling 5 seconds followed by water cooling 6 seconds with uniform the heat transfer coefficient, the cooling rate is 15.5 (℃/s), the maximum temperature difference is 85℃, the maximum the stress is 125 MPa, and the maximum deformation is 1.280 mm. After optimize the heat transfer coefficient distribution in control cooling process with the same cooling time, the cooling rate is increased to 20.5 (℃/s), the maximum temperature difference is decreased to 52℃, the maximum stress is decreased to 82MPa and the maximum deformation is decreased to 1.167mm.

Keywords: controlled cooling, H-Beam, optimization, thermal stress

Procedia PDF Downloads 344
19613 A Hybrid Distributed Algorithm for Multi-Objective Dynamic Flexible Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a hybrid distributed algorithm has been suggested for multi-objective dynamic flexible job shop scheduling problem. The proposed algorithm is high level, in which several algorithms search the space on different machines simultaneously also it is a hybrid algorithm that takes advantages of the artificial intelligence, evolutionary and optimization methods. Distribution is done at different levels and new approaches are used for design of the algorithm. Apache spark and Hadoop frameworks have been used for the distribution of the algorithm. The Pareto optimality approach is used for solving the multi-objective benchmarks. The suggested algorithm that is able to solve large-size problems in short times has been compared with the successful algorithms of the literature. The results prove high speed and efficiency of the algorithm.

Keywords: distributed algorithms, apache-spark, Hadoop, flexible dynamic job shop scheduling, multi-objective optimization

Procedia PDF Downloads 323
19612 Shades of Memory, Echoes of Despair: Exploring Melancholy in Modern Amharic Novels

Authors: Dawit Dibekulu, Tesfaye Dagnew, Tesfamaryam G. Meskel

Abstract:

Echoing with memories of loss and whispers of despair, this study delves into the poignant world of melancholy in Sisay Nigusu's contemporary Amharic novel, ‘Yäqənat Zār’ (‘Zār of Jealousy’). Employing a psychoanalytic lens focused on Freud and Klein's theories of mourning and melancholia, we explore the psychological depths of characters ravaged by grief. Through an interpretive paradigm and descriptive research design, we unpack the intricate tapestry of the novel, revealing how love's ashes morph into melancholic despair. The loss of loved ones, be it sudden death or betrayal, casts long shadows on the characters' souls, distorting their behavior and twisting their narratives. Altered thoughts, self-blame, and paralyzing yearning become their companions, weaving a tragic dance of longing and despair. ‘Yäqənat Zār’ serves as a powerful testament to the transformative power of storytelling, allowing us to navigate the labyrinthine paths of melancholia and gain a glimpse into the Ethiopian soul grappling with loss. This study not only sheds light on the individual's struggle with sadness but also illuminates the cultural fabric of grief and melancholia intricately woven into Ethiopian society.

Keywords: melancholy, loss, psychoanalysis, grief, identity

Procedia PDF Downloads 36
19611 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 504
19610 Electrical Tortuosity across Electrokinetically Remediated Soils

Authors: Waddah S. Abdullah, Khaled F. Al-Omari

Abstract:

Electrokinetic remediation is one of the most influential and effective methods to decontaminate contaminated soils. Electroosmosis and electromigration are the processes of electrochemical extraction of contaminants from soils. The driving force that causes removing contaminants from soils (electroosmosis process or electromigration process) is voltage gradient. Therefore, the electric field distribution throughout the soil domain is extremely important to investigate and to determine the factors that help to establish a uniform electric field distribution in order to make the clean-up process work properly and efficiently. In this study, small-sized passive electrodes (made of graphite) were placed at predetermined locations within the soil specimen, and the voltage drop between these passive electrodes was measured in order to observe the electrical distribution throughout the tested soil specimens. The electrokinetic test was conducted on two types of soils; a sandy soil and a clayey soil. The electrical distribution throughout the soil domain was conducted with different tests properties; and the electrical field distribution was observed in three-dimensional pattern in order to establish the electrical distribution within the soil domain. The effects of density, applied voltages, and degree of saturation on the electrical distribution within the remediated soil were investigated. The distribution of the moisture content, concentration of the sodium ions, and the concentration of the calcium ions were determined and established in three-dimensional scheme. The study has shown that the electrical conductivity within soil domain depends on the moisture content and concentration of electrolytes present in the pore fluid. The distribution of the electrical field in the saturated soil was found not be affected by its density. The study has also shown that high voltage gradient leads to non-uniform electric field distribution within the electroremediated soil. Very importantly, it was found that even when the electric field distribution is uniform globally (i.e. between the passive electrodes), local non-uniformity could be established within the remediated soil mass. Cracks or air gaps formed due to temperature rise (because of electric flow in low conductivity regions) promotes electrical tortuosity. Thus, fracturing or cracking formed in the remediated soil mass causes disconnection of electric current and hence, no removal of contaminant occur within these areas.

Keywords: contaminant removal, electrical tortuousity, electromigration, electroosmosis, voltage distribution

Procedia PDF Downloads 404