Search results for: queue size distribution at a random epoch
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11834

Search results for: queue size distribution at a random epoch

11084 Monte Carlo Methods and Statistical Inference of Multitype Branching Processes

Authors: Ana Staneva, Vessela Stoimenova

Abstract:

A parametric estimation of the MBP with Power Series offspring distribution family is considered in this paper. The MLE for the parameters is obtained in the case when the observable data are incomplete and consist only with the generation sizes of the family tree of MBP. The parameter estimation is calculated by using the Monte Carlo EM algorithm. The estimation for the posterior distribution and for the offspring distribution parameters are calculated by using the Bayesian approach and the Gibbs sampler. The article proposes various examples with bivariate branching processes together with computational results, simulation and an implementation using R.

Keywords: Bayesian, branching processes, EM algorithm, Gibbs sampler, Monte Carlo methods, statistical estimation

Procedia PDF Downloads 422
11083 Influence of Optical Fluence Distribution on Photoacoustic Imaging

Authors: Mohamed K. Metwally, Sherif H. El-Gohary, Kyung Min Byun, Seung Moo Han, Soo Yeol Lee, Min Hyoung Cho, Gon Khang, Jinsung Cho, Tae-Seong Kim

Abstract:

Photoacoustic imaging (PAI) is a non-invasive and non-ionizing imaging modality that combines the absorption contrast of light with ultrasound resolution. Laser is used to deposit optical energy into a target (i.e., optical fluence). Consequently, the target temperature rises, and then thermal expansion occurs that leads to generating a PA signal. In general, most image reconstruction algorithms for PAI assume uniform fluence within an imaging object. However, it is known that optical fluence distribution within the object is non-uniform. This could affect the reconstruction of PA images. In this study, we have investigated the influence of optical fluence distribution on PA back-propagation imaging using finite element method. The uniform fluence was simulated as a triangular waveform within the object of interest. The non-uniform fluence distribution was estimated by solving light propagation within a tissue model via Monte Carlo method. The results show that the PA signal in the case of non-uniform fluence is wider than the uniform case by 23%. The frequency spectrum of the PA signal due to the non-uniform fluence has missed some high frequency components in comparison to the uniform case. Consequently, the reconstructed image with the non-uniform fluence exhibits a strong smoothing effect.

Keywords: finite element method, fluence distribution, Monte Carlo method, photoacoustic imaging

Procedia PDF Downloads 378
11082 Bayesian Analysis of Change Point Problems Using Conditionally Specified Priors

Authors: Golnaz Shahtahmassebi, Jose Maria Sarabia

Abstract:

In this talk, we introduce a new class of conjugate prior distributions obtained from conditional specification methodology. We illustrate the application of such distribution in Bayesian change point detection in Poisson processes. We obtain the posterior distribution of model parameters using a general bivariate distribution with gamma conditionals. Simulation from the posterior is readily implemented using a Gibbs sampling algorithm. The Gibbs sampling is implemented even when using conditional densities that are incompatible or only compatible with an improper joint density. The application of such methods will be demonstrated using examples of simulated and real data.

Keywords: change point, bayesian inference, Gibbs sampler, conditional specification, gamma conditional distributions

Procedia PDF Downloads 189
11081 Different Sampling Schemes for Semi-Parametric Frailty Model

Authors: Nursel Koyuncu, Nihal Ata Tutkun

Abstract:

Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.

Keywords: frailty model, ranked set sampling, efficiency, simple random sampling

Procedia PDF Downloads 212
11080 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water

Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer

Abstract:

Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.

Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software

Procedia PDF Downloads 84
11079 The Pressure Distribution on the Rectangular and Trapezoidal Storage Tanks' Perimeters Due to Liquid Sloshing Impact

Authors: Hassan Saghi, Gholam Reza Askarzadeh Garmroud, Seyyed Ali Reza Emamian

Abstract:

Sloshing phenomenon is a complicated free surface flow problem that increases the dynamic pressure on the sidewalls and the bottom of the storage tanks. When the storage tanks are partially filled, it is essential to be able to evaluate the fluid dynamic loads on the tank’s perimeter. In this paper, a numerical code was developed to determine the pressure distribution on the rectangular and trapezoidal storage tanks’ perimeters due to liquid sloshing impact. Assuming the fluid to be inviscid, the Laplace equation and the nonlinear free surface boundary conditions are solved using coupled BEM-FEM. The code performance for sloshing modeling is validated against available data. Finally, this code is used for partially filled rectangular and trapezoidal storage tanks and the pressure distribution on the tanks’ perimeters due to liquid sloshing impact is estimated. The results show that the maximum pressure on the perimeter of the rectangular and trapezoidal storage tanks was decreased along the sidewalls from the top to the bottom. Furthermore, the period of the pressure distribution is different for different points on the tank’s perimeter and it is bigger in the trapezoidal tanks compared to the rectangular ones.

Keywords: pressure distribution, liquid sloshing impact, sway motion, trapezoidal storage tank, coupled BEM-FEM

Procedia PDF Downloads 552
11078 Advanced Model for Calculation of the Neutral Axis Shifting and the Wall Thickness Distribution in Rotary Draw Bending Processes

Authors: B. Engel, H. Hassan

Abstract:

Rotary draw bending is a method which is being used in tube forming. In the tube bending process, the neutral axis moves towards the inner arc and the wall thickness distribution changes for tube’s cross section. Thinning takes place in the outer arc of the tube (extrados) due to the stretching of the material, whereas thickening occurs in the inner arc of the tube (intrados) due to the comparison of the material. The calculations of the wall thickness distribution, neutral axis shifting, and strain distribution have not been accurate enough, so far. The previous model (the geometrical model) describes the neutral axis shifting and wall thickness distribution. The geometrical of the tube, bending radius and bending angle are considered in the geometrical model, while the influence of the material properties of the tube forming are ignored. The advanced model is a modification of the previous model using material properties that depends on the correction factor. The correction factor is a purely empirically determined factor. The advanced model was compared with the Finite element simulation (FE simulation) using a different bending factor (Bf=bending radius/ diameter of the tube), wall thickness (Wf=diameter of the tube/ wall thickness), and material properties (strain hardening exponent). Finite element model of rotary draw bending has been performed in PAM-TUBE program (version: 2012). Results from the advanced model resemble the FE simulation and the experimental test.

Keywords: rotary draw bending, material properties, neutral axis shifting, wall thickness distribution

Procedia PDF Downloads 397
11077 Effects of Heat Treatment on the Elastic Constants of Cedar Wood

Authors: Tugba Yilmaz Aydin, Ergun Guntekin, Murat Aydin

Abstract:

Effects of heat treatment on the elastic constants of cedar wood (Cedrus libani) were investigated. Specimens were exposed to heat under atmospheric pressure at four different temperatures (120, 150, 180, 210 °C) and three different time levels (2, 5, 8 hours). Three Young’s modulus (EL, ER, ET) and six Poisson ratios (μLR, μLT, μRL, μRT, μTL, μTR) were determined from compression test using bi-axial extensometer at constant moisture content (12 %). Three shear modulus were determined using ultrasound. Six shear wave velocities propagating along the principal axes of anisotropy were measured using EPOCH 650 ultrasonic flaw detector with 1 MHz transverse transducers. The properties of the samples tested were significantly affected by heat treatment by different degree. As a result, softer treatments yielded some amount of increase in Young modulus and shear modulus values, but increase of time and temperature resulted in significant decrease for both values. Poisson ratios seemed insensitive to heat treatment.

Keywords: cedar wood, elastic constants, heat treatment, ultrasound

Procedia PDF Downloads 385
11076 Reconstruction of a Genome-Scale Metabolic Model to Simulate Uncoupled Growth of Zymomonas mobilis

Authors: Maryam Saeidi, Ehsan Motamedian, Seyed Abbas Shojaosadati

Abstract:

Zymomonas mobilis is known as an example of the uncoupled growth phenomenon. This microorganism also has a unique metabolism that degrades glucose by the Entner–Doudoroff (ED) pathway. In this paper, a genome-scale metabolic model including 434 genes, 757 reactions and 691 metabolites was reconstructed to simulate uncoupled growth and study its effect on flux distribution in the central metabolism. The model properly predicted that ATPase was activated in experimental growth yields of Z. mobilis. Flux distribution obtained from model indicates that the major carbon flux passed through ED pathway that resulted in the production of ethanol. Small amounts of carbon source were entered into pentose phosphate pathway and TCA cycle to produce biomass precursors. Predicted flux distribution was in good agreement with experimental data. The model results also indicated that Z. mobilis metabolism is able to produce biomass with maximum growth yield of 123.7 g (mol glucose)-1 if ATP synthase is coupled with growth and produces 82 mmol ATP gDCW-1h-1. Coupling the growth and energy reduced ethanol secretion and changed the flux distribution to produce biomass precursors.

Keywords: genome-scale metabolic model, Zymomonas mobilis, uncoupled growth, flux distribution, ATP dissipation

Procedia PDF Downloads 488
11075 Sensor Registration in Multi-Static Sonar Fusion Detection

Authors: Longxiang Guo, Haoyan Hao, Xueli Sheng, Hanjun Yu, Jingwei Yin

Abstract:

In order to prevent target splitting and ensure the accuracy of fusion, system error registration is an important step in multi-static sonar fusion detection system. To eliminate the inherent system errors including distance error and angle error of each sonar in detection, this paper uses offline estimation method for error registration. Suppose several sonars from different platforms work together to detect a target. The target position detected by each sonar is based on each sonar’s own reference coordinate system. Based on the two-dimensional stereo projection method, this paper uses real-time quality control (RTQC) method and least squares (LS) method to estimate sensor biases. The RTQC method takes the average value of each sonar’s data as the observation value and the LS method makes the least square processing of each sonar’s data to get the observation value. In the underwater acoustic environment, matlab simulation is carried out and the simulation results show that both algorithms can estimate the distance and angle error of sonar system. The performance of the two algorithms is also compared through the root mean square error and the influence of measurement noise on registration accuracy is explored by simulation. The system error convergence of RTQC method is rapid, but the distribution of targets has a serious impact on its performance. LS method can not be affected by target distribution, but the increase of random noise will slow down the convergence rate. LS method is an improvement of RTQC method, which is widely used in two-dimensional registration. The improved method can be used for underwater multi-target detection registration.

Keywords: data fusion, multi-static sonar detection, offline estimation, sensor registration problem

Procedia PDF Downloads 169
11074 Temperature Gradient In Weld Zones During Friction Stir Process Using Finite Element Method

Authors: Armansyah, I. P. Almanar, M. Saiful Bahari Shaari, M. Shamil Jaffarullah

Abstract:

Finite element approach have been used via three-dimensional models by using Altair Hyper Work, a commercially available software, to describe heat gradients along the welding zones (axially and coronaly) in Friction Stir Welding (FSW). Transient thermal finite element analyses are performed in AA 6061-T6 Aluminum Alloy to obtain temperature distribution in the welded aluminum plates during welding operation. Heat input from tool shoulder and tool pin are considered in the model. A moving heat source with a heat distribution simulating the heat generated by frictions between tool shoulder and work piece is used in the analysis. The developed model was then used to show the effect of various input parameters such as total rate of welding speed and rotational speed on temperature distribution in the work piece.

Keywords: Frictions Stir Welding (FSW), temperature distribution, Finite Element Method (FEM), altair hyperwork

Procedia PDF Downloads 548
11073 Distribution and Ecological Risk Assessment of Trace Elements in Sediments along the Ganges River Estuary, India

Authors: Priyanka Mondal, Santosh K. Sarkar

Abstract:

The present study investigated the spatiotemporal distribution and ecological risk assessment of trace elements of surface sediments (top 0 - 5 cm; grain size ≤ 0.63 µm) in relevance to sediment quality characteristics along the Ganges River Estuary, India. Sediment samples were collected during ebb tide from intertidal regions covering seven sampling sites of diverse environmental stresses. The elements were analyzed with the help of ICPAES. This positive, mixohaline, macro-tidal estuary has global significance contributing ecological and economic services. Presence of fine-clayey particle (47.03%) enhances the adsorption as well as transportation of trace elements. There is a remarkable inter-metallic variation (mg kg-1 dry weight) in the distribution pattern in the following manner: Al (31801± 15943) > Fe (23337± 7584) > Mn (461±147) > S(381±235) > Zn(54 ±18) > V(43 ±14) > Cr(39 ±15) > As (34±15) > Cu(27 ±11) > Ni (24 ±9) > Se (17 ±8) > Co(11 ±3) > Mo(10 ± 2) > Hg(0.02 ±0.01). An overall trend of enrichment of majority of trace elements was very much pronounced at the site Lot 8, ~ 35km upstream of the estuarine mouth. In contrast, the minimum concentration was recorded at site Gangasagar, mouth of the estuary, with high energy profile. The prevalent variations in trace element distribution are being liable for a set of cumulative factors such as hydrodynamic conditions, sediment dispersion pattern and textural variations as well as non-homogenous input of contaminants from point and non-point sources. In order to gain insight into the trace elements distribution, accumulation, and their pollution status, geoaccumulation index (Igeo) and enrichment factor (EF) were used. The Igeo indicated that surface sediments were moderately polluted with As (0.60) and Mo (1.30) and strongly contaminated with Se (4.0). The EF indicated severe pollution of Se (53.82) and significant pollution of As (4.05) and Mo (6.0) and indicated the influx of As, Mo and Se in sediments from anthropogenic sources (such as industrial and municipal sewage, atmospheric deposition, agricultural run-off, etc.). The significant role of the megacity Calcutta in relevance to the untreated sewage discharge, atmospheric inputs and other anthropogenic activities is worthwhile to mention. The ecological risk for different trace elements was evaluated using sediment quality guidelines, effects range low (ERL), and effect range median (ERM). The concentration of As, Cu and Ni at 100%, 43% and 86% of the sampling sites has exceeded the ERL value while none of the element concentration exceeded ERM. The potential ecological risk index values revealed that As at 14.3% of the sampling sites would pose relatively moderate risk to benthic organisms. The effective role of finer clay particles for trace element distribution was revealed by multivariate analysis. The authors strongly recommend regular monitoring emphasizing on accurate appraisal of the potential risk of trace elements for effective and sustainable management of this estuarine environment.

Keywords: pollution assessment, sediment contamination, sediment quality, trace elements

Procedia PDF Downloads 257
11072 Spatial Distribution of Certified Mental Disabilities in China

Authors: Jiayue Yang

Abstract:

Based on an analysis of China's database of certified disabled persons in 2021, this study reveals several key findings. Firstly, the proportion of certified mentally disabled persons among China's certified disabled population (Certification rate 1) shows a decreasing distribution from the East to the West and from the South to the North. Secondly, the spatial distribution of the number of mentally disabled persons per 1,000 people holding certificates (certification rate 2) shows a relatively scattered pattern, with significant variations observed between cities in the eastern region. However, on an overall scale, a south-north gradient can still be observed, with higher rates in the North and lower rates in the west, while the central region demonstrates higher rates compared to the western region. The variation in the rate of mentally handicapped certificates among regions is influenced not only by traditional culture and welfare level but also exhibits a certain correlation with the level of economic development.

Keywords: certified disabled persons, mentally disabled persons, spatial distribution, China

Procedia PDF Downloads 107
11071 New Method for Determining the Distribution of Birefringence and Linear Dichroism in Polymer Materials Based on Polarization-Holographic Grating

Authors: Barbara Kilosanidze, George Kakauridze, Levan Nadareishvili, Yuri Mshvenieradze

Abstract:

A new method for determining the distribution of birefringence and linear dichroism in optical polymer materials is presented. The method is based on the use of polarization-holographic diffraction grating that forms an orthogonal circular basis in the process of diffraction of probing laser beam on the grating. The intensities ratio of the orders of diffraction on this grating enables the value of birefringence and linear dichroism in the sample to be determined. The distribution of birefringence in the sample is determined by scanning with a circularly polarized beam with a wavelength far from the absorption band of the material. If the scanning is carried out by probing beam with the wavelength near to a maximum of the absorption band of the chromophore then the distribution of linear dichroism can be determined. An appropriate theoretical model of this method is presented. A laboratory setup was created for the proposed method. An optical scheme of the laboratory setup is presented. The results of measurement in polymer films with two-dimensional gradient distribution of birefringence and linear dichroism are discussed.

Keywords: birefringence, linear dichroism, graded oriented polymers, optical polymers, optical anisotropy, polarization-holographic grating

Procedia PDF Downloads 435
11070 Normalizing Scientometric Indicators of Individual Publications Using Local Cluster Detection Methods on Citation Networks

Authors: Levente Varga, Dávid Deritei, Mária Ercsey-Ravasz, Răzvan Florian, Zsolt I. Lázár, István Papp, Ferenc Járai-Szabó

Abstract:

One of the major shortcomings of widely used scientometric indicators is that different disciplines cannot be compared with each other. The issue of cross-disciplinary normalization has been long discussed, but even the classification of publications into scientific domains poses problems. Structural properties of citation networks offer new possibilities, however, the large size and constant growth of these networks asks for precaution. Here we present a new tool that in order to perform cross-field normalization of scientometric indicators of individual publications relays on the structural properties of citation networks. Due to the large size of the networks, a systematic procedure for identifying scientific domains based on a local community detection algorithm is proposed. The algorithm is tested with different benchmark and real-world networks. Then, by the use of this algorithm, the mechanism of the scientometric indicator normalization process is shown for a few indicators like the citation number, P-index and a local version of the PageRank indicator. The fat-tail trend of the article indicator distribution enables us to successfully perform the indicator normalization process.

Keywords: citation networks, cross-field normalization, local cluster detection, scientometric indicators

Procedia PDF Downloads 205
11069 Integrating Process Planning, WMS Dispatching, and WPPW Weighted Due Date Assignment Using a Genetic Algorithm

Authors: Halil Ibrahim Demir, Tarık Cakar, Ibrahim Cil, Muharrem Dugenci, Caner Erden

Abstract:

Conventionally, process planning, scheduling, and due-date assignment functions are performed separately and sequentially. The interdependence of these functions requires integration. Although integrated process planning and scheduling, and scheduling with due date assignment problems are popular research topics, only a few works address the integration of these three functions. This work focuses on the integration of process planning, WMS scheduling, and WPPW due date assignment. Another novelty of this work is the use of a weighted due date assignment. In the literature, due dates are generally assigned without considering the importance of customers. However, in this study, more important customers get closer due dates. Typically, only tardiness is punished, but the JIT philosophy punishes both earliness and tardiness. In this study, all weighted earliness, tardiness, and due date related costs are penalized. As no customer desires distant due dates, such distant due dates should be penalized. In this study, various levels of integration of these three functions are tested and genetic search and random search are compared both with each other and with ordinary solutions. Higher integration levels are superior, while search is always useful. Genetic searches outperformed random searches.

Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic algorithm, random search

Procedia PDF Downloads 394
11068 Fusion Models for Cyber Threat Defense: Integrating Clustering, Random Forests, and Support Vector Machines to Against Windows Malware

Authors: Azita Ramezani, Atousa Ramezani

Abstract:

In the ever-escalating landscape of windows malware the necessity for pioneering defense strategies turns into undeniable this study introduces an avant-garde approach fusing the capabilities of clustering random forests and support vector machines SVM to combat the intricate web of cyber threats our fusion model triumphs with a staggering accuracy of 98.67 and an equally formidable f1 score of 98.68 a testament to its effectiveness in the realm of windows malware defense by deciphering the intricate patterns within malicious code our model not only raises the bar for detection precision but also redefines the paradigm of cybersecurity preparedness this breakthrough underscores the potential embedded in the fusion of diverse analytical methodologies and signals a paradigm shift in fortifying against the relentless evolution of windows malicious threats as we traverse through the dynamic cybersecurity terrain this research serves as a beacon illuminating the path toward a resilient future where innovative fusion models stand at the forefront of cyber threat defense.

Keywords: fusion models, cyber threat defense, windows malware, clustering, random forests, support vector machines (SVM), accuracy, f1-score, cybersecurity, malicious code detection

Procedia PDF Downloads 72
11067 Study on Concentration and Temperature Measurement with 760 nm Diode Laser in Combustion System Using Tunable Diode Laser Absorption Spectroscopy

Authors: Miyeon Yoo, Sewon Kim, Changyeop Lee

Abstract:

It is important to measure the internal temperature or temperature distribution precisely in combustion system to increase energy efficiency and reduce the pollutants. Especially in case of large combustion systems such as power plant boiler and reheating furnace of steel making process, it is very difficult to measure those physical properties in detail. Tunable diode laser absorption spectroscopy measurement and analysis can be attractive method to overcome the difficulty. In this paper, TDLAS methods are used to measure the oxygen concentration and temperature distribution in various experimental conditions.

Keywords: tunable diode laser absorption Spectroscopy, temperature distribution, gas concentration

Procedia PDF Downloads 387
11066 Analysis of Exponential Distribution under Step Stress Partially Accelerated Life Testing Plan Using Adaptive Type-I Hybrid Progressive Censoring Schemes with Competing Risks Data

Authors: Ahmadur Rahman, Showkat Ahmad Lone, Ariful Islam

Abstract:

In this article, we have estimated the parameters for the failure times of units based on the sampling technique adaptive type-I progressive hybrid censoring under the step-stress partially accelerated life tests for competing risk. The failure times of the units are assumed to follow an exponential distribution. Maximum likelihood estimation technique is used to estimate the unknown parameters of the distribution and tampered coefficient. Confidence interval also obtained for the parameters. A simulation study is performed by using Monte Carlo Simulation method to check the authenticity of the model and its assumptions.

Keywords: adaptive type-I hybrid progressive censoring, competing risks, exponential distribution, simulation, step-stress partially accelerated life tests

Procedia PDF Downloads 344
11065 Production of Low-Density Nanocellular Foam Based on PMMA/PEBAX Blends

Authors: Nigus Maregu Demewoz, Shu-Kai Yeh

Abstract:

Low-density nanocellular foam is a fascinating new-generation advanced material due to its mechanical strength and thermal insulation properties. In nanocellular foam, reducing the density increases the insulation ability. However, producing a nanocellular foam of densities less than 0.3 with a cell size of less than 100 nm is very challenging. In this study, poly (methyl methacrylate) (PMMA) was blended with Polyether block amide (PEBAX) to study the effects of PEBAX on the nanocellular foam structure of the PMMA matrix. We added 2 wt% of PEBAX in the PMMA matrix, and the PEBAX nanostructured domain size of 45 nm was well dispersed in the PMMA matrix. The foaming result produced a new generation special bouquet-like nanocellular foam of cell size less than 50 nm with a relative density of 0.24. Also, we were able to produce a nanocellular foam of a relative density of about 0.17. In addition to thermal insulation applications, bouquet-like nanocellular foam may be expected for filtration applications.

Keywords: nanocellular foam, low-density, cell size, relative density, PMMA/PEBAX

Procedia PDF Downloads 80
11064 Formulation and Invivo Evaluation of Salmeterol Xinafoate Loaded MDI for Asthma Using Response Surface Methodology

Authors: Paresh Patel, Priya Patel, Vaidehi Sorathiya, Navin Sheth

Abstract:

The aim of present work was to fabricate Salmeterol Xinafoate (SX) metered dose inhaler (MDI) for asthma and to evaluate the SX loaded solid lipid nanoparticles (SLNs) for pulmonary delivery. Solid lipid nanoparticles can be used to deliver particles to the lungs via MDI. A modified solvent emulsification diffusion technique was used to prepare Salmeterol Xinafoate loaded solid lipid nanoparticles by using compritol 888 ATO as lipid, tween 80 as surfactant, D-mannitol as cryoprotecting agent and L-leucine was used to improve aerosolization behaviour. Box-Behnken design was applied with 17 runs. 3-D surface response plots and contour plots were drawn and optimized formulation was selected based on minimum particle size and maximum % EE. % yield, in vitro diffusion study, scanning electron microscopy, X-ray diffraction, DSC, FTIR also characterized. Particle size, zeta potential analyzed by Zetatrac particle size analyzer and aerodynamic properties was carried out by cascade impactor. Pre convulsion time was examined for control group, treatment group and compare with marketed group. MDI was evaluated for leakage test, flammability test, spray test and content per puff. By experimental design, particle size and % EE found to be in range between 119-337 nm and 62.04-76.77% by solvent emulsification diffusion technique. Morphologically, particles have spherical shape and uniform distribution. DSC & FTIR study showed that no interaction between drug and excipients. Zeta potential shows good stability of SLNs. % respirable fraction found to be 52.78% indicating reach to the deep part of lung such as alveoli. Animal study showed that fabricated MDI protect the lungs against histamine induced bronchospasm in guinea pigs. MDI showed sphericity of particle in spray pattern, 96.34% content per puff and non-flammable. SLNs prepared by Solvent emulsification diffusion technique provide desirable size for deposition into the alveoli. This delivery platform opens up a wide range of treatment application of pulmonary disease like asthma via solid lipid nanoparticles.

Keywords: salmeterol xinafoate, solid lipid nanoparticles, box-behnken design, solvent emulsification diffusion technique, pulmonary delivery

Procedia PDF Downloads 451
11063 Distribution-Free Exponentially Weighted Moving Average Control Charts for Monitoring Process Variability

Authors: Chen-Fang Tsai, Shin-Li Lu

Abstract:

Distribution-free control chart is an oncoming area from the statistical process control charts in recent years. Some researchers have developed various nonparametric control charts and investigated the detection capability of these charts. The major advantage of nonparametric control charts is that the underlying process is not specifically considered the assumption of normality or any parametric distribution. In this paper, two nonparametric exponentially weighted moving average (EWMA) control charts based on nonparametric tests, namely NE-S and NE-M control charts, are proposed for monitoring process variability. Generally, weighted moving average (GWMA) control charts are extended by utilizing design and adjustment parameters for monitoring the changes in the process variability, namely NG-S and NG-M control charts. Statistical performance is also investigated on NG-S and NG-M control charts with run rules. Moreover, sensitivity analysis is performed to show the effects of design parameters under the nonparametric NG-S and NG-M control charts.

Keywords: Distribution-free control chart, EWMA control charts, GWMA control charts

Procedia PDF Downloads 275
11062 Determinants of Profit Efficiency among Poultry Egg Farmers in Ondo State, Nigeria: A Stochastic Profit Function Approach

Authors: Olufunke Olufunmilayo Ilemobayo, Barakat. O Abdulazeez

Abstract:

Profit making among poultry egg farmers has been a challenge to efficient distribution of scarce farm resources over the years, due majorly to low capital base, inefficient management, technical inefficiency, economic inefficiency, thus poultry egg production has moved into an underperformed situation, characterised by low profit margin. Though previous studies focus mainly on broiler production and efficiency of its production, however, paucity of information exist in the areas of profit efficiency in the study area. Hence, determinants of profit efficiency among poultry egg farmers in Ondo State, Nigeria were investigated. A purposive sampling technique was used to obtain primary data from poultry egg farmers in Owo and Akure local government area of Ondo State, through a well-structured questionnaire. socio-economic characteristics such as age, gender, educational level, marital status, household size, access to credit, extension contact, other variables were input and output data like flock size, cost of feeder and drinker, cost of feed, cost of labour, cost of drugs and medications, cost of energy, price of crate of table egg, price of spent layers were variables used in the study. Data were analysed using descriptive statistics, budgeting analysis, and stochastic profit function/inefficiency model. Result of the descriptive statistics shows that 52 per cent of the poultry farmers were between 31-40 years, 62 per cent were male, 90 per cent had tertiary education, 66 per cent were primarily poultry farmers, 78 per cent were original poultry farm owners and 55 per cent had more than 5 years’ work experience. Descriptive statistics on cost and returns indicated that 64 per cent of the return were from sales of egg, while the remaining 36 per cent was from sales of spent layers. The cost of feeding take the highest proportion of 69 per cent of cost of production and cost of medication the lowest (7 per cent). A positive gross margin of N5, 518,869.76, net farm income of ₦ 5, 500.446.82 and net return on investment of 0.28 indicated poultry egg production is profitable. Equipment’s cost (22.757), feeding cost (18.3437), labour cost (136.698), flock size (16.209), drug and medication cost (4.509) were factors that affecting profit efficiency, while education (-2.3143), household size (-18.4291), access to credit (-16.027), and experience (-7.277) were determinant of profit efficiency. Education, household size, access to credit and experience in poultry production were the main determinants of profit efficiency of poultry egg production in Ondo State. Other factors that affect profit efficiency were cost of feeding, cost of labour, flock size, cost of drug and medication, they positively and significantly influenced profit efficiency in Ondo State, Nigeria.

Keywords: cost and returns, economic inefficiency, profit margin, technical inefficiency

Procedia PDF Downloads 130
11061 Production of Low-Density Nanocellular Foam Based on PMMA/PEBAX Blends

Authors: Nigus Maregu Demewoz, Shu-Kai Yeh

Abstract:

Low-density nanocellular foam is a fascinating new-generation advanced material due to its mechanical strength and thermal insulation properties. In nanocellular foam, reducing the density increases the insulation ability. However, producing a nanocellular foam of densities less than 0.3 with a cell size of less than 100 nm is very challenging. In this study, poly (methyl methacrylate) (PMMA) was blended with Polyether block amide (PEBAX) to study the effects of PEBAX on the nanocellular foam structure of the PMMA matrix. We added 2 wt% of PEBAX in the PMMA matrix, and the PEBAX nanostructured domain size of 45 nm was well dispersed in the PMMA matrix. The foaming result produced a new generation special bouquet-like nanocellular foam of cell size less than 50 nm with a relative density of 0.24. Also, we were able to produce a nanocellular foam of a relative density of about 0.17. In addition to thermal insulation applications, bouquet-like nanocellular foam may be expected for filtration applications.

Keywords: nanocellular foam, low-density, cell size, relative density, PMMA/PEBAX blend

Procedia PDF Downloads 95
11060 Extreme Value Modelling of Ghana Stock Exchange Indices

Authors: Kwabena Asare, Ezekiel N. N. Nortey, Felix O. Mettle

Abstract:

Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana Stock Exchange All-Shares indices (2000-2010) by applying the Extreme Value Theory to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before EVT method was applied. The Peak Over Threshold (POT) approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model’s goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the Value at Risk (VaR) and Expected Shortfall (ES) risk measures at some high quantiles, based on the fitted GPD model.

Keywords: extreme value theory, expected shortfall, generalized pareto distribution, peak over threshold, value at risk

Procedia PDF Downloads 559
11059 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble

Authors: Jaehong Yu, Seoung Bum Kim

Abstract:

Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.

Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking

Procedia PDF Downloads 339
11058 A New Method Presentation for Locating Fault in Power Distribution Feeders Considering DG

Authors: Rahman Dashti, Ehsan Gord

Abstract:

In this paper, an improved impedance based fault location method is proposed. In this method, online fault locating is performed using voltage and current information at the beginning of the feeder. Determining precise fault location in a short time increases reliability and efficiency of the system. The proposed method utilizes information about main component of voltage and current at the beginning of the feeder and distributed generation unit (DGU) in order to precisely locate different faults in acceptable time. To evaluate precision and accuracy of the proposed method, a 13-node is simulated and tested using MATLAB.

Keywords: distribution network, fault section determination, distributed generation units, distribution protection equipment

Procedia PDF Downloads 404
11057 Spatial Distribution of Heavy Metals in Khark Island-Iran Using Geographic Information System

Authors: Abbas Hani, Maryam Jassasizadeh

Abstract:

The concentrations of Cd, Pb, and Ni were determined from 40 soil samples collected in surface soils of Khark Island. Geostatistic methods and GIS were used to identify heavy metal sources and their spatial pattern. Principal component analysis coupled with correlation between heavy metals showed that level of mentioned heavy metal was lower than the standard level. Then the data obtained from the soil analyzing were studied for the purposes of normal distribution. The best way of interior finding for cadmium and nickel was ordinary kriging and the best way of interpolation of lead was inverse distance weighted. The result of this study help us to understand heavy metals distribution and make decision for remediation of soil pollution.

Keywords: geostatistics, ordinary kriging, heavy metals, GIS, Khark

Procedia PDF Downloads 168
11056 Influence of Glass Plates Different Boundary Conditions on Human Impact Resistance

Authors: Alberto Sanchidrián, José A. Parra, Jesús Alonso, Julián Pecharromán, Antonia Pacios, Consuelo Huerta

Abstract:

Glass is a commonly used material in building; there is not a unique design solution as plates with a different number of layers and interlayers may be used. In most façades, a security glazing have to be used according to its performance in the impact pendulum. The European Standard EN 12600 establishes an impact test procedure for classification under the point of view of the human security, of flat plates with different thickness, using a pendulum of two tires and 50 kg mass that impacts against the plate from different heights. However, this test does not replicate the actual dimensions and border conditions used in building configurations and so the real stress distribution is not determined with this test. The influence of different boundary conditions, as the ones employed in construction sites, is not well taking into account when testing the behaviour of safety glazing and there is not a detailed procedure and criteria to determinate the glass resistance against human impact. To reproduce the actual boundary conditions on site, when needed, the pendulum test is arranged to be used "in situ", with no account for load control, stiffness, and without a standard procedure. Fracture stress of small and large glass plates fit a Weibull distribution with quite a big dispersion so conservative values are adopted for admissible fracture stress under static loads. In fact, test performed for human impact gives a fracture strength two or three times higher, and many times without a total fracture of the glass plate. Newest standards, as for example DIN 18008-4, states for an admissible fracture stress 2.5 times higher than the ones used for static and wing loads. Now two working areas are open: a) to define a standard for the ‘in situ’ test; b) to prepare a laboratory procedure that allows testing with more real stress distribution. To work on both research lines a laboratory that allows to test medium size specimens with different border conditions, has been developed. A special steel frame allows reproducing the stiffness of the glass support substructure, including a rigid condition used as reference. The dynamic behaviour of the glass plate and its support substructure have been characterized with finite elements models updated with modal tests results. In addition, a new portable impact machine is being used to get enough force and direction control during the impact test. Impact based on 100 J is used. To avoid problems with broken glass plates, the test have been done using an aluminium plate of 1000 mm x 700 mm size and 10 mm thickness supported on four sides; three different substructure stiffness conditions are used. A detailed control of the dynamic stiffness and the behaviour of the plate is done with modal tests. Repeatability of the test and reproducibility of results prove that procedure to control both, stiffness of the plate and the impact level, is necessary.

Keywords: glass plates, human impact test, modal test, plate boundary conditions

Procedia PDF Downloads 308
11055 Molecular Dynamics Simulation of Irradiation-Induced Damage Cascades in Graphite

Authors: Rong Li, Brian D. Wirth, Bing Liu

Abstract:

Graphite is the matrix, and structural material in the high temperature gas-cooled reactor exhibits an irradiation response. It is of significant importance to analyze the defect production and evaluate the role of graphite under irradiation. A vast experimental literature exists for graphite on the dimensional change, mechanical properties, and thermal behavior. However, simulations have not been applied to the atomistic perspective. Remarkably few molecular dynamics simulations have been performed to study the irradiation response in graphite. In this paper, irradiation-induced damage cascades in graphite were investigated with molecular dynamics simulation. Statistical results of the graphite defects were obtained by sampling a wide energy range (1–30 KeV) and 10 different runs for every cascade simulation with different random number generator seeds to the velocity scaling thermostat function. The chemical bonding in carbon was described using the adaptive intermolecular reactive empirical bond-order potential (AIREBO) potential coupled with the standard Ziegler–Biersack–Littmack (ZBL) potential to describe close-range pair interactions. This study focused on analyzing the number of defects, the final cascade morphology and the distribution of defect clusters in space, the length-scale cascade properties such as the cascade length and the range of primary knock-on atom (PKA), and graphite mechanical properties’ variation. It can be concluded that the number of surviving Frenkel pairs increased remarkably with the increasing initial PKA energy but did not exhibit a thermal spike at slightly lower energies in this paper. The PKA range and cascade length approximately linearly with energy which indicated that increasing the PKA initial energy will come at expensive computation cost such as 30KeV in this study. The cascade morphology and the distribution of defect clusters in space mainly related to the PKA energy meanwhile the temperature effect was relatively negligible. The simulations are in agreement with known experimental results and the Kinchin-Pease model, which can help to understand the graphite damage cascades and lifetime span under irradiation and provide a direction to the designs of these kinds of structural materials in the future reactors.

Keywords: graphite damage cascade, molecular dynamics, cascade morphology, cascade distribution

Procedia PDF Downloads 155