Search results for: Key distribution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1890

Search results for: Key distribution

1230 A Novel Neighborhood Defined Feature Selection on Phase Congruency Images for Recognition of Faces with Extreme Variations

Authors: Satyanadh Gundimada, Vijayan K Asari

Abstract:

A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.

Keywords: Discriminant analysis, intra-class probability distribution, principal component analysis, phase congruency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1850
1229 Thermo-mechanical Deformation Behavior of Functionally Graded Rectangular Plates Subjected to Various Boundary Conditions and Loadings

Authors: Mohammad Talha, B. N. Singh

Abstract:

This paper deals with the thermo-mechanical deformation behavior of shear deformable functionally graded ceramicmetal (FGM) plates. Theoretical formulations are based on higher order shear deformation theory with a considerable amendment in the transverse displacement using finite element method (FEM). The mechanical properties of the plate are assumed to be temperaturedependent and graded in the thickness direction according to a powerlaw distribution in terms of the volume fractions of the constituents. The temperature field is supposed to be a uniform distribution over the plate surface (XY plane) and varied in the thickness direction only. The fundamental equations for the FGM plates are obtained using variational approach by considering traction free boundary conditions on the top and bottom faces of the plate. A C0 continuous isoparametric Lagrangian finite element with thirteen degrees of freedom per node have been employed to accomplish the results. Convergence and comparison studies have been performed to demonstrate the efficiency of the present model. The numerical results are obtained for different thickness ratios, aspect ratios, volume fraction index and temperature rise with different loading and boundary conditions. Numerical results for the FGM plates are provided in dimensionless tabular and graphical forms. The results proclaim that the temperature field and the gradient in the material properties have significant role on the thermo-mechanical deformation behavior of the FGM plates.

Keywords: Functionally graded material, higher order shear deformation theory, finite element method, independent field variables.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2334
1228 Advanced Stochastic Models for Partially Developed Speckle

Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije

Abstract:

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.

Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744
1227 Apparent Temperature Distribution on Scaffoldings during Construction Works

Authors: I. Szer, J. Szer, K. Czarnocki, E. Błazik-Borowa

Abstract:

People on construction scaffoldings work in dynamically changing, often unfavourable climate. Additionally, this kind of work is performed on low stiffness structures at high altitude, which increases the risk of accidents. It is therefore desirable to define the parameters of the work environment that contribute to increasing the construction worker occupational safety level. The aim of this article is to present how changes in microclimate parameters on scaffolding can impact the development of dangerous situations and accidents. For this purpose, indicators based on the human thermal balance were used. However, use of this model under construction conditions is often burdened by significant errors or even impossible to implement due to the lack of precise data. Thus, in the target model, the modified parameter was used – apparent environmental temperature. Apparent temperature in the proposed Scaffold Use Risk Assessment Model has been a perceived outdoor temperature, caused by the combined effects of air temperature, radiative temperature, relative humidity and wind speed (wind chill index, heat index). In the paper, correlations between component factors and apparent temperature for facade scaffolding with a width of 24.5 m and a height of 42.3 m, located at south-west side of building are presented. The distribution of factors on the scaffolding has been used to evaluate fitting of the microclimate model. The results of the studies indicate that observed ranges of apparent temperature on the scaffolds frequently results in a worker’s inability to adapt. This leads to reduced concentration and increased fatigue, adversely affects health, and consequently increases the risk of dangerous situations and accidental injuries

Keywords: Apparent temperature, health, safety work, scaffoldings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 929
1226 Long Wavelength Coherent Pulse of Sound Propagating in Granular Media

Authors: Rohit Kumar Shrivastava, Amalia Thomas, Nathalie Vriend, Stefan Luding

Abstract:

A mechanical wave or vibration propagating through granular media exhibits a specific signature in time. A coherent pulse or wavefront arrives first with multiply scattered waves (coda) arriving later. The coherent pulse is micro-structure independent i.e. it depends only on the bulk properties of the disordered granular sample, the sound wave velocity of the granular sample and hence bulk and shear moduli. The coherent wavefront attenuates (decreases in amplitude) and broadens with distance from its source. The pulse attenuation and broadening effects are affected by disorder (polydispersity; contrast in size of the granules) and have often been attributed to dispersion and scattering. To study the effect of disorder and initial amplitude (non-linearity) of the pulse imparted to the system on the coherent wavefront, numerical simulations have been carried out on one-dimensional sets of particles (granular chains). The interaction force between the particles is given by a Hertzian contact model. The sizes of particles have been selected randomly from a Gaussian distribution, where the standard deviation of this distribution is the relevant parameter that quantifies the effect of disorder on the coherent wavefront. Since, the coherent wavefront is system configuration independent, ensemble averaging has been used for improving the signal quality of the coherent pulse and removing the multiply scattered waves. The results concerning the width of the coherent wavefront have been formulated in terms of scaling laws. An experimental set-up of photoelastic particles constituting a granular chain is proposed to validate the numerical results.

Keywords: Discrete elements, Hertzian Contact, polydispersity, weakly nonlinear, wave propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 922
1225 Simulation of Organic Matter Variability on a Sugarbeet Field Using the Computer Based Geostatistical Methods

Authors: M. Rüstü Karaman, Tekin Susam, Fatih Er, Servet Yaprak, Osman Karkacıer

Abstract:

Computer based geostatistical methods can offer effective data analysis possibilities for agricultural areas by using vectorial data and their objective informations. These methods will help to detect the spatial changes on different locations of the large agricultural lands, which will lead to effective fertilization for optimal yield with reduced environmental pollution. In this study, topsoil (0-20 cm) and subsoil (20-40 cm) samples were taken from a sugar beet field by 20 x 20 m grids. Plant samples were also collected from the same plots. Some physical and chemical analyses for these samples were made by routine methods. According to derived variation coefficients, topsoil organic matter (OM) distribution was more than subsoil OM distribution. The highest C.V. value of 17.79% was found for topsoil OM. The data were analyzed comparatively according to kriging methods which are also used widely in geostatistic. Several interpolation methods (Ordinary,Simple and Universal) and semivariogram models (Spherical, Exponential and Gaussian) were tested in order to choose the suitable methods. Average standard deviations of values estimated by simple kriging interpolation method were less than average standard deviations (topsoil OM ± 0.48, N ± 0.37, subsoil OM ± 0.18) of measured values. The most suitable interpolation method was simple kriging method and exponantial semivariogram model for topsoil, whereas the best optimal interpolation method was simple kriging method and spherical semivariogram model for subsoil. The results also showed that these computer based geostatistical methods should be tested and calibrated for different experimental conditions and semivariogram models.

Keywords: Geostatistic, kriging, organic matter, sugarbeet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
1224 Firing Angle Range Control For Minimising Harmonics in TCR Employed in SVC-s

Authors: D. R. Patil, U. Gudaru

Abstract:

Most electrical distribution systems are incurring large losses as the loads are wide spread, inadequate reactive power compensation facilities and their improper control. A typical static VAR compensator consists of capacitor bank in binary sequential steps operated in conjunction with a thyristor controlled reactor of the smallest step size. This SVC facilitates stepless control of reactive power closely matching with load requirements so as to maintain power factor nearer to unity. This type of SVC-s requiring a appropriately controlled TCR. This paper deals with an air cored reactor suitable for distribution transformer of 3phase, 50Hz, Dy11, 11KV/433V, 125 KVA capacity. Air cored reactors are designed, built, tested and operated in conjunction with capacitor bank in five binary sequential steps. It is established how the delta connected TCR minimizes the harmonic components and the operating range for various electrical quantities as a function of firing angle is investigated. In particular firing angle v/s line & phase currents, D.C. components, THD-s, active and reactive powers, odd and even triplen harmonics, dominant characteristic harmonics are all investigated and range of firing angle is fixed for satisfactory operation. The harmonic spectra for phase and line quantities at specified firing angles are given. In case the TCR is operated within the bound specified in this paper established through simulation studies are yielding the best possible operating condition particularly free from all dominant harmonics.

Keywords: Binary Sequential switched capacitor bank, TCR, Nontriplen harmonics, step less Q control, Active and Reactivepower, Simulink

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5996
1223 Socio-Demographic Characteristics and Psychosocial Consequences of Sickle Cell Disease: The Case of Patients in a Public Hospital in Ghana

Authors: Vincent A. Adzika, Franklin N. Glozah, Collins S. K. Ahorlu

Abstract:

Background: Sickle Cell Disease (SCD) is of major public-health concern globally, with majority of patients living in Africa. Despite its relevance, there is a dearth of research to determine the socio-demographic distribution and psychosocial impact of SCD in Africa. The objective of this study therefore was to examine the socio-demographic distribution and psychosocial consequences of SCD among patients in Ghana and to assess their quality of life and coping mechanisms. Methods: A cross-sectional research design was used, involving the completion of questionnaires on socio-demographic characteristics, quality of life of individuals, anxiety and depression. Participants were 387 male and female patients attending a sickle cell clinic in a public hospital. Results: Results showed no gender and marital status differences in anxiety and depression. However, there were age and level of education variances in depression but not in anxiety. In terms of quality of life, patients were more satisfied by the presence of love, friends, relatives as well as home, community and neighbourhood environment. While pains of varied nature and severity were the major reasons for attending hospital in SCD condition, going to the hospital as well as having Faith in God was the frequently reported mechanisms for coping with an unbearable SCD attacks. Multiple regression analysis showed that some socio-demographic and quality of life indicators had strong associations with anxiety and/or depression. Conclusion: It is recommended that a multi-dimensional intervention strategy incorporating psychosocial dimensions should be considered in the treatment and management of SCD.

Keywords: Sickle cell disease, quality of life, anxiety, depression, socio-demographic characteristics, Ghana.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1826
1222 Improving the Exploitation of Fluid in Elastomeric Polymeric Isolator

Authors: Haithem Elderrat, Huw Davies, Emmanuel Brousseau

Abstract:

Elastomeric polymer foam has been used widely in the automotive industry, especially for isolating unwanted vibrations. Such material is able to absorb unwanted vibration due to its combination of elastic and viscous properties. However, the ‘creep effect’, poor stress distribution and susceptibility to high temperatures are the main disadvantages of such a system. In this study, improvements in the performance of elastomeric foam as a vibration isolator were investigated using the concept of Foam Filled Fluid (FFFluid). In FFFluid devices, the foam takes the form of capsule shapes, and is mixed with viscous fluid, while the mixture is contained in a closed vessel. When the FFFluid isolator is affected by vibrations, energy is absorbed, due to the elastic strain of the foam. As the foam is compressed, there is also movement of the fluid, which contributes to further energy absorption as the fluid shears. Also, and dependent on the design adopted, the packaging could also attenuate vibration through energy absorption via friction and/or elastic strain. The present study focuses on the advantages of the FFFluid concept over the dry polymeric foam in the role of vibration isolation. This comparative study between the performance of dry foam and the FFFluid was made according to experimental procedures. The paper concludes by evaluating the performance of the FFFluid isolator in the suspension system of a light vehicle. One outcome of this research is that the FFFluid may preferable over elastomer isolators in certain applications, as it enables a reduction in the effects of high temperatures and of ‘creep effects’, thereby increasing the reliability and load distribution. The stiffness coefficient of the system has increased about 60% by using an FFFluid sample. The technology represented by the FFFluid is therefore considered by this research suitable for application in the suspension system of a light vehicle.

Keywords: Anti-vibration devices, dry foam, FFFluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
1221 A CT-based Monte Carlo Dose Calculations for Proton Therapy Using a New Interface Program

Authors: A. Esmaili Torshabi, A. Terakawa, K. Ishii, H. Yamazaki, S. Matsuyama, Y. Kikuchi, M. Nakhostin, H. Sabet, A. Ishizaki, W. Yamashita, T. Togashi, J. Arikawa, H. Akiyama, K. Koyata

Abstract:

The purpose of this study is to introduce a new interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues in proton therapy. This interface program was developed under MATLAB software and includes a friendly graphical user interface with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton beam. The result of the mentioned technique is a number of nonoverlapped squares with different sizes in every image. By this way the resolution of image segmentation is high enough in and near heterogeneous areas to preserve the precision of dose calculations and is low enough in homogeneous areas to reduce the number of cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.

Keywords: Monte Carlo, CT images, Quadtree decomposition, Interface program, Proton beam

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867
1220 Internal Surface Measurement of Nanoparticle with Polarization-interferometric Nonlinear Confocal Microscope

Authors: Chikara Egami, Kazuhiro Kuwahara

Abstract:

Polarization-interferometric nonlinear confocal microscopy is proposed for measuring a nano-sized particle with optical anisotropy. The anisotropy in the particle was spectroscopically imaged through a three-dimensional distribution of third-order nonlinear dielectric polarization photoinduced.

Keywords: nanoparticle, optical storage, microscope

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1364
1219 Ordinal Regression with Fenton-Wilkinson Order Statistics: A Case Study of an Orienteering Race

Authors: Joonas Pääkkönen

Abstract:

In sports, individuals and teams are typically interested in final rankings. Final results, such as times or distances, dictate these rankings, also known as places. Places can be further associated with ordered random variables, commonly referred to as order statistics. In this work, we introduce a simple, yet accurate order statistical ordinal regression function that predicts relay race places with changeover-times. We call this function the Fenton-Wilkinson Order Statistics model. This model is built on the following educated assumption: individual leg-times follow log-normal distributions. Moreover, our key idea is to utilize Fenton-Wilkinson approximations of changeover-times alongside an estimator for the total number of teams as in the notorious German tank problem. This original place regression function is sigmoidal and thus correctly predicts the existence of a small number of elite teams that significantly outperform the rest of the teams. Our model also describes how place increases linearly with changeover-time at the inflection point of the log-normal distribution function. With real-world data from Jukola 2019, a massive orienteering relay race, the model is shown to be highly accurate even when the size of the training set is only 5% of the whole data set. Numerical results also show that our model exhibits smaller place prediction root-mean-square-errors than linear regression, mord regression and Gaussian process regression.

Keywords: Fenton-Wilkinson approximation, German tank problem, log-normal distribution, order statistics, ordinal regression, orienteering, sports analytics, sports modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 834
1218 Application of Single Tuned Passive Filters in Distribution Networks at the Point of Common Coupling

Authors: M. Almutairi, S. Hadjiloucas

Abstract:

The harmonic distortion of voltage is important in relation to power quality due to the interaction between the large diffusion of non-linear and time-varying single-phase and three-phase loads with power supply systems. However, harmonic distortion levels can be reduced by improving the design of polluting loads or by applying arrangements and adding filters. The application of passive filters is an effective solution that can be used to achieve harmonic mitigation mainly because filters offer high efficiency, simplicity, and are economical. Additionally, possible different frequency response characteristics can work to achieve certain required harmonic filtering targets. With these ideas in mind, the objective of this paper is to determine what size single tuned passive filters work in distribution networks best, in order to economically limit violations caused at a given point of common coupling (PCC). This article suggests that a single tuned passive filter could be employed in typical industrial power systems. Furthermore, constrained optimization can be used to find the optimal sizing of the passive filter in order to reduce both harmonic voltage and harmonic currents in the power system to an acceptable level, and, thus, improve the load power factor. The optimization technique works to minimize voltage total harmonic distortions (VTHD) and current total harmonic distortions (ITHD), where maintaining a given power factor at a specified range is desired. According to the IEEE Standard 519, both indices are viewed as constraints for the optimal passive filter design problem. The performance of this technique will be discussed using numerical examples taken from previous publications.

Keywords: Harmonics, passive filter, power factor, power quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2191
1217 Forecast of the Small Wind Turbines Sales with Replacement Purchases and with or without Account of Price Changes

Authors: V. Churkin, M. Lopatin

Abstract:

The purpose of the paper is to estimate the US small wind turbines market potential and forecast the small wind turbines sales in the US. The forecasting method is based on the application of the Bass model and the generalized Bass model of innovations diffusion under replacement purchases. In the work an exponential distribution is used for modeling of replacement purchases. Only one parameter of such distribution is determined by average lifetime of small wind turbines. The identification of the model parameters is based on nonlinear regression analysis on the basis of the annual sales statistics which has been published by the American Wind Energy Association (AWEA) since 2001 up to 2012. The estimation of the US average market potential of small wind turbines (for adoption purchases) without account of price changes is 57080 (confidence interval from 49294 to 64866 at P = 0.95) under average lifetime of wind turbines 15 years, and 62402 (confidence interval from 54154 to 70648 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 90,7%, while in the second - 91,8%. The effect of the wind turbines price changes on their sales was estimated using generalized Bass model. This required a price forecast. To do this, the polynomial regression function, which is based on the Berkeley Lab statistics, was used. The estimation of the US average market potential of small wind turbines (for adoption purchases) in that case is 42542 (confidence interval from 32863 to 52221 at P = 0.95) under average lifetime of wind turbines 15 years, and 47426 (confidence interval from 36092 to 58760 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 95,3%, while in the second – 95,3%.

Keywords: Bass model, generalized Bass model, replacement purchases, sales forecasting of innovations, statistics of sales of small wind turbines in the United States.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1883
1216 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing

Authors: V. Barot, S. McLeod, R. Harrison, A. A. West

Abstract:

Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.

Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1373
1215 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data

Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L Duan

Abstract:

The conditional density characterizes the distribution of a response variable y given other predictor x, and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts a motivating starting point. In this work, we extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zP , zN]. The zP component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zN component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach, coined Augmented Posterior CDE (AP-CDE), only requires a simple modification on the common normalizing flow framework, while significantly improving the interpretation of the latent component, since zP represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of x-related variations due to factors such as lighting condition and subject id, from the other random variations. Further, the experiments show that an unconditional NF neural network, based on an unsupervised model of z, such as Gaussian mixture, fails to generate interpretable results.

Keywords: Conditional density estimation, image generation, normalizing flow, supervised dimension reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166
1214 On Fourier Type Integral Transform for a Class of Generalized Quotients

Authors: A. S. Issa, S. K. Q. AL-Omari

Abstract:

In this paper, we investigate certain spaces of generalized functions for the Fourier and Fourier type integral transforms. We discuss convolution theorems and establish certain spaces of distributions for the considered integrals. The new Fourier type integral is well-defined, linear, one-to-one and continuous with respect to certain types of convergences. Many properties and an inverse problem are also discussed in some details.

Keywords: Fourier type integral, Fourier integral, generalized quotient, Boehmian, distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1182
1213 Contextual Distribution for Textual Alignment

Authors: Yuri Bizzoni, Marianne Reboul

Abstract:

Our program compares French and Italian translations of Homer’s Odyssey, from the XVIth to the XXth century. We focus on the third point, showing how distributional semantics systems can be used both to improve alignment between different French translations as well as between the Greek text and a French translation. Although we focus on French examples, the techniques we display are completely language independent.

Keywords: Translation studies, machine translation, computational linguistics, distributional semantics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1034
1212 Spatial Mapping of Dengue Incidence: A Case Study in Hulu Langat District, Selangor, Malaysia

Authors: Er, A. C., Rosli, M. H., Asmahani A., Mohamad Naim M. R., Harsuzilawati M.

Abstract:

Dengue is a mosquito-borne infection that has peaked to an alarming rate in recent decades. It can be found in tropical and sub-tropical climate. In Malaysia, dengue has been declared as one of the national health threat to the public. This study aimed to map the spatial distributions of dengue cases in the district of Hulu Langat, Selangor via a combination of Geographic Information System (GIS) and spatial statistic tools. Data related to dengue was gathered from the various government health agencies. The location of dengue cases was geocoded using a handheld GPS Juno SB Trimble. A total of 197 dengue cases occurring in 2003 were used in this study. Those data then was aggregated into sub-district level and then converted into GIS format. The study also used population or demographic data as well as the boundary of Hulu Langat. To assess the spatial distribution of dengue cases three spatial statistics method (Moran-s I, average nearest neighborhood (ANN) and kernel density estimation) were applied together with spatial analysis in the GIS environment. Those three indices were used to analyze the spatial distribution and average distance of dengue incidence and to locate the hot spot of dengue cases. The results indicated that the dengue cases was clustered (p < 0.01) when analyze using Moran-s I with z scores 5.03. The results from ANN analysis showed that the average nearest neighbor ratio is less than 1 which is 0.518755 (p < 0.0001). From this result, we can expect the dengue cases pattern in Hulu Langat district is exhibiting a cluster pattern. The z-score for dengue incidence within the district is -13.0525 (p < 0.0001). It was also found that the significant spatial autocorrelation of dengue incidences occurs at an average distance of 380.81 meters (p < 0.0001). Several locations especially residential area also had been identified as the hot spots of dengue cases in the district.

Keywords: Dengue, geographic information system (GIS), spatial analysis, spatial statistics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5368
1211 Study of the Elastic Scattering of 16O, 14N and 12C on the Nucleus of 27Al at Different Energies near the Coulomb Barrier

Authors: N. Amangeldi, N. Burtebayev, Sh. Hamada, A. Amar

Abstract:

the measurement of the angular distribution for the elastic scattering of 16O, 14N and 12C on 27Al has been done at energy 1.75 MeV/nucleon. The optical potential code SPIVAL used in this work to analyze the experimental results. A good agreement between the experimental and theoretical results was obtained.

Keywords: 27Al(16O, 16O)27Al, SPIVAL, 27Al(14N, 14N)27Al, 27Al(12C, 12C)27Al, Elastic Scattering, Optical Potential Codes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400
1210 Dust Acoustic Shock Waves in Coupled Dusty Plasmas with Kappa-Distributed Ions

Authors: Hamid Reza Pakzad

Abstract:

We have considered an unmagnetized dusty plasma system consisting of ions obeying superthermal distribution and strongly coupled negatively charged dust. We have used reductive perturbation method and derived the Kordeweg-de Vries-Burgers (KdV-Burgers) equation. The behavior of the shock waves in the plasma has been investigated.

Keywords: Shock, Soliton, Coupling, Superthermal ions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1901
1209 Distribution of Macrobenthic Polychaete Families in Relation to Environmental Parameters in North West Penang, Malaysia

Authors: Mohammad Gholizadeh, Khairun Yahya, Anita Talib, Omar Ahmad

Abstract:

The distribution of macrobenthic polychaetes along the coastal waters of Penang National Park was surveyed to estimate the effect of various environmental parameters at three stations (200m, 600m and 1200m) from the shoreline, during six sampling months, from June 2010 to April 2011.The use of polychaetes in descriptive ecology is surveyed in the light of a recent investigation particularly concerning the soft bottom biota environments. Polychaetes, often connected in the former to the notion of opportunistic species able to proliferate after an enhancement in organic matter, had performed a momentous role particularly with regard to effected soft-bottom habitats. The objective of this survey was to investigate different environment stress over soft bottom polychaete community along Teluk Ketapang and Pantai Acheh (Penang National Park) over a year period. Variations in the polychaete community were evaluated using univariate and multivariate methods. The results of PCA analysis displayed a positive relation between macrobenthic community structures and environmental parameters such as sediment particle size and organic matter in the coastal water. A total of 604 individuals were examined which was grouped into 23 families. Family Nereidae was the most abundant (22.68%), followed by Spionidae (22.02%), Hesionidae (12.58%), Nephtylidae (9.27%) and Orbiniidae (8.61%). It is noticeable that good results can only be obtained on the basis of good taxonomic resolution. We proposed that, in monitoring surveys, operative time could be optimized not only by working at a highertaxonomic level on the entire macrobenthic data set, but by also choosing an especially indicative group and working at lower taxonomic and good level.

Keywords: Polychaete families, environment parameters, Bioindicators, Pantai Acheh, Teluk Ketapang.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1999
1208 Geostatistical Analysis and Mapping of Groundlevel Ozone in a Medium Sized Urban Area

Authors: F. J. Moral García, P. Valiente González, F. López Rodríguez

Abstract:

Ground-level tropospheric ozone is one of the air pollutants of most concern. It is mainly produced by photochemical processes involving nitrogen oxides and volatile organic compounds in the lower parts of the atmosphere. Ozone levels become particularly high in regions close to high ozone precursor emissions and during summer, when stagnant meteorological conditions with high insolation and high temperatures are common. In this work, some results of a study about urban ozone distribution patterns in the city of Badajoz, which is the largest and most industrialized city in Extremadura region (southwest Spain) are shown. Fourteen sampling campaigns, at least one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the ozone distribution at the city, the measured ozone data were analyzed using geostatistical techniques. Thus, first, during the exploratory analysis of data, it was revealed that they were distributed normally, which is a desirable property for the subsequent stages of the geostatistical study. Secondly, during the structural analysis of data, theoretical spherical models provided the best fit for all monthly experimental variograms. The parameters of these variograms (sill, range and nugget) revealed that the maximum distance of spatial dependence is between 302-790 m and the variable, air ozone concentration, is not evenly distributed in reduced distances. Finally, predictive ozone maps were derived for all points of the experimental study area, by use of geostatistical algorithms (kriging). High prediction accuracy was obtained in all cases as cross-validation showed. Useful information for hazard assessment was also provided when probability maps, based on kriging interpolation and kriging standard deviation, were produced.

Keywords: Kriging, map, tropospheric ozone, variogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
1207 Moment Generating Functions of Observed Gaps between Hypopnea Using Saddlepoint Approximations

Authors: Nur Zakiah Mohd Saat, Abdul Aziz Jemain

Abstract:

Saddlepoint approximations is one of the tools to obtain an expressions for densities and distribution functions. We approximate the densities of the observed gaps between the hypopnea events using the Huzurbazar saddlepoint approximation. We demonstrate the density of a maximum likelihood estimator in exponential families.

Keywords: Exponential, maximum likehood estimators, observed gap, Saddlepoint approximations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1298
1206 A Novel Deinterlacing Algorithm Based on Adaptive Polynomial Interpolation

Authors: Seung-Won Jung, Hye-Soo Kim, Le Thanh Ha, Seung-Jin Baek, Sung-Jea Ko

Abstract:

In this paper, a novel deinterlacing algorithm is proposed. The proposed algorithm approximates the distribution of the luminance into a polynomial function. Instead of using one polynomial function for all pixels, different polynomial functions are used for the uniform, texture, and directional edge regions. The function coefficients for each region are computed by matrix multiplications. Experimental results demonstrate that the proposed method performs better than the conventional algorithms.

Keywords: Deinterlacing, polynomial interpolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1382
1205 The Impact of Upgrades on ERP System Reliability

Authors: F. Urem, K. Fertalj, I. Livaja

Abstract:

Constant upgrading of Enterprise Resource Planning (ERP) systems is necessary, but can cause new defects. This paper attempts to model the likelihood of defects after completed upgrades with Weibull defect probability density function (PDF). A case study is presented analyzing data of recorded defects obtained for one ERP subsystem. The trends are observed for the value of the parameters relevant to the proposed statistical Weibull distribution for a given one year period. As a result, the ability to predict the appearance of defects after the next upgrade is described.

Keywords: ERP, upgrade, reliability, Weibull model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
1204 Probabilistic Graphical Model for the Web

Authors: M. Nekri, A. Khelladi

Abstract:

The world wide web network is a network with a complex topology, the main properties of which are the distribution of degrees in power law, A low clustering coefficient and a weak average distance. Modeling the web as a graph allows locating the information in little time and consequently offering a help in the construction of the research engine. Here, we present a model based on the already existing probabilistic graphs with all the aforesaid characteristics. This work will consist in studying the web in order to know its structuring thus it will enable us to modelize it more easily and propose a possible algorithm for its exploration.

Keywords: Clustering coefficient, preferential attachment, small world, Web community.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1604
1203 Seismic Fragility Assessment of Strongback Steel Braced Frames Subjected to Near-Field Earthquakes

Authors: Mohammadreza Salek Faramarzi, Touraj Taghikhany

Abstract:

In this paper, seismic fragility assessment of a recently developed hybrid structural system, known as the strongback system (SBS) is investigated. In this system, to mitigate the occurrence of the soft-story mechanism and improve the distribution of story drifts over the height of the structure, an elastic vertical truss is formed. The strengthened members of the braced span are designed to remain substantially elastic during levels of excitation where soft-story mechanisms are likely to occur and impose a nearly uniform story drift distribution. Due to the distinctive characteristics of near-field ground motions, it seems to be necessary to study the effect of these records on seismic performance of the SBS. To this end, a set of 56 near-field ground motion records suggested by FEMA P695 methodology is used. For fragility assessment, nonlinear dynamic analyses are carried out in OpenSEES based on the recommended procedure in HAZUS technical manual. Four damage states including slight, moderate, extensive, and complete damage (collapse) are considered. To evaluate each damage state, inter-story drift ratio and floor acceleration are implemented as engineering demand parameters. Further, to extend the evaluation of the collapse state of the system, a different collapse criterion suggested in FEMA P695 is applied. It is concluded that SBS can significantly increase the collapse capacity and consequently decrease the collapse risk of the structure during its life time. Comparing the observing mean annual frequency (MAF) of exceedance of each damage state against the allowable values presented in performance-based design methods, it is found that using the elastic vertical truss, improves the structural response effectively.

Keywords: Strongback System, Near-fault, Seismic fragility, Uncertainty, IDA, Probabilistic performance assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 574
1202 On the Reliability of Low Voltage Network with Small Scale Distributed Generators

Authors: Rade M. Ciric, Nikola Lj.Rajakovic

Abstract:

Since the 80s huge efforts have been made to utilize renewable energy sources to generate electric power. This paper reports some aspects of integration of the distributed generators into the low voltage distribution networks. An assessment of impact of the distributed generators on the reliability indices of low voltage network is performed. Results obtained from case study using low voltage network, are presented and discussed.

Keywords: low voltage network, distributed generation, reliability indices

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799
1201 Water Management Scheme: Panacea to Development Using Nigeria’s University of Ibadan Water Supply Scheme as a Case Study

Authors: Sunday Olufemi Adesogan

Abstract:

The supply of potable water at least is a very important index in national development. Water tariffs depend on the treatment cost which carries the highest percentage of the total operation cost in any water supply scheme. In order to keep water tariffs as low as possible, treatment costs have to be minimized. The University of Ibadan, Nigeria, water supply scheme consists of a treatment plant with three distribution stations (Amina way, Kurumi and Lander) and two raw water supply sources (Awba dam and Eleyele dam). An operational study of the scheme was carried out to ascertain the efficiency of the supply of potable water on the campus to justify the need for water supply schemes in tertiary institutions. The study involved regular collection, processing and analysis of periodic operational data. Data collected include supply reading (water production on daily basis) and consumers metered reading for a period of 22 months (October 2013 - July 2015), and also collected, were the operating hours of both plants and human beings. Applying the required mathematical equations, total loss was determined for the distribution system, which was translated into monetary terms. Adequacies of the operational functions were also determined. The study revealed that water supply scheme is justified in tertiary institutions. It was also found that approximately 10.7 million Nigerian naira (N) is lost to leakages during the 22-month study period; the system’s storage capacity is no longer adequate, especially for peak water production. The capacity of the system as a whole is insufficient for the present university population and that the existing water supply system is not being operated in an optimal manner especially due to personnel, power and system ageing constraints.

Keywords: Operational, efficiency, production, supply, water treatment plant, water loss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 723