Search results for: generalized pareto distribution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2319

Search results for: generalized pareto distribution

1239 Apparent Temperature Distribution on Scaffoldings during Construction Works

Authors: I. Szer, J. Szer, K. Czarnocki, E. Błazik-Borowa

Abstract:

People on construction scaffoldings work in dynamically changing, often unfavourable climate. Additionally, this kind of work is performed on low stiffness structures at high altitude, which increases the risk of accidents. It is therefore desirable to define the parameters of the work environment that contribute to increasing the construction worker occupational safety level. The aim of this article is to present how changes in microclimate parameters on scaffolding can impact the development of dangerous situations and accidents. For this purpose, indicators based on the human thermal balance were used. However, use of this model under construction conditions is often burdened by significant errors or even impossible to implement due to the lack of precise data. Thus, in the target model, the modified parameter was used – apparent environmental temperature. Apparent temperature in the proposed Scaffold Use Risk Assessment Model has been a perceived outdoor temperature, caused by the combined effects of air temperature, radiative temperature, relative humidity and wind speed (wind chill index, heat index). In the paper, correlations between component factors and apparent temperature for facade scaffolding with a width of 24.5 m and a height of 42.3 m, located at south-west side of building are presented. The distribution of factors on the scaffolding has been used to evaluate fitting of the microclimate model. The results of the studies indicate that observed ranges of apparent temperature on the scaffolds frequently results in a worker’s inability to adapt. This leads to reduced concentration and increased fatigue, adversely affects health, and consequently increases the risk of dangerous situations and accidental injuries

Keywords: Apparent temperature, health, safety work, scaffoldings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 924
1238 Software Maintenance Severity Prediction for Object Oriented Systems

Authors: Parvinder S. Sandhu, Roma Jaswal, Sandeep Khimta, Shailendra Singh

Abstract:

As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done in time especially for the critical applications. As, Neural networks, which have been already applied in software engineering applications to build reliability growth models predict the gross change or reusability metrics. Neural networks are non-linear sophisticated modeling techniques that are able to model complex functions. Neural network techniques are used when exact nature of input and outputs is not known. A key feature is that they learn the relationship between input and output through training. In this present work, various Neural Network Based techniques are explored and comparative analysis is performed for the prediction of level of need of maintenance by predicting level severity of faults present in NASA-s public domain defect dataset. The comparison of different algorithms is made on the basis of Mean Absolute Error, Root Mean Square Error and Accuracy Values. It is concluded that Generalized Regression Networks is the best algorithm for classification of the software components into different level of severity of impact of the faults. The algorithm can be used to develop model that can be used for identifying modules that are heavily affected by the faults.

Keywords: Neural Network, Software faults, Software Metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1569
1237 Long Wavelength Coherent Pulse of Sound Propagating in Granular Media

Authors: Rohit Kumar Shrivastava, Amalia Thomas, Nathalie Vriend, Stefan Luding

Abstract:

A mechanical wave or vibration propagating through granular media exhibits a specific signature in time. A coherent pulse or wavefront arrives first with multiply scattered waves (coda) arriving later. The coherent pulse is micro-structure independent i.e. it depends only on the bulk properties of the disordered granular sample, the sound wave velocity of the granular sample and hence bulk and shear moduli. The coherent wavefront attenuates (decreases in amplitude) and broadens with distance from its source. The pulse attenuation and broadening effects are affected by disorder (polydispersity; contrast in size of the granules) and have often been attributed to dispersion and scattering. To study the effect of disorder and initial amplitude (non-linearity) of the pulse imparted to the system on the coherent wavefront, numerical simulations have been carried out on one-dimensional sets of particles (granular chains). The interaction force between the particles is given by a Hertzian contact model. The sizes of particles have been selected randomly from a Gaussian distribution, where the standard deviation of this distribution is the relevant parameter that quantifies the effect of disorder on the coherent wavefront. Since, the coherent wavefront is system configuration independent, ensemble averaging has been used for improving the signal quality of the coherent pulse and removing the multiply scattered waves. The results concerning the width of the coherent wavefront have been formulated in terms of scaling laws. An experimental set-up of photoelastic particles constituting a granular chain is proposed to validate the numerical results.

Keywords: Discrete elements, Hertzian Contact, polydispersity, weakly nonlinear, wave propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 916
1236 Simulation of Organic Matter Variability on a Sugarbeet Field Using the Computer Based Geostatistical Methods

Authors: M. Rüstü Karaman, Tekin Susam, Fatih Er, Servet Yaprak, Osman Karkacıer

Abstract:

Computer based geostatistical methods can offer effective data analysis possibilities for agricultural areas by using vectorial data and their objective informations. These methods will help to detect the spatial changes on different locations of the large agricultural lands, which will lead to effective fertilization for optimal yield with reduced environmental pollution. In this study, topsoil (0-20 cm) and subsoil (20-40 cm) samples were taken from a sugar beet field by 20 x 20 m grids. Plant samples were also collected from the same plots. Some physical and chemical analyses for these samples were made by routine methods. According to derived variation coefficients, topsoil organic matter (OM) distribution was more than subsoil OM distribution. The highest C.V. value of 17.79% was found for topsoil OM. The data were analyzed comparatively according to kriging methods which are also used widely in geostatistic. Several interpolation methods (Ordinary,Simple and Universal) and semivariogram models (Spherical, Exponential and Gaussian) were tested in order to choose the suitable methods. Average standard deviations of values estimated by simple kriging interpolation method were less than average standard deviations (topsoil OM ± 0.48, N ± 0.37, subsoil OM ± 0.18) of measured values. The most suitable interpolation method was simple kriging method and exponantial semivariogram model for topsoil, whereas the best optimal interpolation method was simple kriging method and spherical semivariogram model for subsoil. The results also showed that these computer based geostatistical methods should be tested and calibrated for different experimental conditions and semivariogram models.

Keywords: Geostatistic, kriging, organic matter, sugarbeet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
1235 Analysis of Delays during Initial Phase of Construction Projects and Mitigation Measures

Authors: Sunaitan Al Mutairi

Abstract:

A perfect start is a key factor for project completion on time. The study examined the effects of delayed mobilization of resources during the initial phases of the project. This paper mainly highlights the identification and categorization of all delays during the initial construction phase and their root cause analysis with corrective/control measures for the Kuwait Oil Company oil and gas projects. A relatively good percentage of the delays identified during the project execution (Contract award to end of defects liability period) attributed to mobilization/preliminary activity delays. Data analysis demonstrated significant increase in average project delay during the last five years compared to the previous period. Contractors had delays/issues during the initial phase, which resulted in slippages and progressively increased, resulting in time and cost overrun. Delays/issues not mitigated on time during the initial phase had very high impact on project completion. Data analysis of the delays for the past five years was carried out using trend chart, scatter plot, process map, box plot, relative importance index and Pareto chart. Construction of any project inside the Gathering Centers involves complex management skills related to work force, materials, plant, machineries, new technologies etc. Delay affects completion of projects and compromises quality, schedule and budget of project deliverables. Works executed as per plan during the initial phase and start-up duration of the project construction activities resulted in minor slippages/delays in project completion. In addition, there was a good working environment between client and contractor resulting in better project execution and management. Mainly, the contractor was on the front foot in the execution of projects, which had minimum/no delays during the initial and construction period. Hence, having a perfect start during the initial construction phase shall have a positive influence on the project success. Our research paper studies each type of delay with some real example supported by statistic results and suggests mitigation measures. Detailed analysis carried out with all stakeholders based on impact and occurrence of delays to have a practical and effective outcome to mitigate the delays. The key to improvement is to have proper control measures and periodic evaluation/audit to ensure implementation of the mitigation measures. The focus of this research is to reduce the delays encountered during the initial construction phase of the project life cycle.

Keywords: Construction activities delays, delay analysis for construction projects, mobilization delays, oil and gas projects delays.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1842
1234 Firing Angle Range Control For Minimising Harmonics in TCR Employed in SVC-s

Authors: D. R. Patil, U. Gudaru

Abstract:

Most electrical distribution systems are incurring large losses as the loads are wide spread, inadequate reactive power compensation facilities and their improper control. A typical static VAR compensator consists of capacitor bank in binary sequential steps operated in conjunction with a thyristor controlled reactor of the smallest step size. This SVC facilitates stepless control of reactive power closely matching with load requirements so as to maintain power factor nearer to unity. This type of SVC-s requiring a appropriately controlled TCR. This paper deals with an air cored reactor suitable for distribution transformer of 3phase, 50Hz, Dy11, 11KV/433V, 125 KVA capacity. Air cored reactors are designed, built, tested and operated in conjunction with capacitor bank in five binary sequential steps. It is established how the delta connected TCR minimizes the harmonic components and the operating range for various electrical quantities as a function of firing angle is investigated. In particular firing angle v/s line & phase currents, D.C. components, THD-s, active and reactive powers, odd and even triplen harmonics, dominant characteristic harmonics are all investigated and range of firing angle is fixed for satisfactory operation. The harmonic spectra for phase and line quantities at specified firing angles are given. In case the TCR is operated within the bound specified in this paper established through simulation studies are yielding the best possible operating condition particularly free from all dominant harmonics.

Keywords: Binary Sequential switched capacitor bank, TCR, Nontriplen harmonics, step less Q control, Active and Reactivepower, Simulink

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5985
1233 BIDENS: Iterative Density Based Biclustering Algorithm With Application to Gene Expression Analysis

Authors: Mohamed A. Mahfouz, M. A. Ismail

Abstract:

Biclustering is a very useful data mining technique for identifying patterns where different genes are co-related based on a subset of conditions in gene expression analysis. Association rules mining is an efficient approach to achieve biclustering as in BIMODULE algorithm but it is sensitive to the value given to its input parameters and the discretization procedure used in the preprocessing step, also when noise is present, classical association rules miners discover multiple small fragments of the true bicluster, but miss the true bicluster itself. This paper formally presents a generalized noise tolerant bicluster model, termed as μBicluster. An iterative algorithm termed as BIDENS based on the proposed model is introduced that can discover a set of k possibly overlapping biclusters simultaneously. Our model uses a more flexible method to partition the dimensions to preserve meaningful and significant biclusters. The proposed algorithm allows discovering biclusters that hard to be discovered by BIMODULE. Experimental study on yeast, human gene expression data and several artificial datasets shows that our algorithm offers substantial improvements over several previously proposed biclustering algorithms.

Keywords: Machine learning, biclustering, bi-dimensional clustering, gene expression analysis, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1955
1232 Socio-Demographic Characteristics and Psychosocial Consequences of Sickle Cell Disease: The Case of Patients in a Public Hospital in Ghana

Authors: Vincent A. Adzika, Franklin N. Glozah, Collins S. K. Ahorlu

Abstract:

Background: Sickle Cell Disease (SCD) is of major public-health concern globally, with majority of patients living in Africa. Despite its relevance, there is a dearth of research to determine the socio-demographic distribution and psychosocial impact of SCD in Africa. The objective of this study therefore was to examine the socio-demographic distribution and psychosocial consequences of SCD among patients in Ghana and to assess their quality of life and coping mechanisms. Methods: A cross-sectional research design was used, involving the completion of questionnaires on socio-demographic characteristics, quality of life of individuals, anxiety and depression. Participants were 387 male and female patients attending a sickle cell clinic in a public hospital. Results: Results showed no gender and marital status differences in anxiety and depression. However, there were age and level of education variances in depression but not in anxiety. In terms of quality of life, patients were more satisfied by the presence of love, friends, relatives as well as home, community and neighbourhood environment. While pains of varied nature and severity were the major reasons for attending hospital in SCD condition, going to the hospital as well as having Faith in God was the frequently reported mechanisms for coping with an unbearable SCD attacks. Multiple regression analysis showed that some socio-demographic and quality of life indicators had strong associations with anxiety and/or depression. Conclusion: It is recommended that a multi-dimensional intervention strategy incorporating psychosocial dimensions should be considered in the treatment and management of SCD.

Keywords: Sickle cell disease, quality of life, anxiety, depression, socio-demographic characteristics, Ghana.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1818
1231 Algorithms for Computing of Optimization Problems with a Common Minimum-Norm Fixed Point with Applications

Authors: Apirak Sombat, Teerapol Saleewong, Poom Kumam, Parin Chaipunya, Wiyada Kumam, Anantachai Padcharoen, Yeol Je Cho, Thana Sutthibutpong

Abstract:

This research is aimed to study a two-step iteration process defined over a finite family of σ-asymptotically quasi-nonexpansive nonself-mappings. The strong convergence is guaranteed under the framework of Banach spaces with some additional structural properties including strict and uniform convexity, reflexivity, and smoothness assumptions. With similar projection technique for nonself-mapping in Hilbert spaces, we hereby use the generalized projection to construct a point within the corresponding domain. Moreover, we have to introduce the use of duality mapping and its inverse to overcome the unavailability of duality representation that is exploit by Hilbert space theorists. We then apply our results for σ-asymptotically quasi-nonexpansive nonself-mappings to solve for ideal efficiency of vector optimization problems composed of finitely many objective functions. We also showed that the obtained solution from our process is the closest to the origin. Moreover, we also give an illustrative numerical example to support our results.

Keywords: σ-asymptotically quasi-nonexpansive nonselfmapping, strong convergence, fixed point, uniformly convex and uniformly smooth Banach space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1091
1230 Improving the Exploitation of Fluid in Elastomeric Polymeric Isolator

Authors: Haithem Elderrat, Huw Davies, Emmanuel Brousseau

Abstract:

Elastomeric polymer foam has been used widely in the automotive industry, especially for isolating unwanted vibrations. Such material is able to absorb unwanted vibration due to its combination of elastic and viscous properties. However, the ‘creep effect’, poor stress distribution and susceptibility to high temperatures are the main disadvantages of such a system. In this study, improvements in the performance of elastomeric foam as a vibration isolator were investigated using the concept of Foam Filled Fluid (FFFluid). In FFFluid devices, the foam takes the form of capsule shapes, and is mixed with viscous fluid, while the mixture is contained in a closed vessel. When the FFFluid isolator is affected by vibrations, energy is absorbed, due to the elastic strain of the foam. As the foam is compressed, there is also movement of the fluid, which contributes to further energy absorption as the fluid shears. Also, and dependent on the design adopted, the packaging could also attenuate vibration through energy absorption via friction and/or elastic strain. The present study focuses on the advantages of the FFFluid concept over the dry polymeric foam in the role of vibration isolation. This comparative study between the performance of dry foam and the FFFluid was made according to experimental procedures. The paper concludes by evaluating the performance of the FFFluid isolator in the suspension system of a light vehicle. One outcome of this research is that the FFFluid may preferable over elastomer isolators in certain applications, as it enables a reduction in the effects of high temperatures and of ‘creep effects’, thereby increasing the reliability and load distribution. The stiffness coefficient of the system has increased about 60% by using an FFFluid sample. The technology represented by the FFFluid is therefore considered by this research suitable for application in the suspension system of a light vehicle.

Keywords: Anti-vibration devices, dry foam, FFFluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1895
1229 A CT-based Monte Carlo Dose Calculations for Proton Therapy Using a New Interface Program

Authors: A. Esmaili Torshabi, A. Terakawa, K. Ishii, H. Yamazaki, S. Matsuyama, Y. Kikuchi, M. Nakhostin, H. Sabet, A. Ishizaki, W. Yamashita, T. Togashi, J. Arikawa, H. Akiyama, K. Koyata

Abstract:

The purpose of this study is to introduce a new interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues in proton therapy. This interface program was developed under MATLAB software and includes a friendly graphical user interface with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton beam. The result of the mentioned technique is a number of nonoverlapped squares with different sizes in every image. By this way the resolution of image segmentation is high enough in and near heterogeneous areas to preserve the precision of dose calculations and is low enough in homogeneous areas to reduce the number of cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.

Keywords: Monte Carlo, CT images, Quadtree decomposition, Interface program, Proton beam

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
1228 Internal Surface Measurement of Nanoparticle with Polarization-interferometric Nonlinear Confocal Microscope

Authors: Chikara Egami, Kazuhiro Kuwahara

Abstract:

Polarization-interferometric nonlinear confocal microscopy is proposed for measuring a nano-sized particle with optical anisotropy. The anisotropy in the particle was spectroscopically imaged through a three-dimensional distribution of third-order nonlinear dielectric polarization photoinduced.

Keywords: nanoparticle, optical storage, microscope

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1358
1227 Ordinal Regression with Fenton-Wilkinson Order Statistics: A Case Study of an Orienteering Race

Authors: Joonas Pääkkönen

Abstract:

In sports, individuals and teams are typically interested in final rankings. Final results, such as times or distances, dictate these rankings, also known as places. Places can be further associated with ordered random variables, commonly referred to as order statistics. In this work, we introduce a simple, yet accurate order statistical ordinal regression function that predicts relay race places with changeover-times. We call this function the Fenton-Wilkinson Order Statistics model. This model is built on the following educated assumption: individual leg-times follow log-normal distributions. Moreover, our key idea is to utilize Fenton-Wilkinson approximations of changeover-times alongside an estimator for the total number of teams as in the notorious German tank problem. This original place regression function is sigmoidal and thus correctly predicts the existence of a small number of elite teams that significantly outperform the rest of the teams. Our model also describes how place increases linearly with changeover-time at the inflection point of the log-normal distribution function. With real-world data from Jukola 2019, a massive orienteering relay race, the model is shown to be highly accurate even when the size of the training set is only 5% of the whole data set. Numerical results also show that our model exhibits smaller place prediction root-mean-square-errors than linear regression, mord regression and Gaussian process regression.

Keywords: Fenton-Wilkinson approximation, German tank problem, log-normal distribution, order statistics, ordinal regression, orienteering, sports analytics, sports modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 822
1226 Optimal Manufacturing Scheduling for Dependent Details Processing

Authors: Ivan C. Mustakerov, Daniela I. Borissova

Abstract:

The increasing competitiveness in manufacturing industry is forcing manufacturers to seek effective processing schedules. The paper presents an optimization manufacture scheduling approach for dependent details processing with given processing sequences and times on multiple machines. By defining decision variables as start and end moments of details processing it is possible to use straightforward variables restrictions to satisfy different technological requirements and to formulate easy to understand and solve optimization tasks for multiple numbers of details and machines. A case study example is solved for seven base moldings for CNC metalworking machines processed on five different machines with given processing order among details and machines and known processing time-s duration. As a result of linear optimization task solution the optimal manufacturing schedule minimizing the overall processing time is obtained. The manufacturing schedule defines the moments of moldings delivery thus minimizing storage costs and provides mounting due-time satisfaction. The proposed optimization approach is based on real manufacturing plant problem. Different processing schedules variants for different technological restrictions were defined and implemented in the practice of Bulgarian company RAIS Ltd. The proposed approach could be generalized for other job shop scheduling problems for different applications.

Keywords: Optimal manufacturing scheduling, linear programming, metalworking machines production, dependant details processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1480
1225 Application of Single Tuned Passive Filters in Distribution Networks at the Point of Common Coupling

Authors: M. Almutairi, S. Hadjiloucas

Abstract:

The harmonic distortion of voltage is important in relation to power quality due to the interaction between the large diffusion of non-linear and time-varying single-phase and three-phase loads with power supply systems. However, harmonic distortion levels can be reduced by improving the design of polluting loads or by applying arrangements and adding filters. The application of passive filters is an effective solution that can be used to achieve harmonic mitigation mainly because filters offer high efficiency, simplicity, and are economical. Additionally, possible different frequency response characteristics can work to achieve certain required harmonic filtering targets. With these ideas in mind, the objective of this paper is to determine what size single tuned passive filters work in distribution networks best, in order to economically limit violations caused at a given point of common coupling (PCC). This article suggests that a single tuned passive filter could be employed in typical industrial power systems. Furthermore, constrained optimization can be used to find the optimal sizing of the passive filter in order to reduce both harmonic voltage and harmonic currents in the power system to an acceptable level, and, thus, improve the load power factor. The optimization technique works to minimize voltage total harmonic distortions (VTHD) and current total harmonic distortions (ITHD), where maintaining a given power factor at a specified range is desired. According to the IEEE Standard 519, both indices are viewed as constraints for the optimal passive filter design problem. The performance of this technique will be discussed using numerical examples taken from previous publications.

Keywords: Harmonics, passive filter, power factor, power quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2185
1224 Knowledge Management in Academic: A Perspective of Academic Research Contribution to Economic Development of a Nation

Authors: Hilary J. Watsilla, Narasimha R. Vajjhala

Abstract:

Information and Communication Technology (ICT) has made information access easier and affordable. Academic research has also benefited from this, with online journals and academic resource readily available by the click of a button. However, there are limited ways of assessing and controlling the quality of the academic research mostly in public institution. Nigeria is the most populous country in Africa with a significant number of universities and young population. The quality of knowledge created by academic researchers, however, needs to be evaluated due to the high number of predatory journals published by academia. The purpose of this qualitative study is to look at the knowledge creation, acquisition, and assimilation process by academic researchers in public universities in Nigeria. Qualitative research will be carried out using in-depth interviews and observations. Academic researchers will be interviewed and absorptive capacity theory will be used as the theoretical framework to guide the research. The findings from this study should help understand the impact of ICT on the knowledge creation process in academic research and to understand how ICT can affect the quality of knowledge produced by researchers. The findings from this study should help add value to the existing body of knowledge on the quality of academic research, especially in Africa where there is limited availability of quality academic research. As this study is limited to Nigerian universities, the outcome may not be generalized to other developing countries.

Keywords: Knowledge creation, academic research, knowledge management, information and communication technology, research, university.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1284
1223 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing

Authors: V. Barot, S. McLeod, R. Harrison, A. A. West

Abstract:

Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.

Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1368
1222 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data

Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L Duan

Abstract:

The conditional density characterizes the distribution of a response variable y given other predictor x, and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts a motivating starting point. In this work, we extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zP , zN]. The zP component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zN component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach, coined Augmented Posterior CDE (AP-CDE), only requires a simple modification on the common normalizing flow framework, while significantly improving the interpretation of the latent component, since zP represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of x-related variations due to factors such as lighting condition and subject id, from the other random variations. Further, the experiments show that an unconditional NF neural network, based on an unsupervised model of z, such as Gaussian mixture, fails to generate interpretable results.

Keywords: Conditional density estimation, image generation, normalizing flow, supervised dimension reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150
1221 Contextual Distribution for Textual Alignment

Authors: Yuri Bizzoni, Marianne Reboul

Abstract:

Our program compares French and Italian translations of Homer’s Odyssey, from the XVIth to the XXth century. We focus on the third point, showing how distributional semantics systems can be used both to improve alignment between different French translations as well as between the Greek text and a French translation. Although we focus on French examples, the techniques we display are completely language independent.

Keywords: Translation studies, machine translation, computational linguistics, distributional semantics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1026
1220 Spatial Mapping of Dengue Incidence: A Case Study in Hulu Langat District, Selangor, Malaysia

Authors: Er, A. C., Rosli, M. H., Asmahani A., Mohamad Naim M. R., Harsuzilawati M.

Abstract:

Dengue is a mosquito-borne infection that has peaked to an alarming rate in recent decades. It can be found in tropical and sub-tropical climate. In Malaysia, dengue has been declared as one of the national health threat to the public. This study aimed to map the spatial distributions of dengue cases in the district of Hulu Langat, Selangor via a combination of Geographic Information System (GIS) and spatial statistic tools. Data related to dengue was gathered from the various government health agencies. The location of dengue cases was geocoded using a handheld GPS Juno SB Trimble. A total of 197 dengue cases occurring in 2003 were used in this study. Those data then was aggregated into sub-district level and then converted into GIS format. The study also used population or demographic data as well as the boundary of Hulu Langat. To assess the spatial distribution of dengue cases three spatial statistics method (Moran-s I, average nearest neighborhood (ANN) and kernel density estimation) were applied together with spatial analysis in the GIS environment. Those three indices were used to analyze the spatial distribution and average distance of dengue incidence and to locate the hot spot of dengue cases. The results indicated that the dengue cases was clustered (p < 0.01) when analyze using Moran-s I with z scores 5.03. The results from ANN analysis showed that the average nearest neighbor ratio is less than 1 which is 0.518755 (p < 0.0001). From this result, we can expect the dengue cases pattern in Hulu Langat district is exhibiting a cluster pattern. The z-score for dengue incidence within the district is -13.0525 (p < 0.0001). It was also found that the significant spatial autocorrelation of dengue incidences occurs at an average distance of 380.81 meters (p < 0.0001). Several locations especially residential area also had been identified as the hot spots of dengue cases in the district.

Keywords: Dengue, geographic information system (GIS), spatial analysis, spatial statistics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5350
1219 Comparison of Current Chinese and Japanese Design Specification for Bridge Pile in Liquefied Ground

Authors: Baydaa H. Maula, Ling Zhang, Tang Liang, Gao Xia, Xu Peng-Ju, Zhang Yong-Qiang, Kang Jie, Su Lei

Abstract:

Firstly, this study briefly presents the current situation that there exists a vast gap between current Chinese and Japanese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground; The Chinese and Japanese seismic design method and technical detail for bridge pile foundation in liquefying and lateral spreading ground are described and compared systematically and comprehensively, the methods of determining coefficient of subgrade reaction and its reduction factor as well as the computing mode of the applied force on pile foundation due to liquefaction-induced lateral spreading soil in Japanese design specification are especially introduced. Subsequently, the comparison indicates that the content of Chinese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground, just presenting some qualitative items, is too general and lacks systematicness and maneuverability. Finally, some defects of seismic design specification in China are summarized, so the improvement and revision of specification in the field turns out to be imperative for China, some key problems of current Chinese specifications are generalized and the corresponding improvement suggestions are proposed.

Keywords: liquefying soil, laterally spreading ground, seismic design specification for bridge pile foundation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3644
1218 Study of the Elastic Scattering of 16O, 14N and 12C on the Nucleus of 27Al at Different Energies near the Coulomb Barrier

Authors: N. Amangeldi, N. Burtebayev, Sh. Hamada, A. Amar

Abstract:

the measurement of the angular distribution for the elastic scattering of 16O, 14N and 12C on 27Al has been done at energy 1.75 MeV/nucleon. The optical potential code SPIVAL used in this work to analyze the experimental results. A good agreement between the experimental and theoretical results was obtained.

Keywords: 27Al(16O, 16O)27Al, SPIVAL, 27Al(14N, 14N)27Al, 27Al(12C, 12C)27Al, Elastic Scattering, Optical Potential Codes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393
1217 Dust Acoustic Shock Waves in Coupled Dusty Plasmas with Kappa-Distributed Ions

Authors: Hamid Reza Pakzad

Abstract:

We have considered an unmagnetized dusty plasma system consisting of ions obeying superthermal distribution and strongly coupled negatively charged dust. We have used reductive perturbation method and derived the Kordeweg-de Vries-Burgers (KdV-Burgers) equation. The behavior of the shock waves in the plasma has been investigated.

Keywords: Shock, Soliton, Coupling, Superthermal ions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893
1216 Distribution of Macrobenthic Polychaete Families in Relation to Environmental Parameters in North West Penang, Malaysia

Authors: Mohammad Gholizadeh, Khairun Yahya, Anita Talib, Omar Ahmad

Abstract:

The distribution of macrobenthic polychaetes along the coastal waters of Penang National Park was surveyed to estimate the effect of various environmental parameters at three stations (200m, 600m and 1200m) from the shoreline, during six sampling months, from June 2010 to April 2011.The use of polychaetes in descriptive ecology is surveyed in the light of a recent investigation particularly concerning the soft bottom biota environments. Polychaetes, often connected in the former to the notion of opportunistic species able to proliferate after an enhancement in organic matter, had performed a momentous role particularly with regard to effected soft-bottom habitats. The objective of this survey was to investigate different environment stress over soft bottom polychaete community along Teluk Ketapang and Pantai Acheh (Penang National Park) over a year period. Variations in the polychaete community were evaluated using univariate and multivariate methods. The results of PCA analysis displayed a positive relation between macrobenthic community structures and environmental parameters such as sediment particle size and organic matter in the coastal water. A total of 604 individuals were examined which was grouped into 23 families. Family Nereidae was the most abundant (22.68%), followed by Spionidae (22.02%), Hesionidae (12.58%), Nephtylidae (9.27%) and Orbiniidae (8.61%). It is noticeable that good results can only be obtained on the basis of good taxonomic resolution. We proposed that, in monitoring surveys, operative time could be optimized not only by working at a highertaxonomic level on the entire macrobenthic data set, but by also choosing an especially indicative group and working at lower taxonomic and good level.

Keywords: Polychaete families, environment parameters, Bioindicators, Pantai Acheh, Teluk Ketapang.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1993
1215 Geostatistical Analysis and Mapping of Groundlevel Ozone in a Medium Sized Urban Area

Authors: F. J. Moral García, P. Valiente González, F. López Rodríguez

Abstract:

Ground-level tropospheric ozone is one of the air pollutants of most concern. It is mainly produced by photochemical processes involving nitrogen oxides and volatile organic compounds in the lower parts of the atmosphere. Ozone levels become particularly high in regions close to high ozone precursor emissions and during summer, when stagnant meteorological conditions with high insolation and high temperatures are common. In this work, some results of a study about urban ozone distribution patterns in the city of Badajoz, which is the largest and most industrialized city in Extremadura region (southwest Spain) are shown. Fourteen sampling campaigns, at least one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the ozone distribution at the city, the measured ozone data were analyzed using geostatistical techniques. Thus, first, during the exploratory analysis of data, it was revealed that they were distributed normally, which is a desirable property for the subsequent stages of the geostatistical study. Secondly, during the structural analysis of data, theoretical spherical models provided the best fit for all monthly experimental variograms. The parameters of these variograms (sill, range and nugget) revealed that the maximum distance of spatial dependence is between 302-790 m and the variable, air ozone concentration, is not evenly distributed in reduced distances. Finally, predictive ozone maps were derived for all points of the experimental study area, by use of geostatistical algorithms (kriging). High prediction accuracy was obtained in all cases as cross-validation showed. Useful information for hazard assessment was also provided when probability maps, based on kriging interpolation and kriging standard deviation, were produced.

Keywords: Kriging, map, tropospheric ozone, variogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
1214 Moment Generating Functions of Observed Gaps between Hypopnea Using Saddlepoint Approximations

Authors: Nur Zakiah Mohd Saat, Abdul Aziz Jemain

Abstract:

Saddlepoint approximations is one of the tools to obtain an expressions for densities and distribution functions. We approximate the densities of the observed gaps between the hypopnea events using the Huzurbazar saddlepoint approximation. We demonstrate the density of a maximum likelihood estimator in exponential families.

Keywords: Exponential, maximum likehood estimators, observed gap, Saddlepoint approximations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1290
1213 Optimization of Reaction Rate Parameters in Modeling of Heavy Paraffins Dehydrogenation

Authors: Leila Vafajoo, Farhad Khorasheh, Mehrnoosh Hamzezadeh Nakhjavani, Moslem Fattahi

Abstract:

In the present study, a procedure was developed to determine the optimum reaction rate constants in generalized Arrhenius form and optimized through the Nelder-Mead method. For this purpose, a comprehensive mathematical model of a fixed bed reactor for dehydrogenation of heavy paraffins over Pt–Sn/Al2O3 catalyst was developed. Utilizing appropriate kinetic rate expressions for the main dehydrogenation reaction as well as side reactions and catalyst deactivation, a detailed model for the radial flow reactor was obtained. The reactor model composed of a set of partial differential equations (PDE), ordinary differential equations (ODE) as well as algebraic equations all of which were solved numerically to determine variations in components- concentrations in term of mole percents as a function of time and reactor radius. It was demonstrated that most significant variations observed at the entrance of the bed and the initial olefin production obtained was rather high. The aforementioned method utilized a direct-search optimization algorithm along with the numerical solution of the governing differential equations. The usefulness and validity of the method was demonstrated by comparing the predicted values of the kinetic constants using the proposed method with a series of experimental values reported in the literature for different systems.

Keywords: Dehydrogenation, Pt-Sn/Al2O3 Catalyst, Modeling, Nelder-Mead, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2736
1212 A Novel Deinterlacing Algorithm Based on Adaptive Polynomial Interpolation

Authors: Seung-Won Jung, Hye-Soo Kim, Le Thanh Ha, Seung-Jin Baek, Sung-Jea Ko

Abstract:

In this paper, a novel deinterlacing algorithm is proposed. The proposed algorithm approximates the distribution of the luminance into a polynomial function. Instead of using one polynomial function for all pixels, different polynomial functions are used for the uniform, texture, and directional edge regions. The function coefficients for each region are computed by matrix multiplications. Experimental results demonstrate that the proposed method performs better than the conventional algorithms.

Keywords: Deinterlacing, polynomial interpolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1377
1211 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand

Authors: Chukiat Chaiboonsri, Satawat Wannapan

Abstract:

This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.

Keywords: Thailand tourism, maximum entropy bootstrapping approach, macroeconomic model, asymmetric information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1257
1210 The Impact of Upgrades on ERP System Reliability

Authors: F. Urem, K. Fertalj, I. Livaja

Abstract:

Constant upgrading of Enterprise Resource Planning (ERP) systems is necessary, but can cause new defects. This paper attempts to model the likelihood of defects after completed upgrades with Weibull defect probability density function (PDF). A case study is presented analyzing data of recorded defects obtained for one ERP subsystem. The trends are observed for the value of the parameters relevant to the proposed statistical Weibull distribution for a given one year period. As a result, the ability to predict the appearance of defects after the next upgrade is described.

Keywords: ERP, upgrade, reliability, Weibull model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1629