Search results for: Conditional Probability Distribution
1065 An Experimental Investigation of Heating in Induction Motors
Authors: R. Khaldi, N. Benamrouche, M. Bouheraoua
Abstract:
The ability to predict an accurate temperature distribution requires the knowledge of the losses, the thermal characteristics of the materials, and the cooling conditions, all of which are very difficult to quantify. In this paper, the impact of the effects of iron and copper losses are investigated separately and their effects on the heating in various points of the stator of an induction motor, is highlighted by using two simple tests. In addition, the effect of a defect, such as an open circuit in a phase of the stator, on the heating is also obtained by a no-load test. The squirrel cage induction motor is rated at 2.2 kW; 380 V; 5.2 A; Δ connected; 50 Hz; 1420 rpm and the class of insulation F, has been thermally tested under several load conditions. Several thermocouples were placed in strategic points of the stator.Keywords: induction motor, temperature, heating, losses
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18541064 Probabilistic Crash Prediction and Prevention of Vehicle Crash
Authors: Lavanya Annadi, Fahimeh Jafari
Abstract:
Transportation brings immense benefits to society, but it also has its costs. Costs include the cost of infrastructure, personnel, and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion, and various indirect costs in terms of air transport. This research aims to predict the probabilistic crash prediction of vehicles using Machine Learning due to natural and structural reasons by excluding spontaneous reasons, like overspeeding, etc., in the United States. These factors range from meteorological elements such as weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity, to human-made structures, like road structure components such as Bumps, Roundabouts, No Exit, Turning Loops, Give Away, etc. The probabilities are categorized into ten distinct classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes in all states collected by the US government. The probability of the crash was determined by employing Multinomial Expected Value, and a classification label was assigned accordingly. We applied three classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-depth insights through exploratory data analysis.
Keywords: Road safety, crash prediction, exploratory analysis, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 931063 Analysis of the Elastic Scattering of 12C on 11B at Energy near Coulomb Barrier Using Different Optical Potential Codes
Authors: Sh. Hamada, N. Burtebayev, A. Amar, N. Amangieldy
Abstract:
the aim of that work is to study the proton transfer phenomenon which takes place in the elastic scattering of 12C on 11B at energies near the coulomb barrier. This reaction was studied at four different energies 16, 18, 22, 24 MeV. The experimental data of the angular distribution at these energies were compared to the calculation prediction using the optical potential codes such as ECIS88 and SPIVAL. For the raising in the cross section at backward angles due to the transfer process we could use Distorted Wave Born Approximation (DWUCK5). Our analysis showed that SPIVAL code with l-dependent imaginary potential could be used effectively.Keywords: Transfer reaction, DWBA, Elastic Scattering, Optical Potential Codes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13901062 Texture Observation of Bending by XRD and EBSD Method
Authors: Takashi Sakai, Yuri Shimomura
Abstract:
The crystal orientation is a factor that affects the microscopic material properties. Crystal orientation determines the anisotropy of the polycrystalline material. And it is closely related to the mechanical properties of the material. In this paper, for pure copper polycrystalline material, two different methods; X-Ray Diffraction (XRD) and Electron Backscatter Diffraction (EBSD); and the crystal orientation were analyzed. In the latter method, it is possible that the X-ray beam diameter is thicker as compared to the former, to measure the crystal orientation macroscopically relatively. By measurement of the above, we investigated the change in crystal orientation and internal tissues of pure copper.
Keywords: Bending, electron backscatter diffraction, X-ray diffraction, microstructure, IPF map, orientation distribution function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17601061 RFID Logistic Management with Cold Chain Monitoring – Cold Store Case Study
Authors: Mira Trebar
Abstract:
Logistics processes of perishable food in the supply chain include the distribution activities and the real time temperature monitoring to fulfil the cold chain requirements. The paper presents the use of RFID (Radio Frequency Identification) technology as an identification tool of receiving and shipping activities in the cold store. At the same time, the use of RFID data loggers with temperature sensors is presented to observe and store the temperatures for the purpose of analyzing the processes and having the history data available for traceability purposes and efficient recall management.
Keywords: Logistics, warehouse, RFID device, cold chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37211060 Macular Ganglion Cell Inner Plexiform Layer Thinning in Patients with Visual Field Defect that Respects the Vertical Meridian
Authors: Hye-Young Shin, Chan Kee Park
Abstract:
Background: To compare the thinning patterns of the ganglion cell-inner plexiform layer (GCIPL) and peripapillary retinal nerve fiber layer (pRNFL) as measured using Cirrus high-definition optical coherence tomography (HD-OCT) in patients with visual field (VF) defects that respect the vertical meridian. Methods: Twenty eyes of eleven patients with VF defects that respect the vertical meridian were enrolled retrospectively. The thicknesses of the macular GCIPL and pRNFL were measured using Cirrus HD-OCT. The 5% and 1% thinning area index (TAI) was calculated as the proportion of abnormally thin sectors at the 5% and 1% probability level within the area corresponding to the affected VF. The 5% and 1% TAI were compared between the GCIPL and pRNFL measurements. Results: The color-coded GCIPL deviation map showed a characteristic vertical thinning pattern of the GCIPL, which is also seen in the VF of patients with brain lesions. The 5% and 1% TAI were significantly higher in the GCIPL measurements than in the pRNFL measurements (all P < 0.01). Conclusions: Macular GCIPL analysis clearly visualized a characteristic topographic pattern of retinal ganglion cell (RGC) loss in patients with VF defects that respect the vertical meridian, unlike pRNFL measurements. Macular GCIPL measurements provide more valuable information than pRNFL measurements for detecting the loss of RGCs in patients with retrograde degeneration of the optic nerve fibers.Keywords: Brain lesion, Macular ganglion cell-Inner plexiform layer, Spectral-domain optical coherence tomography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17741059 Evaluation of Indoor-Outdoor Particle Size Distribution in Tehran's Elementary Schools
Authors: F. Halek, A. Kavousi, F. Hassani
Abstract:
A simultaneous study on indoor and outdoor particulate matter concentrations was done in five elementary schools in central parts of Tehran, Iran. Three sizes of particles including PM10, PM2.5 and PM1.0 were measured in 13 classrooms within this schools during winter (January, February and March) 2009. A laserbased portable aerosol spectrometer Model Grimm-1.108, was used for the continuous measurement of particles. The average indoor concentration of PM10, PM2.5 and PM1.0 in studied schools were 274 μg/m3, 42 μg/m3 and 19 μg/m3 respectively; and average outdoor concentrations of PM10, PM2.5 and PM1.0 were evaluated to be 22 μg/m3, 38 μg/m3 and 140 μg/m3 respectively.
Keywords: Elementary school, Indoor pollution, particulate matter, PM10, PM2.5, PM1.0, outdoor pollution, Tehran air pollution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16661058 Confidence Interval for the Inverse of a Normal Mean with a Known Coefficient of Variation
Authors: Arunee Wongkha, Suparat Niwitpong, Sa-aat Niwitpong
Abstract:
In this paper, we propose two new confidence intervals for the inverse of a normal mean with a known coefficient of variation. One of new confidence intervals for the inverse of a normal mean with a known coefficient of variation is constructed based on the pivotal statistic Z where Z is a standard normal distribution and another confidence interval is constructed based on the generalized confidence interval, presented by Weerahandi. We examine the performance of these confidence intervals in terms of coverage probabilities and average lengths via Monte Carlo simulation.
Keywords: The inverse of a normal mean, confidence interval, generalized confidence intervals, known coefficient of variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25881057 GA based Optimal Sizing and Placement of Distributed Generation for Loss Minimization
Authors: Deependra Singh, Devender Singh, K. S. Verma
Abstract:
This paper addresses a novel technique for placement of distributed generation (DG) in electric power systems. A GA based approach for sizing and placement of DG keeping in view of system power loss minimization in different loading conditions is explained. Minimal system power loss is obtained under voltage and line loading constraints. Proposed strategy is applied to power distribution systems and its effectiveness is verified through simulation results on 16, 37-bus and 75-bus test systems.
Keywords: Distributed generation (DG), Genetic algorithms (GA), optimal sizing and placement, Power loss.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34711056 Geochemical Assessment of Heavy Metals Concentration in Surface Sediment of West Port, Malaysia
Authors: B.Tavakoly Sany, A. Salleh, A.H .Sulaiman, A. Mehdinia, GH. Monazami
Abstract:
One year (November 2009-October 2010) sediment monitoring was used to evaluate pollution status, concentration and distribution of heavy metals (As, Cu, Cd, Cr, Hg, Ni, Pb and Zn) in West Port of Malaysia. Sediment sample were collected from nine stations every four months. Geo-accumulation factor and Pollution Load Index (PLI) were estimated to better understand the pollution level in study area. The heavy metal concentration (Mg/g dry weight) were ranged from 20.2 to 162 for As, 7.4 to 27.6 for Cu, 0.244 to 3.53 for Cd, 11.5 to 61.5 for Cr, 0.11 to 0.409 for Hg, 7.2 to 22.2 for Ni, 22.3 to 80 for Pb and 23 to 98.3 for Zn. In general, concentration some metals (As,Cd, Hg and Pb) was higher than background values that are considered as serious concern for aquatic life and the human health.
Keywords: Heavy metals, Sediment Quality, geo-accumulationindex, Pollution Load Index
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25321055 Evaluation of the Impact of Dataset Characteristics for Classification Problems in Biological Applications
Authors: Kanthida Kusonmano, Michael Netzer, Bernhard Pfeifer, Christian Baumgartner, Klaus R. Liedl, Armin Graber
Abstract:
Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.
Keywords: Classification, High dimensional data, Machine learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23871054 A Discretizing Method for Reliability Computation in Complex Stress-strength Models
Authors: Alessandro Barbiero
Abstract:
This paper proposes, implements and evaluates an original discretization method for continuous random variables, in order to estimate the reliability of systems for which stress and strength are defined as complex functions, and whose reliability is not derivable through analytic techniques. This method is compared to other two discretizing approaches appeared in literature, also through a comparative study involving four engineering applications. The results show that the proposal is very efficient in terms of closeness of the estimates to the true (simulated) reliability. In the study we analyzed both a normal and a non-normal distribution for the random variables: this method is theoretically suitable for each parametric family.
Keywords: Approximation, asymmetry, experimental design, interference theory, Monte Carlo simulations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17771053 Detection Characteristics of the Random and Deterministic Signals in Antenna Arrays
Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev
Abstract:
In this paper, approach to incoherent signal detection in multi-element antenna array are researched and modeled. Two types of useful signals with unknown wavefront were considered: first one, deterministic (Barker code), and second one, random (Gaussian distribution). The derivation of the sufficient statistics took into account the linearity of the antenna array. The performance characteristics and detecting curves are modeled and compared for different useful signals parameters and for different number of elements of the antenna array. Results of researches in case of some additional conditions can be applied to a digital communications systems.Keywords: Antenna array, detection curves, performance characteristics, quadrature processing, signal detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17771052 Analysis of the Shielding Effectiveness of Several Magnetic Shields
Authors: Diako Azizi, Hosein Heydari, Ahmad Gholami
Abstract:
Today with the rapid growth of telecommunications equipment, electronic and developing more and more networks of power, influence of electromagnetic waves on one another has become hot topic discussions. So in this article, this issue and appropriate mechanisms for EMC operations have been presented. First, a source of alternating current (50 Hz) and a clear victim in a certain distance from the source is placed. With this simple model, the effects of electromagnetic radiation from the source to the victim will be investigated and several methods to reduce these effects have been presented. Therefore passive and active shields have been used. In some steps, shielding effectiveness of proposed shields will be compared. . It should be noted that simulations have been done by the finite element method (FEM).
Keywords: Electrical field, field distribution, finite element method, shielding effectiveness
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17531051 Modeling and Simulation of Dynamic Voltage Restorer for Mitigation of Voltage Sags
Authors: S. Ganesh, L. Raguraman, E. Anushya, J. krishnasree
Abstract:
Voltage sags are the most common power quality disturbance in the distribution system. It occurs due to the fault in the electrical network or by the starting of a large induction motor and this can be solved by using the custom power devices such as Dynamic Voltage Restorer (DVR). In this paper DVR is proposed to compensate voltage sags on critical loads dynamically. The DVR consists of VSC, injection transformers, passive filters and energy storage (lead acid battery). By injecting an appropriate voltage, the DVR restores a voltage waveform and ensures constant load voltage. The simulation and experimental results of a DVR using MATLAB software shows clearly the performance of the DVR in mitigating voltage sags.
Keywords: Dynamic voltage restorer, Voltage sags, Power quality, Injection methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42891050 Formant Tracking Linear Prediction Model using HMMs for Noisy Speech Processing
Authors: Zaineb Ben Messaoud, Dorra Gargouri, Saida Zribi, Ahmed Ben Hamida
Abstract:
This paper presents a formant-tracking linear prediction (FTLP) model for speech processing in noise. The main focus of this work is the detection of formant trajectory based on Hidden Markov Models (HMM), for improved formant estimation in noise. The approach proposed in this paper provides a systematic framework for modelling and utilization of a time- sequence of peaks which satisfies continuity constraints on parameter; the within peaks are modelled by the LP parameters. The formant tracking LP model estimation is composed of three stages: (1) a pre-cleaning multi-band spectral subtraction stage to reduce the effect of residue noise on formants (2) estimation stage where an initial estimate of the LP model of speech for each frame is obtained (3) a formant classification using probability models of formants and Viterbi-decoders. The evaluation results for the estimation of the formant tracking LP model tested in Gaussian white noise background, demonstrate that the proposed combination of the initial noise reduction stage with formant tracking and LPC variable order analysis, results in a significant reduction in errors and distortions. The performance was evaluated with noisy natual vowels extracted from international french and English vocabulary speech signals at SNR value of 10dB. In each case, the estimated formants are compared to reference formants.Keywords: Formants Estimation, HMM, Multi Band Spectral Subtraction, Variable order LPC coding, White Gauusien Noise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19681049 Investigation of the Effect of Milling Time on the Mechanochemical Synthesis of Fe3Al/ Al2O3 Nanocomposite
Authors: B. Ghasemi, A. A. Najafzadeh Khoee
Abstract:
In this study, the effect of mechanical activation on the synthesis of Fe3Al/Al2O3 nanocomposite has been investigated by using mechanochemical method. For this purpose, Aluminum powder and hematite as precursors, with stoichiometric ratio, have been utilized and other effective parameters in milling process were kept constant. Phase formation analysis, crystallite size measurement and lattice strain were studied by X-ray diffraction (XRD) by using Williamson-Hall method as well as microstructure and morphology were explored by Scanning electron microscopy (SEM). Also, Energy-dispersive X-ray spectroscopy (EDX) analysis was used in order to probe the particle distribution. The results showed that after 30-hour milling, the reaction was started, combustibly done and completed.
Keywords: hematite, mechanochemical, milling, nanocomposite
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19081048 Estimating the Population Mean by Using Stratified Double Extreme Ranked Set Sample
Authors: Mahmoud I. Syam, Kamarulzaman Ibrahim, Amer I. Al-Omari
Abstract:
Stratified double extreme ranked set sampling (SDERSS) method is introduced and considered for estimating the population mean. The SDERSS is compared with the simple random sampling (SRS), stratified ranked set sampling (SRSS) and stratified simple set sampling (SSRS). It is shown that the SDERSS estimator is an unbiased of the population mean and more efficient than the estimators using SRS, SRSS and SSRS when the underlying distribution of the variable of interest is symmetric or asymmetric.
Keywords: Double extreme ranked set sampling, Extreme ranked set sampling, Ranked set sampling, Stratified double extreme ranked set sampling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23491047 Peer-to-Peer Epidemic Algorithms for Reliable Multicasting in Ad Hoc Networks
Authors: Zülküf Genç, Öznur Özkasap
Abstract:
Characteristics of ad hoc networks and even their existence depend on the nodes forming them. Thus, services and applications designed for ad hoc networks should adapt to this dynamic and distributed environment. In particular, multicast algorithms having reliability and scalability requirements should abstain from centralized approaches. We aspire to define a reliable and scalable multicast protocol for ad hoc networks. Our target is to utilize epidemic techniques for this purpose. In this paper, we present a brief survey of epidemic algorithms for reliable multicasting in ad hoc networks, and describe formulations and analytical results for simple epidemics. Then, P2P anti-entropy algorithm for content distribution and our prototype simulation model are described together with our initial results demonstrating the behavior of the algorithm.
Keywords: Ad hoc networks, epidemic, peer-to-peer, reliablemulticast.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17761046 Characterization of HD-V2 Gafchromic Film for Measurement of Spatial Dose Distribution from Alpha Particle of 5.5 MeV
Authors: A. Aydarous, M. El Ghazaly
Abstract:
The purpose of this study was to investigate the response of the newly released Gafchromic HD-V2 film for alpha particle of 5.5 MeV. Gafchromic HD-V2 was exposed to alpha particles of energy 5 MeV from 241Am for different durations. Then the films were scanned with a flatbed scanner. The dose response curve up to 2200 Gy has been achieved. The film’s reproducibility and sensitivity were evaluated. The results obtained show that the net optical density increases almost exponentially with the increase in the exposure time, and it becomes saturated after prolonged exposure times. The red channel shows the highest sensitivity, with a value of 4 x 10-3 Gy-1 at netOD of 0.4. The inter-film reproducibility was measured and the relative uncertainty found was 1.7 %, 2.1 % and 2.3 % for grey, red and green channels, respectively.
Keywords: Alpha dosimetry, 241Am, Gafchromic film.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31461045 Investigation on Behavior of Fixed-Ended Reinforced Concrete Deep Beams
Authors: Y. Heyrani Birak, R. Hizaji, J. Shahkarami
Abstract:
Reinforced Concrete (RC) deep beams are special structural elements because of their geometry and behavior under loads. For example, assumption of strain- stress distribution is not linear in the cross section. These types of beams may have simple supports or fixed supports. A lot of research works have been conducted on simply supported deep beams, but little study has been done in the fixed-end RC deep beams behavior. Recently, using of fixed-ended deep beams has been widely increased in structures. In this study, the behavior of fixed-ended deep beams is investigated, and the important parameters in capacity of this type of beams are mentioned.
Keywords: Deep beam, capacity, reinforced concrete, fixed-ended.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8321044 Static Modeling of the Delamination of a Composite Material Laminate in Mode II
Authors: Y. Madani, H. Achache, B. Boutabout
Abstract:
The purpose of this paper is to analyze numerically by the three-dimensional finite element method, using ABAQUS calculation code, the mechanical behavior of a unidirectional and multidirectional delaminated stratified composite under mechanical loading in Mode II. This study consists of the determination of the energy release rate G in mode II as well as the distribution of equivalent von Mises stresses along the damaged zone by varying several parameters such as the applied load and the delamination length. It allowed us to deduce that the high energy release rate favors delamination at the free edges of a stratified plate subjected to bending.
Keywords: Delamination, energy release rate, finite element method, stratified composite.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7111043 Unsteady Stagnation-Point Flow towards a Shrinking Sheet with Radiation Effect
Authors: F. M. Ali, R. Nazar, N. M. Arifin, I. Pop
Abstract:
In this paper, the problem of unsteady stagnation-point flow and heat transfer induced by a shrinking sheet in the presence of radiation effect is studied. The transformed boundary layer equations are solved numerically by the shooting method. The influence of radiation, unsteadiness and shrinking parameters, and the Prandtl number on the reduced skin friction coefficient and the heat transfer coefficient, as well as the velocity and temperature profiles are presented and discussed in detail. It is found that dual solutions exist and the temperature distribution becomes less significant with radiation parameter.
Keywords: Heat transfer, Radiation effect, Shrinking sheet Unsteady flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19571042 Unsupervised Texture Segmentation via Applying Geodesic Active Regions to Gaborian Feature Space
Authors: Yuan He, Yupin Luo, Dongcheng Hu
Abstract:
In this paper, we propose a novel variational method for unsupervised texture segmentation. We use a Gabor filter bank to extract texture features. Some of the filtered channels form a multidimensional Gaborian feature space. To avoid deforming contours directly in a vector-valued space we use a Gaussian mixture model to describe the statistical distribution of this space and get the boundary and region probabilities. Then a framework of geodesic active regions is applied based on them. In the end, experimental results are presented, and show that this method can obtain satisfied boundaries between different texture regions.
Keywords: Texture segmentation, Gabor filter, snakes, Geodesicactive regions
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17761041 Safety Compliance of Substation Earthing Design
Authors: A. Hellany, M.Nagrial, M. Nassereddine, J. Rizk
Abstract:
As new challenges emerge in power electrical workplace safety, it is the responsibility of the systems designer to seek out new approaches and solutions that address them. Design decisions made today will impact cost, safety and serviceability of the installed systems for 40 or 50 years during the useful life for the owner. Studies have shown that this cost is an order of magnitude of 7 to 10 times the installed cost of the power distribution equipment. This paper reviews some aspects of earthing system design in power substation surrounded by residential houses. The electrical potential rise and split factors are discussed and a few recommendations are provided to achieve a safety voltage in the area beyond the boundary of the substation.Keywords: EPR, Split Factor, Earthing Design
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42711040 Analyzing Artificial Emotion in Game Characters Using Soft Computing
Authors: Musbah M. Aqel, P. K. Mahanti, Soumya Banerjee
Abstract:
This paper describes a simulation model for analyzing artificial emotion injected to design the game characters. Most of the game storyboard is interactive in nature and the virtual characters of the game are equipped with an individual personality and dynamic emotion value which is similar to real life emotion and behavior. The uncertainty in real expression, mood and behavior is also exhibited in game paradigm and this is focused in the present paper through a fuzzy logic based agent and storyboard. Subsequently, a pheromone distribution or labeling is presented mimicking the behavior of social insects.
Keywords: Artificial Emotion, Fuzzy logic, Game character, Pheromone label
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13151039 Human Detection using Projected Edge Feature
Authors: Jaedo Kim, Youngjoon Han, Hernsoo Hahn
Abstract:
The purpose of this paper is to detect human in images. This paper proposes a method for extracting human body feature descriptors consisting of projected edge component series. The feature descriptor can express appearances and shapes of human with local and global distribution of edges. Our method evaluated with a linear SVM classifier on Daimler-Chrysler pedestrian dataset, and test with various sub-region size. The result shows that the accuracy level of proposed method similar to Histogram of Oriented Gradients(HOG) feature descriptor and feature extraction process is simple and faster than existing methods.Keywords: Human detection, Projected edge descriptor, Linear SVM, Local appearance feature
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15041038 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings
Authors: G. Candel, D. Naccache
Abstract:
t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embedding. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic, and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n2) to O(n2/k), and the memory requirement from n2 to 2(n/k)2 which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.
Keywords: Concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4911037 Application of Interferometric Techniques for Quality Control of Oils Used in the Food Industry
Authors: Andres Piña, Amy Meléndez, Pablo Cano, Tomas Cahuich
Abstract:
The purpose of this project is to propose a quick and environmentally friendly alternative to measure the quality of oils used in food industry. There is evidence that repeated and indiscriminate use of oils in food processing cause physicochemical changes with formation of potentially toxic compounds that can affect the health of consumers and cause organoleptic changes. In order to assess the quality of oils, non-destructive optical techniques such as Interferometry offer a rapid alternative to the use of reagents, using only the interaction of light on the oil. Through this project, we used interferograms of samples of oil placed under different heating conditions to establish the changes in their quality. These interferograms were obtained by means of a Mach-Zehnder Interferometer using a beam of light from a HeNe laser of 10mW at 632.8nm. Each interferogram was captured, analyzed and measured full width at half-maximum (FWHM) using the software from Amcap and ImageJ. The total of FWHMs was organized in three groups. It was observed that the average obtained from each of the FWHMs of group A shows a behavior that is almost linear, therefore it is probable that the exposure time is not relevant when the oil is kept under constant temperature. Group B exhibits a slight exponential model when temperature raises between 373 K and 393 K. Results of the t-Student show a probability of 95% (0.05) of the existence of variation in the molecular composition of both samples. Furthermore, we found a correlation between the Iodine Indexes (Physicochemical Analysis) and the Interferograms (Optical Analysis) of group C. Based on these results, this project highlights the importance of the quality of the oils used in food industry and shows how Interferometry can be a useful tool for this purpose.Keywords: Food industry, interferometric, oils, quality control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21821036 Assessment of Sediment Quality in the West Port Based On the Index Analysis Approach
Authors: S.B. Tavakoly Sany, A. Salleh, A.H. Sulaiman, G.H. Monazami
Abstract:
The coastal sediments of West Port of Malaysia was monitored from Nov. 2009 to Oct. 2010 to assess spatial distribution of heavy metals As, Cu, Cd, Cr, Hg, Ni, Zn and Pb. Sediment samples were collected from 10 stations in dry and rainy season in West Port. The range concentrations measured (Mg/g dry weight ) were from 23.4 to 98.3 for Zn, 22.3 to 80 for Pb, 7.4 to 27.6 Cu, 0.244 to 3.53 for Cd, 7.2 to 22.2 for Ni, 20.2 to 162 for As, 0.11 to 0.409 for Hg and 11.5 to 61.5 for Cr. The geochemical indexes used in this study were Geoaccumulation (Igeo), Contamination Factor (CF) and Pollution Load Index (PLI); these indexes were used to evaluate the levels of sediment contaminations. The results of these indexes show that, the status of West Port sediment quality are moderately polluted by heavy metals except in arsenic which shows the high level of pollution.
Keywords: Heavy metals, Sediment Quality, West Port.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717