Search results for: continuous time domain estimation
7077 Quantifying and Adjusting the Effects of Publication Bias in Continuous Meta-Analysis
Authors: N.R.N. Idris
Abstract:
This study uses simulated meta-analysis to assess the effects of publication bias on meta-analysis estimates and to evaluate the efficacy of the trim and fill method in adjusting for these biases. The estimated effect sizes and the standard error were evaluated in terms of the statistical bias and the coverage probability. The results demonstrate that if publication bias is not adjusted it could lead to up to 40% bias in the treatment effect estimates. Utilization of the trim and fill method could reduce the bias in the overall estimate by more than half. The method is optimum in presence of moderate underlying bias but has minimal effects in presence of low and severe publication bias. Additionally, the trim and fill method improves the coverage probability by more than half when subjected to the same level of publication bias as those of the unadjusted data. The method however tends to produce false positive results and will incorrectly adjust the data for publication bias up to 45 % of the time. Nonetheless, the bias introduced into the estimates due to this adjustment is minimal
Keywords: Publication bias, Trim and Fill method, percentage relative bias, coverage probability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15687076 Time Series Forecasting Using Independent Component Analysis
Authors: Theodor D. Popescu
Abstract:
The paper presents a method for multivariate time series forecasting using Independent Component Analysis (ICA), as a preprocessing tool. The idea of this approach is to do the forecasting in the space of independent components (sources), and then to transform back the results to the original time series space. The forecasting can be done separately and with a different method for each component, depending on its time structure. The paper gives also a review of the main algorithms for independent component analysis in the case of instantaneous mixture models, using second and high-order statistics. The method has been applied in simulation to an artificial multivariate time series with five components, generated from three sources and a mixing matrix, randomly generated.Keywords: Independent Component Analysis, second order statistics, simulation, time series forecasting
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17847075 Electromyography Pattern Classification with Laplacian Eigenmaps in Human Running
Authors: Elnaz Lashgari, Emel Demircan
Abstract:
Electromyography (EMG) is one of the most important interfaces between humans and robots for rehabilitation. Decoding this signal helps to recognize muscle activation and converts it into smooth motion for the robots. Detecting each muscle’s pattern during walking and running is vital for improving the quality of a patient’s life. In this study, EMG data from 10 muscles in 10 subjects at 4 different speeds were analyzed. EMG signals are nonlinear with high dimensionality. To deal with this challenge, we extracted some features in time-frequency domain and used manifold learning and Laplacian Eigenmaps algorithm to find the intrinsic features that represent data in low-dimensional space. We then used the Bayesian classifier to identify various patterns of EMG signals for different muscles across a range of running speeds. The best result for vastus medialis muscle corresponds to 97.87±0.69 for sensitivity and 88.37±0.79 for specificity with 97.07±0.29 accuracy using Bayesian classifier. The results of this study provide important insight into human movement and its application for robotics research.
Keywords: Electrocardiogram, manifold learning, Laplacian Eigenmaps, running pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11307074 Verification Process of Cylindrical Contact Force Models for Internal Contact Modeling
Authors: Cândida M. Pereira, Amílcar L. Ramalho, Jorge A. Ambrósio
Abstract:
In the numerical solution of the forward dynamics of a multibody system, the positions and velocities of the bodies in the system are obtained first. With the information of the system state variables at each time step, the internal and external forces acting on the system are obtained by appropriate contact force models if the continuous contact method is used instead of a discrete contact method. The local deformation of the bodies in contact, represented by penetration, is used to compute the contact force. The ability and suitability with current cylindrical contact force models to describe the contact between bodies with cylindrical geometries with particular focus on internal contacting geometries involving low clearances and high loads simultaneously is discussed in this paper. A comparative assessment of the performance of each model under analysis for different contact conditions, in particular for very different penetration and clearance values, is presented. It is demonstrated that some models represent a rough approximation to describe the conformal contact between cylindrical geometries because contact forces are underestimated.Keywords: Clearance joints, Contact mechanics, Contact dynamics, Internal cylindrical contact, Multibody dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23287073 2D Graphical Analysis of Wastewater Influent Capacity Time Series
Authors: Monika Chuchro, Maciej Dwornik
Abstract:
The extraction of meaningful information from image could be an alternative method for time series analysis. In this paper, we propose a graphical analysis of time series grouped into table with adjusted colour scale for numerical values. The advantages of this method are also discussed. The proposed method is easy to understand and is flexible to implement the standard methods of pattern recognition and verification, especially for noisy environmental data.Keywords: graphical analysis, time series, seasonality, noisy environmental data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14577072 A High-Resolution Refractive Index Sensor Based on a Magnetic Photonic Crystal
Authors: Ti-An Tsai, Chun-Chih Wang, Hung-Wen Wang, I-Ling Chang, Lien-Wen Chen
Abstract:
In this study, we demonstrate a high-resolution refractive index sensor based on a Magnetic Photonic Crystal (MPC) composed of a triangular lattice array of air holes embedded in Si matrix. A microcavity is created by changing the radius of an air hole in the middle of the photonic crystal. The cavity filled with gyrotropic materials can serve as a refractive index sensor. The shift of the resonant frequency of the sensor is obtained numerically using finite difference time domain method under different ambient conditions having refractive index from n = 1.0 to n = 1.1. The numerical results show that a tiny change in refractive index of Δn = 0.0001 is distinguishable. In addition, the spectral response of the MPC sensor is studied while an external magnetic field is present. The results show that the MPC sensor exhibits a dramatic improvement in resolution.
Keywords: Magnetic photonic crystal, refractive index sensor, sensitivity, high-resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15297071 Estimation of Crustal Thickness within the Sokoto Basin North-Western Nigeria Using Bouguer Gravity Anomaly Data
Authors: T. T. Olugbenga, A. I. Augie
Abstract:
This research proposes an interpretation of the Bouguer’ gravity anomaly data of some parts of Sokoto basin for the estimation of crustal thickness. The study area is bounded between latitudes 1100′0″N and 1300′0″N, and longitudes 400′0″E and 600′0″E that covered Koko, Jega, B/Kebbi, Argungu, Lema, Bodinga, Tamgaza, Gunmi,Daki Takwas, Dange, Sokoto, Ilella, T/Mafara, Anka, Maru, Gusau, K/Namoda, and Sabon Birni within Sokoto, Kebbi and Zamfara state respectively. The established map of the study area was digitized in X, Y and Z format using excel software package and the digitized data were processed using Surfer version 13 software. The Moho and Conrad depths based on a relationship between Bouguer’ gravity anomaly determined crustal thickness were estimated as 35 to 37 km and 19 to 21 km, respectively. The crustal region has been categorized into: Crustal thinning zone that is the region with high gravity anomaly value due to its greater geothermal energy and also Crustal thickening zone which the region with low anomaly values due to its lower geothermal energy. Birnin kebbi, Jega, Sokoto were identified as the region of hydrocarbon potential with an estimate of 35 km thickness within the crustal region which is referred to as crustal thickening as a result of its low but sufficient geothermal energy to decompose organic matter within the region to form hydrocarbons.
Keywords: Bouguer gravity anomaly, crustal thickness, geothermal energy, hydrocarbons, Moho and Conrad Depths.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6677070 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection
Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón
Abstract:
Structural inspection activities are necessary to ensure the correct functioning of infrastructures. UAV techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. In this paper, a methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of RGB and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.
Keywords: Aerial thermography, data processing, drone, low-cost, point cloud.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3557069 Unsteady Transonic Aerodynamic Analysis for Oscillatory Airfoils using Time Spectral Method
Authors: Mohamad Reza. Mohaghegh, Majid. Malek Jafarian
Abstract:
This research proposes an algorithm for the simulation of time-periodic unsteady problems via the solution unsteady Euler and Navier-Stokes equations. This algorithm which is called Time Spectral method uses a Fourier representation in time and hence solve for the periodic state directly without resolving transients (which consume most of the resources in a time-accurate scheme). Mathematical tools used here are discrete Fourier transformations. It has shown tremendous potential for reducing the computational cost compared to conventional time-accurate methods, by enforcing periodicity and using Fourier representation in time, leading to spectral accuracy. The accuracy and efficiency of this technique is verified by Euler and Navier-Stokes calculations for pitching airfoils. Because of flow turbulence nature, Baldwin-Lomax turbulence model has been used at viscous flow analysis. The results presented by the Time Spectral method are compared with experimental data. It has shown tremendous potential for reducing the computational cost compared to the conventional time-accurate methods, by enforcing periodicity and using Fourier representation in time, leading to spectral accuracy, because results verify the small number of time intervals per pitching cycle required to capture the flow physics.Keywords: Time Spectral Method, Time-periodic unsteadyflow, Discrete Fourier transform, Pitching airfoil, Turbulence flow
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17777068 Image Haze Removal Using Scene Depth Based Spatially Varying Atmospheric Light in Haar Lifting Wavelet Domain
Authors: Prabh Preet Singh, Harpreet Kaur
Abstract:
This paper presents a method for single image dehazing based on dark channel prior (DCP). The property that the intensity of the dark channel gives an approximate thickness of the haze is used to estimate the transmission and atmospheric light. Instead of constant atmospheric light, the proposed method employs scene depth to estimate spatially varying atmospheric light as it truly occurs in nature. Haze imaging model together with the soft matting method has been used in this work to produce high quality haze free image. Experimental results demonstrate that the proposed approach produces better results than the classic DCP approach as color fidelity and contrast of haze free image are improved and no over-saturation in the sky region is observed. Further, lifting Haar wavelet transform is employed to reduce overall execution time by a factor of two to three as compared to the conventional approach.
Keywords: Depth based atmospheric light, dark channel prior, lifting wavelet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5647067 The Relations between the Fractal Properties of the River Networks and the River Flow Time Series
Authors: M. H. Fattahi, H. Jahangiri
Abstract:
All the geophysical phenomena including river networks and flow time series are fractal events inherently and fractal patterns can be investigated through their behaviors. A non-linear system like a river basin can well be analyzed by a non-linear measure such as the fractal analysis. A bilateral study is held on the fractal properties of the river network and the river flow time series. A moving window technique is utilized to scan the fractal properties of them. Results depict both events follow the same strategy regarding to the fractal properties. Both the river network and the time series fractal dimension tend to saturate in a distinct value.Keywords: river flow time series, fractal, river networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16967066 Utilization of Kitchen Waste inside Green House Chamber: A Community Level Biogas Programme
Authors: Ravi P. Agrahari
Abstract:
The present study was undertaken with the objective of evaluating kitchen waste as an alternative organic material for biogas production in community level biogas plant. The field study was carried out for one month (January 19, 2012– February 17, 2012) at Centre for Energy Studies, IIT Delhi, New Delhi, India.
This study involves the uses of greenhouse canopy to increase the temperature for the production of biogas in winter period. In continuation, a semi-continuous study was conducted for one month with the retention time of 30 days under batch system. The gas generated from the biogas plant was utilized for cooking (burner) and lighting (lamp) purposes. Gas productions in the winter season registered lower than other months. It can be concluded that the solar greenhouse assisted biogas plant can be efficiently adopted in colder region or in winter season because temperature plays a major role in biogas production.
Keywords: Biogas, Green house chamber, organic material, solar intensity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20477065 Investigation of the Operational Principle and Flow Analysis of a Newly Developed Dry Separator
Authors: Sung Uk Park, Young Su Kang, Sangmo Kang, Yong Kweon Suh
Abstract:
Mineral product, waste concrete (fine aggregates), waste in the optical field, industry, and construction employ separators to separate solids and classify them according to their size. Various sorting machines are used in the industrial field such as those operating under electrical properties, centrifugal force, wind power, vibration, and magnetic force. Study on separators has been carried out to contribute to the environmental industry. In this study, we perform CFD analysis for understanding the basic mechanism of the separation of waste concrete (fine aggregate) particles from air with a machine built with a rotor with blades. In CFD, we first performed two-dimensional particle tracking for various particle sizes for the model with 1 degree, 1.5 degree, and 2 degree angle between each blade to verify the boundary conditions and the method of rotating domain method to be used in 3D. Then we developed 3D numerical model with ANSYS CFX to calculate the air flow and track the particles. We judged the capability of particle separation for given size by counting the number of particles escaping from the domain toward the exit among 10 particles issued at the inlet. We confirm that particles experience stagnant behavior near the exit of the rotating blades where the centrifugal force acting on the particles is in balance with the air drag force. It was also found that the minimum particle size that can be separated by the machine with the rotor is determined by its capability to stay at the outlet of the rotor channels.Keywords: Environmental industry, Separator, CFD, Fine aggregate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18117064 Dynamic Metrics for Polymorphism in Object Oriented Systems
Authors: Parvinder Singh Sandhu, Gurdev Singh
Abstract:
Metrics is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules. Software metrics are instruments or ways to measuring all the aspect of software product. These metrics are used throughout a software project to assist in estimation, quality control, productivity assessment, and project control. Object oriented software metrics focus on measurements that are applied to the class and other characteristics. These measurements convey the software engineer to the behavior of the software and how changes can be made that will reduce complexity and improve the continuing capability of the software. Object oriented software metric can be classified in two types static and dynamic. Static metrics are concerned with all the aspects of measuring by static analysis of software and dynamic metrics are concerned with all the measuring aspect of the software at run time. Major work done before, was focusing on static metric. Also some work has been done in the field of dynamic nature of the software measurements. But research in this area is demanding for more work. In this paper we give a set of dynamic metrics specifically for polymorphism in object oriented system.Keywords: Metrics, Software, Quality, Object oriented system, Polymorphism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17697063 Conflation Methodology Applied to Flood Recovery
Authors: E. L. Suarez, D. E. Meeroff, Y. Yong
Abstract:
Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.
Keywords: Community resilience, conflation, flood risk, nuisance flooding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657062 Ontology-based Concept Weighting for Text Documents
Authors: Hmway Hmway Tar, Thi Thi Soe Nyaunt
Abstract:
Documents clustering become an essential technology with the popularity of the Internet. That also means that fast and high-quality document clustering technique play core topics. Text clustering or shortly clustering is about discovering semantically related groups in an unstructured collection of documents. Clustering has been very popular for a long time because it provides unique ways of digesting and generalizing large amounts of information. One of the issues of clustering is to extract proper feature (concept) of a problem domain. The existing clustering technology mainly focuses on term weight calculation. To achieve more accurate document clustering, more informative features including concept weight are important. Feature Selection is important for clustering process because some of the irrelevant or redundant feature may misguide the clustering results. To counteract this issue, the proposed system presents the concept weight for text clustering system developed based on a k-means algorithm in accordance with the principles of ontology so that the important of words of a cluster can be identified by the weight values. To a certain extent, it has resolved the semantic problem in specific areas.Keywords: Clustering, Concept Weight, Document clustering, Feature Selection, Ontology
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24107061 An Estimation of Rice Output Supply Response in Sierra Leone: A Nerlovian Model Approach
Authors: Alhaji M. H. Conteh, Xiangbin Yan, Issa Fofana, Brima Gegbe, Tamba I. Isaac
Abstract:
Rice grain is Sierra Leone’s staple food and the nation imports over 120,000 metric tons annually due to a shortfall in its cultivation. Thus, the insufficient level of the crop's cultivation in Sierra Leone is caused by many problems and this led to the endlessly widening supply and demand for the crop within the country. Consequently, this has instigated the government to spend huge money on the importation of this grain that would have been otherwise cultivated domestically at a cheaper cost. Hence, this research attempts to explore the response of rice supply with respect to its demand in Sierra Leone within the period 1980-2010. The Nerlovian adjustment model to the Sierra Leone rice data set within the period 1980-2010 was used. The estimated trend equations revealed that time had significant effect on output, productivity (yield) and area (acreage) of rice grain within the period 1980-2010 and this occurred generally at the 1% level of significance. The results showed that, almost the entire growth in output had the tendency to increase in the area cultivated to the crop. The time trend variable that was included for government policy intervention showed an insignificant effect on all the variables considered in this research. Therefore, both the short-run and long-run price response was inelastic since all their values were less than one. From the findings above, immediate actions that will lead to productivity growth in rice cultivation are required. To achieve the above, the responsible agencies should provide extension service schemes to farmers as well as motivating them on the adoption of modern rice varieties and technology in their rice cultivation ventures.
Keywords: Nerlovian adjustment model, price elasticities, Sierra Leone, Trend equations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28077060 Neural Network in Fixed Time for Collision Detection between Two Convex Polyhedra
Authors: M. Khouil, N. Saber, M. Mestari
Abstract:
In this paper, a different architecture of a collision detection neural network (DCNN) is developed. This network, which has been particularly reviewed, has enabled us to solve with a new approach the problem of collision detection between two convex polyhedra in a fixed time (O (1) time). We used two types of neurons, linear and threshold logic, which simplified the actual implementation of all the networks proposed. The study of the collision detection is divided into two sections, the collision between a point and a polyhedron and then the collision between two convex polyhedra. The aim of this research is to determine through the AMAXNET network a mini maximum point in a fixed time, which allows us to detect the presence of a potential collision.
Keywords: Collision identification, fixed time, convex polyhedra, neural network, AMAXNET.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18247059 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot
Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan
Abstract:
With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.Keywords: Service Robot, Object Recognition, 3D Sensors, Plane Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16837058 Bridgeless Boost Power Factor Correction Rectifier with Hold-Up Time Extension Circuit
Authors: Chih-Chiang Hua, Yi-Hsiung Fang, Yuan-Jhen Siao
Abstract:
A bridgeless boost (BLB) power factor correction (PFC) rectifier with hold-up time extension circuit is proposed in this paper. A full bridge rectifier is widely used in the front end of the ac/dc converter. Since the shortcomings of the full bridge rectifier, the bridgeless rectifier is developed. A BLB rectifier topology is utilized with the hold-up time extension circuit. Unlike the traditional hold-up time extension circuit, the proposed extension scheme uses fewer active switches to achieve a longer hold-up time. Simulation results are presented to verify the converter performance.Keywords: Bridgeless boost, boost converter, power factor correction, hold-up time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15267057 Estimation of Wind Characteristics and Energy Yield at Different Towns in Libya
Authors: Farag Ahwide, Souhel Bousheha
Abstract:
A technical assessment has been made of electricity generation, considering wind turbines ranging between Vestas (V80-2.0 MW and V112-3.0 MW) and the air density is equal to 1.225 Kg/m3, at different towns in Libya. Wind speed might have been measured each 3 hours during 10 m stature at a time for 10 quite sometime between 2000 Furthermore 2009, these towns which are spotted on the bank from claiming Mediterranean ocean also how in the desert, which need aid Derna 1, Derna 2, Shahat, Benghazi, Ajdabya, Sirte, Misurata, Tripoli-Airport, Al-Zawya, Al-Kofra, Sabha, Nalut. The work presented long term "wind data analysis in terms of annual, seasonal, monthly and diurnal variations at these sites. Wind power density with different heights has been studied. Excel sheet program was used to calculate the values of wind power density and the values of wind speed frequency for the stations; their seasonally values have been estimated. Limit variable with rated wind pace to 10 different wind turbines need to be been estimated, which is used to focus those required yearly vitality yield of a wind vitality change framework (WECS), acknowledging wind turbines extending between 600 kW and 3000 kW).
Keywords: Energy yield, wind turbines, wind speed, wind power density.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11577056 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters
Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev
Abstract:
Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.Keywords: Lexicon of disasters, modelling, Petri nets, text annotation, social disasters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11627055 Voltage Problem Location Classification Using Performance of Least Squares Support Vector Machine LS-SVM and Learning Vector Quantization LVQ
Authors: Khaled Abduesslam. M, Mohammed Ali, Basher H Alsdai, Muhammad Nizam, Inayati
Abstract:
This paper presents the voltage problem location classification using performance of Least Squares Support Vector Machine (LS-SVM) and Learning Vector Quantization (LVQ) in electrical power system for proper voltage problem location implemented by IEEE 39 bus New- England. The data was collected from the time domain simulation by using Power System Analysis Toolbox (PSAT). Outputs from simulation data such as voltage, phase angle, real power and reactive power were taken as input to estimate voltage stability at particular buses based on Power Transfer Stability Index (PTSI).The simulation data was carried out on the IEEE 39 bus test system by considering load bus increased on the system. To verify of the proposed LS-SVM its performance was compared to Learning Vector Quantization (LVQ). The results showed that LS-SVM is faster and better as compared to LVQ. The results also demonstrated that the LS-SVM was estimated by 0% misclassification whereas LVQ had 7.69% misclassification.
Keywords: IEEE 39 bus, Least Squares Support Vector Machine, Learning Vector Quantization, Voltage Collapse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24137054 Fuzzy Inference Based Modelling of Perception Reaction Time of Drivers
Authors: U. Chattaraj, K. Dhusiya, M. Raviteja
Abstract:
Perception reaction time of drivers is an outcome of human thought process, which is vague and approximate in nature and also varies from driver to driver. So, in this study a fuzzy logic based model for prediction of the same has been presented, which seems suitable. The control factors, like, age, experience, intensity of driving of the driver, speed of the vehicle and distance of stimulus have been considered as premise variables in the model, in which the perception reaction time is the consequence variable. Results show that the model is able to explain the impacts of the control factors on perception reaction time properly.Keywords: Driver, fuzzy logic, perception reaction time, premise variable.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10207053 Current Density Effect on Nickel Electroplating Using Post Supercritical CO2 Mixed Watts Electrolyte
Authors: Chun-Ying Lee, Mei-Wen Wu, Van Cuong Nguyen, Hung-Wei Chuang
Abstract:
In this study, a nickel film with nano-crystalline grains, high hardness and smooth surface was electrodeposited using a post supercritical carbon dioxide (CO2) mixed Watts electrolyte. Although the hardness was not as high as its Sc-CO2 counterpart, the thin coating contained significantly less number of nano-sized pinholes. By measuring the escape concentration of the dissolved CO2 in post Sc-CO2 mixed electrolyte with the elapsed time, it was believed that the residue of dissolved CO2 bubbles should closely relate to the improvement in hardness and surface roughness over its conventional plating counterpart. Therefore, shortening the duration of electroplating with the raise of current density up to 0.5 A/cm2 could effectively retain more post Sc-CO2 mixing effect. This study not only confirms the roles of dissolved CO2 bubbles in electrolyte but also provides a potential process to overcome most issues associated with the cost in building high-pressure chamber for large size products and continuous plating using supercritical method.
Keywords: Additive-free electrolyte, electroplating, nickel, supercritical CO2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36177052 Empirical Modeling of Air Dried Rubberwood Drying System
Authors: S. Khamtree, T. Ratanawilai, C. Nuntadusit
Abstract:
Rubberwood is a crucial commercial timber in Southern Thailand. All processes in a rubberwood production depend on the knowledge and expertise of the technicians, especially the drying process. This research aims to develop an empirical model for drying kinetics in rubberwood. During the experiment, the temperature of the hot air and the average air flow velocity were kept at 80-100 °C and 1.75 m/s, respectively. The moisture content in the samples was determined less than 12% in the achievement of drying basis. The drying kinetic was simulated using an empirical solver. The experimental results illustrated that the moisture content was reduced whereas the drying temperature and time were increased. The coefficient of the moisture ratio between the empirical and the experimental model was tested with three statistical parameters, R-square (R²), Root Mean Square Error (RMSE) and Chi-square (χ²) to predict the accuracy of the parameters. The experimental moisture ratio had a good fit with the empirical model. Additionally, the results indicated that the drying of rubberwood using the Henderson and Pabis model revealed the suitable level of agreement. The result presented an excellent estimation (R² = 0.9963) for the moisture movement compared to the other models. Therefore, the empirical results were valid and can be implemented in the future experiments.
Keywords: Empirical models, hot air, moisture ratio, rubberwood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7867051 Analyzing The Effect of Variable Round Time for Clustering Approach in Wireless Sensor Networks
Authors: Vipin Pal, Girdhari Singh, R P Yadav
Abstract:
As wireless sensor networks are energy constraint networks so energy efficiency of sensor nodes is the main design issue. Clustering of nodes is an energy efficient approach. It prolongs the lifetime of wireless sensor networks by avoiding long distance communication. Clustering algorithms operate in rounds. Performance of clustering algorithm depends upon the round time. A large round time consumes more energy of cluster heads while a small round time causes frequent re-clustering. So existing clustering algorithms apply a trade off to round time and calculate it from the initial parameters of networks. But it is not appropriate to use initial parameters based round time value throughout the network lifetime because wireless sensor networks are dynamic in nature (nodes can be added to the network or some nodes go out of energy). In this paper a variable round time approach is proposed that calculates round time depending upon the number of active nodes remaining in the field. The proposed approach makes the clustering algorithm adaptive to network dynamics. For simulation the approach is implemented with LEACH in NS-2 and the results show that there is 6% increase in network lifetime, 7% increase in 50% node death time and 5% improvement over the data units gathered at the base station.Keywords: Wireless Sensor Network, Clustering, Energy Efficiency, Round Time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17937050 Measurement of the Bipolarization Events
Authors: Stefan V. Stefanescu
Abstract:
We intend to point out the differences which exist between the classical Gini concentration coefficient and a proposed bipolarization index defined for an arbitrary random variable which have a finite support. In fact Gini's index measures only the "poverty degree" for the individuals from a given population taking into consideration their wages. The Gini coefficient is not so sensitive to the significant income variations in the "rich people class" . In practice there are multiple interdependent relations between the pauperization and the socio-economical polarization phenomena. The presence of a strong pauperization aspect inside the population induces often a polarization effect in this society. But the pauperization and the polarization phenomena are not identical. For this reason it isn't always adequate to use a Gini type coefficient, based on the Lorenz order, to estimate the bipolarization level of the individuals from the studied population. The present paper emphasizes these ideas by considering two families of random variables which have a linear or a triangular type distributions. In addition, the continuous variation, depending on the parameter "time" of the chosen distributions, could simulate a real dynamical evolution of the population.Keywords: Bipolarization phenomenon, Gini coefficient, income distribution, poverty measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11527049 The Impact of Semantic Web on E-Commerce
Authors: Karim Heidari
Abstract:
Semantic Web Technologies enable machines to interpret data published in a machine-interpretable form on the web. At the present time, only human beings are able to understand the product information published online. The emerging semantic Web technologies have the potential to deeply influence the further development of the Internet Economy. In this paper we propose a scenario based research approach to predict the effects of these new technologies on electronic markets and business models of traders and intermediaries and customers. Over 300 million searches are conducted everyday on the Internet by people trying to find what they need. A majority of these searches are in the domain of consumer ecommerce, where a web user is looking for something to buy. This represents a huge cost in terms of people hours and an enormous drain of resources. Agent enabled semantic search will have a dramatic impact on the precision of these searches. It will reduce and possibly eliminate information asymmetry where a better informed buyer gets the best value. By impacting this key determinant of market prices semantic web will foster the evolution of different business and economic models. We submit that there is a need for developing these futuristic models based on our current understanding of e-commerce models and nascent semantic web technologies. We believe these business models will encourage mainstream web developers and businesses to join the “semantic web revolution."Keywords: E-Commerce, E-Business, Semantic Web, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34717048 Direct Power Control Strategies for Multilevel Inverter Based Custom Power Devices
Authors: S. Venkateshwarlu, B. P. Muni, A. D. Rajkumar, J. Praveen
Abstract:
Custom power is a technology driven product and service solution which embraces a family devices such as Dynamic Voltage Restorer (DVR), Distributed Shunt Compensator (DSTATCOM), Solid State Breaker (SSB) etc which will provide power quality functions at distribution voltages. The rapid response of these devices enables them to operate in real time, providing continuous and dynamic control of the supply including voltage and reactive power regulation, harmonic reduction and elimination of voltage dips. This paper presents the benefits of multilevel inverters when they are used for DPC based custom power devices. Power flow control mechanism, salient features, advantages and disadvantages of direct power control (DPC) using lookup table, SVM, predictive voltage vector and hybrid DPC strategies are discussed in this paper. Simulation results of three level inverter based STATCOM, harmonic analysis of multi level inverters are presented at the end.Keywords: DPC, DPC-SVM, Dynamic voltage restorer, DSTATCOM, Multilevel inverter, PWM Converter, PDPC, VF-DPC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2969