Search results for: network data envelopment analysis
12907 The Performance Improvement of the Target Position Determining System in Laser Tracking Based on 4Q Detector using Neural Network
Authors: A. Salmanpour, Sh. Mohammad Nejad
Abstract:
One of the methods for detecting the target position error in the laser tracking systems is using Four Quadrant (4Q) detectors. If the coordinates of the target center is yielded through the usual relations of the detector outputs, the results will be nonlinear, dependent on the shape, target size and its position on the detector screen. In this paper we have designed an algorithm with using neural network that coordinates of the target center in laser tracking systems is calculated by using detector outputs obtained from visual modeling. With this method, the results except from the part related to the detector intrinsic limitation, are linear and dependent from the shape and target size.Keywords: four quadrant detector, laser tracking system, rangefinder, tracking sensor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 220712906 Application New Approach with Two Networks Slow and Fast on the Asynchronous Machine
Authors: Samia Salah, M’hamed Hadj Sadok, Abderrezak Guessoum
Abstract:
In this paper, we propose a new modular approach called neuroglial consisting of two neural networks slow and fast which emulates a biological reality recently discovered. The implementation is based on complex multi-time scale systems; validation is performed on the model of the asynchronous machine. We applied the geometric approach based on the Gerschgorin circles for the decoupling of fast and slow variables, and the method of singular perturbations for the development of reductions models.
This new architecture allows for smaller networks with less complexity and better performance in terms of mean square error and convergence than the single network model.
Keywords: Gerschgorin’s Circles, Neuroglial Network, Multi time scales systems, Singular perturbation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 160512905 Using Ferry Access Points to Improve the Performance of Message Ferrying in Delay-Tolerant Networks
Authors: Farzana Yasmeen, Md. Nurul Huda, Md. Enamul Haque, Michihiro Aoki, Shigeki Yamada
Abstract:
Delay-Tolerant Networks (DTNs) are sparse, wireless networks where disconnections are common due to host mobility and low node density. The Message Ferrying (MF) scheme is a mobilityassisted paradigm to improve connectivity in DTN-like networks. A ferry or message ferry is a special node in the network which has a per-determined route in the deployed area and relays messages between mobile hosts (MHs) which are intermittently connected. Increased contact opportunities among mobile hosts and the ferry improve the performance of the network, both in terms of message delivery ratio and average end-end delay. However, due to the inherent mobility of mobile hosts and pre-determined periodicity of the message ferry, mobile hosts may often -miss- contact opportunities with a ferry. In this paper, we propose the combination of stationary ferry access points (FAPs) with MF routing to increase contact opportunities between mobile hosts and the MF and consequently improve the performance of the DTN. We also propose several placement models for deploying FAPs on MF routes. We evaluate the performance of the FAP placement models through comprehensive simulation. Our findings show that FAPs do improve the performance of MF-assisted DTNs and symmetric placement of FAPs outperforms other placement strategies.Keywords: Service infrastructure, delay-tolerant network, messageferry routing, placement models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 197912904 A Reusability Evaluation Model for OO-Based Software Components
Authors: Parvinder S. Sandhu, Hardeep Singh
Abstract:
The requirement to improve software productivity has promoted the research on software metric technology. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. CK metric suit is most widely used metrics for the objectoriented (OO) software; we critically analyzed the CK metrics, tried to remove the inconsistencies and devised the framework of metrics to obtain the structural analysis of OO-based software components. Neural network can learn new relationships with new input data and can be used to refine fuzzy rules to create fuzzy adaptive system. Hence, Neuro-fuzzy inference engine can be used to evaluate the reusability of OO-based component using its structural attributes as inputs. In this paper, an algorithm has been proposed in which the inputs can be given to Neuro-fuzzy system in form of tuned WMC, DIT, NOC, CBO , LCOM values of the OO software component and output can be obtained in terms of reusability. The developed reusability model has produced high precision results as expected by the human experts.Keywords: CK-Metric, ID3, Neuro-fuzzy, Reusability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181912903 Using Simulation for Prediction of Units Movements in Case of Communication Failure
Authors: J. Hodicky, P. Frantis
Abstract:
Command and Control (C2) system and its interfacethe Common Operational Picture (COP) are main means that supports commander in its decision making process. COP contains information about friendly and enemy unit positions. The friendly position is gathered via tactical network. In the case of tactical network failure the information about units are not available. The tactical simulator can be used as a tool that is capable to predict movements of units in respect of terrain features. Article deals with an experiment that was based on Czech C2 system that is in the case of connectivity lost fed by VR Forces simulator. Article analyzes maximum time interval in which the position created by simulator is still usable and truthful for commander in real time.Keywords: command and control system, movement prediction, simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 128012902 The Concentration Analysis of CO2 Using ALOHA Code for Kuosheng Nuclear Power Plant
Authors: W. S. Hsu, Y. Chiang, H. C. Chen, J. R. Wang, S. W. Chen, J. H. Yang, C. Shih
Abstract:
Not only radiation materials, but also the normal chemical material stored in the power plant can cause a risk to the residents. In this research, the ALOHA code was used to perform the concentration analysis under the CO2 storage burst or leakage conditions for Kuosheng nuclear power plant (NPP). The Final Safety Analysis Report (FSAR) and data were used in this study. Additionally, the analysis results of ALOHA code were compared with the R.G. 1.78 failure criteria in order to confirm the control room habitability. The comparison results show that the ALOHA result for burst case was 0.923 g/m3 which was below the criteria. However, the ALOHA results for leakage case was 11.3 g/m3.
Keywords: BWR, ALOHA, habitability, Kuosheng.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 96612901 CFD Analysis of Two Phase Flow in a Horizontal Pipe – Prediction of Pressure Drop
Authors: P. Bhramara, V. D. Rao, K. V. Sharma , T. K. K. Reddy
Abstract:
In designing of condensers, the prediction of pressure drop is as important as the prediction of heat transfer coefficient. Modeling of two phase flow, particularly liquid – vapor flow under diabatic conditions inside a horizontal tube using CFD analysis is difficult with the available two phase models in FLUENT due to continuously changing flow patterns. In the present analysis, CFD analysis of two phase flow of refrigerants inside a horizontal tube of inner diameter, 0.0085 m and 1.2 m length is carried out using homogeneous model under adiabatic conditions. The refrigerants considered are R22, R134a and R407C. The analysis is performed at different saturation temperatures and at different flow rates to evaluate the local frictional pressure drop. Using Homogeneous model, average properties are obtained for each of the refrigerants that is considered as single phase pseudo fluid. The so obtained pressure drop data is compared with the separated flow models available in literature.Keywords: Adiabatic conditions, CFD analysis, Homogeneousmodel and Liquid – Vapor flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 369712900 Nine-Level Shunt Active Power Filter Associated with a Photovoltaic Array Coupled to the Electrical Distribution Network
Authors: Zahzouh Zoubir, Bouzaouit Azzeddine, Gahgah Mounir
Abstract:
The use of more and more electronic power switches with a nonlinear behavior generates non-sinusoidal currents in distribution networks, which causes damage to domestic and industrial equipment. The multi-level shunt power active filter is subsequently shown to be an adequate solution to the problem raised. Nevertheless, the difficulty of adjusting the active filter DC supply voltage requires another technology to ensure it. In this article, a photovoltaic generator is associated with the DC bus power terminals of the active filter. The proposed system consists of a field of solar panels, three multi-level voltage inverters connected to the power grid and a non-linear load consisting of a six-diode rectifier bridge supplying a resistive-inductive load. Current control techniques of active and reactive power are used to compensate for both harmonic currents and reactive power as well as to inject active solar power into the distribution network. An algorithm of the search method of the maximum power point of type Perturb and observe is applied. Simulation results of the system proposed under the MATLAB/Simulink environment shows that the performance of control commands that reassure the solar power injection in the network, harmonic current compensation and power factor correction.Keywords: MPPT, active power filter, PV array, perturb and observe algorithm, PWM-control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 75412899 Spatially Random Sampling for Retail Food Risk Factors Study
Authors: Guilan Huang
Abstract:
In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.
Keywords: Geospatial technology, restaurant, retail food risk factors study, spatial random sampling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146512898 Improved BEENISH Protocol for Wireless Sensor Networks Based Upon Fuzzy Inference System
Authors: Rishabh Sharma, Renu Vig, Neeraj Sharma
Abstract:
The main design parameter of WSN (wireless sensor network) is the energy consumption. To compensate this parameter, hierarchical clustering is a technique that assists in extending duration of the networks life by efficiently consuming the energy. This paper focuses on dealing with the WSNs and the FIS (fuzzy interface system) which are deployed to enhance the BEENISH protocol. The node energy, mobility, pause time and density are considered for the selection of CH (cluster head). The simulation outcomes exhibited that the projected system outperforms the traditional system with regard to the energy utilization and number of packets transmitted to sink.
Keywords: Wireless sensor network, sink, sensor node, routing protocol, fuzzy rule, fuzzy inference system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 47712897 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators
Authors: M. A. Okezue, K. L. Clase, S. R. Byrn
Abstract:
The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.
Keywords: Data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 51612896 A Low-Power Two-Stage Seismic Sensor Scheme for Earthquake Early Warning System
Authors: Arvind Srivastav, Tarun Kanti Bhattacharyya
Abstract:
The north-eastern, Himalayan, and Eastern Ghats Belt of India comprise of earthquake-prone, remote, and hilly terrains. Earthquakes have caused enormous damages in these regions in the past. A wireless sensor network based earthquake early warning system (EEWS) is being developed to mitigate the damages caused by earthquakes. It consists of sensor nodes, distributed over the region, that perform majority voting of the output of the seismic sensors in the vicinity, and relay a message to a base station to alert the residents when an earthquake is detected. At the heart of the EEWS is a low-power two-stage seismic sensor that continuously tracks seismic events from incoming three-axis accelerometer signal at the first-stage, and, in the presence of a seismic event, triggers the second-stage P-wave detector that detects the onset of P-wave in an earthquake event. The parameters of the P-wave detector have been optimized for minimizing detection time and maximizing the accuracy of detection.Working of the sensor scheme has been verified with seven earthquakes data retrieved from IRIS. In all test cases, the scheme detected the onset of P-wave accurately. Also, it has been established that the P-wave onset detection time reduces linearly with the sampling rate. It has been verified with test data; the detection time for data sampled at 10Hz was around 2 seconds which reduced to 0.3 second for the data sampled at 100Hz.Keywords: Earthquake early warning system, EEWS, STA/LTA, polarization, wavelet, event detector, P-wave detector.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78112895 Quantifying and Adjusting the Effects of Publication Bias in Continuous Meta-Analysis
Authors: N.R.N. Idris
Abstract:
This study uses simulated meta-analysis to assess the effects of publication bias on meta-analysis estimates and to evaluate the efficacy of the trim and fill method in adjusting for these biases. The estimated effect sizes and the standard error were evaluated in terms of the statistical bias and the coverage probability. The results demonstrate that if publication bias is not adjusted it could lead to up to 40% bias in the treatment effect estimates. Utilization of the trim and fill method could reduce the bias in the overall estimate by more than half. The method is optimum in presence of moderate underlying bias but has minimal effects in presence of low and severe publication bias. Additionally, the trim and fill method improves the coverage probability by more than half when subjected to the same level of publication bias as those of the unadjusted data. The method however tends to produce false positive results and will incorrectly adjust the data for publication bias up to 45 % of the time. Nonetheless, the bias introduced into the estimates due to this adjustment is minimal
Keywords: Publication bias, Trim and Fill method, percentage relative bias, coverage probability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 155812894 Computational Aspects of Regression Analysis of Interval Data
Authors: Michal Cerny
Abstract:
We consider linear regression models where both input data (the values of independent variables) and output data (the observations of the dependent variable) are interval-censored. We introduce a possibilistic generalization of the least squares estimator, so called OLS-set for the interval model. This set captures the impact of the loss of information on the OLS estimator caused by interval censoring and provides a tool for quantification of this effect. We study complexity-theoretic properties of the OLS-set. We also deal with restricted versions of the general interval linear regression model, in particular the crisp input – interval output model. We give an argument that natural descriptions of the OLS-set in the crisp input – interval output cannot be computed in polynomial time. Then we derive easily computable approximations for the OLS-set which can be used instead of the exact description. We illustrate the approach by an example.
Keywords: Linear regression, interval-censored data, computational complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 147012893 Net-Trainer-ST: A Swiss Army Knife for Pentesting, Based on Single Board Computer, for Cybersecurity Professionals and Hobbyists
Authors: K. Hołda, D. Śliwa, K. Daniec
Abstract:
This article was created as part of the developed master's thesis. It attempts to present a developed device, which will support the work of specialists dealing with broadly understood cybersecurity terms. The device is contrived to automate security tests. In addition, it simulates potential cyberattacks in the most realistic way possible, without causing permanent damage to the network, in order to maximize the quality of the subsequent corrections to the tested network systems. The proposed solution is a fully operational prototype created from commonly available electronic components and a single board computer. The focus of the article is not only put on the hardware part of the device but also on the theoretical and applicatory way in which implemented cybersecurity tests operate and examples of their results.
Keywords: Raspberry Pi, ethernet, automated cybersecurity tests, ARP, DNS, backdoor, TCP, password sniffing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77612892 Modeling of PZ in Haunch Connections Systems
Authors: Peyman Shadman Heidari, Roohollah Ahmady Jazany, Mahmood Reza Mehran, Pouya Shadman Heidari, Mohammad khorasani
Abstract:
Modeling of Panel Zone (PZ) seismic behavior, because of its role in overall ductility and lateral stiffness of steel moment frames, has been considered a challenge for years. There are some studies regarding the effects of different doubler plates thicknesses and geometric properties of PZ on its seismic behavior. However, there is not much investigation on the effects of number of provided continuity plates in case of presence of one triangular haunch, two triangular haunches and rectangular haunch (T shape haunches) for exterior columns. In this research first detailed finite element models of 12tested connection of SAC joint venture were created and analyzed then obtained cyclic behavior backbone curves of these models besides other FE models for similar tests were used for neural network training. Then seismic behavior of these data is categorized according to continuity plate-s arrangements and differences in type of haunches. PZ with one-sided haunches have little plastic rotation. As the number of continuity plates increases due to presence of two triangular haunches (four continuity plate), there will be no plastic rotation, in other words PZ behaves in its elastic range. In the case of rectangular haunch, PZ show more plastic rotation in comparison with one-sided triangular haunch and especially double-sided triangular haunches. Moreover, the models that will be presented in case of triangular one-sided and double- sided haunches and rectangular haunches as a result of this study seem to have a proper estimation of PZ seismic behavior.Keywords: Continuity plate, FE models, Neural network, Panel zone, Plastic rotation, Rectangular haunch, Seismic behavior
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 200812891 Particle Swarm Optimization with Interval-valued Genotypes and Its Application to Neuroevolution
Authors: Hidehiko Okada
Abstract:
The author proposes an extension of particle swarm optimization (PSO) for solving interval-valued optimization problems and applies the extended PSO to evolutionary training of neural networks (NNs) with interval weights. In the proposed PSO, values in the genotypes are not real numbers but intervals. Experimental results show that interval-valued NNs trained by the proposed method could well approximate hidden target functions despite the fact that no training data was explicitly provided.
Keywords: Evolutionary algorithms, swarm intelligence, particle swarm optimization, neural network, interval arithmetic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196612890 Reliability Analysis of Press Unit using Vague Set
Authors: S. P. Sharma, Monica Rani
Abstract:
In conventional reliability assessment, the reliability data of system components are treated as crisp values. The collected data have some uncertainties due to errors by human beings/machines or any other sources. These uncertainty factors will limit the understanding of system component failure due to the reason of incomplete data. In these situations, we need to generalize classical methods to fuzzy environment for studying and analyzing the systems of interest. Fuzzy set theory has been proposed to handle such vagueness by generalizing the notion of membership in a set. Essentially, in a Fuzzy Set (FS) each element is associated with a point-value selected from the unit interval [0, 1], which is termed as the grade of membership in the set. A Vague Set (VS), as well as an Intuitionistic Fuzzy Set (IFS), is a further generalization of an FS. Instead of using point-based membership as in FS, interval-based membership is used in VS. The interval-based membership in VS is more expressive in capturing vagueness of data. In the present paper, vague set theory coupled with conventional Lambda-Tau method is presented for reliability analysis of repairable systems. The methodology uses Petri nets (PN) to model the system instead of fault tree because it allows efficient simultaneous generation of minimal cuts and path sets. The presented method is illustrated with the press unit of the paper mill.
Keywords: Lambda -Tau methodology, Petri nets, repairable system, vague fuzzy set.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 152712889 Finding Viable Pollution Routes in an Urban Network under a Predefined Cost
Authors: Dimitra Alexiou, Stefanos Katsavounis, Ria Kalfakakou
Abstract:
In an urban area the determination of transportation routes should be planned so as to minimize the provoked pollution taking into account the cost of such routes. In the sequel these routes are cited as pollution routes.
The transportation network is expressed by a weighted graph G=(V,E,D,P) where every vertex represents a location to be served and E contains unordered pairs (edges) of elements in V that indicate a simple road. The distances / cost and a weight that depict the provoked air pollution by a vehicle transition at every road are assigned to each road as well. These are the items of set D and P respectively.
Furthermore the investigated pollution routes must not exceed predefined corresponding values concerning the route cost and the route pollution level during the vehicle transition.
In this paper we present an algorithm that generates such routes in order that the decision maker selects the most appropriate one.
Keywords: bi-criteria, pollution, shortest paths.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 168012888 Interest of the Sequences Pseudo Noises Codes of Different Lengths for the Reduction from the Interference between Users of CDMA Network
Authors: Nerguè Kassahan Kone, Souleymane Oumtanaga
Abstract:
The third generation (3G) of cellular system adopted the spread spectrum as solution for the transmission of the data in the physical layer. Contrary to systems IS-95 or CDMAOne (systems with spread spectrum of the preceding generation), the new standard, called Universal Mobil Telecommunications System (UMTS), uses long codes in the down link. The system is conceived for the vocal communication and the transmission of the data. In particular, the down link is very important, because of the asymmetrical request of the data, i.e., more remote loading towards the mobiles than towards the basic station. Moreover, the UMTS uses for the down link an orthogonal spreading out with a variable factor of spreading out (OVSF for Orthogonal Variable Spreading Factor). This characteristic makes it possible to increase the flow of data of one or more users by reducing their factor of spreading out without changing the factor of spreading out of other users. In the current standard of the UMTS, two techniques to increase the performances of the down link were proposed, the diversity of sending antenna and the codes space-time. These two techniques fight only fainding. The receiver proposed for the mobil station is the RAKE, but one can imagine a receiver more sophisticated, able to reduce the interference between users and the impact of the coloured noise and interferences to narrow band. In this context, where the users have long codes synchronized with variable factor of spreading out and ignorance by the mobile of the other active codes/users, the use of the sequences of code pseudo-noises different lengths is presented in the form of one of the most appropriate solutions.Keywords: DS-CDMA, multiple access interference, ratio Signal / interference + Noise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 135112887 Cloud Computing Support for Diagnosing Researches
Authors: A. Amirov, O. Gerget, V. Kochegurov
Abstract:
One of the main biomedical problem lies in detecting dependencies in semi structured data. Solution includes biomedical portal and algorithms (integral rating health criteria, multidimensional data visualization methods). Biomedical portal allows to process diagnostic and research data in parallel mode using Microsoft System Center 2012, Windows HPC Server cloud technologies. Service does not allow user to see internal calculations instead it provides practical interface. When data is sent for processing user may track status of task and will achieve results as soon as computation is completed. Service includes own algorithms and allows diagnosing and predicating medical cases. Approved methods are based on complex system entropy methods, algorithms for determining the energy patterns of development and trajectory models of biological systems and logical–probabilistic approach with the blurring of images.
Keywords: Biomedical portal, cloud computing, diagnostic and prognostic research, mathematical data analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164412886 Laban Movement Analysis Using Kinect
Authors: Ran Bernstein, Tal Shafir, Rachelle Tsachor, Karen Studd, Assaf Schuster
Abstract:
Laban Movement Analysis (LMA), developed in the dance community over the past seventy years, is an effective method for observing, describing, notating, and interpreting human movement to enhance communication and expression in everyday and professional life. Many applications that use motion capture data might be significantly leveraged if the Laban qualities will be recognized automatically. This paper presents an automated recognition method of Laban qualities from motion capture skeletal recordings and it is demonstrated on the output of Microsoft’s Kinect V2 sensor.Keywords: Laban Movement Analysis, Kinect, Machine Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 283312885 The Quality Assessment of Seismic Reflection Survey Data Using Statistical Analysis: A Case Study of Fort Abbas Area, Cholistan Desert, Pakistan
Authors: U. Waqas, M. F. Ahmed, A. Mehmood, M. A. Rashid
Abstract:
In geophysical exploration surveys, the quality of acquired data holds significant importance before executing the data processing and interpretation phases. In this study, 2D seismic reflection survey data of Fort Abbas area, Cholistan Desert, Pakistan was taken as test case in order to assess its quality on statistical bases by using normalized root mean square error (NRMSE), Cronbach’s alpha test (α) and null hypothesis tests (t-test and F-test). The analysis challenged the quality of the acquired data and highlighted the significant errors in the acquired database. It is proven that the study area is plain, tectonically least affected and rich in oil and gas reserves. However, subsurface 3D modeling and contouring by using acquired database revealed high degrees of structural complexities and intense folding. The NRMSE had highest percentage of residuals between the estimated and predicted cases. The outcomes of hypothesis testing also proved the biasness and erraticness of the acquired database. Low estimated value of alpha (α) in Cronbach’s alpha test confirmed poor reliability of acquired database. A very low quality of acquired database needs excessive static correction or in some cases, reacquisition of data is also suggested which is most of the time not feasible on economic grounds. The outcomes of this study could be used to assess the quality of large databases and to further utilize as a guideline to establish database quality assessment models to make much more informed decisions in hydrocarbon exploration field.
Keywords: Data quality, null hypothesis, seismic lines, seismic reflection survey.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 61512884 Improving Co-integration Trading Rule Profitability with Forecasts from an Artificial Neural Network
Authors: Paul Lajbcygier, Seng Lee
Abstract:
Co-integration models the long-term, equilibrium relationship of two or more related financial variables. Even if cointegration is found, in the short run, there may be deviations from the long run equilibrium relationship. The aim of this work is to forecast these deviations using neural networks and create a trading strategy based on them. A case study is used: co-integration residuals from Australian Bank Bill futures are forecast and traded using various exogenous input variables combined with neural networks. The choice of the optimal exogenous input variables chosen for each neural network, undertaken in previous work [1], is validated by comparing the forecasts and corresponding profitability of each, using a trading strategy.
Keywords: Artificial neural networks, co-integration, forecasting, trading rule.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 124612883 Health Monitoring of Power Transformers by Dissolved Gas Analysis using Regression Method and Study the Effect of Filtration on Oil
Authors: Anjali Chatterjee, Nirmal Kumar Roy
Abstract:
Economically transformers constitute one of the largest investments in a Power system. For this reason, transformer condition assessment and management is a high priority task. If a transformer fails, it would have a significant negative impact on revenue and service reliability. Monitoring the state of health of power transformers has traditionally been carried out using laboratory Dissolved Gas Analysis (DGA) tests performed at periodic intervals on the oil sample, collected from the transformers. DGA of transformer oil is the single best indicator of a transformer-s overall condition and is a universal practice today, which started somewhere in the 1960s. Failure can occur in a transformer due to different reasons. Some failures can be limited or prevented by maintenance. Oil filtration is one of the methods to remove the dissolve gases and prevent the deterioration of the oil. In this paper we analysis the DGA data by regression method and predict the gas concentration in the oil in the future. We bring about a comparative study of different traditional methods of regression and the errors generated out of their predictions. With the help of these data we can deduce the health of the transformer by finding the type of fault if it has occurred or will occur in future. Additional in this paper effect of filtration on the transformer health is highlight by calculating the probability of failure of a transformer with and without oil filtrating.
Keywords: Power Transformers, Dissolve gas Analysis, Regression method, Filtration, oil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 294312882 Application of a New Hybrid Optimization Algorithm on Cluster Analysis
Authors: T. Niknam, M. Nayeripour, B.Bahmani Firouzi
Abstract:
Clustering techniques have received attention in many areas including engineering, medicine, biology and data mining. The purpose of clustering is to group together data points, which are close to one another. The K-means algorithm is one of the most widely used techniques for clustering. However, K-means has two shortcomings: dependency on the initial state and convergence to local optima and global solutions of large problems cannot found with reasonable amount of computation effort. In order to overcome local optima problem lots of studies done in clustering. This paper is presented an efficient hybrid evolutionary optimization algorithm based on combining Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO), called PSO-ACO, for optimally clustering N object into K clusters. The new PSO-ACO algorithm is tested on several data sets, and its performance is compared with those of ACO, PSO and K-means clustering. The simulation results show that the proposed evolutionary optimization algorithm is robust and suitable for handing data clustering.
Keywords: Ant Colony Optimization (ACO), Data clustering, Hybrid evolutionary optimization algorithm, K-means clustering, Particle Swarm Optimization (PSO).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 219812881 Performance Optimization of Data Mining Application Using Radial Basis Function Classifier
Authors: M. Govindarajan, R. M.Chandrasekaran
Abstract:
Text data mining is a process of exploratory data analysis. Classification maps data into predefined groups or classes. It is often referred to as supervised learning because the classes are determined before examining the data. This paper describes proposed radial basis function Classifier that performs comparative crossvalidation for existing radial basis function Classifier. The feasibility and the benefits of the proposed approach are demonstrated by means of data mining problem: direct Marketing. Direct marketing has become an important application field of data mining. Comparative Cross-validation involves estimation of accuracy by either stratified k-fold cross-validation or equivalent repeated random subsampling. While the proposed method may have high bias; its performance (accuracy estimation in our case) may be poor due to high variance. Thus the accuracy with proposed radial basis function Classifier was less than with the existing radial basis function Classifier. However there is smaller the improvement in runtime and larger improvement in precision and recall. In the proposed method Classification accuracy and prediction accuracy are determined where the prediction accuracy is comparatively high.Keywords: Text Data Mining, Comparative Cross-validation, Radial Basis Function, runtime, accuracy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 155412880 ATC in Competitive Electricity Market Using TCSC
Authors: S. K. Gupta, Richa Bansal
Abstract:
In a deregulated power system structure, power producers and customers share a common transmission network for wheeling power from the point of generation to the point of consumption. All parties in this open access environment may try to purchase the energy from the cheaper source for greater profit margins, which may lead to overloading and congestion of certain corridors of the transmission network. This may result in violation of line flow, voltage and stability limits and thereby undermine the system security. Utilities therefore need to determine adequately their available transfer capability (ATC) to ensure that system reliability is maintained while serving a wide range of bilateral and multilateral transactions. This paper presents power transfer distribution factor based on AC load flow for the determination and enhancement of ATC. The study has been carried out for IEEE 24 bus Reliability Test System.
Keywords: Available Transfer Capability, FACTS devices, Power Transfer Distribution Factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 235712879 Research Trend Analysis – A Sample in the Field of Information Systems
Authors: Hei-Chia Wang, Wei-Pin Chiu
Abstract:
As research performance in academia is treated as one of indices for national competency, many countries devote much attention and resources to increasing their research performance. Understand the research trend is the basic step to improve the research performance. The goal of this research is to design an analysis system to evaluate research trends from analyzing data from different countries. In this paper, information system researches in Taiwan and other countries, including Asian countries and prominent countries represented by the Group of Eight (G8) is used as example. Our research found the trends are varied in different countries. Our research suggested that Taiwan-s scholars can pay more attention to interdisciplinary applications and try to increase their collaboration with other countries, in order to increase Taiwan's competency in the area of information science.
Keywords: Bibliometric analysis, research trend, scientometric analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 168112878 CNet Module Design of IMCS
Authors: Youkyung Park, SeungYup Kang, SungHo Kim, SimKyun Yook
Abstract:
IMCS is Integrated Monitoring and Control System for thermal power plant. This system consists of mainly two parts; controllers and OIS (Operator Interface System). These two parts are connected by Ethernet-based communication. The controller side of communication is managed by CNet module and OIS side is managed by data server of OIS. CNet module sends the data of controller to data server and receives commend data from data server. To minimizes or balance the load of data server, this module buffers data created by controller at every cycle and send buffered data to data server on request of data server. For multiple data server, this module manages the connection line with each data server and response for each request from multiple data server. CNet module is included in each controller of redundant system. When controller fail-over happens on redundant system, this module can provide data of controller to data sever without loss. This paper presents three main features – separation of get task, usage of ring buffer and monitoring communication status –of CNet module to carry out these functions.Keywords: Ethernet communication, DCS, power plant, ring buffer, data integrity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563