Search results for: real time control
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10306

Search results for: real time control

4876 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: Time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545
4875 Hybrid Knowledge Approach for Determining Health Care Provider Specialty from Patient Diagnoses

Authors: Erin Lynne Plettenberg, Jeremy Vickery

Abstract:

In an access-control situation, the role of a user determines whether a data request is appropriate. This paper combines vetted web mining and logic modeling to build a lightweight system for determining the role of a health care provider based only on their prior authorized requests. The model identifies provider roles with 100% recall from very little data. This shows the value of vetted web mining in AI systems, and suggests the impact of the ICD classification on medical practice.

Keywords: Ontology, logic modeling, electronic medical records, information extraction, vetted web mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 915
4874 Flow Modeling and Runner Design Optimization in Turgo Water Turbines

Authors: John S. Anagnostopoulos, Dimitrios E. Papantonis

Abstract:

The incorporation of computational fluid dynamics in the design of modern hydraulic turbines appears to be necessary in order to improve their efficiency and cost-effectiveness beyond the traditional design practices. A numerical optimization methodology is developed and applied in the present work to a Turgo water turbine. The fluid is simulated by a Lagrangian mesh-free approach that can provide detailed information on the energy transfer and enhance the understanding of the complex, unsteady flow field, at very small computing cost. The runner blades are initially shaped according to hydrodynamics theory, and parameterized using Bezier polynomials and interpolation techniques. The use of a limited number of free design variables allows for various modifications of the standard blade shape, while stochastic optimization using evolutionary algorithms is implemented to find the best blade that maximizes the attainable hydraulic efficiency of the runner. The obtained optimal runner design achieves considerably higher efficiency than the standard one, and its numerically predicted performance is comparable to a real Turgo turbine, verifying the reliability and the prospects of the new methodology.

Keywords: Turgo turbine, Lagrangian flow modeling, Surface parameterization, Design optimization, Evolutionary algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4035
4873 A 1H NMR-Linked PCR Modelling Strategy for Tracking the Fatty Acid Sources of Aldehydic Lipid Oxidation Products in Culinary Oils Exposed to Simulated Shallow-Frying Episodes

Authors: Martin Grootveld, Benita Percival, Sarah Moumtaz, Kerry L. Grootveld

Abstract:

Objectives/Hypotheses: The adverse health effect potential of dietary lipid oxidation products (LOPs) has evoked much clinical interest. Therefore, we employed a 1H NMR-linked Principal Component Regression (PCR) chemometrics modelling strategy to explore relationships between data matrices comprising (1) aldehydic LOP concentrations generated in culinary oils/fats when exposed to laboratory-simulated shallow frying practices, and (2) the prior saturated (SFA), monounsaturated (MUFA) and polyunsaturated fatty acid (PUFA) contents of such frying media (FM), together with their heating time-points at a standard frying temperature (180 oC). Methods: Corn, sunflower, extra virgin olive, rapeseed, linseed, canola, coconut and MUFA-rich algae frying oils, together with butter and lard, were heated according to laboratory-simulated shallow-frying episodes at 180 oC, and FM samples were collected at time-points of 0, 5, 10, 20, 30, 60, and 90 min. (n = 6 replicates per sample). Aldehydes were determined by 1H NMR analysis (Bruker AV 400 MHz spectrometer). The first (dependent output variable) PCR data matrix comprised aldehyde concentration scores vectors (PC1* and PC2*), whilst the second (predictor) one incorporated those from the fatty acid content/heating time variables (PC1-PC4) and their first-order interactions. Results: Structurally complex trans,trans- and cis,trans-alka-2,4-dienals, 4,5-epxy-trans-2-alkenals and 4-hydroxy-/4-hydroperoxy-trans-2-alkenals (group I aldehydes predominantly arising from PUFA peroxidation) strongly and positively loaded on PC1*, whereas n-alkanals and trans-2-alkenals (group II aldehydes derived from both MUFA and PUFA hydroperoxides) strongly and positively loaded on PC2*. PCR analysis of these scores vectors (SVs) demonstrated that PCs 1 (positively-loaded linoleoylglycerols and [linoleoylglycerol]:[SFA] content ratio), 2 (positively-loaded oleoylglycerols and negatively-loaded SFAs), 3 (positively-loaded linolenoylglycerols and [PUFA]:[SFA] content ratios), and 4 (exclusively orthogonal sampling time-points) all powerfully contributed to aldehydic PC1* SVs (p 10-3 to < 10-9), as did all PC1-3 x PC4 interaction ones (p 10-5 to < 10-9). PC2* was also markedly dependent on all the above PC SVs (PC2 > PC1 and PC3), and the interactions of PC1 and PC2 with PC4 (p < 10-9 in each case), but not the PC3 x PC4 contribution. Conclusions: NMR-linked PCR analysis is a valuable strategy for (1) modelling the generation of aldehydic LOPs in heated cooking oils and other FM, and (2) tracking their unsaturated fatty acid (UFA) triacylglycerol sources therein.

Keywords: Frying oils, frying episodes, lipid oxidation products, cytotoxic/genotoxic aldehydes, chemometrics, principal component regression, NMR Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 865
4872 Fuzzy Processing of Uncertain Data

Authors: Petr Morávek, Miloš Šeda

Abstract:

In practice, we often come across situations where it is necessary to make decisions based on incomplete or uncertain data. In control systems it may be due to the unknown exact mathematical model, or its excessive complexity (e.g. nonlinearity) when it is necessary to simplify it, respectively, to solve it using a rule base. In the case of databases, searching data we compare a similarity measure with of the requirements of the selection with stored data, where both the select query and the data itself may contain vague terms, for example in the form of linguistic qualifiers. In this paper, we focus on the processing of uncertain data in databases and demonstrate it on the example multi-criteria decision making in the selection of variants, specified by higher number of technical parameters.

Keywords: fuzzy logic, linguistic variable, multicriteria decision

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
4871 A Model to Study the Effect of Excess Buffers and Na+ Ions on Ca2+ Diffusion in Neuron Cell

Authors: Vikas Tewari, Shivendra Tewari, K. R. Pardasani

Abstract:

Calcium is a vital second messenger used in signal transduction. Calcium controls secretion, cell movement, muscular contraction, cell differentiation, ciliary beating and so on. Two theories have been used to simplify the system of reaction-diffusion equations of calcium into a single equation. One is excess buffer approximation (EBA) which assumes that mobile buffer is present in excess and cannot be saturated. The other is rapid buffer approximation (RBA), which assumes that calcium binding to buffer is rapid compared to calcium diffusion rate. In the present work, attempt has been made to develop a model for calcium diffusion under excess buffer approximation in neuron cells. This model incorporates the effect of [Na+] influx on [Ca2+] diffusion,variable calcium and sodium sources, sodium-calcium exchange protein, Sarcolemmal Calcium ATPase pump, sodium and calcium channels. The proposed mathematical model leads to a system of partial differential equations which have been solved numerically using Forward Time Centered Space (FTCS) approach. The numerical results have been used to study the relationships among different types of parameters such as buffer concentration, association rate, calcium permeability.

Keywords: Excess buffer approximation, Na+ influx, sodium calcium exchange protein, sarcolemmal calcium atpase pump, forward time centred space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
4870 Combination of Standard Secondary Raw Materials and New Production Waste Materials in Green Concrete Technology

Authors: M. Tazky, R. Hela, P. Novosad, L. Osuska

Abstract:

This paper deals with the possibility of safe incorporation fluidised bed combustion fly ash (waste material) into cement matrix together with next commonly used secondary raw material, which is high-temperature fly ash. Both of these materials have a very high pozzolanic ability, and the right combination could bring important improvements in both the physico-mechanical properties and the better durability of a cement composite. This paper tries to determine the correct methodology for designing green concrete by using modern methods measuring rheology of fresh concrete and following hydration processes. The use of fluidised bed combustion fly ash in cement composite production as an admixture is not currently common, but there are some real possibilities for its potential. The most striking negative aspect is its chemical composition which supports the development of new product formation, influencing the durability of the composite. Another disadvantage is the morphology of grains, which have a negative effect on consistency. This raises the question of how this waste can be used in concrete production to emphasize its positive properties and eliminate negatives. The focal point of the experiment carried out on cement pastes was particularly on the progress of hydration processes, aiming for the possible acceleration of pozzolanic reactions of both types of fly ash.

Keywords: High-temperature fly ash, fluidised bed combustion fly ash, pozzolanic, CaO (calcium oxide), rheology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 759
4869 Backplane Serial Signaling and Protocol for Telecom Systems

Authors: Ali Poureslami, Hossein Borhanifar, Seyed Ali Alavian

Abstract:

In this paper, we implement a modern serial backplane platform for telecommunication inter-rack systems. For combination high reliability and low cost protocol property, we applied high level data link control (HDLC) protocol with low voltage differential signaling (LVDS) bus for card to card communicated over backplane. HDLC protocol is a high performance with several operation modes and is famous in telecommunication systems. LVDS bus is a high reliability with high immunity against electromagnetic interference (EMI) and noise.

Keywords: Backplane, BLVDS, HDLC, EMI, I2C, LCT, OSC, SFP, SNMP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247
4868 Development of Integrated GIS Interface for Characteristics of Regional Daily Flow

Authors: Ju Young Lee, Jung-Seok Yang, Jaeyoung Choi

Abstract:

The purpose of this paper primarily intends to develop GIS interface for estimating sequences of stream-flows at ungauged stations based on known flows at gauged stations. The integrated GIS interface is composed of three major steps. The first, precipitation characteristics using statistical analysis is the procedure for making multiple linear regression equation to get the long term mean daily flow at ungauged stations. The independent variables in regression equation are mean daily flow and drainage area. Traditionally, mean flow data are generated by using Thissen polygon method. However, method for obtaining mean flow data can be selected by user such as Kriging, IDW (Inverse Distance Weighted), Spline methods as well as other traditional methods. At the second, flow duration curve (FDC) is computing at unguaged station by FDCs in gauged stations. Finally, the mean annual daily flow is computed by spatial interpolation algorithm. The third step is to obtain watershed/topographic characteristics. They are the most important factors which govern stream-flows. In summary, the simulated daily flow time series are compared with observed times series. The results using integrated GIS interface are closely similar and are well fitted each other. Also, the relationship between the topographic/watershed characteristics and stream flow time series is highly correlated.

Keywords: Integrated GIS interface, spatial interpolation algorithm, FDC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492
4867 Design of MBMS Client Functions in the Mobile

Authors: Jaewook Shin, Aesoon Park

Abstract:

MBMS is a unidirectional point-to-multipoint bearer service in which data are transmitted from a single source entity to multiple recipients. For a mobile to support the MBMS, MBMS client functions as well as MBMS radio protocols should be designed and implemented. In this paper, we analyze the MBMS client functions and describe the implementation of them in our mobile test-bed. User operations and signaling flows between protocol entities to control the MBMS functions are designed in detail. Service announcement utilizing the file download MBMS service and four MBMS user services are demonstrated in the test-bed to verify the MBMS client functions.

Keywords: BM-SC, Broadcast, MBMS, Mobile, Multicast.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
4866 Probabilistic Method of Wind Generation Placement for Congestion Management

Authors: S. Z. Moussavi, A. Badri, F. Rastegar Kashkooli

Abstract:

Wind farms (WFs) with high level of penetration are being established in power systems worldwide more rapidly than other renewable resources. The Independent System Operator (ISO), as a policy maker, should propose appropriate places for WF installation in order to maximize the benefits for the investors. There is also a possibility of congestion relief using the new installation of WFs which should be taken into account by the ISO when proposing the locations for WF installation. In this context, efficient wind farm (WF) placement method is proposed in order to reduce burdens on congested lines. Since the wind speed is a random variable and load forecasts also contain uncertainties, probabilistic approaches are used for this type of study. AC probabilistic optimal power flow (P-OPF) is formulated and solved using Monte Carlo Simulations (MCS). In order to reduce computation time, point estimate methods (PEM) are introduced as efficient alternative for time-demanding MCS. Subsequently, WF optimal placement is determined using generation shift distribution factors (GSDF) considering a new parameter entitled, wind availability factor (WAF). In order to obtain more realistic results, N-1 contingency analysis is employed to find the optimal size of WF, by means of line outage distribution factors (LODF). The IEEE 30-bus test system is used to show and compare the accuracy of proposed methodology.

Keywords: Probabilistic optimal power flow, Wind power, Pointestimate methods, Congestion management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1860
4865 Numerical Analysis of Thermal Conductivity of Non-Charring Material Ablation Carbon-Carbon and Graphite with Considering Chemical Reaction Effects, Mass Transfer and Surface Heat Transfer

Authors: H. Mohammadiun, A. Kianifar, A. Kargar

Abstract:

Nowadays, there is little information, concerning the heat shield systems, and this information is not completely reliable to use in so many cases. for example, the precise calculation cannot be done for various materials. In addition, the real scale test has two disadvantages: high cost and low flexibility, and for each case we must perform a new test. Hence, using numerical modeling program that calculates the surface recession rate and interior temperature distribution is necessary. Also, numerical solution of governing equation for non-charring material ablation is presented in order to anticipate the recession rate and the heat response of non-charring heat shields. the governing equation is nonlinear and the Newton- Rafson method along with TDMA algorithm is used to solve this nonlinear equation system. Using Newton- Rafson method for solving the governing equation is one of the advantages of the solving method because this method is simple and it can be easily generalized to more difficult problems. The obtained results compared with reliable sources in order to examine the accuracy of compiling code.

Keywords: Ablation rate, surface recession, interior temperaturedistribution, non charring material ablation, Newton Rafson method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1874
4864 Curing Methods Yield Multiple Refractive Index of Benzocyclobutene Polymer Film

Authors: N.A.M. Yahya, W.H. Lim, S.W. Phang, H. Ahmad, R. Zakaria, F.R. Mahamd Adikan

Abstract:

Refractive index control of benzocyclobutene (BCB 4024-40) is achieved by facilitating different conditions during the thermal curing of BCB film. Refractive index (RI) change of 1.49% is obtained with curing of BCB film using an oven, while the RI change is 0.1% when the BCB is cured using a hotplate. The two different curing methods exhibit a temperature dependent refractive index change of the BCB photosensitive polymer. By carefully controlling the curing conditions, multiple layers of BCB with different RI can be fabricated, which can then be applied in the fabrication of optical waveguides.

Keywords: BCB 4024-40, curing method, multiple refractiveindex, polymers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2699
4863 Trajectory Planning Design Equations and Control of a 4 - axes Stationary Robotic Arm

Authors: T.C. Manjunath,

Abstract:

This paper features the trajectory planning design of a indigenously developed 4-Axis SCARA robot which is used for doing successful robotic manipulation task in the laboratory. Once, a trajectory is being designed and given as input to the robot, the robot's gripper tip moves along that specified trajectory. Trajectories have to be designed in the work space only. The main idea of this paper is to design a continuous path trajectory model for the indigenously developed SCARA robot arm during its maneuvering from one point to another point (during pick and place operations) in a workspace avoiding all the obstacles in its path of motion.

Keywords: SCARA, Trajectory, Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4202
4862 Neuro-Fuzzy Algorithm for a Biped Robotic System

Authors: Hataitep Wongsuwarn, Djitt Laowattana

Abstract:

This paper summaries basic principles and concepts of intelligent controls, implemented in humanoid robotics as well as recent algorithms being devised for advanced control of humanoid robots. Secondly, this paper presents a new approach neuro-fuzzy system. We have included some simulating results from our computational intelligence technique that will be applied to our humanoid robot. Subsequently, we determine a relationship between joint trajectories and located forces on robot-s foot through a proposed neuro-fuzzy technique.

Keywords: Biped Robot, Computational Intelligence, Static and Dynamic Walking, Gait Synthesis, Neuro-Fuzzy System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2536
4861 Application of UV-C Irradiation on Quality and Textural Properties of Button Mushrooms

Authors: M. Ghasemi-Varnamkhasti, S. H. Yoosefian. A. Mohammad- Razdari

Abstract:

The effect of 1.0 kJ/m2 Ultraviolet-C (UV-C) light on pH, weight loss, color, and firmness of button mushroom (Agaricus bisporus) tissues during 21-days storage at 4 ºC was studied. UV-C irradiation enhanced pH, weight, color parameters, and firmness of mushroom during storage compared to control treatment. However, application of 1.0 kJ/m2 UV-C treatment could effectively induce the increase of weight loss, firmness, and pH to 14.53%, 49.82%, and 10.39%, respectively. These results suggest that the application of UV-C irradiation could be an effective method to maintain the postharvest quality of mushrooms.

Keywords: Mushroom, polyethylene film, quality, UV-C irradiation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1384
4860 A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data

Authors: P. Kaladevi, N. Giridharan

Abstract:

The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.

Keywords: Big Data, Hadoop, HDFS, Caching, MapReduce, web personalization, e-governance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
4859 Solubility of Water in CO2 Mixtures at Pipeline Operation Conditions

Authors: Mohammad Ahmad, Sander Gersen, Erwin Wilbers

Abstract:

Carbon capture, transport and underground storage have become a major solution to reduce CO2 emissions from power plants and other large CO2 sources. A big part of this captured CO2 stream is transported at high pressure dense phase conditions and stored in offshore underground depleted oil and gas fields. CO2 is also transported in offshore pipelines to be used for enhanced oil and gas recovery. The captured CO2 stream with impurities may contain water that causes severe corrosion problems, flow assurance failure and might damage valves and instrumentations. Thus, free water formation should be strictly prevented. The purpose of this work is to study the solubility of water in pure CO2 and in CO2 mixtures under real pipeline pressure (90-150 bar) and temperature operation conditions (5-35°C). A set up was constructed to generate experimental data. The results show the solubility of water in CO2 mixtures increasing with the increase of the temperature or/and with the increase in pressure. A drop in water solubility in CO2 is observed in the presence of impurities. The data generated were then used to assess the capabilities of two mixture models: the GERG-2008 model and the EOS-CG model. By generating the solubility data, this study contributes to determine the maximum allowable water content in CO2 pipelines.

Keywords: Carbon capture and storage, water solubility, equation of states.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2875
4858 Improving the Elder-s Quality of Life with Smart Television Based Services

Authors: Van-Quang Trinh, Gi-Soo Chung, Hee-Cheol Kim

Abstract:

The increasing number of senior population gradually causes to demand the use of information and communication technology for their satisfactory lives. This paper presents the development of an integrated TV based system which offers an opportunity to provide value added services to a large number of elderly citizens, and thus helps improve their quality of life. The design philosophy underlying this paper is to fulfill both technological and human aspects. The balance between these two dimensions has been currently stressed as a crucial element for the design of usable systems in real use, particularly to the elderly who have physical and mental decline. As the first step to achieve it, we have identified human and social factors that affect the elder-s quality of life by a literature review, and based on them, build four fundamental services: information, healthcare, learning and social network services. Secondly, the system architecture, employed technologies and the elderly-friendly system design considerations are presented. This reflects technological and human perspectives in terms of the system design. Finally, we describe some scenarios that illustrate the potentiality of the proposed system to improve elderly people-s quality of life.

Keywords: Elderly people, human computer interaction, quality of life, smart television, user-centered system design

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034
4857 An Improved Transfer Logic of the Two-Path Algorithm for Acoustic Echo Cancellation

Authors: Chang Liu, Zishu He

Abstract:

Adaptive echo cancellers with two-path algorithm are applied to avoid the false adaptation during the double-talk situation. In the two-path algorithm, several transfer logic solutions have been proposed to control the filter update. This paper presents an improved transfer logic solution. It improves the convergence speed of the two-path algorithm, and allows the reduction of the memory elements and computational complexity. Results of simulations show the improved performance of the proposed solution.

Keywords: Acoustic echo cancellation, Echo return lossenhancement (ERLE), Two-path algorithm, Transfer logic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
4856 Corporate Credit Rating using Multiclass Classification Models with order Information

Authors: Hyunchul Ahn, Kyoung-Jae Kim

Abstract:

Corporate credit rating prediction using statistical and artificial intelligence (AI) techniques has been one of the attractive research topics in the literature. In recent years, multiclass classification models such as artificial neural network (ANN) or multiclass support vector machine (MSVM) have become a very appealing machine learning approaches due to their good performance. However, most of them have only focused on classifying samples into nominal categories, thus the unique characteristic of the credit rating - ordinality - has been seldom considered in their approaches. This study proposes new types of ANN and MSVM classifiers, which are named OMANN and OMSVM respectively. OMANN and OMSVM are designed to extend binary ANN or SVM classifiers by applying ordinal pairwise partitioning (OPP) strategy. These models can handle ordinal multiple classes efficiently and effectively. To validate the usefulness of these two models, we applied them to the real-world bond rating case. We compared the results of our models to those of conventional approaches. The experimental results showed that our proposed models improve classification accuracy in comparison to typical multiclass classification techniques with the reduced computation resource.

Keywords: Artificial neural network, Corporate credit rating, Support vector machines, Ordinal pairwise partitioning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3413
4855 Application of Neural Network in User Authentication for Smart Home System

Authors: A. Joseph, D.B.L. Bong, D.A.A. Mat

Abstract:

Security has been an important issue and concern in the smart home systems. Smart home networks consist of a wide range of wired or wireless devices, there is possibility that illegal access to some restricted data or devices may happen. Password-based authentication is widely used to identify authorize users, because this method is cheap, easy and quite accurate. In this paper, a neural network is trained to store the passwords instead of using verification table. This method is useful in solving security problems that happened in some authentication system. The conventional way to train the network using Backpropagation (BPN) requires a long training time. Hence, a faster training algorithm, Resilient Backpropagation (RPROP) is embedded to the MLPs Neural Network to accelerate the training process. For the Data Part, 200 sets of UserID and Passwords were created and encoded into binary as the input. The simulation had been carried out to evaluate the performance for different number of hidden neurons and combination of transfer functions. Mean Square Error (MSE), training time and number of epochs are used to determine the network performance. From the results obtained, using Tansig and Purelin in hidden and output layer and 250 hidden neurons gave the better performance. As a result, a password-based user authentication system for smart home by using neural network had been developed successfully.

Keywords: Neural Network, User Authentication, Smart Home, Security

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2022
4854 Efficient Utilization of Biomass for Bioenergy in Environmental Control

Authors: Subir Kundu, Sukhendra Singh, Sumedha Ojha, Kanika Kundu

Abstract:

The continuous decline of petroleum and natural gas reserves and non linear rise of oil price has brought about a realisation of the need for a change in our perpetual dependence on the fossil fuel. A day to day increased consumption of crude and petroleum products has made a considerable impact on our foreign exchange reserves. Hence, an alternate resource for the conversion of energy (both liquid and gas) is essential for the substitution of conventional fuels. Biomass is the alternate solution for the present scenario. Biomass can be converted into both liquid as well as gaseous fuels and other feedstocks for the industries.

Keywords: Bioenergy, Biomass conversion, Biorefining, Efficient utilisation of night soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2385
4853 The Enhancement of Training of Military Pilots Using Psychophysiological Methods

Authors: G. Kloudova, M. Stehlik

Abstract:

Optimal human performance is a key goal in the professional setting of military pilots, which is a highly challenging atmosphere. The aviation environment requires substantial cognitive effort and is rich in potential stressors. Therefore, it is important to analyze variables such as mental workload to ensure safe conditions. Pilot mental workload could be measured using several tools, but most of them are very subjective. This paper details research conducted with military pilots using psychophysiological methods such as electroencephalography (EEG) and heart rate (HR) monitoring. The data were measured in a simulator as well as under real flight conditions. All of the pilots were exposed to highly demanding flight tasks and showed big individual response differences. On that basis, the individual pattern for each pilot was created counting different EEG features and heart rate variations. Later on, it was possible to distinguish the most difficult flight tasks for each pilot that should be more extensively trained. For training purposes, an application was developed for the instructors to decide which of the specific tasks to focus on during follow-up training. This complex system can help instructors detect the mentally demanding parts of the flight and enhance the training of military pilots to achieve optimal performance.

Keywords: Cognitive effort, human performance, military pilots, psychophysiological methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1159
4852 Chaotic Properties of Hemodynamic Responsein Functional Near Infrared Spectroscopic Measurement of Brain Activity

Authors: Ni Ni Soe , Masahiro Nakagawa

Abstract:

Functional near infrared spectroscopy (fNIRS) is a practical non-invasive optical technique to detect characteristic of hemoglobin density dynamics response during functional activation of the cerebral cortex. In this paper, fNIRS measurements were made in the area of motor cortex from C4 position according to international 10-20 system. Three subjects, aged 23 - 30 years, were participated in the experiment. The aim of this paper was to evaluate the effects of different motor activation tasks of the hemoglobin density dynamics of fNIRS signal. The chaotic concept based on deterministic dynamics is an important feature in biological signal analysis. This paper employs the chaotic properties which is a novel method of nonlinear analysis, to analyze and to quantify the chaotic property in the time series of the hemoglobin dynamics of the various motor imagery tasks of fNIRS signal. Usually, hemoglobin density in the human brain cortex is found to change slowly in time. An inevitable noise caused by various factors is to be included in a signal. So, principle component analysis method (PCA) is utilized to remove high frequency component. The phase pace is reconstructed and evaluated the Lyapunov spectrum, and Lyapunov dimensions. From the experimental results, it can be conclude that the signals measured by fNIRS are chaotic.

Keywords: Chaos, hemoglobin, Lyapunov spectrum, motorimagery, near infrared spectroscopy (NIRS), principal componentanalysis (PCA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
4851 An Agent Oriented Approach to Operational Profile Management

Authors: Sunitha Ramanujam, Hany El Yamany, Miriam A. M. Capretz

Abstract:

Software reliability, defined as the probability of a software system or application functioning without failure or errors over a defined period of time, has been an important area of research for over three decades. Several research efforts aimed at developing models to improve reliability are currently underway. One of the most popular approaches to software reliability adopted by some of these research efforts involves the use of operational profiles to predict how software applications will be used. Operational profiles are a quantification of usage patterns for a software application. The research presented in this paper investigates an innovative multiagent framework for automatic creation and management of operational profiles for generic distributed systems after their release into the market. The architecture of the proposed Operational Profile MAS (Multi-Agent System) is presented along with detailed descriptions of the various models arrived at following the analysis and design phases of the proposed system. The operational profile in this paper is extended to comprise seven different profiles. Further, the criticality of operations is defined using a new composed metrics in order to organize the testing process as well as to decrease the time and cost involved in this process. A prototype implementation of the proposed MAS is included as proof-of-concept and the framework is considered as a step towards making distributed systems intelligent and self-managing.

Keywords: Software reliability, Software testing, Metrics, Distributed systems, Multi-agent systems

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
4850 Fail-safe Modeling of Discrete Event Systems using Petri Nets

Authors: P. Nazemzadeh, A. Dideban, M. Zareiee

Abstract:

In this paper the effect of faults in the elements and parts of discrete event systems is investigated. In the occurrence of faults, some states of the system must be changed and some of them must be forbidden. For this goal, different states of these elements are examined and a model for fail-safe behavior of each state is introduced. Replacing new models of the target elements in the preliminary model by a systematic method, leads to a fail-safe discrete event system.

Keywords: Discrete event systems, Fail-safe, Petri nets, Supervisory control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596
4849 Quantifying Landscape Connectivity: A GIS-based Approach

Authors: Siqing S. Chen

Abstract:

Landscape connectivity combines a description of the physical structure of the landscape with special species- response to that structure, which forms the theoretical background of applying landscape connectivity principles in the practices of landscape planning and design. In this study, a residential development project in the southern United States was used to explore the meaning of landscape connectivity and its application in town planning. The vast rural landscape in the southern United States is conspicuously characterized by the hedgerow trees or groves. The patchwork landscape of fields surrounded by high hedgerows is a traditional and familiar feature of the American countryside. Hedgerows are in effect linear strips of trees, groves, or woodlands, which are often critical habitats for wildlife and important for the visual quality of the landscape. Based on geographic information system (GIS) and statistical analysis (FRAGSTAT), this study attempts to quantify the landscape connectivity characterized by hedgerows in south Alabama where substantial areas of authentic hedgerow landscape are being urbanized due to the ever expanding real estate industry and high demand for new residential development. The results of this study shed lights on how to balance the needs of new urban development and biodiversity conservation by maintaining a higher level of landscape connectivity, thus will inform the design intervention.

Keywords: Biodiversity, Connectivity, Landscape planning, GIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4466
4848 General Regression Neural Network and Back Propagation Neural Network Modeling for Predicting Radial Overcut in EDM: A Comparative Study

Authors: Raja Das, M. K. Pradhan

Abstract:

This paper presents a comparative study between two neural network models namely General Regression Neural Network (GRNN) and Back Propagation Neural Network (BPNN) are used to estimate radial overcut produced during Electrical Discharge Machining (EDM). Four input parameters have been employed: discharge current (Ip), pulse on time (Ton), Duty fraction (Tau) and discharge voltage (V). Recently, artificial intelligence techniques, as it is emerged as an effective tool that could be used to replace time consuming procedures in various scientific or engineering applications, explicitly in prediction and estimation of the complex and nonlinear process. The both networks are trained, and the prediction results are tested with the unseen validation set of the experiment and analysed. It is found that the performance of both the networks are found to be in good agreement with average percentage error less than 11% and the correlation coefficient obtained for the validation data set for GRNN and BPNN is more than 91%. However, it is much faster to train GRNN network than a BPNN and GRNN is often more accurate than BPNN. GRNN requires more memory space to store the model, GRNN features fast learning that does not require an iterative procedure, and highly parallel structure. GRNN networks are slower than multilayer perceptron networks at classifying new cases.

Keywords: Electrical-discharge machining, General Regression Neural Network, Back-propagation Neural Network, Radial Overcut.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3080
4847 Designing a Framework for Network Security Protection

Authors: Eric P. Jiang

Abstract:

As the Internet continues to grow at a rapid pace as the primary medium for communications and commerce and as telecommunication networks and systems continue to expand their global reach, digital information has become the most popular and important information resource and our dependence upon the underlying cyber infrastructure has been increasing significantly. Unfortunately, as our dependency has grown, so has the threat to the cyber infrastructure from spammers, attackers and criminal enterprises. In this paper, we propose a new machine learning based network intrusion detection framework for cyber security. The detection process of the framework consists of two stages: model construction and intrusion detection. In the model construction stage, a semi-supervised machine learning algorithm is applied to a collected set of network audit data to generate a profile of normal network behavior and in the intrusion detection stage, input network events are analyzed and compared with the patterns gathered in the profile, and some of them are then flagged as anomalies should these events are sufficiently far from the expected normal behavior. The proposed framework is particularly applicable to the situations where there is only a small amount of labeled network training data available, which is very typical in real world network environments.

Keywords: classification, data analysis and mining, network intrusion detection, semi-supervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1781