Search results for: Quality frame selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4297

Search results for: Quality frame selection

2407 A Signal Driven Adaptive Resolution Short-Time Fourier Transform

Authors: Saeed Mian Qaisar, Laurent Fesquet, Marc Renaudin

Abstract:

The frequency contents of the non-stationary signals vary with time. For proper characterization of such signals, a smart time-frequency representation is necessary. Classically, the STFT (short-time Fourier transform) is employed for this purpose. Its limitation is the fixed timefrequency resolution. To overcome this drawback an enhanced STFT version is devised. It is based on the signal driven sampling scheme, which is named as the cross-level sampling. It can adapt the sampling frequency and the window function (length plus shape) by following the input signal local variations. This adaptation results into the proposed technique appealing features, which are the adaptive time-frequency resolution and the computational efficiency.

Keywords: Level Crossing Sampling, Activity Selection, Adaptive Resolution Analysis, Computational Complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572
2406 Physicochemical and Microbiological Assessment of Source and Stored Domestic Water from Three Local Governments in Ile-Ife, Nigeria

Authors: Mary A. Bisi-Johnson, Kehinde A. Adediran, Saheed A. Akinola, Hamzat A. Oyelade

Abstract:

Some of the main problems man contends with are the quantity (source and amount) and quality of water in Nigeria. Scarcity leads to water being obtained from various sources and microbiological contamination of the water may thus occur between the collection point and the point of usage. This study thus aims to assess the general and microbiological quality of domestic water sources and household stored water used within selected areas in Ile-Ife, South-Western part of Nigeria for microbial contaminants.             Physicochemical and microbiological examination were carried out on 45 source and stored water samples collected from well and spring in three different local government areas i.e. Ife east, Ife-south and Ife-north. Physicochemical analysis included pH value, temperature, total dissolved solid, dissolved oxygen and biochemical oxygen demand. Microbiology involved most probable number analysis, total coliform, heterotrophic plate, faecal coliform and streptococcus count.

The result of the physicochemical analysis of samples showed anomalies compared to acceptable standards with the pH value of 7.20-8.60 for stored and 6.50-7.80 for source samples. The total dissolved solids (TDS of stored 20-70mg/L, source 352-691mg/L), dissolved oxygen (DO of stored 1.60-9.60mg/L, source 1.60-4.80mg/L), biochemical oxygen demand (BOD stored 0.80-3.60mg/L, source 0.60-5.40mg/L). General microbiological quality indicated that both stored and source samples with the exception of a sample were not within acceptable range as indicated by analysis of the MPN/100ml which ranges between (stored 290-1100mg/L, source 9-1100mg/L). Apart from high counts, most samples did not meet the World Health Organization standard for drinking water with the presence of some pathogenic bacteria and fungi such as Salmonella and Aspergillus spp. To annul these constraints, standard treatment methods should be adopted to make water free from contaminants. This will help identify common and likely water related infection origin within the communities and thus help guide in terms of interventions required to prevent the general populace from such infections.

Keywords: Domestic, microbiology, physicochemical, quality, water.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2713
2405 Factors that Contribute to the Improvement of the Sense of Self-Efficacy of Special Educators in Inclusive Settings in Greece

Authors: Sotiria Tzivinikou, Dimitra Kagkara

Abstract:

Teacher’s sense of self-efficacy can affect significantly both teacher’s and student’s performance. More specific, self-efficacy is associated with the learning outcomes as well as student’s motivation and self-efficacy. For example, teachers with high sense of self-efficacy are more open to innovations and invest more effort in teaching. In addition to this, effective inclusive education is associated with higher levels of teacher’s self-efficacy. Pre-service teachers with high levels of self-efficacy could handle student’s behavior better and more effectively assist students with special educational needs. Teacher preparation programs are also important, because teacher’s efficacy beliefs are shaped early in learning, as a result the quality of teacher’s education programs can affect the sense of self-efficacy of pre-service teachers. Usually, a number of pre-service teachers do not consider themselves well prepared to work with students with special educational needs and do not have the appropriate sense of self-efficacy. This study aims to investigate the factors that contribute to the improvement of the sense of self-efficacy of pre-service special educators by using an academic practicum training program. The sample of this study is 159 pre-service special educators, who also participated in the academic practicum training program. For the purpose of this study were used quantitative methods for data collection and analysis. Teacher’s self-efficacy was assessed by the teachers themselves with the completion of a questionnaire which was based on the scale of Teacher’s Sense of Efficacy Scale. Pre and post measurements of teacher’s self-efficacy were taken. The results of the survey are consistent with those of the international literature. The results indicate that a significant number of pre-service special educators do not hold the appropriate sense of self-efficacy regarding teaching students with special educational needs. Moreover, a quality academic training program constitutes a crucial factor for the improvement of the sense of self-efficacy of pre-service special educators, as additional for the provision of high quality inclusive education.

Keywords: Inclusive education, pre-service, self-efficacy, training program.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 902
2404 Biometric Technology in Securing the Internet Using Large Neural Network Technology

Authors: B. Akhmetov, A. Doszhanova, A. Ivanov, T. Kartbayev, A. Malygin

Abstract:

The article examines the methods of protection of citizens' personal data on the Internet using biometric identity authentication technology. It`s celebrated their potential danger due to the threat of loss of base biometric templates. To eliminate the threat of compromised biometric templates is proposed to use neural networks large and extra-large sizes, which will on the one hand securely (Highly reliable) to authenticate a person by his biometrics, and on the other hand make biometrics a person is not available for observation and understanding. This article also describes in detail the transformation of personal biometric data access code. It`s formed the requirements for biometrics converter code for his work with the images of "Insider," "Stranger", all the "Strangers". It`s analyzed the effect of the dimension of neural networks on the quality of converters mystery of biometrics in access code.

Keywords: Biometric security technologies, Conversion of personal biometric data access code, Electronic signature, Large neural networks, quality of converters "Biometrics - the code", the Egovernment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2179
2403 Decoupled Scheduling in Meta Environment

Authors: Ponsy R.K. Sathia Bhama, Thamarai Selvi Soma Sundaram, R. Sivakama Sundari, R. Bakiyalakshmi, K. Thamizharasi

Abstract:

Grid scheduling is the process of mapping grid jobs to resources over multiple administrative domains. Traditionally, application-level schedulers have been tightly integrated with the application itself and were not easily applied to other applications. This design is generic that decouples the scheduler core (the search procedure) from the application-specific (e.g. application performance models) and platform-specific (e.g. collection of resource information) components used by the search procedure. In this decoupled approach the application details are not revealed completely to broker, but customer will give the application to resource provider for execution. In a decoupled approach, apart from scheduling, the resource selection can be performed independently in order to achieve scalability.

Keywords: Meta, grid scheduling, application-level scheduler, decouple, scheduler core and performance model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1277
2402 Identifying Potential Partnership for Open Innovation by using Bibliographic Coupling and Keyword Vector Mapping

Authors: Inchae Park, Byungun Yoon

Abstract:

As open innovation has received increasingly attention in the management of innovation, the importance of identifying potential partnership is increasing. This paper suggests a methodology to identify the interested parties as one of Innovation intermediaries to enable open innovation with patent network. To implement the methodology, multi-stage patent citation analysis such as bibliographic coupling and information visualization method such as keyword vector mapping are utilized. This paper has contribution in that it can present meaningful collaboration keywords to identified potential partners in network since not only citation information but also patent textual information is used.

Keywords: Open innovation, partner selection, bibliographic coupling, Keyword vector mapping, patent network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1794
2401 Computational Prediction of Complicated Atmospheric Motion for Spinning or non- Spinning Projectiles

Authors: Dimitrios N. Gkritzapis, Elias E. Panagiotopoulos, Dionissios P. Margaris, Dimitrios G. Papanikas

Abstract:

A full six degrees of freedom (6-DOF) flight dynamics model is proposed for the accurate prediction of short and long-range trajectories of high spin and fin-stabilized projectiles via atmospheric flight to final impact point. The projectiles is assumed to be both rigid (non-flexible), and rotationally symmetric about its spin axis launched at low and high pitch angles. The mathematical model is based on the full equations of motion set up in the no-roll body reference frame and is integrated numerically from given initial conditions at the firing site. The projectiles maneuvering motion depends on the most significant force and moment variations, in addition to wind and gravity. The computational flight analysis takes into consideration the Mach number and total angle of attack effects by means of the variable aerodynamic coefficients. For the purposes of the present work, linear interpolation has been applied from the tabulated database of McCoy-s book. The developed computational method gives satisfactory agreement with published data of verified experiments and computational codes on atmospheric projectile trajectory analysis for various initial firing flight conditions.

Keywords: Constant-Variable aerodynamic coefficients, low and high pitch angles, wind.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2422
2400 PRO-Teaching – Sharing Ideas to Develop Capabilities

Authors: Steve J. Drew, Christopher J. Klopper

Abstract:

In this paper, the action research driven design of a context relevant, developmental peer review of teaching model, its implementation strategy and its impact at an Australian university is presented. PRO-Teaching realizes an innovative process that triangulates contemporaneous teaching quality data from a range of stakeholders including students, discipline academics, learning and teaching expert academics, and teacher reflection to create reliable evidence of teaching quality. Data collected over multiple classroom observations allows objective reporting on development differentials in constructive alignment, peer, and student evaluations. Further innovation is realized in the application of this highly structured developmental process to provide summative evidence of sufficient validity to support claims for professional advancement and learning and teaching awards. Design decision points and contextual triggers are described within the operating domain. Academics and developers seeking to introduce structured peer review of teaching into their organization will find this paper a useful reference.

Keywords: Development loop, Multiple data sources, Objective reporting, Peer review of teaching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
2399 Process Development of Safe and Ready-to-eat Raw Oyster Meat by Irradiation Technology

Authors: Pattama Ratana-Arporn, Pongtep Wilaipun

Abstract:

White scar oyster (Crassostrea belcheri) is often eaten raw and being the leading vehicle for foodborne disease, especially Salmonella Weltevreden which exposed the prominent and most resistant to radiation. Gamma irradiation at a low dose of 1 kGy was enough to eliminate S. Weltevreden contaminated in oyster meat at a level up to 5 log CFU/g while it still retain the raw characteristics and equivalent sensory quality as the non-irradiated one. Process development of ready-to-eat chilled oyster meat was conducted by shucking the meat, individually packed in plastic bags, subjected to 1 kGy gamma radiation at chilled condition and then stored in 4oC refrigerated temperature. Microbiological determination showed the absence of S. Weltevreden (5 log CFU/g initial inoculated) along the whole storage time of 30 days. Sensory evaluation indicated the decreasing in sensory scores along storage time which determining the product shelf life to be 18 days compared to 15 days of nonirradiated one. The most advantage of developed process was to provide the safe raw oyster to consumers and in addition sensory quality retained and 3-day extension shelf life also exist.

Keywords: decontamination, food safety, irradiation, oyster, Salmonella Weltevreden

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684
2398 Analysis of the Root Causes of Transformer Bushing Failures

Authors: E. A. Feilat, I. A. Metwally, S. Al-Matri, A. S. Al-Abri

Abstract:

This paper presents the results of a comprehensive investigation of five blackouts that occurred on 28 August to 8 September 2011 due to bushing failures of the 132/33 kV, 125 MVA transformers at JBB Ali Grid station. The investigation aims to explore the root causes of the bushing failures and come up with recommendations that help in rectifying the problem and avoiding the reoccurrence of similar type of incidents. The incident reports about the failed bushings and the SCADA reports at this grid station were examined and analyzed. Moreover, comprehensive power quality field measurements at ten 33/11 kV substations (S/Ss) in JBB Ali area were conducted, and frequency scans were performed to verify any harmonic resonance frequencies due to power factor correction capacitors. Furthermore, the daily operations of the on-load tap changers (OLTCs) of both the 125 MVA and 20 MVA transformers at JBB Ali Grid station have been analyzed. The investigation showed that the five bushing failures were due to a local problem, i.e. internal degradation of the bushing insulation. This has been confirmed by analyzing the time interval between successive OLTC operations of the faulty grid transformers. It was also found that monitoring the number of OLTC operations can help in predicting bushing failure.

Keywords: Modeling and simulation, power system, transformer, bushing, OLTC, power quality, partial discharge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10778
2397 Two-Phase Optimization for Selecting Materialized Views in a Data Warehouse

Authors: Jiratta Phuboon-ob, Raweewan Auepanwiriyakul

Abstract:

A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance. Therefore, in this paper, we introduce a new approach aimed to solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that 2PO outperform the original algorithms in terms of query processing cost and view maintenance cost.

Keywords: Data warehouse, materialized views, view selectionproblem, two-phase optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705
2396 Sensitivity of Input Blocking Capacitor on Output Voltage and Current of a PV Inverter Employing IGBTs

Authors: Z.A. Jaffery, Vinay Kumar Chandna, Sunil Kumar Chaudhary

Abstract:

This paper present a MATLAB-SIMULINK model of a single phase 2.5 KVA, 240V RMS controlled PV VSI (Photovoltaic Voltage Source Inverter) inverter using IGBTs (Insulated Gate Bipolar Transistor). The behavior of output voltage, output current, and the total harmonic distortion (THD), with the variation in input dc blocking capacitor (Cdc), for linear and non-linear load has been analyzed. The values of Cdc as suggested by the other authors in their papers are not clearly defined and it poses difficulty in selecting the proper value. As the dc power stored in Cdc, (generally placed parallel with battery) is used as input to the VSI inverter. The simulation results shows the variation in the output voltage and current with different values of Cdc for linear and non-linear load connected at the output side of PV VSI inverter and suggest the selection of suitable value of Cdc.

Keywords: DC Blocking capacitor, IGBTs, PV VSI, THD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2131
2395 Random Projections for Dimensionality Reduction in ICA

Authors: Sabrina Gaito, Andrea Greppi, Giuliano Grossi

Abstract:

In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate ¤ü, defined as the ratio between the size k of the reduced space and the original one d, which guarantees a narrow confidence interval of such estimator with high confidence level. The derived dimensionality reduction rate depends on a system control parameter β easily computed a priori on the basis of the observations only. Extensive simulations have been done on different sets of real world signals. They show that actually the dimensionality reduction is very high, it preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, a set of signals, on which the estimated reduction rate is greater than 1, exhibits bad decomposition results if reduced, thus validating the reliability of the parameter β. We are confident that our method will lead to a better approach to real time applications.

Keywords: Independent Component Analysis, FastICA algorithm, Higher-order statistics, Johnson-Lindenstrauss lemma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890
2394 An In-Depth Analysis of Open Data Portals as an Emerging Public E-Service

Authors: Martin Lnenicka

Abstract:

Governments collect and produce large amounts of data. Increasingly, governments worldwide have started to implement open data initiatives and also launch open data portals to enable the release of these data in open and reusable formats. Therefore, a large number of open data repositories, catalogues and portals have been emerging in the world. The greater availability of interoperable and linkable open government data catalyzes secondary use of such data, so they can be used for building useful applications which leverage their value, allow insight, provide access to government services, and support transparency. The efficient development of successful open data portals makes it necessary to evaluate them systematic, in order to understand them better and assess the various types of value they generate, and identify the required improvements for increasing this value. Thus, the attention of this paper is directed particularly to the field of open data portals. The main aim of this paper is to compare the selected open data portals on the national level using content analysis and propose a new evaluation framework, which further improves the quality of these portals. It also establishes a set of considerations for involving businesses and citizens to create eservices and applications that leverage on the datasets available from these portals.

Keywords: Big data, content analysis, criteria comparison, data quality, open data, open data portals, public sector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3082
2393 The Effect of Ageing Treatment of Aluminum Alloys for Fuselage Structure-Light Aircraft

Authors: Shwe Wut Hmon Aye, Kay Thi Lwin, Waing Waing Kay Khine Oo

Abstract:

As the material used for fuselage structure must possess low density, high strength to weight ratio, the selection of appropriate materials for fuselage structure is one of the most important tasks. Aluminum metal itself is soft and low in strength. It can be made stronger by giving proper combination of suitable alloy addition, mechanical treatment and thermal treatment. The usual thermal treatment given to aluminum alloys is called age-hardening or precipitation hardening. In this paper, the studies are carried out on 7075 aluminum alloy which is how to improve strength level for fuselage structure. The marked effect of the strength on the ternary alloy is clearly demonstrated at several ageing times and temperatures. It is concluded that aluminum-zinc-magnesium alloy can get the highest strength level in natural ageing.

Keywords: Aluminum alloy, ageing, heat treatment, strength.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2322
2392 Principal Component Analysis-Ranking as a Variable Selection Method for the Simultaneous Spectrophotometric Determination of Phenol, Resorcinol and Catechol in Real Samples

Authors: Nahid Ghasemi, Mohammad Goodarzi, Morteza Khosravi

Abstract:

Simultaneous determination of multicomponents of phenol, resorcinol and catechol with a chemometric technique a PCranking artificial neural network (PCranking-ANN) algorithm is reported in this study. Based on the data correlation coefficient method, 3 representative PCs are selected from the scores of original UV spectral data (35 PCs) as the original input patterns for ANN to build a neural network model. The results obtained by iterating 8000 .The RMSEP for phenol, resorcinol and catechol with PCranking- ANN were 0.6680, 0.0766 and 0.1033, respectively. Calibration matrices were 0.50-21.0, 0.50-15.1 and 0.50-20.0 μg ml-1 for phenol, resorcinol and catechol, respectively. The proposed method was successfully applied for the determination of phenol, resorcinol and catechol in synthetic and water samples.

Keywords: Phenol, Resorcinol, Catechol, Principal componentrankingArtificial Neural Network, Chemometrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1427
2391 Image Dehazing Using Dark Channel Prior and Fast Guided Filter in Daubechies Lifting Wavelet Transform Domain

Authors: Harpreet Kaur, Sudipta Majumdar

Abstract:

In this paper a method for image dehazing is proposed in lifting wavelet transform domain. Lifting Daubechies (D4) wavelet has been used to obtain the approximate image and detail images.  As the haze is contained in low frequency part, only the approximate image is used for further processing. This region is processed by dehazing algorithm based on dark channel prior (DCP). The dehazed approximate image is then recombined with the detail images using inverse lifting wavelet transform. Implementation of lifting wavelet transform has the advantage of auxiliary memory saving, fast implementation and simplicity. Also, the proposed method deals with near white scene problem, blue horizon issue and localized light sources in a way to enhance image quality and makes the algorithm robust. Simulation results present improvement in terms of visual quality, parameters such as root mean square (RMS) contrast, structural similarity index (SSIM), entropy and execution time.

Keywords: Dark channel prior, image dehazing, lifting wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1123
2390 Time-Domain Analysis Approaches of Soil-Structure Interaction: A Comparative Study

Authors: Abdelrahman Taha, Niloofar Malekghaini, Hamed Ebrahimian, Ramin Motamed

Abstract:

This paper compares the substructure and direct approaches for soil-structure interaction (SSI) analysis in the time domain. In the substructure approach, the soil domain is replaced by a set of springs and dashpots, also referred to as the impedance function, derived through the study of the behavior of a massless rigid foundation. The impedance function is inherently frequency dependent, i.e., it varies as a function of the frequency content of the structural response. To use the frequency-dependent impedance function for time-domain SSI analysis, the impedance function is approximated at the fundamental frequency of the coupled soil-structure system. To explore the potential limitations of the substructure modeling process, a two-dimensional (2D) reinforced concrete frame structure is modeled and analyzed using the direct and substructure approaches. The results show discrepancy between the simulated responses of the direct and substructure models. It is concluded that the main source of discrepancy is likely attributed to the way the impedance functions are calculated, i.e., assuming a massless rigid foundation without considering the presence of the superstructure. Hence, a refined impedance function, considering the presence of the superstructure, shall alternatively be developed. This refined impedance function is expected to improve the simulation accuracy of the substructure approach.

Keywords: Direct approach, impedance function, massless rigid foundation, soil-structure interaction, substructure approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 468
2389 An Analysis of Institutional Audits: Basis for Teaching, Learning and Assessment Framework and Principles

Authors: Nabil El Kadhi, Minerva M. Bunagan

Abstract:

The dynamism in education, particularly in the area of teaching, learning and assessment has caused Higher Education Institutions (HEIs) worldwide to seek for ways to continuously improve their educational processes. HEIs use outcomes of institutional audits, assessments and accreditations, for improvement. In this study, the published institutional audit reports of HEIs in the Sultanate of Oman were analyzed to produce features of good practice; identify challenges along Teaching, Learning Assessment (TLA); and propose a framework that puts major emphasis in having a quality-assured TLA, including a set of principles that can be used as basis in succeeding an institutional visit. The TLA framework, which shows the TLA components, characteristics of the components, related expectation, including implementation tool/ strategy and pitfalls can be used by HEIs to have an adequate understanding of the scope of audit and be able to satisfy institutional audit requirements. The scope of this study can be widened by exploring the other requirements of the Institutional Audits in the Sultanate of Oman, particularly the area on Governance and Management and Student Support Services.

Keywords: Accreditation, audit, quality assurance, teaching, learning and assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477
2388 Non-negative Principal Component Analysis for Face Recognition

Authors: Zhang Yan, Yu Bin

Abstract:

Principle component analysis is often combined with the state-of-art classification algorithms to recognize human faces. However, principle component analysis can only capture these features contributing to the global characteristics of data because it is a global feature selection algorithm. It misses those features contributing to the local characteristics of data because each principal component only contains some levels of global characteristics of data. In this study, we present a novel face recognition approach using non-negative principal component analysis which is added with the constraint of non-negative to improve data locality and contribute to elucidating latent data structures. Experiments are performed on the Cambridge ORL face database. We demonstrate the strong performances of the algorithm in recognizing human faces in comparison with PCA and NREMF approaches.

Keywords: classification, face recognition, non-negativeprinciple component analysis (NPCA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
2387 Multiclass Support Vector Machines for Environmental Sounds Classification Using log-Gabor Filters

Authors: S. Souli, Z. Lachiri

Abstract:

In this paper we propose a robust environmental sound classification approach, based on spectrograms features driven from log-Gabor filters. This approach includes two methods. In the first methods, the spectrograms are passed through an appropriate log-Gabor filter banks and the outputs are averaged and underwent an optimal feature selection procedure based on a mutual information criteria. The second method uses the same steps but applied only to three patches extracted from each spectrogram.

To investigate the accuracy of the proposed methods, we conduct experiments using a large database containing 10 environmental sound classes. The classification results based on Multiclass Support Vector Machines show that the second method is the most efficient with an average classification accuracy of 89.62 %.

Keywords: Environmental sounds, Log-Gabor filters, Spectrogram, SVM Multiclass, Visual features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746
2386 Comprehensive Nonlinearity Simulation of Different Types and Modes of HEMTs with Respect to Biasing Conditions

Authors: M. M. Karkhanehchi, A. Ammani

Abstract:

A simple analytical model has been developed to optimize biasing conditions for obtaining maximum linearity among lattice-matched, pseudomorphic and metamorphic HEMT types as well as enhancement and depletion HEMT modes. A nonlinear current-voltage model has been simulated based on extracted data to study and select the most appropriate type and mode of HEMT in terms of a given gate-source biasing voltage within the device so as to employ the circuit for the highest possible output current or voltage linear swing. Simulation results can be used as a basis for the selection of optimum gate-source biasing voltage for a given type and mode of HEMT with regard to a circuit design. The consequences can also be a criterion for choosing the optimum type or mode of HEMT for a predetermined biasing condition.

Keywords: Biasing, characteristic, linearity, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
2385 Efficient Implementation of Serial and Parallel Support Vector Machine Training with a Multi-Parameter Kernel for Large-Scale Data Mining

Authors: Tatjana Eitrich, Bruno Lang

Abstract:

This work deals with aspects of support vector learning for large-scale data mining tasks. Based on a decomposition algorithm that can be run in serial and parallel mode we introduce a data transformation that allows for the usage of an expensive generalized kernel without additional costs. In order to speed up the decomposition algorithm we analyze the problem of working set selection for large data sets and analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our modifications and settings lead to improvement of support vector learning performance and thus allow using extensive parameter search methods to optimize classification accuracy.

Keywords: Support Vector Machines, Shared Memory Parallel Computing, Large Data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
2384 Automatic Microaneurysm Quantification for Diabetic Retinopathy Screening

Authors: A. Sopharak, B. Uyyanonvara, S. Barman

Abstract:

Microaneurysm is a key indicator of diabetic retinopathy that can potentially cause damage to retina. Early detection and automatic quantification are the keys to prevent further damage. In this paper, which focuses on automatic microaneurysm detection in images acquired through non-dilated pupils, we present a series of experiments on feature selection and automatic microaneurysm pixel classification. We found that the best feature set is a combination of 10 features: the pixel-s intensity of shade corrected image, the pixel hue, the standard deviation of shade corrected image, DoG4, the area of the candidate MA, the perimeter of the candidate MA, the eccentricity of the candidate MA, the circularity of the candidate MA, the mean intensity of the candidate MA on shade corrected image and the ratio of the major axis length and minor length of the candidate MA. The overall sensitivity, specificity, precision, and accuracy are 84.82%, 99.99%, 89.01%, and 99.99%, respectively.

Keywords: Diabetic retinopathy, microaneurysm, naive Bayes classifier

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
2383 Simultaneous Term Structure Estimation of Hazard and Loss Given Default with a Statistical Model using Credit Rating and Financial Information

Authors: Tomohiro Ando, Satoshi Yamashita

Abstract:

The objective of this study is to propose a statistical modeling method which enables simultaneous term structure estimation of the risk-free interest rate, hazard and loss given default, incorporating the characteristics of the bond issuing company such as credit rating and financial information. A reduced form model is used for this purpose. Statistical techniques such as spline estimation and Bayesian information criterion are employed for parameter estimation and model selection. An empirical analysis is conducted using the information on the Japanese bond market data. Results of the empirical analysis confirm the usefulness of the proposed method.

Keywords: Empirical Bayes, Hazard term structure, Loss given default.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667
2382 Study of Compost Maturity during Humification Process using UV-Spectroscopy

Authors: N. Sanmanee, K. Panishkan, K. Obsuwan, S. Dharmvanij

Abstract:

The increments of aromatic structures are widely used to monitor the degree of humification. Compost derived from mix manures mixed with agricultural wastes was studied. The compost collected at day 0, 7, 14, 21, 28, 35, 49, 77, 91, 105, and 119 was divided into 3 stages, initial stage at day 0, thermophilic stage during day 1-48, and mature stage during day 49-119. The change of highest absorptions at wavelength range between 210-235 nm during day 0- 49 implied that small molecules such as nitrates and carboxylic occurred faster than the aromatic molecules that were found at wavelength around 280 nm. The ratio of electron-transfer band at wavelength 253 nm by the benzonoid band at wavelength 230 nm (E253/E230) also gradually increased during the fermenting period indicating the presence of O-containing functional groups. This was in agreement with the shift change from aliphatic to aromatic structures as shown by the relationship with C/N and H/C ratios (r = - 0.631 and -0.717, p< 0.05) since both were decreasing. Although the amounts of humic acid (HA) were not different much during the humification process, the UV spectral deconvolution showed better qualitative characteristics to help in determining the compost quality. From this study, the compost should be used at day 49 and should not be kept longer than 3 months otherwise the quality of HA would decline regardless of the amounts of HA that might be rising. This implied that other processes, such as mineralization had an influence on the humification process changing HA-s structure and its qualities.

Keywords: Compost maturity, UV spectroscopy, humification, humic acid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659
2381 Estimating an Optimal Neighborhood Size in the Spherical Self-Organizing Feature Map

Authors: Alexandros Leontitsis, Archana P. Sangole

Abstract:

This article presents a short discussion on optimum neighborhood size selection in a spherical selforganizing feature map (SOFM). A majority of the literature on the SOFMs have addressed the issue of selecting optimal learning parameters in the case of Cartesian topology SOFMs. However, the use of a Spherical SOFM suggested that the learning aspects of Cartesian topology SOFM are not directly translated. This article presents an approach on how to estimate the neighborhood size of a spherical SOFM based on the data. It adopts the L-curve criterion, previously suggested for choosing the regularization parameter on problems of linear equations where their right-hand-side is contaminated with noise. Simulation results are presented on two artificial 4D data sets of the coupled Hénon-Ikeda map.

Keywords: Parameter estimation, self-organizing feature maps, spherical topology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519
2380 Review of Trust Models in Wireless Sensor Networks

Authors: V. Uma Rani, K. Soma Sundaram

Abstract:

The major challenge faced by wireless sensor networks is security. Because of dynamic and collaborative nature of sensor networks the connected sensor devices makes the network unusable. To solve this issue, a trust model is required to find malicious, selfish and compromised insiders by evaluating trust worthiness sensors from the network. It supports the decision making processes in wireless sensor networks such as pre key-distribution, cluster head selection, data aggregation, routing and self reconfiguration of sensor nodes. This paper discussed the kinds of trust model, trust metrics used to address attacks by monitoring certain behavior of network. It describes the major design issues and their countermeasures of building trust model. It also discusses existing trust models used in various decision making process of wireless sensor networks.

Keywords: Attacks, Security, Trust, Trust model, Wireless sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4566
2379 Terrain Evaluation Method for Hexapod Robot

Authors: Tomas Luneckas, Dainius Udris

Abstract:

In this paper a simple terrain evaluation method for hexapod robot is introduced. This method is based on feet coordinate evaluation when all are on the ground. Depending on the feet coordinate differences the local terrain evaluation is possible. Terrain evaluation is necessary for right gait selection and/or body position correction. For terrain roughness evaluation three planes are plotted: two of them as definition points use opposite feet coordinates, third coincides with the robot body plane. The leaning angle of body plane is evaluated measuring gravity force using three-axis accelerometer. Terrain roughness evaluation method is based on angle estimation between normal vectors of these planes. Aim of this work is to present a simple method for embedded robot controller, allowing to find the best further movement settings.

Keywords: Hexapod robot, pose estimation, terrain evaluation, terrain roughness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
2378 Research of Strong-Column-Weak-Beam Criteria of Reinforced Concrete Frames Subjected to Biaxial Seismic Excitation

Authors: Chong Zhang, Mu-Xuan Tao

Abstract:

In several earthquakes, numerous reinforced concrete (RC) frames subjected to seismic excitation demonstrated a collapse pattern characterized by column hinges, though designed according to the Strong-Column-Weak-Beam (S-C-W-B) criteria. The effect of biaxial seismic excitation on the disparity between design and actual performance is carefully investigated in this article. First, a modified load contour method is proposed to derive a closed-form equation of biaxial bending moment strength, which is verified by numerical and experimental tests. Afterwards, a group of time history analyses of a simple frame modeled by fiber beam-column elements subjected to biaxial seismic excitation are conducted to verify that the current S-C-W-B criteria are not adequate to prevent the occurrence of column hinges. A biaxial over-strength factor is developed based on the proposed equation, and the reinforcement of columns is appropriately amplified with this factor to prevent the occurrence of column hinges under biaxial excitation, which is proved to be effective by another group of time history analyses.

Keywords: Biaxial bending moment strength, biaxial seismic excitation, fiber beam-column model, load contour method, strong-column-weak-beam.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 622