Search results for: high resolution array processing techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9209

Search results for: high resolution array processing techniques

7649 The Potential of Strain M Protease in Degradations of Protein in Natural Rubber Latex

Authors: Norlin Pauzi, Ahmad R.M. Yahya, Zairossani Nor, Amirul A. Abdullah

Abstract:

Strain M was isolated from the latex of Hevea brasiliensis that grow in the rubber farm area of Malaysia Rubber Board. Strain M was tentatively identified as Bacillus sp. Strain M demonstrated high protease production at pH 9, and this was suitable to be applied in rubber processing that was in alkaline conditions. The right and suitable proportion to be used in applying supernatant into the latex was two parts of latex and one part of enzyme. In this proportion, the latex was stable throughout the 72 hours of treatment. The potential of strain M to degrade protein in the natural rubber latex was proven with the reduction of 79.3% nitrogen in 24 hours treatment. Centrifugation process of the latex before undergoing the treatment had increased the protein degradation in latex. Although the centrifugation process did not achieve zero nitrogen content, it had improved the performance of protein denaturing in the natural rubber.

Keywords: Hevea brasiliensis, Bacillus sp., protease, latex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2398
7648 AC Signals Estimation from Irregular Samples

Authors: Predrag B. Petrović

Abstract:

The paper deals with the estimation of amplitude and phase of an analogue multi-harmonic band-limited signal from irregularly spaced sampling values. To this end, assuming the signal fundamental frequency is known in advance (i.e., estimated at an independent stage), a complexity-reduced algorithm for signal reconstruction in time domain is proposed. The reduction in complexity is achieved owing to completely new analytical and summarized expressions that enable a quick estimation at a low numerical error. The proposed algorithm for the calculation of the unknown parameters requires O((2M+1)2) flops, while the straightforward solution of the obtained equations takes O((2M+1)3) flops (M is the number of the harmonic components). It is applied in signal reconstruction, spectral estimation, system identification, as well as in other important signal processing problems. The proposed method of processing can be used for precise RMS measurements (for power and energy) of a periodic signal based on the presented signal reconstruction. The paper investigates the errors related to the signal parameter estimation, and there is a computer simulation that demonstrates the accuracy of these algorithms.

Keywords: Band-limited signals, Fourier coefficient estimation, analytical solutions, signal reconstruction, time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
7647 Holistic Simulation-Based Impact Analysis Framework for Sustainable Manufacturing

Authors: Mijoh A. Gbededo, Kapila Liyanage, Sabuj Mallik

Abstract:

The emerging approaches to sustainable manufacturing are considered to be solution-oriented with the aim of addressing the environmental, economic and social issues holistically. However, the analysis of the interdependencies amongst the three sustainability dimensions has not been fully captured in the literature. In a recent review of approaches to sustainable manufacturing, two categories of techniques are identified: 1) Sustainable Product Development (SPD), and 2) Sustainability Performance Assessment (SPA) techniques. The challenges of the approaches are not only related to the arguments and misconceptions of the relationships between the techniques and sustainable development but also to the inability to capture and integrate the three sustainability dimensions. This requires a clear definition of some of the approaches and a road-map to the development of a holistic approach that supports sustainability decision-making. In this context, eco-innovation, social impact assessment, and life cycle sustainability analysis play an important role. This paper deployed an integrative approach that enabled amalgamation of sustainable manufacturing approaches and the theories of reciprocity and motivation into a holistic simulation-based impact analysis framework. The findings in this research have the potential to guide sustainability analysts to capture the aspects of the three sustainability dimensions into an analytical model. Additionally, the research findings presented can aid the construction of a holistic simulation model of a sustainable manufacturing and support effective decision-making.

Keywords: Life cycle sustainability analysis, sustainable manufacturing, sustainability performance assessment, sustainable product development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 849
7646 Kurtosis, Renyi's Entropy and Independent Component Scalp Maps for the Automatic Artifact Rejection from EEG Data

Authors: Antonino Greco, Nadia Mammone, Francesco Carlo Morabito, Mario Versaci

Abstract:

The goal of this work is to improve the efficiency and the reliability of the automatic artifact rejection, in particular from the Electroencephalographic (EEG) recordings. Artifact rejection is a key topic in signal processing. The artifacts are unwelcome signals that may occur during the signal acquisition and that may alter the analysis of the signals themselves. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we enhance this technique introducing the Renyi-s entropy. The performance of our method was tested exploiting the Independent Component scalp maps and it was compared to the performance of the method in literature and it showed to outperform it.

Keywords: Artifact, EEG, Renyi's entropy, independent component analysis, kurtosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2431
7645 The Effect of Contrived Success in Calculation Tasks on the Self-efficacy of Junior High School Students

Authors: Akitoshi Uchida, Kazuo Mori

Abstract:

This study examines whether contrived success on a task closely related to school subjects would promote students- self-efficacy. In our previous study, junior high school students who experienced contrived success on anagram tasks raised their sense of self-efficacy and kept it high for a year.We tried to replicate that study, substituting calculation tasks for the anagrams. One hundred eighteen junior high school students participated in this study, 18 of whom were surreptitiously given easier tasks than their classmates. Those students with easier tasks outperformed their peers and thereby raised their sense of self-efficacy. However, elevated self-efficacy did not persist, falling to the starting level after only three months.

Keywords: self-efficacy, contrived success, junior high schoolstudents, calculation tasks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1449
7644 Auspicious Meaning for Community Souvenir Products

Authors: Somsakul Jerasilp, Jong Boonpracha

Abstract:

The objective of this research was to find the relationship between auspicious meaning in eastern wisdom and the interpretation as a guideline for the design and development of community souvenirs. The sample group included 400 customers in Bangkok who used to buy community souvenir products. The information was applied to design the souvenirs which were considered for the appropriateness by 5 design specialists. The data were analyzed to find frequency, percentage, and SD with the results as follows. 1) The best factor referring to the auspicious meaning is color. The application of auspicious meaning can make the value added to the product and bring the fortune to the receivers. 2) The effectiveness of the auspicious meaning integration on the design of community souvenir product was in high level. When considering in each aspect, it was found that the interpretation aspect was in high level, the congruency of the auspicious meaning and the utility of the product was in high level. The attractiveness and the good design were in very high level while the potential of the value added in the product design was in high level. The suitable application to the design of community souvenir product was in high level.

Keywords: Auspicious meaning, community products, souvenirs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042
7643 Methanol Concentration Sensitive SWCNT/Nafion Composites

Authors: Kyongsoo Lee, , Seong-Il Kim, Byeong-Kwon Ju

Abstract:

An aqueous methanol sensor for use in direct methanol fuel cells (DMFCs) applications is demonstrated; the methanol sensor is built using dispersed single-walled carbon nanotubes (SWCNTs) with Nafion117 solution to detect the methanol concentration in water. The study is aimed at the potential use of the carbon nanotubes array as a methanol sensor for direct methanol fuel cells (DMFCs). The concentration of methanol in the fuel circulation loop of a DMFC system is an important operating parameter, because it determines the electrical performance and efficiency of the fuel cell system. The sensor is also operative even at ambient temperatures and responds quickly to changes in the concentration levels of the methanol. Such a sensor can be easily incorporated into the methanol fuel solution flow loop in the DMFC system.

Keywords: methanol concentration, SWCNT, nafion composites

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
7642 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection

Authors: Hamidullah Binol, Abdullah Bal

Abstract:

Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.

Keywords: Food (Ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500
7641 Project Base Learning for IT Personnel Resources Development using TVML

Authors: Tansuriyavong Suriyon, Endo Takanobu, Boonmee Choompol

Abstract:

Using the animations video of teaching materials is an effective learning method. However, we thought that more effective learning method is to produce the teaching video by learners themselves. The learners who act as the producer must learn and understand well to produce and present video of teaching materials to others. The purpose of this study is to propose the project based learning (PBL) technique by co-producing video of IT (information technology) teaching materials. We used the T2V player to produce the video based on TVML a TV program description language. By proposed method, we have assigned the learners to produce the animations video for “National Examination for Information Processing Technicians (IPA examination)" in Japan, in order to get them learns various knowledge and skill on IT field. Experimental result showed that learning effect has occurred at the video production process that useful for IT personnel resources development.

Keywords: TVML , T2V Player, The animation made as learning materials, National Examination for Information Processing Technicians, IT Education, Problem Based Learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1534
7640 Technique for Processing and Preservation of Human Amniotic Membrane for Ocular Surface Reconstruction

Authors: Irfan Z. Qureshi, Fareeha A., Wajid A. Khan

Abstract:

Human amniotic membrane (HAM) is a useful biological material for the reconstruction of damaged ocular surface. The processing and preservation of HAM is critical to prevent the patients undergoing amniotic membrane transplant (AMT) from cross infections. For HAM preparation human placenta is obtained after an elective cesarean delivery. Before collection, the donor is screened for seronegativity of HCV, Hbs Ag, HIV and Syphilis. After collection, placenta is washed in balanced salt solution (BSS) in sterile environment. Amniotic membrane is then separated from the placenta as well as chorion while keeping the preparation in BSS. Scrapping of HAM is then carried out manually until all the debris is removed and clear transparent membrane is acquired. Nitrocellulose membrane filters are then placed on the stromal side of HAM, cut around the edges with little membrane folded towards other side making it easy to separate during surgery. HAM is finally stored in solution of glycerine and Dulbecco-s Modified Eagle Medium (DMEM) in 1:1 ratio containing antibiotics. The capped borosil vials containing HAM are kept at -80°C until use. This vial is thawed to room temperature and opened under sterile operation theatre conditions at the time of surgery.

Keywords: HAM, AMT, ocular transplant

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3548
7639 Geochemical Study of Natural Bitumen, Condensate and Gas Seeps from Sousse Area, Central Tunisia

Authors: A. Belhaj Mohamed, M. Saidi, N. Boucherb, N. Ourtani, A. Soltani, I. Bouazizi, M. Ben Jrad

Abstract:

Natural hydrocarbon seepage has helped petroleum exploration as a direct indicator of gas and/or oil subsurface accumulations. Surface macro-seeps are generally an indication of a fault in an active Petroleum Seepage System belonging to a Total Petroleum System. This paper describes a case study in which multiple analytical techniques were used to identify and characterize trace petroleum-related hydrocarbons and other volatile organic compounds in groundwater samples collected from Sousse aquifer (Central Tunisia). The analytical techniques used for analyses of water samples included gas chromatography-mass spectrometry (GCMS), capillary GC with flame-ionization detection, Compound Specific Isotope Analysis, Rock Eval Pyrolysis. The objective of the study was to confirm the presence of gasoline and other petroleum products or other volatile organic pollutants in those samples in order to assess the respective implication of each of the potentially responsible parties to the contamination of the aquifer. In addition, the degree of contamination at different depths in the aquifer was also of interest. The oil and gas seeps have been investigated using biomarker and stable carbon isotope analyses to perform oil-oil and oil-source rock correlations. The seepage gases are characterized by high CH4 content, very low δ13CCH4 values (-71,9 ‰) and high C1/C1–5 ratios (0.95–1.0), light deuterium–hydrogen isotope ratios (- 198 ‰) and light δ13CC2 and δ13CCO2 values (-23,8‰ and-23,8‰ respectively) indicating a thermogenic origin with the contribution of the biogenic gas. An organic geochemistry study was carried out on the more ten oil seep samples. This study includes light hydrocarbon and biomarkers analyses (hopanes, steranes, n-alkanes, acyclic isoprenoids, and aromatic steroids) using GC and GC-MS. The studied samples show at least two distinct families, suggesting two different types of crude oil origins: the first oil seeps appears to be highly mature, showing evidence of chemical and/or biological degradation and was derived from a clay-rich source rock deposited in suboxic conditions. It has been sourced mainly by the lower Fahdene (Albian) source rocks. The second oil seeps was derived from a carbonate-rich source rock deposited in anoxic conditions, well correlated with the Bahloul (Cenomanian-Turonian) source rock.

Keywords: Biomarkers, oil and gas seeps, organic geochemistry, source rock.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3450
7638 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program

Authors: Ming Wen, Nasim Nezamoddini

Abstract:

Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.

Keywords: FEA, random vibration fatigue, process automation, AHP, TOPSIS, multiple-criteria decision-making, MCDM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 532
7637 A Proposed Program for Postgraduates in Egypt to Acquire the Skills and Techniques for Producing Concept Cartoons for Kindergarten Children

Authors: Ahmed Amin Mousa, M. Abd El Salam

Abstract:

The current study presents a proposed program for acquisition the skills and techniques needed to produce concept cartoon. The proposed program has been prepared for non-specialist students who have never used neither graphics nor animating software. It was presented to postgraduates in Faculty of Education for Early Childhood, Cairo University, during the spring term of the 2014-2015 academic year. The program works in three different aspects: Drawing and images editing, sound manipulation, and creating animation. In addition, the researchers have prepared a questionnaire for measuring the quality of the concept cartoons produced by the students. The questionnaire was used as a pre-test and post-test, and at the end of the study, a significant difference was determined in favour of post-test results.

Keywords: Cartoon, concept cartoon, kindergarten, animation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
7636 High Temperature Hydrogen Sensors Based On Pd/Ta2O5/SiC MOS Capacitor

Authors: J. H. Choi, S. J. Kim, M. S. Jung, S. J. Kim, S. J. Joo, S. C. Kim

Abstract:

There are a many of needs for the development of SiC-based hydrogen sensor for harsh environment applications. We fabricated and investigated Pd/Ta2O5/SiC-based hydrogen sensors with MOS capacitor structure for high temperature process monitoring and leak detection applications in such automotive, chemical and petroleum industries as well as direct monitoring of combustion processes. In this work, we used silicon carbide (SiC) as a substrate to replace silicon which operating temperatures are limited to below 200°C. Tantalum oxide was investigated as dielectric layer which has high permeability for hydrogen gas and high dielectric permittivity, compared with silicon dioxide or silicon nitride. Then, electrical response properties, such as I-V curve and dependence of capacitance on hydrogen concentrations were analyzed in the temperature ranges of room temperature to 500°C for performance evaluation of the sensor.

Keywords: High temperature, hydrogen sensor, SiC, Ta2O5 dielectric layer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069
7635 Video Quality Assessment Methods: A Bird’s-Eye View

Authors: P. M. Arun Kumar, S. Chandramathi

Abstract:

The proliferation of multimedia technology and services in today’s world provide ample research scope in the frontiers of visual signal processing. Wide spread usage of video based applications in heterogeneous environment needs viable methods of Video Quality Assessment (VQA). The evaluation of video quality not only depends on high QoS requirements but also emphasis the need of novel term ‘QoE’ (Quality of Experience) that perceive video quality as user centric. This paper discusses two vital video quality assessment methods namely, subjective and objective assessment methods. The evolution of various video quality metrics, their classification models and applications are reviewed in this work. The Mean Opinion Score (MOS) based subjective measurements and algorithm based objective metrics are discussed and their challenges are outlined. Further, this paper explores the recent progress of VQA in emerging technologies such as mobile video and 3D video.

Keywords: 3D-Video, no reference metric, quality of experience, video quality assessment, video quality metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4053
7634 Flexible Arm Manipulator Control for Industrial Tasks

Authors: Mircea Ivanescu, Nirvana Popescu, Decebal Popescu, Dorin Popescu

Abstract:

This paper addresses the control problem of a class of hyper-redundant arms. In order to avoid discrepancy between the mathematical model and the actual dynamics, the dynamic model with uncertain parameters of this class of manipulators is inferred. A procedure to design a feedback controller which stabilizes the uncertain system has been proposed. A PD boundary control algorithm is used in order to control the desired position of the manipulator. This controller is easy to implement from the point of view of measuring techniques and actuation. Numerical simulations verify the effectiveness of the presented methods. In order to verify the suitability of the control algorithm, a platform with a 3D flexible manipulator has been employed for testing. Experimental tests on this platform illustrate the applications of the techniques developed in the paper.

Keywords: Distributed model, flexible manipulator, observer, robot control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
7633 Integrating Computational Intelligence Techniques and Assessment Agents in ELearning Environments

Authors: Konstantinos C. Giotopoulos, Christos E. Alexakos, Grigorios N. Beligiannis, Spiridon D.Likothanassis

Abstract:

In this contribution an innovative platform is being presented that integrates intelligent agents and evolutionary computation techniques in legacy e-learning environments. It introduces the design and development of a scalable and interoperable integration platform supporting: I) various assessment agents for e-learning environments, II) a specific resource retrieval agent for the provision of additional information from Internet sources matching the needs and profile of the specific user and III) a genetic algorithm designed to extract efficient information (classifying rules) based on the students- answering input data. The agents are implemented in order to provide intelligent assessment services based on computational intelligence techniques such as Bayesian Networks and Genetic Algorithms. The proposed Genetic Algorithm (GA) is used in order to extract efficient information (classifying rules) based on the students- answering input data. The idea of using a GA in order to fulfil this difficult task came from the fact that GAs have been widely used in applications including classification of unknown data. The utilization of new and emerging technologies like web services allows integrating the provided services to any web based legacy e-learning environment.

Keywords: Bayesian Networks, Computational Intelligencetechniques, E-learning legacy systems, Service Oriented Integration, Intelligent Agents, Genetic Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744
7632 The Mass Attenuation Coefficients, Effective Atomic Cross Sections, Effective Atomic Numbers and Electron Densities of Some Halides

Authors: Shivalinge Gowda

Abstract:

The total mass attenuation coefficients m/r, of some halides such as, NaCl, KCl, CuCl, NaBr, KBr, RbCl, AgCl, NaI, KI, AgBr, CsI, HgCl2, CdI2 and HgI2 were determined at photon energies 279.2, 320.07, 514.0, 661.6, 1115.5, 1173.2 and 1332.5 keV in a well-collimated narrow beam good geometry set-up using a high resolution, hyper pure germanium detector. The mass attenuation coefficients and the effective atomic cross sections are found to be in good agreement with the XCOM values. From these mass attenuation coefficients, the effective atomic cross sections sa, of the compounds were determined. These effective atomic cross section sa data so obtained are then used to compute the effective atomic numbers Zeff. For this, the interpolation of total attenuation cross-sections of photons of energy E in elements of atomic number Z was performed by using the logarithmic regression analysis of the data measured by the authors and reported earlier for the above said energies along with XCOM data for standard energies. The best-fit coefficients in the photon energy range of 250 to 350 keV, 350 to 500 keV, 500 to 700 keV, 700 to 1000 keV and 1000 to 1500 keV by a piecewise interpolation method were then used to find the Zeff of the compounds with respect to the effective atomic cross section sa from the relation obtained by piece wise interpolation method. Using these Zeff values, the electron densities Nel of halides were also determined. The present Zeff and Nel values of halides are found to be in good agreement with the values calculated from XCOM data and other available published values.

Keywords: Mass attenuation coefficient, atomic cross-section, effective atomic number, electron density.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2122
7631 Solution of Two Dimensional Quasi-Harmonic Equations with CA Approach

Authors: F. Rezaie Moghaddam, J. Amani, T. Rezaie Moghaddam

Abstract:

Many computational techniques were applied to solution of heat conduction problem. Those techniques were the finite difference (FD), finite element (FE) and recently meshless methods. FE is commonly used in solution of equation of heat conduction problem based on the summation of stiffness matrix of elements and the solution of the final system of equations. Because of summation process of finite element, convergence rate was decreased. Hence in the present paper Cellular Automata (CA) approach is presented for the solution of heat conduction problem. Each cell considered as a fixed point in a regular grid lead to the solution of a system of equations is substituted by discrete systems of equations with small dimensions. Results show that CA can be used for solution of heat conduction problem.

Keywords: Heat conduction, Cellular automata, convergencerate, discrete system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775
7630 GPU Implementation for Solving in Compressible Two-Phase Flows

Authors: Sheng-Hsiu Kuo, Pao-Hsiung Chiu, Reui-Kuo Lin, Yan-Ting Lin

Abstract:

A one-step conservative level set method, combined with a global mass correction method, is developed in this study to simulate the incompressible two-phase flows. The present framework do not need to solve the conservative level set scheme at two separated steps, and the global mass can be exactly conserved. The present method is then more efficient than two-step conservative level set scheme. The dispersion-relation-preserving schemes are utilized for the advection terms. The pressure Poisson equation solver is applied to GPU computation using the pCDR library developed by National Center for High-Performance Computing, Taiwan. The SMP parallelization is used to accelerate the rest of calculations. Three benchmark problems were done for the performance evaluation. Good agreements with the referenced solutions are demonstrated for all the investigated problems.

Keywords: Conservative level set method, two-phase flow, dispersion-relation-preserving, graphics processing unit (GPU), multi-threading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065
7629 Equivalence Class Subset Algorithm

Authors: Jeffrey L. Duffany

Abstract:

The equivalence class subset algorithm is a powerful tool for solving a wide variety of constraint satisfaction problems and is based on the use of a decision function which has a very high but not perfect accuracy. Perfect accuracy is not required in the decision function as even a suboptimal solution contains valuable information that can be used to help find an optimal solution. In the hardest problems, the decision function can break down leading to a suboptimal solution where there are more equivalence classes than are necessary and which can be viewed as a mixture of good decision and bad decisions. By choosing a subset of the decisions made in reaching a suboptimal solution an iterative technique can lead to an optimal solution, using series of steadily improved suboptimal solutions. The goal is to reach an optimal solution as quickly as possible. Various techniques for choosing the decision subset are evaluated.

Keywords: np-complete, complexity, algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1364
7628 Equalization Algorithms for MIMO System

Authors: Said Elkassimi, Said Safi, B. Manaut

Abstract:

In recent years, multi-antenna techniques are being considered as a potential solution to increase the flow of future wireless communication systems. The objective of this article is to study the emission and reception system MIMO (Multiple Input Multiple Output), and present the different reception decoding techniques. First we will present the least complex technical, linear receivers such as the zero forcing equalizer (ZF) and minimum mean squared error (MMSE). Then a nonlinear technique called ordered successive cancellation of interferences (OSIC) and the optimal detector based on the maximum likelihood criterion (ML), finally, we simulate the associated decoding algorithms for MIMO system such as ZF, MMSE, OSIC and ML, thus a comparison of performance of these algorithms in MIMO context.

Keywords: Multiple Input Multiple Outputs (MIMO), ZF, MMSE, Ordered Interference Successive Cancellation (OSIC), ML, Interference Successive Cancellation (SIC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2829
7627 Image Restoration in Non-Linear Filtering Domain using MDB approach

Authors: S. K. Satpathy, S. Panda, K. K. Nagwanshi, C. Ardil

Abstract:

This paper proposes a new technique based on nonlinear Minmax Detector Based (MDB) filter for image restoration. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Image degradation can be due to the addition of different types of noise in the original image. Image noise can be modeled of many types and impulse noise is one of them. Impulse noise generates pixels with gray value not consistent with their local neighborhood. It appears as a sprinkle of both light and dark or only light spots in the image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly efficient but for large window and in case of high noise it gives rise to more blurring to image. The Centre Weighted Mean (CWM) filter has got a better average performance over the median filter. However the original pixel corrupted and noise reduction is substantial under high noise condition. Hence this technique has also blurring affect on the image. To illustrate the superiority of the proposed approach, the proposed new scheme has been simulated along with the standard ones and various restored performance measures have been compared.

Keywords: Filtering, Minmax Detector Based (MDB), noise, centre weighted mean filter, PSNR, restoration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2739
7626 An Evaluation on Fixed Wing and Multi-Rotor UAV Images Using Photogrammetric Image Processing

Authors: Khairul Nizam Tahar, Anuar Ahmad

Abstract:

This paper has introduced a slope photogrammetric mapping using unmanned aerial vehicle. There are two units of UAV has been used in this study; namely; fixed wing and multi-rotor. Both UAVs were used to capture images at the study area. A consumer digital camera was mounted vertically at the bottom of UAV and captured the images at an altitude. The objectives of this study are to obtain three dimensional coordinates of slope area and to determine the accuracy of photogrammetric product produced from both UAVs. Several control points and checkpoints were established Real Time Kinematic Global Positioning System (RTK-GPS) in the study area. All acquired images from both UAVs went through all photogrammetric processes such as interior orientation, exterior orientation, aerial triangulation and bundle adjustment using photogrammetric software. Two primary results were produced in this study; namely; digital elevation model and digital orthophoto. Based on results, UAV system can be used to mapping slope area especially for limited budget and time constraints project.

Keywords: Slope mapping, 3D, DEM, UAV, Photogrammetry, image processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6087
7625 Incremental Learning of Independent Topic Analysis

Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda

Abstract:

In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.

Keywords: Text mining, topic extraction, independent, incremental, independent component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1059
7624 Fast Cosine Transform to Increase Speed-up and Efficiency of Karhunen-Loève Transform for Lossy Image Compression

Authors: Mario Mastriani, Juliana Gambini

Abstract:

In this work, we present a comparison between two techniques of image compression. In the first case, the image is divided in blocks which are collected according to zig-zag scan. In the second one, we apply the Fast Cosine Transform to the image, and then the transformed image is divided in blocks which are collected according to zig-zag scan too. Later, in both cases, the Karhunen-Loève transform is applied to mentioned blocks. On the other hand, we present three new metrics based on eigenvalues for a better comparative evaluation of the techniques. Simulations show that the combined version is the best, with minor Mean Absolute Error (MAE) and Mean Squared Error (MSE), higher Peak Signal to Noise Ratio (PSNR) and better image quality. Finally, new technique was far superior to JPEG and JPEG2000.

Keywords: Fast Cosine Transform, image compression, JPEG, JPEG2000, Karhunen-Loève Transform, zig-zag scan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4915
7623 Automatic Threshold Search for Heat Map Based Feature Selection: A Cancer Dataset Analysis

Authors: Carlos Huertas, Reyes Juarez-Ramirez

Abstract:

Public health is one of the most critical issues today; therefore, there is great interest to improve technologies in the area of diseases detection. With machine learning and feature selection, it has been possible to aid the diagnosis of several diseases such as cancer. In this work, we present an extension to the Heat Map Based Feature Selection algorithm, this modification allows automatic threshold parameter selection that helps to improve the generalization performance of high dimensional data such as mass spectrometry. We have performed a comparison analysis using multiple cancer datasets and compare against the well known Recursive Feature Elimination algorithm and our original proposal, the results show improved classification performance that is very competitive against current techniques.

Keywords: Feature selection, mass spectrometry, biomarker discovery, cancer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589
7622 Speech Enhancement Using Wavelet Coefficients Masking with Local Binary Patterns

Authors: Christian Arcos, Marley Vellasco, Abraham Alcaim

Abstract:

In this paper, we present a wavelet coefficients masking based on Local Binary Patterns (WLBP) approach to enhance the temporal spectra of the wavelet coefficients for speech enhancement. This technique exploits the wavelet denoising scheme, which splits the degraded speech into pyramidal subband components and extracts frequency information without losing temporal information. Speech enhancement in each high-frequency subband is performed by binary labels through the local binary pattern masking that encodes the ratio between the original value of each coefficient and the values of the neighbour coefficients. This approach enhances the high-frequency spectra of the wavelet transform instead of eliminating them through a threshold. A comparative analysis is carried out with conventional speech enhancement algorithms, demonstrating that the proposed technique achieves significant improvements in terms of PESQ, an international recommendation of objective measure for estimating subjective speech quality. Informal listening tests also show that the proposed method in an acoustic context improves the quality of speech, avoiding the annoying musical noise present in other speech enhancement techniques. Experimental results obtained with a DNN based speech recognizer in noisy environments corroborate the superiority of the proposed scheme in the robust speech recognition scenario.

Keywords: Binary labels, local binary patterns, mask, wavelet coefficients, speech enhancement, speech recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1017
7621 Development and Optimization of Automated Dry-Wafer Separation

Authors: Tim Giesen, Christian Fischmann, Fabian Böttinger, Alexander Ehm, Alexander Verl

Abstract:

In a state-of-the-art industrial production line of photovoltaic products the handling and automation processes are of particular importance and implication. While processing a fully functional crystalline solar cell an as-cut photovoltaic wafer is subject to numerous repeated handling steps. With respect to stronger requirements in productivity and decreasing rejections due to defects the mechanical stress on the thin wafers has to be reduced to a minimum as the fragility increases by decreasing wafer thicknesses. In relation to the increasing wafer fragility, researches at the Fraunhofer Institutes IPA and CSP showed a negative correlation between multiple handling processes and the wafer integrity. Recent work therefore focused on the analysis and optimization of the dry wafer stack separation process with compressed air. The achievement of a wafer sensitive process capability and a high production throughput rate is the basic motivation in this research.

Keywords: Automation, Photovoltaic Manufacturing, Thin Wafer, Material Handling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671
7620 Thermo-Mechanical Processing of Armor Steel Plates

Authors: Taher El-Bitar, Maha El-Meligy, Eman El-Shenawy, Almosilhy Almosilhy, Nader Dawood

Abstract:

The steel contains 0.3% C and 0.004% B, beside Mn, Cr, Mo, and Ni. The alloy was processed by using 20-ton capacity electric arc furnace (EAF), and then refined by ladle furnace (LF). Liquid steel was cast as rectangular ingots. Dilatation test showed the critical transformation temperatures Ac1, Ac3, Ms and Mf as 716, 835, 356, and 218 °C. The ingots were austenitized and soaked and then rough rolled to thin slabs with 80 mm thickness. The thin slabs were then reheated and soaked for finish rolling to 6.0 mm thickness plates. During the rough rolling, the roll force increases as a result of rolling at temperatures less than recrystallization temperature. However, during finish rolling, the steel reflects initially continuous static recrystallization after which it shows strain hardening due to fall of temperature. It was concluded that, the steel plates were successfully heat treated by quenching-tempering at 250 ºC for 20 min.

Keywords: Armor steel, austenitizing, critical transformation temperatures, dilatation curve, martensite, quenching, rough and finish rolling processes, soaking, tempering, thermo-mechanical processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1298