Search results for: Protein Structure Data.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9858

Search results for: Protein Structure Data.

8388 Large-Dimensional Shells under Mining Tremors from Various Mining Regions in Poland

Authors: Joanna M. Dulińska, Maria Fabijańska

Abstract:

In the paper a detailed analysis of the dynamic response of a cooling tower shell to mining tremors originated from two main regions of mining activity in Poland (Upper Silesian Coal Basin and Legnica-Glogow Copper District) was presented. The representative time histories registered in the both regions were used as ground motion data in calculations of the dynamic response of the structure. It was proved that the dynamic response of the shell is strongly dependent not only on the level of vibration amplitudes but on the dominant frequency range of the mining shock typical for the mining region as well. Also a vertical component of vibrations occurred to have considerable influence on the total dynamic response of the shell. Finally, it turned out that non-uniformity of kinematic excitation resulting from spatial variety of ground motion plays a significant role in dynamic analysis of large-dimensional shells under mining shocks.

Keywords: Cooling towers, dynamic response, mining tremors, non-uniform kinematic excitation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421
8387 Attitudes of Gratitude: An Analysis of 30 Cancer Narratives Published by Leading U.S. Cancer Care Centers

Authors: Maria L. McLeod

Abstract:

This study examines the ways in which cancer patient narratives are portrayed and framed on the websites of three leading U.S. cancer care centers – The University of Texas MD Anderson Cancer Center in Houston, Memorial Sloan Kettering Cancer Center in New York, and Seattle Cancer Care Alliance. Thirty patient stories, 10 from each cancer center website blog, were analyzed using qualitative and quantitative textual analysis of unstructured data, documenting common themes and other elements of story structure and content. Patient narratives were coded using grounded theory as the basis for conducting emergent qualitative research. As part of a systematic, inductive approach to collecting and analyzing data, recurrent and unique themes were examined and compared in terms of positive and negative framing, patient agency, and institutional praise. All three of these cancer care centers are teaching hospitals, with university affiliations, that emphasize an evidence-based scientific approach to treatment that utilizes the latest research and cutting-edge techniques and technology. The featured cancer stories suggest positive outcomes based on anecdotal narratives as opposed to the science-based treatment models employed by the cancer centers. An analysis of 30 sample stories found skewed representation of the “cancer experience” that emphasizes positive outcomes while minimizing or excluding more negative realities of cancer diagnosis and treatment. The stories also deemphasize patient agency, instead focusing on deference and gratitude toward the cancer care centers, which are cast in the role of savior.  

Keywords: Cancer framing, cancer narratives, survivor stories, patient narratives.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 572
8386 GeNS: a Biological Data Integration Platform

Authors: Joel Arrais, João E. Pereira, João Fernandes, José Luís Oliveira

Abstract:

The scientific achievements coming from molecular biology depend greatly on the capability of computational applications to analyze the laboratorial results. A comprehensive analysis of an experiment requires typically the simultaneous study of the obtained dataset with data that is available in several distinct public databases. Nevertheless, developing a centralized access to these distributed databases rises up a set of challenges such as: what is the best integration strategy, how to solve nomenclature clashes, how to solve database overlapping data and how to deal with huge datasets. In this paper we present GeNS, a system that uses a simple and yet innovative approach to address several biological data integration issues. Compared with existing systems, the main advantages of GeNS are related to its maintenance simplicity and to its coverage and scalability, in terms of number of supported databases and data types. To support our claims we present the current use of GeNS in two concrete applications. GeNS currently contains more than 140 million of biological relations and it can be publicly downloaded or remotely access through SOAP web services.

Keywords: Data integration, biological databases

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1632
8385 A Study of RSCMAC Enhanced GPS Dynamic Positioning

Authors: Ching-Tsan Chiang, Sheng-Jie Yang, Jing-Kai Huang

Abstract:

The purpose of this research is to develop and apply the RSCMAC to enhance the dynamic accuracy of Global Positioning System (GPS). GPS devices provide services of accurate positioning, speed detection and highly precise time standard for over 98% area on the earth. The overall operation of Global Positioning System includes 24 GPS satellites in space; signal transmission that includes 2 frequency carrier waves (Link 1 and Link 2) and 2 sets random telegraphic codes (C/A code and P code), on-earth monitoring stations or client GPS receivers. Only 4 satellites utilization, the client position and its elevation can be detected rapidly. The more receivable satellites, the more accurate position can be decoded. Currently, the standard positioning accuracy of the simplified GPS receiver is greatly increased, but due to affected by the error of satellite clock, the troposphere delay and the ionosphere delay, current measurement accuracy is in the level of 5~15m. In increasing the dynamic GPS positioning accuracy, most researchers mainly use inertial navigation system (INS) and installation of other sensors or maps for the assistance. This research utilizes the RSCMAC advantages of fast learning, learning convergence assurance, solving capability of time-related dynamic system problems with the static positioning calibration structure to improve and increase the GPS dynamic accuracy. The increasing of GPS dynamic positioning accuracy can be achieved by using RSCMAC system with GPS receivers collecting dynamic error data for the error prediction and follows by using the predicted error to correct the GPS dynamic positioning data. The ultimate purpose of this research is to improve the dynamic positioning error of cheap GPS receivers and the economic benefits will be enhanced while the accuracy is increased.

Keywords: Dynamic Error, GPS, Prediction, RSCMAC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1685
8384 A Modified Run Length Coding Technique for Test Data Compression Based on Multi-Level Selective Huffman Coding

Authors: C. Kalamani, K. Paramasivam

Abstract:

Test data compression is an efficient method for reducing the test application cost. The problem of reducing test data has been addressed by researchers in three different aspects: Test Data Compression, Built-in-Self-Test (BIST) and Test set compaction. The latter two methods are capable of enhancing fault coverage with cost of hardware overhead. The drawback of the conventional methods is that they are capable of reducing the test storage and test power but when test data have redundant length of runs, no additional compression method is followed. This paper presents a modified Run Length Coding (RLC) technique with Multilevel Selective Huffman Coding (MLSHC) technique to reduce test data volume, test pattern delivery time and power dissipation in scan test applications where redundant length of runs is encountered then the preceding run symbol is replaced with tiny codeword. Experimental results show that the presented method not only improves the test data compression but also reduces the overall test data volume compared to recent schemes. Experiments for the six largest ISCAS-98 benchmarks show that our method outperforms most known techniques.

Keywords: Modified run length coding, multilevel selective Huffman coding, built-in-self-test modified selective Huffman coding, automatic test equipment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274
8383 EEIA: Energy Efficient Indexed Aggregation in Smart Wireless Sensor Networks

Authors: Mohamed Watfa, William Daher, Hisham Al Azar

Abstract:

The main idea behind in network aggregation is that, rather than sending individual data items from sensors to sinks, multiple data items are aggregated as they are forwarded by the sensor network. Existing sensor network data aggregation techniques assume that the nodes are preprogrammed and send data to a central sink for offline querying and analysis. This approach faces two major drawbacks. First, the system behavior is preprogrammed and cannot be modified on the fly. Second, the increased energy wastage due to the communication overhead will result in decreasing the overall system lifetime. Thus, energy conservation is of prime consideration in sensor network protocols in order to maximize the network-s operational lifetime. In this paper, we give an energy efficient approach to query processing by implementing new optimization techniques applied to in-network aggregation. We first discuss earlier approaches in sensors data management and highlight their disadvantages. We then present our approach “Energy Efficient Indexed Aggregation" (EEIA) and evaluate it through several simulations to prove its efficiency, competence and effectiveness.

Keywords: Sensor Networks, Data Base, Data Fusion, Aggregation, Indexing, Energy Efficiency

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
8382 FengShui Paradigm as Philosophy of Sustainable Design

Authors: E. Erdogan, H. A. Erdogan

Abstract:

FengShui, an old Chinese discipline, dates back to more than 5000 years, is one of the design principles that aim at creating habitable and sustainable spaces in harmony with nature by systematizing data within its own structure. Having emerged from Chinese mysticism and embodying elements of faith in its principles, FengShui argues that the positive energy in the environment channels human behavior and psychology. This argument is supported with the thesis of quantum physics that ‘everything is made up of energy’ and gains an important place. In spaces where living and working take place with several principles and systematized rules, FengShui promises a happier, more peaceful and comfortable life by influencing human psychology, acts, and soul as well as the professional and social life of the individual. Observing these design properties in houses, workplaces, offices, the environment, and daily life as a design paradigm is significant. In this study, how FengShui, a Central Asian culture emanated from Chinese mysticism, shapes design and how it is used as an element of sustainable design will be explained.

Keywords: FengShui, design principle, sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3422
8381 Comparison of Different Methods to Produce Fuzzy Tolerance Relations for Rainfall Data Classification in the Region of Central Greece

Authors: N. Samarinas, C. Evangelides, C. Vrekos

Abstract:

The aim of this paper is the comparison of three different methods, in order to produce fuzzy tolerance relations for rainfall data classification. More specifically, the three methods are correlation coefficient, cosine amplitude and max-min method. The data were obtained from seven rainfall stations in the region of central Greece and refers to 20-year time series of monthly rainfall height average. Three methods were used to express these data as a fuzzy relation. This specific fuzzy tolerance relation is reformed into an equivalence relation with max-min composition for all three methods. From the equivalence relation, the rainfall stations were categorized and classified according to the degree of confidence. The classification shows the similarities among the rainfall stations. Stations with high similarity can be utilized in water resource management scenarios interchangeably or to augment data from one to another. Due to the complexity of calculations, it is important to find out which of the methods is computationally simpler and needs fewer compositions in order to give reliable results.

Keywords: Classification, fuzzy logic, tolerance relations, rainfall data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1026
8380 Granularity Analysis for Spatio-Temporal Web Sensors

Authors: Shun Hattori

Abstract:

In recent years, many researches to mine the exploding Web world, especially User Generated Content (UGC) such as weblogs, for knowledge about various phenomena and events in the physical world have been done actively, and also Web services with the Web-mined knowledge have begun to be developed for the public. However, there are few detailed investigations on how accurately Web-mined data reflect physical-world data. It must be problematic to idolatrously utilize the Web-mined data in public Web services without ensuring their accuracy sufficiently. Therefore, this paper introduces the simplest Web Sensor and spatiotemporallynormalized Web Sensor to extract spatiotemporal data about a target phenomenon from weblogs searched by keyword(s) representing the target phenomenon, and tries to validate the potential and reliability of the Web-sensed spatiotemporal data by four kinds of granularity analyses of coefficient correlation with temperature, rainfall, snowfall, and earthquake statistics per day by region of Japan Meteorological Agency as physical-world data: spatial granularity (region-s population density), temporal granularity (time period, e.g., per day vs. per week), representation granularity (e.g., “rain" vs. “heavy rain"), and media granularity (weblogs vs. microblogs such as Tweets).

Keywords: Granularity analysis, knowledge extraction, spatiotemporal data mining, Web credibility, Web mining, Web sensor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
8379 Some Biological and Molecular Characterization of Bean Common Mosaic Necrosis Virus Isolated from Soybean in Tehran Province, Iran

Authors: F. S. Abtahi, M. Koohi Hbibi, M. Khodaei Motlagh

Abstract:

Bean common mosaic necrosis virus (BCMNV) is a potyvirus with a worldwide distribution. This virus causes serious economic losses in Iran in many leguminoses. During 20008, samples were collected from soybeans fields in Tehran Province. Four isolates (S1, S2 and S3) were inoculated on 15 species of Cucurbitaceae, Chenopodiaceae, Solanacae and Leguminosae. Chenopodium quinoa and C. amaranticolor. Did not developed any symptoms.all isolates caused mosaic symptoms on Phaseolus vulgaris cv. Red Kidney and P. vulgaris cv. Bountiful. The molecular weights of coat protein using SDS-PAGE and western blotting were estimated at 33 kDa. Reverse transcription polymerase chain reaction (RT-PCR) was performed using one primer pairs designed by L. XU et al. An approximately 920 bp fragment was amplified with a specific primer.

Keywords: ELISA, RT-PCR, SDS-PAGE, BCMNV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1806
8378 Characteristics of Hemodynamics in a Bileaflet Mechanical Heart Valve using an Implicit FSI Method

Authors: Tae-Hyub Hong, Choeng-Ryul Choi, Chang-Nyung Kim

Abstract:

Human heart valves diseased by congenital heart defects, rheumatic fever, bacterial infection, cancer may cause stenosis or insufficiency in the valves. Treatment may be with medication but often involves valve repair or replacement (insertion of an artificial heart valve). Bileaflet mechanical heart valves (BMHVs) are widely implanted to replace the diseased heart valves, but still suffer from complications such as hemolysis, platelet activation, tissue overgrowth and device failure. These complications are closely related to both flow characteristics through the valves and leaflet dynamics. In this study, the physiological flow interacting with the moving leaflets in a bileaflet mechanical heart valve (BMHV) is simulated with a strongly coupled implicit fluid-structure interaction (FSI) method which is newly organized based on the Arbitrary-Lagrangian-Eulerian (ALE) approach and the dynamic mesh method (remeshing) of FLUENT. The simulated results are in good agreement with previous experimental studies. This study shows the applicability of the present FSI model to the complicated physics interacting between fluid flow and moving boundary.

Keywords: Bileaflet Mechanical Heart Valve, Fluid- Structure Interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
8377 Non-negative Principal Component Analysis for Face Recognition

Authors: Zhang Yan, Yu Bin

Abstract:

Principle component analysis is often combined with the state-of-art classification algorithms to recognize human faces. However, principle component analysis can only capture these features contributing to the global characteristics of data because it is a global feature selection algorithm. It misses those features contributing to the local characteristics of data because each principal component only contains some levels of global characteristics of data. In this study, we present a novel face recognition approach using non-negative principal component analysis which is added with the constraint of non-negative to improve data locality and contribute to elucidating latent data structures. Experiments are performed on the Cambridge ORL face database. We demonstrate the strong performances of the algorithm in recognizing human faces in comparison with PCA and NREMF approaches.

Keywords: classification, face recognition, non-negativeprinciple component analysis (NPCA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
8376 Concurrent Approach to Data Parallel Model using Java

Authors: Bala Dhandayuthapani Veerasamy

Abstract:

Parallel programming models exist as an abstraction of hardware and memory architectures. There are several parallel programming models in commonly use; they are shared memory model, thread model, message passing model, data parallel model, hybrid model, Flynn-s models, embarrassingly parallel computations model, pipelined computations model. These models are not specific to a particular type of machine or memory architecture. This paper expresses the model program for concurrent approach to data parallel model through java programming.

Keywords: Concurrent, Data Parallel, JDK, Parallel, Thread

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2097
8375 Adjusted Ratio and Regression Type Estimators for Estimation of Population Mean when some Observations are missing

Authors: Nuanpan Nangsue

Abstract:

Ratio and regression type estimators have been used by previous authors to estimate a population mean for the principal variable from samples in which both auxiliary x and principal y variable data are available. However, missing data are a common problem in statistical analyses with real data. Ratio and regression type estimators have also been used for imputing values of missing y data. In this paper, six new ratio and regression type estimators are proposed for imputing values for any missing y data and estimating a population mean for y from samples with missing x and/or y data. A simulation study has been conducted to compare the six ratio and regression type estimators with a previous estimator of Rueda. Two population sizes N = 1,000 and 5,000 have been considered with sample sizes of 10% and 30% and with correlation coefficients between population variables X and Y of 0.5 and 0.8. In the simulations, 10 and 40 percent of sample y values and 10 and 40 percent of sample x values were randomly designated as missing. The new ratio and regression type estimators give similar mean absolute percentage errors that are smaller than the Rueda estimator for all cases. The new estimators give a large reduction in errors for the case of 40% missing y values and sampling fraction of 30%.

Keywords: Auxiliary variable, missing data, ratio and regression type estimators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732
8374 Lamb Wave Wireless Communication in Healthy Plates Using Coherent Demodulation

Authors: Rudy Bahouth, Farouk Benmeddour, Emmanuel Moulin, Jamal Assaad

Abstract:

Guided ultrasonic waves are used in Non-Destructive Testing and Structural Health Monitoring for inspection and damage detection. Recently, wireless data transmission using ultrasonic waves in solid metallic channels has gained popularity in some industrial applications such as nuclear, aerospace and smart vehicles. The idea is to find a good substitute for electromagnetic waves since they are highly attenuated near metallic components due to Faraday shielding. The proposed solution is to use ultrasonic guided waves such as Lamb waves as an information carrier due to their capability of propagation for long distances. In addition to this, valuable information about the health of the structure could be extracted simultaneously. In this work, the reliable frequency bandwidth for communication is extracted experimentally from dispersion curves at first. Then, an experimental platform for wireless communication using Lamb waves is described and built. After this, coherent demodulation algorithm used in telecommunications is tested for Amplitude Shift Keying, On-Off Keying and Binary Phase Shift Keying modulation techniques. Signal processing parameters such as threshold choice, number of cycles per bit and Bit Rate are optimized. Experimental results are compared based on the average bit error percentage. Results has shown high sensitivity to threshold selection for Amplitude Shift Keying and On-Off Keying techniques resulting a Bit Rate decrease. Binary Phase Shift Keying technique shows the highest stability and data rate between all tested modulation techniques.

Keywords: Lamb Wave Communication, wireless communication, coherent demodulation, bit error percentage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 561
8373 Creating 3D Models Using Infrared Thermography with Remotely Piloted Aerial Systems

Authors: P. van Tonder, C. C. Kruger

Abstract:

Concrete structures deteriorate over time and degradation escalates due to various factors. The rate of deterioration can be complex and unpredictable in nature. Such deteriorations may be located beneath the surface of the concrete at high elevations. This emphasizes the need for an efficient method of finding such defects to be able to assess the severity thereof. Current methods using thermography to find defects require equipment to reach higher elevations. This could become costly and time consuming not to mention the risks involved in having personnel scaffold or abseiling at such heights. Accordingly, by combining the thermal camera needed for thermography and a remotely piloted aerial system (Drone/RPAS), it could be used to alleviate some of the issues mentioned. Images can be translated into a 3D temperature model to aid concrete diagnostics and with further research can relate back to the mechanical properties of the structure but will not be dealt with in this paper. Such diagnostics includes finding delamination, similar to finding delamination on concrete decks, which resides beneath the surface of the concrete before spalling can occur. Delamination can be caused by reinforcement eroding and causing expansion beneath the concrete surface. This could lead to spalling, where concrete pieces start breaking off from the main concrete structure.

Keywords: Concrete, diagnostic, infrared thermography, 3D thermal models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 410
8372 Efficient Implementation of Serial and Parallel Support Vector Machine Training with a Multi-Parameter Kernel for Large-Scale Data Mining

Authors: Tatjana Eitrich, Bruno Lang

Abstract:

This work deals with aspects of support vector learning for large-scale data mining tasks. Based on a decomposition algorithm that can be run in serial and parallel mode we introduce a data transformation that allows for the usage of an expensive generalized kernel without additional costs. In order to speed up the decomposition algorithm we analyze the problem of working set selection for large data sets and analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our modifications and settings lead to improvement of support vector learning performance and thus allow using extensive parameter search methods to optimize classification accuracy.

Keywords: Support Vector Machines, Shared Memory Parallel Computing, Large Data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
8371 Modeling of Flood Mitigation Structures for Sarawak River Sub-basin Using Info Works River Simulation (RS)

Authors: Rosmina Bustami, Charles Bong, Darrien Mah, Afnie Hamzah, Marina Patrick

Abstract:

The distressing flood scenarios that occur in recent years at the surrounding areas of Sarawak River have left damages of properties and indirectly caused disruptions of productive activities. This study is meant to reconstruct a 100-year flood event that took place in this river basin. Sarawak River Subbasin was chosen and modeled using the one-dimensional hydrodynamic modeling approach using InfoWorks River Simulation (RS), in combination with Geographical Information System (GIS). This produces the hydraulic response of the river and its floodplains in extreme flooding conditions. With different parameters introduced to the model, correlations of observed and simulated data are between 79% – 87%. Using the best calibrated model, flood mitigation structures are imposed along the sub-basin. Analysis is done based on the model simulation results. Result shows that the proposed retention ponds constructed along the sub-basin provide the most efficient reduction of flood by 34.18%.

Keywords: Flood, Flood mitigation structure, InfoWorks RS, Retention pond, Sarawak River sub-basin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2717
8370 Identification of Conserved Domains and Motifs for GRF Gene Family

Authors: Jafar Ahmadi, Nafiseh Noormohammadi, Sedigheh Fabriki Ourang

Abstract:

GRF, Growth regulating factor, genes encode a novel class of plant-specific transcription factors. The GRF proteins play a role in the regulation of cell numbers in young and growing tissues and may act as transcription activations in growth and development of plants. Identification of GRF genes and their expression are important in plants to performance of the growth and development of various organs. In this study, to better understanding the structural and functional differences of GRFs family, 45 GRF proteins sequences in A. thaliana, Z. mays, O. sativa, B. napus, B. rapa, H. vulgare and S. bicolor, have been collected and analyzed through bioinformatics data mining. As a result, in secondary structure of GRFs, the number of alpha helices was more than beta sheets and in all of them QLQ domains were completely in the biggest alpha helix. In all GRFs, QLQ and WRC domains were completely protected except in AtGRF9. These proteins have no trans-membrane domain and due to have nuclear localization signals act in nuclear and they are component of unstable proteins in the test tube.

Keywords: Domain, Gene Family, GRF, Motif.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2330
8369 Software Test Data Generation using Ant Colony Optimization

Authors: Huaizhong Li, C.Peng Lam

Abstract:

State-based testing is frequently used in software testing. Test data generation is one of the key issues in software testing. A properly generated test suite may not only locate the errors in a software system, but also help in reducing the high cost associated with software testing. It is often desired that test data in the form of test sequences within a test suite can be automatically generated to achieve required test coverage. This paper proposes an Ant Colony Optimization approach to test data generation for the state-based software testing.

Keywords: Software testing, ant colony optimization, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3459
8368 Natural Language News Generation from Big Data

Authors: Bastian Haarmann, Lukas Sikorski

Abstract:

In this paper, we introduce an NLG application for the automatic creation of ready-to-publish texts from big data. The resulting fully automatic generated news stories have a high resemblance to the style in which the human writer would draw up such a story. Topics include soccer games, stock exchange market reports, and weather forecasts. Each generated text is unique. Readyto-publish stories written by a computer application can help humans to quickly grasp the outcomes of big data analyses, save timeconsuming pre-formulations for journalists and cater to rather small audiences by offering stories that would otherwise not exist. 

Keywords: Big data, natural language generation, publishing, robotic journalism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1687
8367 Yield Prediction Using Support Vectors Based Under-Sampling in Semiconductor Process

Authors: Sae-Rom Pak, Seung Hwan Park, Jeong Ho Cho, Daewoong An, Cheong-Sool Park, Jun Seok Kim, Jun-Geol Baek

Abstract:

It is important to predict yield in semiconductor test process in order to increase yield. In this study, yield prediction means finding out defective die, wafer or lot effectively. Semiconductor test process consists of some test steps and each test includes various test items. In other world, test data has a big and complicated characteristic. It also is disproportionably distributed as the number of data belonging to FAIL class is extremely low. For yield prediction, general data mining techniques have a limitation without any data preprocessing due to eigen properties of test data. Therefore, this study proposes an under-sampling method using support vector machine (SVM) to eliminate an imbalanced characteristic. For evaluating a performance, randomly under-sampling method is compared with the proposed method using actual semiconductor test data. As a result, sampling method using SVM is effective in generating robust model for yield prediction.

Keywords: Yield Prediction, Semiconductor Test Process, Support Vector Machine, Under Sampling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2398
8366 Improved Closed Set Text-Independent Speaker Identification by Combining MFCC with Evidence from Flipped Filter Banks

Authors: Sandipan Chakroborty, Anindya Roy, Goutam Saha

Abstract:

A state of the art Speaker Identification (SI) system requires a robust feature extraction unit followed by a speaker modeling scheme for generalized representation of these features. Over the years, Mel-Frequency Cepstral Coefficients (MFCC) modeled on the human auditory system has been used as a standard acoustic feature set for SI applications. However, due to the structure of its filter bank, it captures vocal tract characteristics more effectively in the lower frequency regions. This paper proposes a new set of features using a complementary filter bank structure which improves distinguishability of speaker specific cues present in the higher frequency zone. Unlike high level features that are difficult to extract, the proposed feature set involves little computational burden during the extraction process. When combined with MFCC via a parallel implementation of speaker models, the proposed feature set outperforms baseline MFCC significantly. This proposition is validated by experiments conducted on two different kinds of public databases namely YOHO (microphone speech) and POLYCOST (telephone speech) with Gaussian Mixture Models (GMM) as a Classifier for various model orders.

Keywords: Complementary Information, Filter Bank, GMM, IMFCC, MFCC, Speaker Identification, Speaker Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2298
8365 A Ring-Shaped Tri-Axial Force Sensor for Minimally Invasive Surgery

Authors: Beibei Han, Yong-Jin Yoon, Muhammad Hamidullah, Angel Tsu-Hui Lin, Woo-Tae Park

Abstract:

This paper presents the design of a ring-shaped tri-axial fore sensor that can be incorporated into the tip of a guidewire for use in minimally invasive surgery (MIS). The designed sensor comprises a ring-shaped structure located at the center of four cantilever beams. The ringdesign allows surgical tools to be easily passed through which largely simplified the integration process. Silicon nanowires (SiNWs) are used aspiezoresistive sensing elementsembeddedon the four cantilevers of the sensor to detect the resistance change caused by the applied load.An integration scheme with new designed guidewire tip structure having two coils at the distal end is presented. Finite element modeling has been employed in the sensor design to find the maximum stress location in order to put the SiNWs at the high stress regions to obtain maximum output. A maximum applicable force of 5 mN is found from modeling. The interaction mechanism between the designed sensor and a steel wire has been modeled by FEM. A linear relationship between the applied load on the steel wire and the induced stress on the SiNWs were observed.

Keywords: Triaxial MEMS force sensor, Ring shape, Silicon Nanowire, Minimally invasive surgery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2277
8364 Modelling Silica Optical Fibre Reliability: A Software Application

Authors: I. Severin, M. Caramihai, R. El Abdi, M. Poulain, A. Avadanii

Abstract:

In order to assess optical fiber reliability in different environmental and stress conditions series of testing are performed simulating overlapping of chemical and mechanical controlled varying factors. Each series of testing may be compared using statistical processing: i.e. Weibull plots. Due to the numerous data to treat, a software application has appeared useful to interpret selected series of experiments in function of envisaged factors. The current paper presents a software application used in the storage, modelling and interpretation of experimental data gathered from optical fibre testing. The present paper strictly deals with the software part of the project (regarding the modelling, storage and processing of user supplied data).

Keywords: Optical fibres, computer aided analysis, data models, data processing, graphical user interfaces.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823
8363 Molecular Dynamics Simulation of Liquid-Vapor Interface on the Solid Surface Using the GEAR-S Algorithm

Authors: D. Toghraie, A. R. Azimian

Abstract:

In this paper, the Lennard -Jones potential is applied to molecules of liquid argon as well as its vapor and platinum as solid surface in order to perform a non-equilibrium molecular dynamics simulation to study the microscopic aspects of liquid-vapor-solid interactions. The channel is periodic in x and y directions and along z direction it is bounded by atomic walls. It was found that density of the liquids near the solid walls fluctuated greatly and that the structure was more like a solid than a liquid. This indicates that the interactions of solid and liquid molecules are very strong. The resultant surface tension, liquid density and vapor density are found to be well predicted when compared with the experimental data for argon. Liquid and vapor densities were found to depend on the cutoff radius which induces the use of P3M (particle-particle particle-mesh) method which was implemented for evaluation of force and surface tension.

Keywords: Lennard-Jones Potential, Molecular DynamicsSimulation, Periodic Boundary Conditions (PBC), Non-EquilibriumMolecular Dynamics (NEMD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2015
8362 A Numerical Study of Seismic Response of Shallow Square Tunnels in Two-Layered Ground

Authors: Mahmoud Hassanlourad, Mehran Naghizadehrokni, Vahid Molaei

Abstract:

In this study, the seismic behavior of a shallow tunnel with square cross section is investigated in a two layered and elastic heterogeneous environment using numerical method. To do so, FLAC finite difference software was used. Behavioral model of the ground and tunnel structure was assumed linear elastic. Dynamic load was applied to the model for 0.2 seconds from the bottom in form of a square pulse with maximum acceleration of 1 m/s2. The interface between the two layers was considered at three different levels of crest, middle, and bottom of the tunnel. The stiffness of the two upper and lower layers was considered to be varied from 10 MPa to 1000 MPa. Deformation of cross section of the tunnel due to dynamic load propagation, as well as the values of axial force and bending moment created in the tunnel structure, were examined in the three states mentioned above. The results of analyses show that heterogeneity of the environment, its stratification, and positioning of the interface of the two layers with respect to tunnel height and the stiffness ratio of the two layers have significant effects on the value of bending moment, axial force, and distortion of tunnel cross-section.

Keywords: Dynamic analysis, shallow-buried tunnel, two-layered ground.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 769
8361 Learner Awareness Levels Questionnaire: Development and Preliminary Validation of the English and Malay Versions to Measure How and Why Students Learn

Authors: S. Chee Choy, Pauline Swee Choo Goh, Yow Lin Liew

Abstract:

The purpose of this study is to evaluate the English version and a Malay translation of the 21-item Learner Awareness Questionnaire for its application to assess student learning in higher education. The Learner Awareness Questionnaire, originally written in English, is a quantitative measure of how and why students learn. The questionnaire gives an indication of the process and motives to learn using four scales: survival, establishing stability, approval and loving to learn. Data in the present study came from 680 university students enrolled in various programmes in Malaysia. The Malay version of the questionnaire supported a similar four factor structure and internal consistency to the English version. The four factors of the Malay version also showed moderate to strong correlations with those of the English versions. The results suggest that the Malay version of the questionnaire is similar to the English version. However, further refinement to the questions is needed to strengthen the correlations between the two questionnaires.

Keywords: Student learning, learner awareness, instrument validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2261
8360 The Role of Synthetic Data in Aerial Object Detection

Authors: Ava Dodd, Jonathan Adams

Abstract:

The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represent another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.

Keywords: computer vision, machine learning, synthetic data, YOLOv4

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 852
8359 Unsupervised Text Mining Approach to Early Warning System

Authors: Ichihan Tai, Bill Olson, Paul Blessner

Abstract:

Traditional early warning systems that alarm against crisis are generally based on structured or numerical data; therefore, a system that can make predictions based on unstructured textual data, an uncorrelated data source, is a great complement to the traditional early warning systems. The Chicago Board Options Exchange (CBOE) Volatility Index (VIX), commonly referred to as the fear index, measures the cost of insurance against market crash, and spikes in the event of crisis. In this study, news data is consumed for prediction of whether there will be a market-wide crisis by predicting the movement of the fear index, and the historical references to similar events are presented in an unsupervised manner. Topic modeling-based prediction and representation are made based on daily news data between 1990 and 2015 from The Wall Street Journal against VIX index data from CBOE.

Keywords: Early Warning System, Knowledge Management, Topic Modeling, Market Prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920