Search results for: Ensemble Based Threshold Accepting
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11503

Search results for: Ensemble Based Threshold Accepting

11143 High Performance In0.42Ga0.58As/In0.26Ga0.74As Vertical Cavity Surface Emitting Quantum Well Laser on In0.31Ga0.69As Ternary Substrate

Authors: Md. M. Biswas, Md. M. Hossain, Shaikh Nuruddin

Abstract:

This paper reports on the theoretical performance analysis of the 1.3 μm In0.42Ga0.58As /In0.26Ga0.74As multiple quantum well (MQW) vertical cavity surface emitting laser (VCSEL) on the ternary In0.31Ga0.69As substrate. The output power of 2.2 mW has been obtained at room temperature for 7.5 mA injection current. The material gain has been estimated to be ~3156 cm-1 at room temperature with the injection carrier concentration of 2×1017 cm-3. The modulation bandwidth of this laser is measured to be 9.34 GHz at room temperature for the biasing current of 2 mA above the threshold value. The outcomes reveal that the proposed InGaAsbased MQW laser is the promising one for optical communication system.

Keywords: Quantum well, VCSEL, output power, materialgain, modulation bandwidth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
11142 Low Dimensional Representation of Dorsal Hand Vein Features Using Principle Component Analysis (PCA)

Authors: M.Heenaye-Mamode Khan, R.K. Subramanian, N. A. Mamode Khan

Abstract:

The quest of providing more secure identification system has led to a rise in developing biometric systems. Dorsal hand vein pattern is an emerging biometric which has attracted the attention of many researchers, of late. Different approaches have been used to extract the vein pattern and match them. In this work, Principle Component Analysis (PCA) which is a method that has been successfully applied on human faces and hand geometry is applied on the dorsal hand vein pattern. PCA has been used to obtain eigenveins which is a low dimensional representation of vein pattern features. Low cost CCD cameras were used to obtain the vein images. The extraction of the vein pattern was obtained by applying morphology. We have applied noise reduction filters to enhance the vein patterns. The system has been successfully tested on a database of 200 images using a threshold value of 0.9. The results obtained are encouraging.

Keywords: Biometric, Dorsal vein pattern, PCA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871
11141 Shot Transition Detection with Minimal Decoding of MPEG Video Streams

Authors: Mona A. Fouad, Fatma M. Bayoumi, Hoda M. Onsi, Mohamed G. Darwish

Abstract:

Digital libraries become more and more necessary in order to support users with powerful and easy-to-use tools for searching, browsing and retrieving media information. The starting point for these tasks is the segmentation of video content into shots. To segment MPEG video streams into shots, a fully automatic procedure to detect both abrupt and gradual transitions (dissolve and fade-groups) with minimal decoding in real time is developed in this study. Each was explored through two phases: macro-block type's analysis in B-frames, and on-demand intensity information analysis. The experimental results show remarkable performance in detecting gradual transitions of some kinds of input data and comparable results of the rest of the examined video streams. Almost all abrupt transitions could be detected with very few false positive alarms.

Keywords: Adaptive threshold, abrupt transitions, gradual transitions, MPEG video streams.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1543
11140 The Study of Increasing Environmental Temperature on the Dynamical Behaviour of a Prey-Predator System: A Model

Authors: O. P. Misra, Preety Kalra

Abstract:

It is well recognized that the green house gases such as Chlorofluoro Carbon (CFC), CH4, CO2 etc. are responsible directly or indirectly for the increase in the average global temperature of the Earth. The presence of CFC is responsible for the depletion of ozone concentration in the atmosphere due to which the heat accompanied with the sun rays are less absorbed causing increase in the atmospheric temperature of the Earth. The gases like CH4 and CO2 are also responsible for the increase in the atmospheric temperature. The increase in the temperature level directly or indirectly affects the dynamics of interacting species systems. Therefore, in this paper a mathematical model is proposed and analysed using stability theory to asses the effects of increasing temperature due to greenhouse gases on the survival or extinction of populations in a prey-predator system. A threshold value in terms of a stress parameter is obtained which determines the extinction or existence of populations in the underlying system.

Keywords: Equilibria, Green house gases, Model, Populations, Stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1506
11139 Effect of Welding Processes on Fatigue Properties of Ti-6Al-4V Alloy Joints

Authors: T.S.Balasubramanian, V.Balasubramanian, M.A.Muthumanikkam

Abstract:

This paper reports the fatigue crack growth behaviour of gas tungsten arc, electron beam and laser beam welded Ti-6Al-4V titanium alloy. Centre cracked tensile specimens were prepared to evaluate the fatigue crack growth behaviour. A 100kN servo hydraulic controlled fatigue testing machine was used under constant amplitude uniaxial tensile load (stress ratio of 0.1 and frequency of 10 Hz). Crack growth curves were plotted and crack growth parameters (exponent and intercept) were evaluated. Critical and threshold stress intensity factor ranges were also evaluated. Fatigue crack growth behaviour of welds was correlated with mechanical properties and microstructural characteristics of welds. Of the three joints, the joint fabricated by laser beam welding exhibited higher fatigue crack growth resistance due to the presence of fine lamellar microstructure in the weld metal.

Keywords: Fatigue, Non ferrous metals and alloys, welding

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4491
11138 Over-Height Vehicle Detection in Low Headroom Roads Using Digital Video Processing

Authors: Vahid Khorramshahi, Alireza Behrad, Neeraj K. Kanhere

Abstract:

In this paper we present a new method for over-height vehicle detection in low headroom streets and highways using digital video possessing. The accuracy and the lower price comparing to present detectors like laser radars and the capability of providing extra information like speed and height measurement make this method more reliable and efficient. In this algorithm the features are selected and tracked using KLT algorithm. A blob extraction algorithm is also applied using background estimation and subtraction. Then the world coordinates of features that are inside the blobs are estimated using a noble calibration method. As, the heights of the features are calculated, we apply a threshold to select overheight features and eliminate others. The over-height features are segmented using some association criteria and grouped using an undirected graph. Then they are tracked through sequential frames. The obtained groups refer to over-height vehicles in a scene.

Keywords: Feature extraction, over-height vehicle detection, traffic monitoring, vehicle tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2808
11137 An Improved Variable Tolerance RSM with a Proportion Threshold

Authors: Chen Wu, Youquan Xu, Dandan Li, Ronghua Yang, Lijuan Wang

Abstract:

In rough set models, tolerance relation, similarity relation and limited tolerance relation solve different situation problems for incomplete information systems in which there exists a phenomenon of missing value. If two objects have the same few known attributes and more unknown attributes, they cannot distinguish them well. In order to solve this problem, we presented two improved limited and variable precision rough set models. One is symmetric, the other one is non-symmetric. They all use more stringent condition to separate two small probability equivalent objects into different classes. The two models are needed to engage further study in detail. In the present paper, we newly form object classes with a different respect comparing to the first suggested model. We overcome disadvantages of non-symmetry regarding to the second suggested model. We discuss relationships between or among several models and also make rule generation. The obtained results by applying the second model are more accurate and reasonable.

Keywords: Incomplete information system, rough set, symmetry, variable precision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 874
11136 Bifurcation Analysis of a Plankton Model with Discrete Delay

Authors: Anuj Kumar Sharma, Amit Sharma, Kulbhushan Agnihotri

Abstract:

In this paper, a delayed plankton-nutrient interaction model consisting of phytoplankton, zooplankton and dissolved nutrient is considered. It is assumed that some species of phytoplankton releases toxin (known as toxin producing phytoplankton (TPP)) which is harmful for zooplankton growth and this toxin releasing process follows a discrete time variation. Using delay as bifurcation parameter, the stability of interior equilibrium point is investigated and it is shown that time delay can destabilize the otherwise stable non-zero equilibrium state by inducing Hopf-bifurcation when it crosses a certain threshold value. Explicit results are derived for stability and direction of the bifurcating periodic solution by using normal form theory and center manifold arguments. Finally, outcomes of the system are validated through numerical simulations.

Keywords: Plankton, Time delay, Hopf-bifurcation, Normal form theory, Center manifold theorem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1900
11135 A Real-Time Image Change Detection System

Authors: Madina Hamiane, Amina Khunji

Abstract:

Detecting changes in multiple images of the same scene has recently seen increased interest due to the many contemporary applications including smart security systems, smart homes, remote sensing, surveillance, medical diagnosis, weather forecasting, speed and distance measurement, post-disaster forensics and much more. These applications differ in the scale, nature, and speed of change. This paper presents an application of image processing techniques to implement a real-time change detection system. Change is identified by comparing the RGB representation of two consecutive frames captured in real-time. The detection threshold can be controlled to account for various luminance levels. The comparison result is passed through a filter before decision making to reduce false positives, especially at lower luminance conditions. The system is implemented with a MATLAB Graphical User interface with several controls to manage its operation and performance.

Keywords: Image change detection, Image processing, image filtering, thresholding, B/W quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2546
11134 Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining

Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride

Abstract:

In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 420
11133 GPU-Based Volume Rendering for Medical Imagery

Authors: Hadjira Bentoumi, Pascal Gautron, Kadi Bouatouch

Abstract:

We present a method for fast volume rendering using graphics hardware (GPU). To our knowledge, it is the first implementation on the GPU. Based on the Shear-Warp algorithm, our GPU-based method provides real-time frame rates and outperforms the CPU-based implementation. When the number of slices is not sufficient, we add in-between slices computed by interpolation. This improves then the quality of the rendered images. We have also implemented the ray marching algorithm on the GPU. The results generated by the three algorithms (CPU-based and GPU-based Shear- Warp, GPU-based Ray Marching) for two test models has proved that the ray marching algorithm outperforms the shear-warp methods in terms of speed up and image quality.

Keywords: Volume rendering, graphics processors

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1836
11132 Long Wavelength Coherent Pulse of Sound Propagating in Granular Media

Authors: Rohit Kumar Shrivastava, Amalia Thomas, Nathalie Vriend, Stefan Luding

Abstract:

A mechanical wave or vibration propagating through granular media exhibits a specific signature in time. A coherent pulse or wavefront arrives first with multiply scattered waves (coda) arriving later. The coherent pulse is micro-structure independent i.e. it depends only on the bulk properties of the disordered granular sample, the sound wave velocity of the granular sample and hence bulk and shear moduli. The coherent wavefront attenuates (decreases in amplitude) and broadens with distance from its source. The pulse attenuation and broadening effects are affected by disorder (polydispersity; contrast in size of the granules) and have often been attributed to dispersion and scattering. To study the effect of disorder and initial amplitude (non-linearity) of the pulse imparted to the system on the coherent wavefront, numerical simulations have been carried out on one-dimensional sets of particles (granular chains). The interaction force between the particles is given by a Hertzian contact model. The sizes of particles have been selected randomly from a Gaussian distribution, where the standard deviation of this distribution is the relevant parameter that quantifies the effect of disorder on the coherent wavefront. Since, the coherent wavefront is system configuration independent, ensemble averaging has been used for improving the signal quality of the coherent pulse and removing the multiply scattered waves. The results concerning the width of the coherent wavefront have been formulated in terms of scaling laws. An experimental set-up of photoelastic particles constituting a granular chain is proposed to validate the numerical results.

Keywords: Discrete elements, Hertzian Contact, polydispersity, weakly nonlinear, wave propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 906
11131 A Hybrid Approach for Quantification of Novelty in Rule Discovery

Authors: Vasudha Bhatnagar, Ahmed Sultan Al-Hegami, Naveen Kumar

Abstract:

Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.

Keywords: Knowledge Discovery in Databases (KDD), Data Mining, Rule Discovery, Interestingness, Subjective Measures, Novelty Measure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1340
11130 New Gate Stack Double Diffusion MOSFET Design to Improve the Electrical Performances for Power Applications

Authors: Z. Dibi, F. Djeffal, N. Lakhdar

Abstract:

In this paper, we have developed an explicit analytical drain current model comprising surface channel potential and threshold voltage in order to explain the advantages of the proposed Gate Stack Double Diffusion (GSDD) MOSFET design over the conventional MOSFET with the same geometric specifications that allow us to use the benefits of the incorporation of the high-k layer between the oxide layer and gate metal aspect on the immunity of the proposed design against the self-heating effects. In order to show the efficiency of our proposed structure, we propose the simulation of the power chopper circuit. The use of the proposed structure to design a power chopper circuit has showed that the (GSDD) MOSFET can improve the working of the circuit in terms of power dissipation and self-heating effect immunity. The results so obtained are in close proximity with the 2D simulated results thus confirming the validity of the proposed model.

Keywords: Double-Diffusion, modeling, MOSFET, power.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
11129 Optimal Compensation of Reactive Power in the Restructured Distribution Network

Authors: Atefeh Pourshafie, Mohsen. Saniei, S. S. Mortazavi, A. Saeedian

Abstract:

In this paper optimal capacitor placement problem has been formulated in a restructured distribution network. In this scenario the distribution network operator can consider reactive energy also as a service that can be sold to transmission system. Thus search for optimal location, size and number of capacitor banks with the objective of loss reduction, maximum income from selling reactive energy to transmission system and return on investment for capacitors, has been performed. Results is influenced with economic value of reactive energy, therefore problem has been solved for various amounts of it. The implemented optimization technique is genetic algorithm. For any value of reactive power economic value, when reverse of investment index increase and change from zero or negative values to positive values, the threshold value of selling reactive power has been obtained. This increasing price of economic parameter is reasonable until the network losses is less than loss before compensation.

Keywords: capacitor placement, deregulated electric market, distribution network optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108
11128 An Application of Extreme Value Theory as a Risk Measurement Approach in Frontier Markets

Authors: Dany Ng Cheong Vee, Preethee Nunkoo Gonpot, Noor-Ul-Hacq Sookia

Abstract:

In this paper, we consider the application of Extreme Value Theory as a risk measurement tool. The Value at Risk, for a set of indices, from six Stock Exchanges of Frontier markets is calculated using the Peaks over Threshold method and the performance of the model index-wise is evaluated using coverage tests and loss functions. Our results show that “fattailedness” alone of the data is not enough to justify the use of EVT as a VaR approach. The structure of the returns dynamics is also a determining factor. This approach works fine in markets which have had extremes occurring in the past thus making the model capable of coping with extremes coming up (Colombo, Tunisia and Zagreb Stock Exchanges). On the other hand, we find that indices with lower past than present volatility fail to adequately deal with future extremes (Mauritius and Kazakhstan). We also conclude that using EVT alone produces quite static VaR figures not reflecting the actual dynamics of the data.

Keywords: Extreme Value theory, Financial Crisis 2008, Frontier Markets, Value at Risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2367
11127 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data

Authors: Benjamin D. Leiby, Darryl K. Ahner

Abstract:

This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known  values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions, while presenting a need for further refinement that mimics predictive mean matching.

Keywords: Correlation, country conflict, imputation, stochastic regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 397
11126 Culture of Oleaginous Yeasts in Dairy Industry Wastewaters to Obtain Lipids Suitable for the Production of II-Generation Biodiesel

Authors: Domenico Pirozzi, Angelo Ausiello, Gaetano Zuccaro, Filomena Sannino, Abu Yousuf

Abstract:

The oleaginous yeasts Lipomyces starkey were grown in the presence of dairy industry wastewaters (DIW). The yeasts were able to degrade the organic components of DIW and to produce a significant fraction of their biomass as triglycerides. When using DIW from the Ricotta cheese production or residual whey as growth medium, the L. starkey could be cultured without dilution nor external organic supplement. On the contrary, the yeasts could only partially degrade the DIW from the Mozzarella cheese production, due to the accumulation of a metabolic product beyond the threshold of toxicity. In this case, a dilution of the DIW was required to obtain a more efficient degradation of the carbon compounds and an higher yield in oleaginous biomass. The fatty acid distribution of the microbial oils obtained showed a prevalence of oleic acid, and is compatible with the production of a II generation biodiesel offering a good resistance to oxidation as well as an excellent cold-performance.

Keywords: Yeasts, Lipids, Biodiesel, Dairy industry wastewaters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2058
11125 Evaluation of Phthalates Contents and Their Health Effects in Consumed Sachet Water Brands in Delta State, Nigeria

Authors: Edjere Oghenekohwiroro, Asibor Irabor Godwin, Uwem Bassey

Abstract:

This paper determines the presence and levels of phthalates in sachet and borehole water source in some parts of Delta State, Nigeria. Sachet and borehole water samples were collected from seven different water packaging facilities and level of phthalates determined using GC-MS instrumentation. Phthalates concentration in borehole samples varied from 0.00-0.01 (DMP), 0.06-0.20 (DEP), 0.10-0.98 (DBP), 0.21-0.36 (BEHP), 0.01-0.03 (DnOP) µg/L and (BBP) was not detectable; while sachet water varied from 0.03-0.95 (DMP), 0.16-12.45 (DEP), 0.57-3.38 (DBP), 0.00-0.03 (BBP), 0.08-0.31 (BEHP) and 0-0.03 (DnOP) µg/L. Phthalates concentration in the sachet water was higher than that of the corresponding boreholes sources and also showed significant difference (p < 0.05) between the two. Sources of these phthalate esters were the interaction between water molecules and plastic storage facilities. Although concentration of all phthalate esters analyzed were lower than the threshold limit value(TLV), over time storage of water samples in this medium can lead to substantial increase with negative effects on individuals consuming them.

Keywords: Phthalate esters, borehole, sachet water, sample extraction, gas chromatography, GC-MS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2211
11124 The Hyperbolic Smoothing Approach for Automatic Calibration of Rainfall-Runoff Models

Authors: Adilson Elias Xavier, Otto Corrêa Rotunno Filho, Paulo Canedo de Magalhães

Abstract:

This paper addresses the issue of automatic parameter estimation in conceptual rainfall-runoff (CRR) models. Due to threshold structures commonly occurring in CRR models, the associated mathematical optimization problems have the significant characteristic of being strongly non-differentiable. In order to face this enormous task, the resolution method proposed adopts a smoothing strategy using a special C∞ differentiable class function. The final estimation solution is obtained by solving a sequence of differentiable subproblems which gradually approach the original conceptual problem. The use of this technique, called Hyperbolic Smoothing Method (HSM), makes possible the application of the most powerful minimization algorithms, and also allows for the main difficulties presented by the original CRR problem to be overcome. A set of computational experiments is presented for the purpose of illustrating both the reliability and the efficiency of the proposed approach.

Keywords: Rainfall-runoff models, optimization procedure, automatic parameter calibration, hyperbolic smoothing method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 391
11123 Energy Efficient Clustering Algorithm with Global and Local Re-clustering for Wireless Sensor Networks

Authors: Ashanie Guanathillake, Kithsiri Samarasinghe

Abstract:

Wireless Sensor Networks consist of inexpensive, low power sensor nodes deployed to monitor the environment and collect data. Gathering information in an energy efficient manner is a critical aspect to prolong the network lifetime. Clustering  algorithms have an advantage of enhancing the network lifetime. Current clustering algorithms usually focus on global re-clustering and local re-clustering separately. This paper, proposed a combination of those two reclustering methods to reduce the energy consumption of the network. Furthermore, the proposed algorithm can apply to homogeneous as well as heterogeneous wireless sensor networks. In addition, the cluster head rotation happens, only when its energy drops below a dynamic threshold value computed by the algorithm. The simulation result shows that the proposed algorithm prolong the network lifetime compared to existing algorithms.

Keywords: Energy efficient, Global re-clustering, Local re-clustering, Wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2352
11122 Optimization of Car Seat Considering Whiplash Injury

Authors: Wookyung Baik, Seungchan Lee, Choongmin Jeong, Siwoo Kim, Myungwon Suh

Abstract:

Development of motor car safety devices has reduced fatality rates in car accidents. Yet despite this increase in car safety, neck injuries resulting from rear impact collisions, particularly at low speed, remain a primary concern. In this study, FEA(Finite Element Analysis) of seat was performed to evaluate neck injuries in rear impact. And the FEA result was verified by comparison with the actual test results. The dummy used in FE model and actual test is BioRID II which is regarded suitable for rear impact collision analysis. A threshold of the BioRID II neck injury indicators was also proposed to upgrade seat performance in order to reduce whiplash injury. To optimize the seat for a low-speed rear impact collision, a method was proposed, which is multi-objective optimization idea using DOE (Design of Experiments) results.

Keywords: Whiplash injury, Dynamic assessment, Finite element method, Optimization, DOE (Design of Experiments), WSM (Weighed Sum Method).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1847
11121 Influence of Measurement System on Negative Bias Temperature Instability Characterization: Fast BTI vs Conventional BTI vs Fast Wafer Level Reliability

Authors: Vincent King Soon Wong, Hong Seng Ng, Florinna Sim

Abstract:

Negative Bias Temperature Instability (NBTI) is one of the critical degradation mechanisms in semiconductor device reliability that causes shift in the threshold voltage (Vth). However, thorough understanding of this reliability failure mechanism is still unachievable due to a recovery characteristic known as NBTI recovery. This paper will demonstrate the severity of NBTI recovery as well as one of the effective methods used to mitigate, which is the minimization of measurement system delays. Comparison was done in between two measurement systems that have significant differences in measurement delays to show how NBTI recovery causes result deviations and how fast measurement systems can mitigate NBTI recovery. Another method to minimize NBTI recovery without the influence of measurement system known as Fast Wafer Level Reliability (FWLR) NBTI was also done to be used as reference.

Keywords: Fast vs slow BTI, Fast wafer level reliability, Negative bias temperature instability, NBTI measurement system, metal-oxide-semiconductor field-effect transistor, MOSFET, NBTI recovery, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630
11120 A GA-Based Role Assignment Approach for Web-based Cooperative Learning Environments

Authors: Yi-Chun Chang, Jian-Wei Li

Abstract:

Web-based cooperative learning focuses on (1) the interaction and the collaboration of community members, and (2) the sharing and the distribution of knowledge and expertise by network technology to enhance learning performance. Numerous research literatures related to web-based cooperative learning have demonstrated that cooperative scripts have a positive impact to specify, sequence, and assign cooperative learning activities. Besides, literatures have indicated that role-play in web-based cooperative learning environments enhances two or more students to work together toward the completion of a common goal. Since students generally do not know each other and they lack the face-to-face contact that is necessary for the negotiation of assigning group roles in web-based cooperative learning environments, this paper intends to further extend the application of genetic algorithm (GA) and propose a GA-based algorithm to tackle the problem of role assignment in web-based cooperative learning environments, which not only saves communication costs but also reduces conflict between group members in negotiating role assignments.

Keywords: genetic algorithm (GA), role assignment, role-play; web-based cooperative learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1439
11119 Potential Climate Change Impacts on the Hydrological System of the Harvey River Catchment

Authors: Hashim Isam Jameel Al-Safi, P. Ranjan Sarukkalige

Abstract:

Climate change is likely to impact the Australian continent by changing the trends of rainfall, increasing temperature, and affecting the accessibility of water quantity and quality. This study investigates the possible impacts of future climate change on the hydrological system of the Harvey River catchment in Western Australia by using the conceptual modelling approach (HBV mode). Daily observations of rainfall and temperature and the long-term monthly mean potential evapotranspiration, from six weather stations, were available for the period (1961-2015). The observed streamflow data at Clifton Park gauging station for 33 years (1983-2015) in line with the observed climate variables were used to run, calibrate and validate the HBV-model prior to the simulation process. The calibrated model was then forced with the downscaled future climate signals from a multi-model ensemble of fifteen GCMs of the CMIP3 model under three emission scenarios (A2, A1B and B1) to simulate the future runoff at the catchment outlet. Two periods were selected to represent the future climate conditions including the mid (2046-2065) and late (2080-2099) of the 21st century. A control run, with the reference climate period (1981-2000), was used to represent the current climate status. The modelling outcomes show an evident reduction in the mean annual streamflow during the mid of this century particularly for the A1B scenario relative to the control run. Toward the end of the century, all scenarios show a relatively high reduction trends in the mean annual streamflow, especially the A1B scenario, compared to the control run. The decline in the mean annual streamflow ranged between 4-15% during the mid of the current century and 9-42% by the end of the century.

Keywords: Climate change impact, Harvey catchment, HBV model, hydrological modelling, GCMs, LARS-WG, Australia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
11118 Evaluating Factors Affecting Audiologists’ Diagnostic Performance in Auditory Brainstem Response Reading: Training and Experience

Authors: M. Zaitoun, S. Cumming, A. Purcell

Abstract:

This study aims to determine if audiologists' experience characteristics in ABR (Auditory Brainstem Response) reading is associated with their performance in interpreting ABR results. Fifteen ABR traces with varying degrees of hearing level were presented twice, making a total of 30. Audiologists were asked to determine the hearing threshold for each of the cases after completing a brief survey regarding their experience and training in ABR administration. Sixty-one audiologists completed all tasks. Correlations between audiologists’ performance measures and experience variables suggested significant associations (p < 0.05) between training period in ABR testing and audiologists’ performance in terms of both sensitivity and accuracy. In addition, the number of years conducting ABR testing correlated with specificity. No other correlations approached significance. While there are relatively few significant correlations between ABR performance and experience, accuracy in ABR reading is associated with audiologists’ length of experience and period of training. To improve audiologists’ performance in reading ABR results, an emphasis on the importance of training should be raised and standardized levels and period for audiologists training in ABR testing should also be set.

Keywords: ABR, audiology, performance, training, experience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 740
11117 Mathematical Model of Dengue Disease with the Incubation Period of Virus

Authors: P. Pongsumpun

Abstract:

Dengue virus is transmitted from person to person through the biting of infected Aedes Aegypti mosquitoes. DEN-1, DEN-2, DEN-3 and DEN-4 are four serotypes of this virus. Infection with one of these four serotypes apparently produces permanent immunity to it, but only temporary cross immunity to the others. The length of time during incubation of dengue virus in human and mosquito are considered in this study. The dengue patients are classified into infected and infectious classes. The infectious human can transmit dengue virus to susceptible mosquitoes but infected human can not. The transmission model of this disease is formulated. The human population is divided into susceptible, infected, infectious and recovered classes. The mosquito population is separated into susceptible, infected and infectious classes. Only infectious mosquitoes can transmit dengue virus to the susceptible human. We analyze this model by using dynamical analysis method. The threshold condition is discussed to reduce the outbreak of this disease.

Keywords: Transmission model, intrinsic incubation period, extrinsic incubation period, basic reproductive number, equilibriumstates, local stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2167
11116 Applying Multiple Kinect on the Development of a Rapid 3D Mannequin Scan Platform

Authors: Shih-Wen Hsiao, Yi-Cheng Tsao

Abstract:

In the field of reverse engineering and creative industries, applying 3D scanning process to obtain geometric forms of the objects is a mature and common technique. For instance, organic objects such as faces and nonorganic objects such as products could be scanned to acquire the geometric information for further application. However, although the data resolution of 3D scanning device is increasing and there are more and more abundant complementary applications, the penetration rate of 3D scanning for the public is still limited by the relative high price of the devices. On the other hand, Kinect, released by Microsoft, is known for its powerful functions, considerably low price, and complete technology and database support. Therefore, related studies can be done with the applying of Kinect under acceptable cost and data precision. Due to the fact that Kinect utilizes optical mechanism to extracting depth information, limitations are found due to the reason of the straight path of the light. Thus, various angles are required sequentially to obtain the complete 3D information of the object when applying a single Kinect for 3D scanning. The integration process which combines the 3D data from different angles by certain algorithms is also required. This sequential scanning process costs much time and the complex integration process often encounter some technical problems. Therefore, this paper aimed to apply multiple Kinects simultaneously on the field of developing a rapid 3D mannequin scan platform and proposed suggestions on the number and angles of Kinects. In the content, a method of establishing the coordination based on the relation between mannequin and the specifications of Kinect is proposed, and a suggestion of angles and number of Kinects is also described. An experiment of applying multiple Kinect on the scanning of 3D mannequin is constructed by Microsoft API, and the results show that the time required for scanning and technical threshold can be reduced in the industries of fashion and garment design.

Keywords: 3D scan, depth sensor, fashion and garment design, mannequin, multiple kinect sensor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2260
11115 Negotiation Support for Value-based Decision in Construction

Authors: Christiono Utomo, Arazi Idrus, Isnanto, Annisa Nugraheni, Farida Rahmawati

Abstract:

A Negotiation Support is required on a value-based decision to enable each stakeholder to evaluate and rank the solution alternatives before engaging into negotiation with the other stakeholders. This study demonstrates a process of negotiation support model for selection of a building system from value-based design perspective. The perspective is based on comparison of function and cost of a building system. Multi criteria decision techniques were applied to determine the relative value of the alternative solutions for performing the function. A satisfying option game theory are applied to the criteria of value-based decision which are LCC (life cycle cost) and function based FAST. The results demonstrate a negotiation process to select priorities of a building system. The support model can be extended to an automated negotiation by combining value based decision method, group decision and negotiation support.

Keywords: NSS, Value-based, Decision, Construction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
11114 Ultrasound Therapy: Amplitude Modulation Technique for Tissue Ablation by Acoustic Cavitation

Authors: Fares A. Mayia, Mahmoud A. Yamany, Mushabbab A. Asiri

Abstract:

In recent years, non-invasive Focused Ultrasound (FU) has been utilized for generating bubbles (cavities) to ablate target tissue by mechanical fractionation. Intensities >10 kW/cm2 are required to generate the inertial cavities. The generation, rapid growth, and collapse of these inertial cavities cause tissue fractionation and the process is called Histotripsy. The ability to fractionate tissue from outside the body has many clinical applications including the destruction of the tumor mass. The process of tissue fractionation leaves a void at the treated site, where all the affected tissue is liquefied to particles at sub-micron size. The liquefied tissue will eventually be absorbed by the body. Histotripsy is a promising non-invasive treatment modality. This paper presents a technique for generating inertial cavities at lower intensities (< 1 kW/cm2). The technique (patent pending) is based on amplitude modulation (AM), whereby a low frequency signal modulates the amplitude of a higher frequency FU wave. Cavitation threshold is lower at low frequencies; the intensity required to generate cavitation in water at 10 kHz is two orders of magnitude lower than the intensity at 1 MHz. The Amplitude Modulation technique can operate in both continuous wave (CW) and pulse wave (PW) modes, and the percentage modulation (modulation index) can be varied from 0 % (thermal effect) to 100 % (cavitation effect), thus allowing a range of ablating effects from Hyperthermia to Histotripsy. Furthermore, changing the frequency of the modulating signal allows controlling the size of the generated cavities. Results from in vitro work demonstrate the efficacy of the new technique in fractionating soft tissue and solid calcium carbonate (Chalk) material. The technique, when combined with MR or Ultrasound imaging, will present a precise treatment modality for ablating diseased tissue without affecting the surrounding healthy tissue.

Keywords: Focused ultrasound therapy, Histotripsy, generation of inertial cavitation, mechanical tissue ablation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956