Search results for: Statistical language model N-grams
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8876

Search results for: Statistical language model N-grams

8426 Validation of SWAT Model for Prediction of Water Yield and Water Balance: Case Study of Upstream Catchment of Jebba Dam in Nigeria

Authors: Adeniyi G. Adeogun, Bolaji F. Sule, Adebayo W. Salami, Michael O. Daramola

Abstract:

Estimation of water yield and water balance in a river catchment is critical to the sustainable management of water resources at watershed level in any country. Therefore, in the present study, Soil and Water Assessment Tool (SWAT) interfaced with Geographical Information System (GIS) was applied as a tool to predict water balance and water yield of a catchment area in Nigeria. The catchment area, which was 12,992km2, is located upstream Jebba hydropower dam in North central part of Nigeria. In this study, data on the observed flow were collected and compared with simulated flow using SWAT. The correlation between the two data sets was evaluated using statistical measures, such as, Nasch-Sucliffe Efficiency (NSE) and coefficient of determination (R2). The model output shows a good agreement between the observed flow and simulated flow as indicated by NSE and R2, which were greater than 0.7 for both calibration and validation period. A total of 42,733 mm of water was predicted by the calibrated model as the water yield potential of the basin for a simulation period between 1985 to 2010. This interesting performance obtained with SWAT model suggests that SWAT model could be a promising tool to predict water balance and water yield in sustainable management of water resources. In addition, SWAT could be applied to other water resources in other basins in Nigeria as a decision support tool for sustainable water management in Nigeria.

Keywords: GIS, Modeling, Sensitivity Analysis, SWAT, Water Yield, Watershed level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5038
8425 On Dialogue Systems Based on Deep Learning

Authors: Yifan Fan, Xudong Luo, Pingping Lin

Abstract:

Nowadays, dialogue systems increasingly become the way for humans to access many computer systems. So, humans can interact with computers in natural language. A dialogue system consists of three parts: understanding what humans say in natural language, managing dialogue, and generating responses in natural language. In this paper, we survey deep learning based methods for dialogue management, response generation and dialogue evaluation. Specifically, these methods are based on neural network, long short-term memory network, deep reinforcement learning, pre-training and generative adversarial network. We compare these methods and point out the further research directions.

Keywords: Dialogue management, response generation, reinforcement learning, deep learning, evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 787
8424 Service Architecture for 3rd Party Operator's Participation

Authors: F. Sarabchi, A. H. Darvishan, H. Yeganeh, H. Ahmadian

Abstract:

Next generation networks with the idea of convergence of service and control layer in existing networks (fixed, mobile and data) and with the intention of providing services in an integrated network, has opened new horizon for telecom operators. On the other hand, economic problems have caused operators to look for new source of income including consider new services, subscription of more users and their promotion in using morenetwork resources and easy participation of service providers or 3rd party operators in utilizing networks. With this requirement, an architecture based on next generation objectives for service layer is necessary. In this paper, a new architecture based on IMS model explains participation of 3rd party operators in creation and implementation of services on an integrated telecom network.

Keywords: Service model, IMS, API, Scripting language, JAIN, Parlay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1473
8423 A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks

Authors: Naghmeh Moradpoor Sheykhkanloo

Abstract:

Thousands of organisations store important and confidential information related to them, their customers, and their business partners in databases all across the world. The stored data ranges from less sensitive (e.g. first name, last name, date of birth) to more sensitive data (e.g. password, pin code, and credit card information). Losing data, disclosing confidential information or even changing the value of data are the severe damages that Structured Query Language injection (SQLi) attack can cause on a given database. It is a code injection technique where malicious SQL statements are inserted into a given SQL database by simply using a web browser. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLi attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLi attack categories, and a NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLi attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.

Keywords: Neural Networks, pattern recognition, SQL injection attacks, SQL injection attack classification, SQL injection attack detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2844
8422 A Description Logics Based Approach for Building Multi-Viewpoints Ontologies

Authors: M. Hemam, M. Djezzar, T. Djouad

Abstract:

We are interested in the problem of building an ontology in a heterogeneous organization, by taking into account different viewpoints and different terminologies of communities in the organization. Such ontology, that we call multi-viewpoint ontology, confers to the same universe of discourse, several partial descriptions, where each one is relative to a particular viewpoint. In addition, these partial descriptions share at global level, ontological elements constituent a consensus between the various viewpoints. In order to provide response elements to this problem we define a multi-viewpoints knowledge model based on viewpoint and ontology notions. The multi-viewpoints knowledge model is used to formalize the multi-viewpoints ontology in description logics language.

Keywords: Description logic, knowledge engineering, ontology, viewpoint.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1023
8421 Data Mining on the Router Logs for Statistical Application Classification

Authors: M. Rahmati, S.M. Mirzababaei

Abstract:

With the advance of information technology in the new era the applications of Internet to access data resources has steadily increased and huge amount of data have become accessible in various forms. Obviously, the network providers and agencies, look after to prevent electronic attacks that may be harmful or may be related to terrorist applications. Thus, these have facilitated the authorities to under take a variety of methods to protect the special regions from harmful data. One of the most important approaches is to use firewall in the network facilities. The main objectives of firewalls are to stop the transfer of suspicious packets in several ways. However because of its blind packet stopping, high process power requirements and expensive prices some of the providers are reluctant to use the firewall. In this paper we proposed a method to find a discriminate function to distinguish between usual packets and harmful ones by the statistical processing on the network router logs. By discriminating these data, an administrator may take an approach action against the user. This method is very fast and can be used simply in adjacent with the Internet routers.

Keywords: Data Mining, Firewall, Optimization, Packetclassification, Statistical Pattern Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
8420 Text Retrieval Relevance Feedback Techniques for Bag of Words Model in CBIR

Authors: Nhu Van NGUYEN, Jean-Marc OGIER, Salvatore TABBONE, Alain BOUCHER

Abstract:

The state-of-the-art Bag of Words model in Content- Based Image Retrieval has been used for years but the relevance feedback strategies for this model are not fully investigated. Inspired from text retrieval, the Bag of Words model has the ability to use the wealth of knowledge and practices available in text retrieval. We study and experiment the relevance feedback model in text retrieval for adapting it to image retrieval. The experiments show that the techniques from text retrieval give good results for image retrieval and that further improvements is possible.

Keywords: Relevance feedback, bag of words model, probabilistic model, vector space model, image retrieval

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
8419 A Simple Deterministic Model for the Spread of Leptospirosis in Thailand

Authors: W. Triampo, D. Baowan, I.M. Tang, N. Nuttavut, J. Wong-Ekkabut, G. Doungchawee

Abstract:

In this work, we consider a deterministic model for the transmission of leptospirosis which is currently spreading in the Thai population. The SIR model which incorporates the features of this disease is applied to the epidemiological data in Thailand. It is seen that the numerical solutions of the SIR equations are in good agreement with real empirical data. Further improvements are discussed.

Keywords: Leptospirosis, SIR Model, Deterministic model, Thailand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987
8418 IOT Based Process Model for Heart Monitoring Process

Authors: Dalyah Y. Al-Jamal, Maryam H. Eshtaiwi, Liyakathunisa Syed

Abstract:

Connecting health services with technology has a huge demand as people health situations are becoming worse day by day. In fact, engaging new technologies such as Internet of Things (IOT) into the medical services can enhance the patient care services. Specifically, patients suffering from chronic diseases such as cardiac patients need a special care and monitoring. In reality, some efforts were previously taken to automate and improve the patient monitoring systems. However, the previous efforts have some limitations and lack the real-time feature needed for chronic kind of diseases. In this paper, an improved process model for patient monitoring system specialized for cardiac patients is presented. A survey was distributed and interviews were conducted to gather the needed requirements to improve the cardiac patient monitoring system. Business Process Model and Notation (BPMN) language was used to model the proposed process. In fact, the proposed system uses the IOT Technology to assist doctors to remotely monitor and follow-up with their heart patients in real-time. In order to validate the effectiveness of the proposed solution, simulation analysis was performed using Bizagi Modeler tool. Analysis results show performance improvements in the heart monitoring process. For the future, authors suggest enhancing the proposed system to cover all the chronic diseases.

Keywords: Business process model and notation, cardiac patient, cardiac monitoring, heart monitoring, healthcare, internet of things, remote patient monitoring system, process model, telemedicine, wearable sensors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
8417 Quantitative Precipitation Forecast using MM5 and WRF models for Kelantan River Basin

Authors: Wardah, T., Kamil, A.A., Sahol Hamid, A.B., Maisarah, W.W.I

Abstract:

Quantitative precipitation forecast (QPF) from atmospheric model as input to hydrological model in an integrated hydro-meteorological flood forecasting system has been operational in many countries worldwide. High-resolution numerical weather prediction (NWP) models with grid cell sizes between 2 and 14 km have great potential in contributing towards reasonably accurate QPF. In this study the potential of two NWP models to forecast precipitation for a flood-prone area in a tropical region is examined. The precipitation forecasts produced from the Fifth Generation Penn State/NCAR Mesoscale (MM5) and Weather Research and Forecasting (WRF) models are statistically verified with the observed rain in Kelantan River Basin, Malaysia. The statistical verification indicates that the models have performed quite satisfactorily for low and moderate rainfall but not very satisfactory for heavy rainfall.

Keywords: MM5, Numerical weather prediction (NWP), quantitative precipitation forecast (QPF), WRF

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2930
8416 Development of Perez-Du Mortier Calibration Algorithm for Ground-Based Aerosol Optical Depth Measurement with Validation using SMARTS Model

Authors: Jedol Dayou, Jackson Hian Wui Chang, Rubena Yusoff, Ag. Sufiyan Abd. Hamid, Fauziah Sulaiman, Justin Sentian

Abstract:

Aerosols are small particles suspended in air that have wide varying spatial and temporal distributions. The concentration of aerosol in total columnar atmosphere is normally measured using aerosol optical depth (AOD). In long-term monitoring stations, accurate AOD retrieval is often difficult due to the lack of frequent calibration. To overcome this problem, a near-sea-level Langley calibration algorithm is developed using the combination of clear-sky detection model and statistical filter. It attempts to produce a dataset that consists of only homogenous and stable atmospheric condition for the Langley calibration purposes. In this paper, a radiance-based validation method is performed to further investigate the feasibility and consistency of the proposed algorithm at different location, day, and time. The algorithm is validated using SMARTS model based n DNI value. The overall results confirmed that the proposed calibration algorithm feasible and consistent for measurements taken at different sites and weather conditions.

Keywords: Aerosol optical depth, direct normal irradiance, Langley calibration, radiance-based validation, SMARTS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
8415 Coloured Reconfigurable Nets for Code Mobility Modeling

Authors: Kahloul Laid, Chaoui Allaoua

Abstract:

Code mobility technologies attract more and more developers and consumers. Numerous domains are concerned, many platforms are developed and interest applications are realized. However, developing good software products requires modeling, analyzing and proving steps. The choice of models and modeling languages is so critical on these steps. Formal tools are powerful in analyzing and proving steps. However, poorness of classical modeling language to model mobility requires proposition of new models. The objective of this paper is to provide a specific formalism “Coloured Reconfigurable Nets" and to show how this one seems to be adequate to model different kinds of code mobility.

Keywords: Code mobility, modelling mobility, labelled reconfigurable nets, Coloured reconfigurable nets, mobile code design paradigms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
8414 Global Kinetics of Direct Dimethyl Ether Synthesis Process from Syngas in Slurry Reactor over a Novel Cu-Zn-Al-Zr Slurry Catalyst

Authors: Zhen Chen, Haitao Zhang, Weiyong Ying, Dingye Fang

Abstract:

The direct synthesis process of dimethyl ether (DME) from syngas in slurry reactors is considered to be promising because of its advantages in caloric transfer. In this paper, the influences of operating conditions (temperature, pressure and weight hourly space velocity) on the conversion of CO, selectivity of DME and methanol were studied in a stirred autoclave over Cu-Zn-Al-Zr slurry catalyst, which is far more suitable to liquid phase dimethyl ether synthesis process than bifunctional catalyst commercially. A Langmuir- Hinshelwood mechanism type global kinetics model for liquid phase DME direct synthesis based on methanol synthesis models and a methanol dehydration model has been investigated by fitting our experimental data. The model parameters were estimated with MATLAB program based on general Genetic Algorithms and Levenberg-Marquardt method, which is suitably fitting experimental data and its reliability was verified by statistical test and residual error analysis.

Keywords: alcohol/ether fuel, Cu-Zn-Al-Zr slurry catalyst, global kinetics, slurry reactor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5521
8413 Coalescence of Insulin and Triglyceride/High Density Lipoprotein Cholesterol Ratio for the Derivation of a Laboratory Index to Predict Metabolic Syndrome in Morbid Obese Children

Authors: Orkide Donma, Mustafa M. Donma

Abstract:

Morbid obesity is a health threatening condition particularly in children. Generally, it leads to the development of metabolic syndrome (MetS) characterized by central obesity, elevated fasting blood glucose (FBG), triglyceride (TRG), blood pressure values and suppressed high density lipoprotein cholesterol (HDL-C) levels. However, some ambiguities exist during the diagnosis of MetS in children below 10 years of age. Therefore, clinicians are in the need of some surrogate markers for the laboratory assessment of pediatric MetS. In this study, the aim is to develop an index, which will be more helpful during the evaluation of further risks detected in morbid obese (MO) children. A total of 235 children with normal body mass index (N-BMI), with varying degrees of obesity; overweight (OW), obese (OB), MO as well as MetS participated in this study. The study was approved by the Institutional Ethical Committee. Informed consent forms were obtained from the parents of the children. Obesity states of the children were classified using BMI percentiles adjusted for age and sex. For the purpose, tabulated data prepared by WHO were used. MetS criteria were defined. Systolic and diastolic blood pressure values were measured. Parameters related to glucose and lipid metabolisms were determined. FBG, insulin (INS), HDL-C, TRG concentrations were determined. Diagnostic Obesity Notation Model Assessment Laboratory (DONMALAB) Index [ln TRG/HDL-C*INS] was introduced. Commonly used insulin resistance (IR) indices such as Homeostatic Model Assessment for IR (HOMA-IR) as well as ratios such as TRG/HDL-C, TRG/HDL-C*INS, HDL-C/TRG*INS, TRG/HDL-C*INS/FBG, log, and ln versions of these ratios were calculated. Results were interpreted using statistical package program (SPSS Version 16.0) for Windows. The data were evaluated using appropriate statistical tests. The degree for statistical significance was defined as 0.05. 35 N, 20 OW, 47 OB, 97 MO children and 36 with MetS were investigated. Mean ± SD values of TRG/HDL-C were 1.27 ± 0.69, 1.86 ± 1.08, 2.15 ± 1.22, 2.48 ± 2.35 and 4.61 ± 3.92 for N, OW, OB, MO and MetS children, respectively. Corresponding values for the DONMALAB index were 2.17 ± 1.07, 3.01 ± 0.94, 3.41 ± 0.93, 3.43 ± 1.08 and 4.32 ± 1.00. TRG/HDL-C ratio significantly differed between N and MetS groups. On the other hand, DONMALAB index exhibited statistically significant differences between N and all the other groups except the OW group. This index was capable of discriminating MO children from those with MetS. Statistically significant elevations were detected in MO children with MetS (p < 0.05). Multiple parameters are commonly used during the assessment of MetS. Upon evaluation of the values obtained for N, OW, OB, MO groups and for MO children with MetS, the [ln TRG/HDL-C*INS] value was unique in discriminating children with MetS.

Keywords: Children, index, laboratory, metabolic syndrome, obesity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 727
8412 Controlling of Load Elevators by the Fuzzy Logic Method

Authors: Ismail Saritas, Abdullah Adiyaman

Abstract:

In this study, a fuzzy-logic based control system was designed to ensure that time and energy is saved during the operation of load elevators which are used during the construction of tall buildings. In the control system that was devised, for the load elevators to work more efficiently, the energy interval where the motor worked was taken as the output variable whereas the amount of load and the building height were taken as input variables. The most appropriate working intervals depending on the characteristics of these variables were defined by the help of an expert. Fuzzy expert system software was formed using Delphi programming language. In this design, mamdani max-min inference mechanism was used and the centroid method was employed in the clarification procedure. In conclusion, it is observed that the system that was designed is feasible and this is supported by statistical analyses..

Keywords: Fuzzy Logic Control, DC Motor, Load Elevators, Power Control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2596
8411 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: Goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, type-I error, penalized quasi-likelihood, power, quasi-likelihood.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 733
8410 CART Method for Modeling the Output Power of Copper Bromide Laser

Authors: Iliycho P. Iliev, Desislava S. Voynikova, Snezhana G. Gocheva-Ilieva

Abstract:

This paper examines the available experiment data for a copper bromide vapor laser (CuBr laser), emitting at two wavelengths - 510.6 and 578.2nm. Laser output power is estimated based on 10 independent input physical parameters. A classification and regression tree (CART) model is obtained which describes 97% of data. The resulting binary CART tree specifies which input parameters influence considerably each of the classification groups. This allows for a technical assessment that indicates which of these are the most significant for the manufacture and operation of the type of laser under consideration. The predicted values of the laser output power are also obtained depending on classification. This aids the design and development processes considerably.

Keywords: Classification and regression trees (CART), Copper Bromide laser (CuBr laser), laser generation, nonparametric statistical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1824
8409 Application the Statistical Conditional Entropy Function for Definition of Cause-and-Effect Relations during Primary Soil Formation

Authors: Vladimir K. Mukhomorov

Abstract:

Within the framework of a method of the information theory it is offered statistics and probabilistic model for definition of cause-and-effect relations in the coupled multicomponent subsystems. The quantitative parameter which is defined through conditional and unconditional entropy functions is introduced. The method is applied to the analysis of the experimental data on dynamics of change of the chemical elements composition of plants organs (roots, reproductive organs, leafs and stems). Experiment is directed on studying of temporal processes of primary soil formation and their connection with redistribution dynamics of chemical elements in plant organs. This statistics and probabilistic model allows also quantitatively and unambiguously to specify the directions of the information streams on plant organs.

Keywords: Chemical elements, entropy function, information, plants.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1359
8408 Monitoring Blood Pressure Using Regression Techniques

Authors: Qasem Qananwah, Ahmad Dagamseh, Hiam AlQuran, Khalid Shaker Ibrahim

Abstract:

Blood pressure helps the physicians greatly to have a deep insight into the cardiovascular system. The determination of individual blood pressure is a standard clinical procedure considered for cardiovascular system problems. The conventional techniques to measure blood pressure (e.g. cuff method) allows a limited number of readings for a certain period (e.g. every 5-10 minutes). Additionally, these systems cause turbulence to blood flow; impeding continuous blood pressure monitoring, especially in emergency cases or critically ill persons. In this paper, the most important statistical features in the photoplethysmogram (PPG) signals were extracted to estimate the blood pressure noninvasively. PPG signals from more than 40 subjects were measured and analyzed and 12 features were extracted. The features were fed to principal component analysis (PCA) to find the most important independent features that have the highest correlation with blood pressure. The results show that the stiffness index means and standard deviation for the beat-to-beat heart rate were the most important features. A model representing both features for Systolic Blood Pressure (SBP) and Diastolic Blood Pressure (DBP) was obtained using a statistical regression technique. Surface fitting is used to best fit the series of data and the results show that the error value in estimating the SBP is 4.95% and in estimating the DBP is 3.99%.

Keywords: Blood pressure, noninvasive optical system, PCA, continuous monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 687
8407 Analysis of Event-related Response in Human Visual Cortex with fMRI

Authors: Ayesha Zaman, Tanvir Atahary, Shahida Rafiq

Abstract:

Functional Magnetic Resonance Imaging(fMRI) is a noninvasive imaging technique that measures the hemodynamic response related to neural activity in the human brain. Event-related functional magnetic resonance imaging (efMRI) is a form of functional Magnetic Resonance Imaging (fMRI) in which a series of fMRI images are time-locked to a stimulus presentation and averaged together over many trials. Again an event related potential (ERP) is a measured brain response that is directly the result of a thought or perception. Here the neuronal response of human visual cortex in normal healthy patients have been studied. The patients were asked to perform a visual three choice reaction task; from the relative response of each patient corresponding neuronal activity in visual cortex was imaged. The average number of neurons in the adult human primary visual cortex, in each hemisphere has been estimated at around 140 million. Statistical analysis of this experiment was done with SPM5(Statistical Parametric Mapping version 5) software. The result shows a robust design of imaging the neuronal activity of human visual cortex.

Keywords: Echo Planner Imaging, Event related Response, General Linear Model, Visual Neuronal Response.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456
8406 Evaluating the Capability of the Flux-Limiter Schemes in Capturing the Turbulence Structures in a Fully Developed Channel Flow

Authors: Mohamed Elghorab, Vendra C. Madhav Rao, Jennifer X. Wen

Abstract:

Turbulence modelling is still evolving, and efforts are on to improve and develop numerical methods to simulate the real turbulence structures by using the empirical and experimental information. The monotonically integrated large eddy simulation (MILES) is an attractive approach for modelling turbulence in high Re flows, which is based on the solving of the unfiltered flow equations with no explicit sub-grid scale (SGS) model. In the current work, this approach has been used, and the action of the SGS model has been included implicitly by intrinsic nonlinear high-frequency filters built into the convection discretization schemes. The MILES solver is developed using the opensource CFD OpenFOAM libraries. The role of flux limiters schemes namely, Gamma, superBee, van-Albada and van-Leer, is studied in predicting turbulent statistical quantities for a fully developed channel flow with a friction Reynolds number, ReT = 180, and compared the numerical predictions with the well-established Direct Numerical Simulation (DNS) results for studying the wall generated turbulence. It is inferred from the numerical predictions that Gamma, van-Leer and van-Albada limiters produced more diffusion and overpredicted the velocity profiles, while superBee scheme reproduced velocity profiles and turbulence statistical quantities in good agreement with the reference DNS data in the streamwise direction although it deviated slightly in the spanwise and normal to the wall directions. The simulation results are further discussed in terms of the turbulence intensities and Reynolds stresses averaged in time and space to draw conclusion on the flux limiter schemes performance in OpenFOAM context.

Keywords: Flux limiters, MILES, OpenFOAM, turbulence structures, TVD schemes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1124
8405 Combining ASTER Thermal Data and Spatial-Based Insolation Model for Identification of Geothermal Active Areas

Authors: Khalid Hussein, Waleed Abdalati, Pakorn Petchprayoon, Khaula Alkaabi

Abstract:

In this study, we integrated ASTER thermal data with an area-based spatial insolation model to identify and delineate geothermally active areas in Yellowstone National Park (YNP). Two pairs of L1B ASTER day- and nighttime scenes were used to calculate land surface temperature. We employed the Emissivity Normalization Algorithm which separates temperature from emissivity to calculate surface temperature. We calculated the incoming solar radiation for the area covered by each of the four ASTER scenes using an insolation model and used this information to compute temperature due to solar radiation. We then identified the statistical thermal anomalies using land surface temperature and the residuals calculated from modeled temperatures and ASTER-derived surface temperatures. Areas that had temperatures or temperature residuals greater than 2σ and between 1σ and 2σ were considered ASTER-modeled thermal anomalies. The areas identified as thermal anomalies were in strong agreement with the thermal areas obtained from the YNP GIS database. Also the YNP hot springs and geysers were located within areas identified as anomalous thermal areas. The consistency between our results and known geothermally active areas indicate that thermal remote sensing data, integrated with a spatial-based insolation model, provides an effective means for identifying and locating areas of geothermal activities over large areas and rough terrain.

Keywords: Thermal remote sensing, insolation model, land surface temperature, geothermal anomalies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1025
8404 Video Quality Assessment using Visual Attention Approach for Sign Language

Authors: Julia Kucerova, Jaroslav Polec, Darina Tarcsiova

Abstract:

Visual information is very important in human perception of surrounding world. Video is one of the most common ways to capture visual information. The video capability has many benefits and can be used in various applications. For the most part, the video information is used to bring entertainment and help to relax, moreover, it can improve the quality of life of deaf people. Visual information is crucial for hearing impaired people, it allows them to communicate personally, using the sign language; some parts of the person being spoken to, are more important than others (e.g. hands, face). Therefore, the information about visually relevant parts of the image, allows us to design objective metric for this specific case. In this paper, we present an example of an objective metric based on human visual attention and detection of salient object in the observed scene.

Keywords: sign language, objective video quality, visual attention, saliency

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1579
8403 Dynamic Model of a Buck Converter with a Sliding Mode Control

Authors: S. Chonsatidjamroen , K-N. Areerak, K-L. Areerak

Abstract:

This paper presents the averaging model of a buck converter derived from the generalized state-space averaging method. The sliding mode control is used to regulate the output voltage of the converter and taken into account in the model. The proposed model requires the fast computational time compared with those of the full topology model. The intensive time-domain simulations via the exact topology model are used as the comparable model. The results show that a good agreement between the proposed model and the switching model is achieved in both transient and steady-state responses. The reported model is suitable for the optimal controller design by using the artificial intelligence techniques.

Keywords: Generalized state-space averaging method, buck converter, sliding mode control, modeling, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2990
8402 The Main Principles of Text-to-Speech Synthesis System

Authors: K.R. Aida–Zade, C. Ardil, A.M. Sharifova

Abstract:

In this paper, the main principles of text-to-speech synthesis system are presented. Associated problems which arise when developing speech synthesis system are described. Used approaches and their application in the speech synthesis systems for Azerbaijani language are shown.

Keywords: synthesis of Azerbaijani language, morphemes, phonemes, sounds, sentence, speech synthesizer, intonation, accent, pronunciation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5652
8401 A File Splitting Technique for Reducing the Entropy of Text Files

Authors: Abdel-Rahman M. Jaradat, , Mansour I. Irshid, Talha T. Nassar

Abstract:

A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.

Keywords: Bit-wise compression, entropy, file splitting, source mapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1444
8400 UEMSD Risk Identification – Case Study

Authors: K. Sekulová, M. Šimon

Abstract:

The article demonstrates on a case study how it is possible to identify MSD risk. It is based on a dissertation Risk identification model of occupational diseases formation in relation to the work activity that determines what risk can endanger workers who are exposed to the specific risk factors. It is evaluated based on statistical calculations. These risk factors are main cause of upperextremities musculoskeletal disorders.

Keywords: Case study, upper-extremity musculoskeletal disorders, ergonomics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065
8399 Comparative Study of Filter Characteristics as Statistical Vocal Correlates of Clinical Psychiatric State in Human

Authors: Thaweesak Yingthawornsuk, Chusak Thanawattano

Abstract:

Acoustical properties of speech have been shown to be related to mental states of speaker with symptoms: depression and remission. This paper describes way to address the issue of distinguishing depressed patients from remitted subjects based on measureable acoustics change of their spoken sound. The vocal-tract related frequency characteristics of speech samples from female remitted and depressed patients were analyzed via speech processing techniques and consequently, evaluated statistically by cross-validation with Support Vector Machine. Our results comparatively show the classifier's performance with effectively correct separation of 93% determined from testing with the subjectbased feature model and 88% from the frame-based model based on the same speech samples collected from hospital visiting interview sessions between patients and psychiatrists.

Keywords: Depression, SVM, Vocal Extract, Vocal Tract

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541
8398 A Dynamic Hybrid Option Pricing Model by Genetic Algorithm and Black- Scholes Model

Authors: Yi-Chang Chen, Shan-Lin Chang, Chia-Chun Wu

Abstract:

Unlike this study focused extensively on trading behavior of option market, those researches were just taken their attention to model-driven option pricing. For example, Black-Scholes (B-S) model is one of the most famous option pricing models. However, the arguments of B-S model are previously mentioned by some pricing models reviewing. This paper following suggests the importance of the dynamic character for option pricing, which is also the reason why using the genetic algorithm (GA). Because of its natural selection and species evolution, this study proposed a hybrid model, the Genetic-BS model which combining GA and B-S to estimate the price more accurate. As for the final experiments, the result shows that the output estimated price with lower MAE value than the calculated price by either B-S model or its enhanced one, Gram-Charlier garch (G-C garch) model. Finally, this work would conclude that the Genetic-BS pricing model is exactly practical.

Keywords: genetic algorithm, Genetic-BS, option pricing model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2245
8397 Prediction of Time to Crack Reinforced Concrete by Chloride Induced Corrosion

Authors: Anuruddha Jayasuriya, Thanakorn Pheeraphan

Abstract:

In this paper, a review of different mathematical models which can be used as prediction tools to assess the time to crack reinforced concrete (RC) due to corrosion is investigated. This investigation leads to an experimental study to validate a selected prediction model. Most of these mathematical models depend upon the mechanical behaviors, chemical behaviors, electrochemical behaviors or geometric aspects of the RC members during a corrosion process. The experimental program is designed to verify the accuracy of a well-selected mathematical model from a rigorous literature study. Fundamentally, the experimental program exemplifies both one-dimensional chloride diffusion using RC squared slab elements of 500 mm by 500 mm and two-dimensional chloride diffusion using RC squared column elements of 225 mm by 225 mm by 500 mm. Each set consists of three water-to-cement ratios (w/c); 0.4, 0.5, 0.6 and two cover depths; 25 mm and 50 mm. 12 mm bars are used for column elements and 16 mm bars are used for slab elements. All the samples are subjected to accelerated chloride corrosion in a chloride bath of 5% (w/w) sodium chloride (NaCl) solution. Based on a pre-screening of different models, it is clear that the well-selected mathematical model had included mechanical properties, chemical and electrochemical properties, nature of corrosion whether it is accelerated or natural, and the amount of porous area that rust products can accommodate before exerting expansive pressure on the surrounding concrete. The experimental results have shown that the selected model for both one-dimensional and two-dimensional chloride diffusion had ±20% and ±10% respective accuracies compared to the experimental output. The half-cell potential readings are also used to see the corrosion probability, and experimental results have shown that the mass loss is proportional to the negative half-cell potential readings that are obtained. Additionally, a statistical analysis is carried out in order to determine the most influential factor that affects the time to corrode the reinforcement in the concrete due to chloride diffusion. The factors considered for this analysis are w/c, bar diameter, and cover depth. The analysis is accomplished by using Minitab statistical software, and it showed that cover depth is the significant effect on the time to crack the concrete from chloride induced corrosion than other factors considered. Thus, the time predictions can be illustrated through the selected mathematical model as it covers a wide range of factors affecting the corrosion process, and it can be used to predetermine the durability concern of RC structures that are vulnerable to chloride exposure. And eventually, it is further concluded that cover thickness plays a vital role in durability in terms of chloride diffusion.

Keywords: Accelerated corrosion, chloride diffusion, corrosion cracks, passivation layer, reinforcement corrosion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 894