Search results for: Field data based model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19698

Search results for: Field data based model

17148 The Adsorption of SDS on Ferro-Precipitates

Authors: R.Marsalek

Abstract:

This paper present a new way to find the aerodynamic characteristic equation of missile for the numerical trajectories prediction more accurate. The goal is to obtain the polynomial equation based on two missile characteristic parameters, angle of attack (α ) and flight speed (ν ). First, the understudied missile is modeled and used for flow computational model to compute aerodynamic force and moment. Assume that performance range of understudied missile where range -10< α <10 and 0< ν <200. After completely obtained results of all cases, the data are fit by polynomial interpolation to create equation of each case and then combine all equations to form aerodynamic characteristic equation, which will be used for trajectories simulation.

Keywords: ferro-precipitate, adsorption, SDS, zeta potential

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886
17147 The Investigation of Motor Cooling Performance

Authors: Chih-Chung Chang, Sy-Chi Kuo, Chen-Kang Huang, Sih-Li Chen

Abstract:

This study experimentally and numerically investigates motor cooling performance. The motor consists of a centrifugal fan, two axial fans, a shaft, a stator, a rotor and a heat exchanger with 637 cooling tubes. The pressure rise-flow rate (P-Q) performance curves of the cooling fans at 1800 rpm are tested using a test apparatus complying with the Chinese National Standard (CNS) 2726. Compared with the experimental measurements, the numerical analysis results show that the P-Q performance curves of the axial fan and centrifugal fan can be estimated within about 2% and 6%, respectively. By using the simplified model, setting up the heat exchanger and stator as porous media, the flow field in the motor is calculated. By using the results of the flow field near the rotor and stator, and subjecting the heat generation rate as a boundary condition, the temperature distributions of the stator and rotor are also calculated. The simulation results show that the calculated temperature of the stator winding near the axial fans is lower by about 5% than the measured value, and the calculated temperature of the stator core located at the center of the stator is about 1% higher than the measured value. Besides, discussion is made to improve the motor cooling performance.

Keywords: Motor cooling, P-Q performance curves, CNS, porous media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615
17146 Ultrasound Mechanical Index as a Parameter Affecting of the Ability of Proliferation of Cells

Authors: Z. Hormozi Moghaddam, M. Mokhtari-Dizaji, M. Movahedin, M. E. Ravari

Abstract:

Mechanical index (MI) is used for quantifying acoustic cavitation and the relationship between acoustic pressure and the frequency. In this study, modeling of the MI was applied to provide treatment protocol and to understand the effective physical processes on reproducibility of stem cells. The acoustic pressure and MI equations are modeled and solved to estimate optimal MI for 28, 40, 150 kHz and 1 MHz frequencies. Radial and axial acoustic pressure distribution was extracted. To validate the results of the modeling, the acoustic pressure in the water and near field depth was measured by a piston hydrophone. Results of modeling and experiments show that the model is consistent well to experimental results with 0.91 and 0.90 correlation of coefficient (p<0.05) for 1 MHz and 40 kHz. Low intensity ultrasound with 0.40 MI is more effective on the proliferation rate of the spermatogonial stem cells during the seven days of culture, in contrast, high MI has a harmful effect on the spermatogonial stem cells. This model provides proper treatment planning in vitro and in vivo by estimating the cavitation phenomenon.

Keywords: Ultrasound, mechanical index, modeling, stem cell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 939
17145 Education Quality Development for Excellence Performance with Higher Education by Using COBIT 5

Authors: Kemkanit Sanyanunthana

Abstract:

The purpose of this research is to study the management system of information technology which supports the education of five private universities in Thailand, according to the case studies which have been developing their qualities and standards of management and education by service provision of information technology to support the excellence performance. The concept to connect information technology with a suitable system has been created by information technology administrators for development, as a system that can be used throughout the organizations to help reach the utmost benefits of using all resources. Hence, the researcher as a person who has been performing these duties within higher education is interested to do this research by selecting the Control Objective for Information and Related Technology 5 (COBIT 5) for the Malcolm Baldrige National Quality Award (MBNQA) of America, or the National Award which applies the concept of Total Quality Management (TQM) to the organization evaluation. Such evaluation is called the Education Criteria for Performance Excellence (EdPEx) focuses on studying and comparing education quality development for excellent performance using COBIT 5 in terms of information technology to study the problems and obstacles of the investigation process for an information technology system, which is considered as an instrument to drive all organizations to reach the excellence performance of the information technology, and to be the model of evaluation and analysis of the process to be in accordance with the strategic plans of the information technology in the universities. This research is conducted in the form of descriptive and survey research according to the case studies. The data collection were carried out by using questionnaires through the administrators working related to the information technology field, and the research documents related to the change management as the main study. The research can be concluded that the performance based on the APO domain process (ALIGN, PLAN AND ORGANISE) of the COBIT 5 standard frame, which emphasizes concordant governance and management of strategic plans for the organizations, could reach only 95%. This might be because of some restrictions such as organizational cultures; therefore, the researcher has studied and analyzed the management of information technology in universities as a whole, under the organizational structures, to reach the performance in accordance with the overall APO domain which would affect the determined strategic plans to be able to develop based on the excellence performance of information technology, and to apply the risk management system at the organizational level into every performance process which would develop the work effectiveness for the resources management of information technology to reach the utmost benefits. 

Keywords: COBIT 5, APO, EdPEx Criteria, MBNQA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468
17144 Flexible Sensor Array with Programmable Measurement System

Authors: Jung-Chuan Chou, Wei-Chuan Chen, Chien-Cheng Chen

Abstract:

This study is concerned with pH solution detection using 2 × 4 flexible sensor array based on a plastic polyethylene terephthalate (PET) substrate that is coated a conductive layer and a ruthenium dioxide (RuO2) sensitive membrane with the technologies of screen-printing and RF sputtering. For data analysis, we also prepared a dynamic measurement system for acquiring the response voltage and analyzing the characteristics of the working electrodes (WEs), such as sensitivity and linearity. In this condition, an array measurement system was designed to acquire the original signal from sensor array, and it is based on the method of digital signal processing (DSP). The DSP modifies the unstable acquisition data to a direct current (DC) output using the technique of digital filter. Hence, this sensor array can obtain a satisfactory yield, 62.5%, through the design measurement and analysis system in our laboratory.

Keywords: Flexible sensor array, PET, RuO2, dynamic measurement, data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
17143 Realization of Fractional-Order Capacitors with Field-Effect Transistors

Authors: Steve Hung-Lung Tu, Yu-Hsuan Cheng

Abstract:

A novel and efficient approach to realize fractional-order capacitors is investigated in this paper. Meanwhile, a new approach which is more efficient for semiconductor implementation of fractional-order capacitors is proposed. The feasibility of the approach has been verified with the preliminary measured results.

Keywords: Fractional-order, field-effect transistors, RC transmission lines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3135
17142 Seamless Flow of Voluminous Data in High Speed Network without Congestion Using Feedback Mechanism

Authors: T.Sheela, Dr.J.Raja

Abstract:

Continuously growing needs for Internet applications that transmit massive amount of data have led to the emergence of high speed network. Data transfer must take place without any congestion and hence feedback parameters must be transferred from the receiver end to the sender end so as to restrict the sending rate in order to avoid congestion. Even though TCP tries to avoid congestion by restricting the sending rate and window size, it never announces the sender about the capacity of the data to be sent and also it reduces the window size by half at the time of congestion therefore resulting in the decrease of throughput, low utilization of the bandwidth and maximum delay. In this paper, XCP protocol is used and feedback parameters are calculated based on arrival rate, service rate, traffic rate and queue size and hence the receiver informs the sender about the throughput, capacity of the data to be sent and window size adjustment, resulting in no drastic decrease in window size, better increase in sending rate because of which there is a continuous flow of data without congestion. Therefore as a result of this, there is a maximum increase in throughput, high utilization of the bandwidth and minimum delay. The result of the proposed work is presented as a graph based on throughput, delay and window size. Thus in this paper, XCP protocol is well illustrated and the various parameters are thoroughly analyzed and adequately presented.

Keywords: Bandwidth-Delay Product, Congestion Control, Congestion Window, TCP/IP

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469
17141 A Decision Support Model for Bank Branch Location Selection

Authors: Nihan Cinar

Abstract:

Location selection is one of the most important decision making process which requires to consider several criteria based on the mission and the strategy. This study-s object is to provide a decision support model in order to help the bank selecting the most appropriate location for a bank-s branch considering a case study in Turkey. The object of the bank is to select the most appropriate city for opening a branch among six alternatives in the South-Eastern of Turkey. The model in this study was consisted of five main criteria which are Demographic, Socio-Economic, Sectoral Employment, Banking and Trade Potential and twenty one subcriteria which represent the bank-s mission and strategy. Because of the multi-criteria structure of the problem and the fuzziness in the comparisons of the criteria, fuzzy AHP is used and for the ranking of the alternatives, TOPSIS method is used.

Keywords: MCDM, bank branch location, fuzzy AHP, TOPSIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4939
17140 Hidden State Probabilistic Modeling for Complex Wavelet Based Image Registration

Authors: F. C. Calnegru

Abstract:

This article presents a computationally tractable probabilistic model for the relation between the complex wavelet coefficients of two images of the same scene. The two images are acquisitioned at distinct moments of times, or from distinct viewpoints, or by distinct sensors. By means of the introduced probabilistic model, we argue that the similarity between the two images is controlled not by the values of the wavelet coefficients, which can be altered by many factors, but by the nature of the wavelet coefficients, that we model with the help of hidden state variables. We integrate this probabilistic framework in the construction of a new image registration algorithm. This algorithm has sub-pixel accuracy and is robust to noise and to other variations like local illumination changes. We present the performance of our algorithm on various image types.

Keywords: Complex wavelet transform, image registration, modeling using hidden state variables, probabilistic similaritymeasure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1361
17139 Accurate HLA Typing at High-Digit Resolution from NGS Data

Authors: Yazhi Huang, Jing Yang, Dingge Ying, Yan Zhang, Vorasuk Shotelersuk, Nattiya Hirankarn, Pak Chung Sham, Yu Lung Lau, Wanling Yang

Abstract:

Human leukocyte antigen (HLA) typing from next generation sequencing (NGS) data has the potential for applications in clinical laboratories and population genetic studies. Here we introduce a novel technique for HLA typing from NGS data based on read-mapping using a comprehensive reference panel containing all known HLA alleles and de novo assembly of the gene-specific short reads. An accurate HLA typing at high-digit resolution was achieved when it was tested on publicly available NGS data, outperforming other newly-developed tools such as HLAminer and PHLAT.

Keywords: Human leukocyte antigens, next generation sequencing, whole exome sequencing, HLA typing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2596
17138 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: Agent-oriented modeling, business Intelligence management, distributed data mining, multi-agent system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1354
17137 A Mixed Integer Linear Programming Model for Flexible Job Shop Scheduling Problem

Authors: Mohsen Ziaee

Abstract:

In this paper, a mixed integer linear programming (MILP) model is presented to solve the flexible job shop scheduling problem (FJSP). This problem is one of the hardest combinatorial problems. The objective considered is the minimization of the makespan. The computational results of the proposed MILP model were compared with those of the best known mathematical model in the literature in terms of the computational time. The results show that our model has better performance with respect to all the considered performance measures including relative percentage deviation (RPD) value, number of constraints, and total number of variables. By this improved mathematical model, larger FJS problems can be optimally solved in reasonable time, and therefore, the model would be a better tool for the performance evaluation of the approximation algorithms developed for the problem.

Keywords: Scheduling, flexible job shop, makespan, mixed integer linear programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
17136 Autonomously Determining the Parameters for SVDD with RBF Kernel from a One-Class Training Set

Authors: Andreas Theissler, Ian Dear

Abstract:

The one-class support vector machine “support vector data description” (SVDD) is an ideal approach for anomaly or outlier detection. However, for the applicability of SVDD in real-world applications, the ease of use is crucial. The results of SVDD are massively determined by the choice of the regularisation parameter C and the kernel parameter  of the widely used RBF kernel. While for two-class SVMs the parameters can be tuned using cross-validation based on the confusion matrix, for a one-class SVM this is not possible, because only true positives and false negatives can occur during training. This paper proposes an approach to find the optimal set of parameters for SVDD solely based on a training set from one class and without any user parameterisation. Results on artificial and real data sets are presented, underpinning the usefulness of the approach.

Keywords: Support vector data description, anomaly detection, one-class classification, parameter tuning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2911
17135 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: Metagenomics, phenotype prediction, deep learning, embeddings, multiple instance learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 876
17134 Validation and Selection between Machine Learning Technique and Traditional Methods to Reduce Bullwhip Effects: a Data Mining Approach

Authors: Hamid R. S. Mojaveri, Seyed S. Mousavi, Mojtaba Heydar, Ahmad Aminian

Abstract:

The aim of this paper is to present a methodology in three steps to forecast supply chain demand. In first step, various data mining techniques are applied in order to prepare data for entering into forecasting models. In second step, the modeling step, an artificial neural network and support vector machine is presented after defining Mean Absolute Percentage Error index for measuring error. The structure of artificial neural network is selected based on previous researchers' results and in this article the accuracy of network is increased by using sensitivity analysis. The best forecast for classical forecasting methods (Moving Average, Exponential Smoothing, and Exponential Smoothing with Trend) is resulted based on prepared data and this forecast is compared with result of support vector machine and proposed artificial neural network. The results show that artificial neural network can forecast more precisely in comparison with other methods. Finally, forecasting methods' stability is analyzed by using raw data and even the effectiveness of clustering analysis is measured.

Keywords: Artificial Neural Networks (ANN), bullwhip effect, demand forecasting, Support Vector Machine (SVM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988
17133 Nonlinear Thermal Expansion Model for SiC/Al

Authors: T.R. Sahroni, S. Sulaiman, I. Romli, M.R. Salleh, H.A. Ariff

Abstract:

The thermal expansion behaviour of silicon carbide (SCS-2) fibre reinforced 6061 aluminium matrix composite subjected to the influenced thermal mechanical cycling (TMC) process were investigated. The thermal stress has important effect on the longitudinal thermal expansion coefficient of the composites. The present paper used experimental data of the thermal expansion behaviour of a SiC/Al composite for temperatures up to 370°C, in which their data was used for carrying out modelling of theoretical predictions.

Keywords: Nonlinear, thermal, fibre reinforced, metal matrixcomposites

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2677
17132 An Application of Extreme Value Theory as a Risk Measurement Approach in Frontier Markets

Authors: Dany Ng Cheong Vee, Preethee Nunkoo Gonpot, Noor-Ul-Hacq Sookia

Abstract:

In this paper, we consider the application of Extreme Value Theory as a risk measurement tool. The Value at Risk, for a set of indices, from six Stock Exchanges of Frontier markets is calculated using the Peaks over Threshold method and the performance of the model index-wise is evaluated using coverage tests and loss functions. Our results show that “fattailedness” alone of the data is not enough to justify the use of EVT as a VaR approach. The structure of the returns dynamics is also a determining factor. This approach works fine in markets which have had extremes occurring in the past thus making the model capable of coping with extremes coming up (Colombo, Tunisia and Zagreb Stock Exchanges). On the other hand, we find that indices with lower past than present volatility fail to adequately deal with future extremes (Mauritius and Kazakhstan). We also conclude that using EVT alone produces quite static VaR figures not reflecting the actual dynamics of the data.

Keywords: Extreme Value theory, Financial Crisis 2008, Frontier Markets, Value at Risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2358
17131 Experiments on Element and Document Statistics for XML Retrieval

Authors: Mohamed Ben Aouicha, Mohamed Tmar, Mohand Boughanem, Mohamed Abid

Abstract:

This paper presents an information retrieval model on XML documents based on tree matching. Queries and documents are represented by extended trees. An extended tree is built starting from the original tree, with additional weighted virtual links between each node and its indirect descendants allowing to directly reach each descendant. Therefore only one level separates between each node and its indirect descendants. This allows to compare the user query and the document with flexibility and with respect to the structural constraints of the query. The content of each node is very important to decide weither a document element is relevant or not, thus the content should be taken into account in the retrieval process. We separate between the structure-based and the content-based retrieval processes. The content-based score of each node is commonly based on the well-known Tf × Idf criteria. In this paper, we compare between this criteria and another one we call Tf × Ief. The comparison is based on some experiments into a dataset provided by INEX1 to show the effectiveness of our approach on one hand and those of both weighting functions on the other.

Keywords: XML retrieval, INEX, Tf × Idf, Tf × Ief

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1309
17130 The Conceptualization of Integrated Consumer Health Informatics Utilization Framework

Authors: Norfadzila, S.W.A., Balakrishnan, V., A. Abrizah

Abstract:

The purpose of this paper is to propose an integrated consumer health informatics utilization framework that can be used to gauge the online health information needs and usage patterns among Malaysian women. The proposed framework was developed based on four different theories/models: Use and Gratification Theory, Technology Acceptance 3 Model, Health Belief Model, and Multi-level Model of Information Seeking. The relevant constructs and research hypotheses are also presented in this paper. The framework will be tested in order for it to be used successfully to identify Malaysian women-s preferences of online health information resources and health information seeking activities.

Keywords: Consumer Health Informatics, Consumer Preferences, Information Needs and Usage Patterns, Online Health Information, Women Studies

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1699
17129 A Novel Fuzzy-Neural Based Medical Diagnosis System

Authors: S. Moein, S. A. Monadjemi, P. Moallem

Abstract:

In this paper, application of artificial neural networks in typical disease diagnosis has been investigated. The real procedure of medical diagnosis which usually is employed by physicians was analyzed and converted to a machine implementable format. Then after selecting some symptoms of eight different diseases, a data set contains the information of a few hundreds cases was configured and applied to a MLP neural network. The results of the experiments and also the advantages of using a fuzzy approach were discussed as well. Outcomes suggest the role of effective symptoms selection and the advantages of data fuzzificaton on a neural networks-based automatic medical diagnosis system.

Keywords: Artificial Neural Networks, Fuzzy Logic, MedicalDiagnosis, Symptoms, Fuzzification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2237
17128 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data

Authors: Salam Khalifa, Naveed Ahmed

Abstract:

We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignement method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.

Keywords: 3D video, 3D animation, RGB-D video, Temporally Coherent 3D Animation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052
17127 Analytical Model for Brine Discharges from a Sea Outfall with Multiport Diffusers

Authors: Anton Purnama

Abstract:

Multiport diffusers are the effective engineering devices installed at the modern marine outfalls for the steady discharge of effluent streams from the coastal plants, such as municipal sewage treatment, thermal power generation and seawater desalination. A mathematical model using a two-dimensional advection-diffusion equation based on a flat seabed and incorporating the effect of a coastal tidal current is developed to calculate the compounded concentration following discharges of desalination brine from a sea outfall with multiport diffusers. The analytical solutions are computed graphically to illustrate the merging of multiple brine plumes in shallow coastal waters, and further approximation will be made to the maximum shoreline's concentration to formulate dilution of a multiport diffuser discharge.

Keywords: Desalination brine discharge, mathematical model, multiport diffuser, two sea outfalls.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2975
17126 Gas Flow Rate Identification in Biomass Power Plants by Response Surface Method

Authors: J. Satonsaowapak, M. Krapeedang, R. Oonsivilai, A. Oonsivilai

Abstract:

The utilize of renewable energy sources becomes more crucial and fascinatingly, wider application of renewable energy devices at domestic, commercial and industrial levels is not only affect to stronger awareness but also significantly installed capacities. Moreover, biomass principally is in form of woods and converts to be energy for using by humans for a long time. Gasification is a process of conversion of solid carbonaceous fuel into combustible gas by partial combustion. Many gasified models have various operating conditions because the parameters kept in each model are differentiated. This study applied the experimental data including three inputs variables including biomass consumption; temperature at combustion zone and ash discharge rate and gas flow rate as only one output variable. In this paper, response surface methods were applied for identification of the gasified system equation suitable for experimental data. The result showed that linear model gave superlative results.

Keywords: Gasified System, Identification, Response SurfaceMethod

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1224
17125 BTG-BIBA: A Flexibility-Enhanced Biba Model Using BTG Strategies for Operating System

Authors: Gang Liu, Can Wang, Runnan Zhang, Quan Wang, Huimin Song, Shaomin Ji

Abstract:

Biba model can protect information integrity but might deny various non-malicious access requests of the subjects, thereby decreasing the availability in the system. Therefore, a mechanism that allows exceptional access control is needed. Break the Glass (BTG) strategies refer an efficient means for extending the access rights of users in exceptional cases. These strategies help to prevent a system from stagnation. An approach is presented in this work for integrating Break the Glass strategies into the Biba model. This research proposes a model, BTG-Biba, which provides both an original Biba model used in normal situations and a mechanism used in emergency situations. The proposed model is context aware, can implement a fine-grained type of access control and primarily solves cross-domain access problems. Finally, the flexibility and availability improvement with the use of the proposed model is illustrated.

Keywords: Biba model, break the glass, context, cross-domain, fine-grained.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1128
17124 A Comparative Analysis of Different Web Content Mining Tools

Authors: T. Suresh Kumar, M. Arthanari, N. Shanthi

Abstract:

Nowadays, the Web has become one of the most pervasive platforms for information change and retrieval. It collects the suitable and perfectly fitting information from websites that one requires. Data mining is the form of extracting data’s available in the internet. Web mining is one of the elements of data mining Technique, which relates to various research communities such as information recovery, folder managing system and simulated intellects. In this Paper we have discussed the concepts of Web mining. We contain generally focused on one of the categories of Web mining, specifically the Web Content Mining and its various farm duties. The mining tools are imperative to scanning the many images, text, and HTML documents and then, the result is used by the various search engines. We conclude by presenting a comparative table of these tools based on some pertinent criteria.

Keywords: Data Mining, Web Mining, Web Content Mining, Mining Tools, Information retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3528
17123 Air Pollution and Respiratory-Related Restricted Activity Days in Tunisia

Authors: Mokhtar Kouki Inès Rekik

Abstract:

This paper focuses on the assessment of the air pollution and morbidity relationship in Tunisia. Air pollution is measured by ozone air concentration and the morbidity is measured by the number of respiratory-related restricted activity days during the 2-week period prior to the interview. Socioeconomic data are also collected in order to adjust for any confounding covariates. Our sample is composed by 407 Tunisian respondents; 44.7% are women, the average age is 35.2, near 69% are living in a house built after 1980, and 27.8% have reported at least one day of respiratory-related restricted activity. The model consists on the regression of the number of respiratory-related restricted activity days on the air quality measure and the socioeconomic covariates. In order to correct for zero-inflation and heterogeneity, we estimate several models (Poisson, negative binomial, zero inflated Poisson, Poisson hurdle, negative binomial hurdle and finite mixture Poisson models). Bootstrapping and post-stratification techniques are used in order to correct for any sample bias. According to the Akaike information criteria, the hurdle negative binomial model has the greatest goodness of fit. The main result indicates that, after adjusting for socioeconomic data, the ozone concentration increases the probability of positive number of restricted activity days.

Keywords: Bootstrapping, hurdle negbin model, overdispersion, ozone concentration, respiratory-related restricted activity days.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2111
17122 Groundwater Level Prediction at a Pilot Area in Southeastern Part of the UAE using Shallow Seismic Method

Authors: Murad A, Baker H, Mahmoud S, Gabr A

Abstract:

The groundwater is one of the main sources for sustainability in the United Arab Emirates (UAE). Intensive developments in Al-Ain area lead to increase water demand, which consequently reduced the overall groundwater quantity in major aquifers. However, in certain residential areas within Al-Ain, it has been noticed that the groundwater level is rising, for example in Sha-ab Al Askher area. The reasons for the groundwater rising phenomenon are yet to be investigated. In this work, twenty four seismic refraction profiles have been carried out along the study pilot area; as well as field measurement of the groundwater level in a number of available water wells in the area. The processed seismic data indicated the deepest and shallowest groundwater levels are 15m and 2.3 meters respectively. This result is greatly consistent with the proper field measurement of the groundwater level. The minimum detected value may be referred to perched subsurface water which may be associated to the infiltration from the surrounding water bodies such as lakes, and elevated farms. The maximum values indicate the accurate groundwater level within the study area. The findings of this work may be considered as a preliminary help to the decision makers.

Keywords: groundwater, shallow seismic method, United Arab Emirates

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1480
17121 Serviceability of Fabric-Formed Concrete Structures

Authors: Yadgar Tayfur, Antony Darby, Tim Ibell, Mark Evernden, John Orr

Abstract:

Fabric form-work is a technique to cast concrete structures with a great advantage of saving concrete material of up to 40%. This technique is particularly associated with the optimized concrete structures that usually have smaller cross-section dimensions than equivalent prismatic members. However, this can make the structural system produced from these members prone to smaller serviceability safety margins. Therefore, it is very important to understand the serviceability issue of non-prismatic concrete structures. In this paper, an analytical computer-based model to optimize concrete beams and to predict load-deflection behaviour of both prismatic and non-prismatic concrete beams is presented. The model was developed based on the method of sectional analysis and integration of curvatures. Results from the analytical model were compared to load-deflection behaviour of a number of beams with different geometric and material properties from other researchers. The results of the comparison show that the analytical program can accurately predict the load-deflection response of concrete beams with medium reinforcement ratios. However, it over-estimates deflection values for lightly reinforced specimens. Finally, the analytical program acceptably predicted load-deflection behaviour of on-prismatic concrete beams.

Keywords: Concrete beams, deflections, fabric formwork, optimisation, serviceability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554
17120 Supplier Selection in a Scenario Based Stochastic Model with Uncertain Defectiveness and Delivery Lateness Rates

Authors: Abeer Amayri, Akif A. Bulgak

Abstract:

Due to today’s globalization as well as outsourcing practices of the companies, the Supply Chain (SC) performances have become more dependent on the efficient movement of material among places that are geographically dispersed, where there is more chance for disruptions. One such disruption is the quality and delivery uncertainties of outsourcing. These uncertainties could lead the products to be unsafe and, as is the case in a number of recent examples, companies may have to end up in recalling their products. As a result of these problems, there is a need to develop a methodology for selecting suppliers globally in view of risks associated with low quality and late delivery. Accordingly, we developed a two-stage stochastic model that captures the risks associated with uncertainty in quality and delivery as well as a solution procedure for the model. The stochastic model developed simultaneously optimizes supplier selection and purchase quantities under price discounts over a time horizon. In particular, our target is the study of global organizations with multiple sites and multiple overseas suppliers, where the pricing is offered in suppliers’ local currencies. Our proposed methodology is applied to a case study for a US automotive company having two assembly plants and four potential global suppliers to illustrate how the proposed model works in practice.

Keywords: Global supply chains, quality, stochastic programming, supplier selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540
17119 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data

Authors: Benjamin D. Leiby, Darryl K. Ahner

Abstract:

This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known  values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions, while presenting a need for further refinement that mimics predictive mean matching.

Keywords: Correlation, country conflict, imputation, stochastic regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 389