Search results for: predicting models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2800

Search results for: predicting models

2470 Linear Prediction System in Measuring Glucose Level in Blood

Authors: Intan Maisarah Abd Rahim, Herlina Abdul Rahim, Rashidah Ghazali

Abstract:

Diabetes is a medical condition that can lead to various diseases such as stroke, heart disease, blindness and obesity. In clinical practice, the concern of the diabetic patients towards the blood glucose examination is rather alarming as some of the individual describing it as something painful with pinprick and pinch. As for some patient with high level of glucose level, pricking the fingers multiple times a day with the conventional glucose meter for close monitoring can be tiresome, time consuming and painful. With these concerns, several non-invasive techniques were used by researchers in measuring the glucose level in blood, including ultrasonic sensor implementation, multisensory systems, absorbance of transmittance, bio-impedance, voltage intensity, and thermography. This paper is discussing the application of the near-infrared (NIR) spectroscopy as a non-invasive method in measuring the glucose level and the implementation of the linear system identification model in predicting the output data for the NIR measurement. In this study, the wavelengths considered are at the 1450 nm and 1950 nm. Both of these wavelengths showed the most reliable information on the glucose presence in blood. Then, the linear Autoregressive Moving Average Exogenous model (ARMAX) model with both un-regularized and regularized methods was implemented in predicting the output result for the NIR measurement in order to investigate the practicality of the linear system in this study. However, the result showed only 50.11% accuracy obtained from the system which is far from the satisfying results that should be obtained.

Keywords: Diabetes, glucose level, linear, near-infrared (NIR), non-invasive, prediction system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
2469 Operating System Based Virtualization Models in Cloud Computing

Authors: Dev Ras Pandey, Bharat Mishra, S. K. Tripathi

Abstract:

Cloud computing is ready to transform the structure of businesses and learning through supplying the real-time applications and provide an immediate help for small to medium sized businesses. The ability to run a hypervisor inside a virtual machine is important feature of virtualization and it is called nested virtualization. In today’s growing field of information technology, many of the virtualization models are available, that provide a convenient approach to implement, but decision for a single model selection is difficult. This paper explains the applications of operating system based virtualization in cloud computing with an appropriate/suitable model with their different specifications and user’s requirements. In the present paper, most popular models are selected, and the selection was based on container and hypervisor based virtualization. Selected models were compared with a wide range of user’s requirements as number of CPUs, memory size, nested virtualization supports, live migration and commercial supports, etc. and we identified a most suitable model of virtualization.

Keywords: Virtualization, OS based virtualization, container and hypervisor based virtualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1946
2468 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.

Keywords: Integral differential equations, American options, jump–diffusion model, rational approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 563
2467 The Use of Performance Indicators for Evaluating Models of Drying Jackfruit (Artocarpus heterophyllus L.): Page, Midilli, and Lewis

Authors: D. S. C. Soares, D. G. Costa, J. T. S., A. K. S. Abud, T. P. Nunes, A. M. Oliveira Júnior

Abstract:

Mathematical models of drying are used for the purpose of understanding the drying process in order to determine important parameters for design and operation of the dryer. The jackfruit is a fruit with high consumption in the Northeast and perishability. It is necessary to apply techniques to improve their conservation for longer in order to diffuse it by regions with low consumption. This study aimed to analyze several mathematical models (Page, Lewis, and Midilli) to indicate one that best fits the conditions of convective drying process using performance indicators associated with each model: accuracy (Af) and noise factors (Bf), mean square error (RMSE) and standard error of prediction (% SEP). Jackfruit drying was carried out in convective type tray dryer at a temperature of 50°C for 9 hours. It is observed that the model Midili was more accurate with Af: 1.39, Bf: 1.33, RMSE: 0.01%, and SEP: 5.34. However, the use of the Model Midilli is not appropriate for purposes of control process due to need four tuning parameters. With the performance indicators used in this paper, the Page model showed similar results with only two parameters. It is concluded that the best correlation between the experimental and estimated data is given by the Page’s model.

Keywords: Drying, models, jackfruit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2423
2466 Analysis of Textual Data Based On Multiple 2-Class Classification Models

Authors: Shigeaki Sakurai, Ryohei Orihara

Abstract:

This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.

Keywords: Text mining, Multiple viewpoints, Differential analysis, Questionnaire data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292
2465 Comparison of different Channel Modeling Techniques used in the BPLC Systems

Authors: Justinian Anatory, Nelson Theethayi

Abstract:

The paper compares different channel models used for modeling Broadband Power-Line Communication (BPLC) system. The models compared are Zimmermann and Dostert, Philipps, Anatory et al and Anatory et al generalized Transmission Line (TL) model. The validity of each model was compared in time domain with ATP-EMTP software which uses transmission line approach. It is found that for a power-line network with minimum number of branches all the models give similar signal/pulse time responses compared with ATP-EMTP software; however, Zimmermann and Dostert model indicates the same amplitude but different time delay. It is observed that when the numbers of branches are increased only generalized TL theory approach results are comparable with ATPEMTP results. Also the Multi-Carrier Spread Spectrum (MC-SS) system was applied to check the implication of such behavior on the modulation schemes. It is observed that using Philipps on the underground cable can predict the performance up to 25dB better than other channel models which can misread the actual performance of the system. Also modified Zimmermann and Dostert under multipath can predict a better performance of about 5dB better than the actual predicted by Generalized TL theory. It is therefore suggested for a realistic BPLC system design and analyses the model based on generalized TL theory be used.

Keywords: Broadband Power line Channel Models, loadimpedance, Branched network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1829
2464 Expectation-Confirmation Model of Information System Continuance: A Meta-Analysis

Authors: Hui-Min Lai, Chin-Pin Chen, Yung-Fu Chang

Abstract:

The expectation-confirmation model (ECM) is one of the most widely used models for evaluating information system continuance, and this model has been extended to other study backgrounds, or expanded with other theoretical perspectives. However, combining ECM with other theories or investigating the background problem may produce some disparities, thus generating inaccurate conclusions. Habit is considered to be an important factor that influences the user’s continuance behavior. This paper thus critically examines seven pairs of relationships from the original ECM and the habit variable. A meta-analysis was used to tackle the development of ECM research over the last 10 years from a range of journals and conference papers published in 2005–2014. Forty-six journal articles and 19 conference papers were selected for analysis. The results confirm our prediction that a high effect size for the seven pairs of relationships was obtained (ranging from r=0.386 to r=0.588). Furthermore, a meta-analytic structural equation modeling was performed to simultaneously test all relationships. The results show that habit had a significant positive effect on continuance intention at p<=0.05 and that the six other pairs of relationships were significant at p<0.10. Based on the findings, we refined our original research model and an alternative model was proposed for understanding and predicting information system continuance. Some theoretical implications are also discussed.

Keywords: Expectation-confirmation theory, expectation- confirmation model, meta-analysis, meta-analytic structural equation modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2737
2463 Experimental Study on the Variation of Young's Modulus of Hollow Clay Brick Obtained from Static and Dynamic Tests

Authors: M. Aboudalle, Le Btth, M. Sari, F. Meftah

Abstract:

In parallel with the appearance of new materials, brick masonry had and still has an essential part of the construction market today, with new technical challenges in designing bricks to meet additional requirements. Being used in structural applications, predicting the performance of clay brick masonry allows a significant cost reduction, in terms of practical experimentation. The behavior of masonry walls depends on the behavior of their elementary components, such as bricks, joints, and coatings. Therefore, it is necessary to consider it at different scales (from the scale of the intrinsic material to the real scale of the wall) and then to develop appropriate models, using numerical simulations. The work presented in this paper focuses on the mechanical characterization of the terracotta material at ambient temperature. As a result, the static Young’s modulus obtained from the flexural test shows different values in comparison with the compression test, as well as with the dynamic Young’s modulus obtained from the Impulse excitation of vibration test. Moreover, the Young's modulus varies according to the direction in which samples are extracted, where the values in the extrusion direction diverge from the ones in the orthogonal directions. Based on these results, hollow bricks can be considered as transversely isotropic bimodulus material.

Keywords: Bimodulus material, hollow clay brick, impulse excitation of vibration, transversely isotropic material, Young’s modulus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 458
2462 Vision Based Hand Gesture Recognition Using Generative and Discriminative Stochastic Models

Authors: Mahmoud Elmezain, Samar El-shinawy

Abstract:

Many approaches to pattern recognition are founded on probability theory, and can be broadly characterized as either generative or discriminative according to whether or not the distribution of the image features. Generative and discriminative models have very different characteristics, as well as complementary strengths and weaknesses. In this paper, we study these models to recognize the patterns of alphabet characters (A-Z) and numbers (0-9). To handle isolated pattern, generative model as Hidden Markov Model (HMM) and discriminative models like Conditional Random Field (CRF), Hidden Conditional Random Field (HCRF) and Latent-Dynamic Conditional Random Field (LDCRF) with different number of window size are applied on extracted pattern features. The gesture recognition rate is improved initially as the window size increase, but degrades as window size increase further. Experimental results show that the LDCRF is the best in terms of results than CRF, HCRF and HMM at window size equal 4. Additionally, our results show that; an overall recognition rates are 91.52%, 95.28%, 96.94% and 98.05% for CRF, HCRF, HMM and LDCRF respectively.

Keywords: Statistical Pattern Recognition, Generative Model, Discriminative Model, Human Computer Interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2938
2461 Dynamic Load Modeling for KHUZESTAN Power System Voltage Stability Studies

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

Based on the component approach, three kinds of dynamic load models, including a single –motor model, a two-motor model and composite load model have been developed for the stability studies of Khuzestan power system. The study results are presented in this paper. Voltage instability is a dynamic phenomenon and therefore requires dynamic representation of the power system components. Industrial loads contain a large fraction of induction machines. Several models of different complexity are available for the description investigations. This study evaluates the dynamic performances of several dynamic load models in combination with the dynamics of a load changing transformer. Case study is steel industrial substation in Khuzestan power systems.

Keywords: Dynamic load, modeling, Voltage Stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1861
2460 Stochastic Learning Algorithms for Modeling Human Category Learning

Authors: Toshihiko Matsuka, James E. Corter

Abstract:

Most neural network (NN) models of human category learning use a gradient-based learning method, which assumes that locally-optimal changes are made to model parameters on each learning trial. This method tends to under predict variability in individual-level cognitive processes. In addition many recent models of human category learning have been criticized for not being able to replicate rapid changes in categorization accuracy and attention processes observed in empirical studies. In this paper we introduce stochastic learning algorithms for NN models of human category learning and show that use of the algorithms can result in (a) rapid changes in accuracy and attention allocation, and (b) different learning trajectories and more realistic variability at the individual-level.

Keywords: category learning, cognitive modeling, radial basis function, stochastic optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631
2459 Personal Information Classification Based on Deep Learning in Automatic Form Filling System

Authors: Shunzuo Wu, Xudong Luo, Yuanxiu Liao

Abstract:

Recently, the rapid development of deep learning makes artificial intelligence (AI) penetrate into many fields, replacing manual work there. In particular, AI systems also become a research focus in the field of automatic office. To meet real needs in automatic officiating, in this paper we develop an automatic form filling system. Specifically, it uses two classical neural network models and several word embedding models to classify various relevant information elicited from the Internet. When training the neural network models, we use less noisy and balanced data for training. We conduct a series of experiments to test my systems and the results show that our system can achieve better classification results.

Keywords: Personal information, deep learning, auto fill, NLP, document analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 865
2458 Elastic and Plastic Collision Comparison Using Finite Element Method

Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier

Abstract:

The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.

Keywords: Collision, finite element method, Hertz’s Theory, impact models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 783
2457 General Regression Neural Network and Back Propagation Neural Network Modeling for Predicting Radial Overcut in EDM: A Comparative Study

Authors: Raja Das, M. K. Pradhan

Abstract:

This paper presents a comparative study between two neural network models namely General Regression Neural Network (GRNN) and Back Propagation Neural Network (BPNN) are used to estimate radial overcut produced during Electrical Discharge Machining (EDM). Four input parameters have been employed: discharge current (Ip), pulse on time (Ton), Duty fraction (Tau) and discharge voltage (V). Recently, artificial intelligence techniques, as it is emerged as an effective tool that could be used to replace time consuming procedures in various scientific or engineering applications, explicitly in prediction and estimation of the complex and nonlinear process. The both networks are trained, and the prediction results are tested with the unseen validation set of the experiment and analysed. It is found that the performance of both the networks are found to be in good agreement with average percentage error less than 11% and the correlation coefficient obtained for the validation data set for GRNN and BPNN is more than 91%. However, it is much faster to train GRNN network than a BPNN and GRNN is often more accurate than BPNN. GRNN requires more memory space to store the model, GRNN features fast learning that does not require an iterative procedure, and highly parallel structure. GRNN networks are slower than multilayer perceptron networks at classifying new cases.

Keywords: Electrical-discharge machining, General Regression Neural Network, Back-propagation Neural Network, Radial Overcut.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3116
2456 Analytical and Experimental Study on the Effect of Air-Core Coil Parameters on Magnetic Force Used in a Linear Optical Scanner

Authors: Loke Kean Koay, Horizon Gitano-Briggs, Mani Maran Ratnam

Abstract:

Today air-core coils (ACC) are a viable alternative to ferrite-core coils in a range of applications due to their low induction effect. An analytical study was carried out and the results were used as a guide to understand the relationship between the magnet-coil distance and the resulting attractive magnetic force. Four different ACC models were fabricated for experimental study. The variation in the models included the dimensions, the number of coil turns and the current supply to the coil. Comparison between the analytical and experimental results for all the models shows an average discrepancy of less than 10%. An optimized ACC design was selected for the scanner which can provide maximum magnetic force.

Keywords: Air-Core Coils, Electromagnetic, Linear Optical Scanner

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1381
2455 Performance Analysis of Software Reliability Models using Matrix Method

Authors: RajPal Garg, Kapil Sharma, Rajive Kumar, R. K. Garg

Abstract:

This paper presents a computational methodology based on matrix operations for a computer based solution to the problem of performance analysis of software reliability models (SRMs). A set of seven comparison criteria have been formulated to rank various non-homogenous Poisson process software reliability models proposed during the past 30 years to estimate software reliability measures such as the number of remaining faults, software failure rate, and software reliability. Selection of optimal SRM for use in a particular case has been an area of interest for researchers in the field of software reliability. Tools and techniques for software reliability model selection found in the literature cannot be used with high level of confidence as they use a limited number of model selection criteria. A real data set of middle size software project from published papers has been used for demonstration of matrix method. The result of this study will be a ranking of SRMs based on the Permanent value of the criteria matrix formed for each model based on the comparison criteria. The software reliability model with highest value of the Permanent is ranked at number – 1 and so on.

Keywords: Matrix method, Model ranking, Model selection, Model selection criteria, Software reliability models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2320
2454 Microscopic Emission and Fuel Consumption Modeling for Light-duty Vehicles Using Portable Emission Measurement System Data

Authors: Wei Lei, Hui Chen, Lin Lu

Abstract:

Microscopic emission and fuel consumption models have been widely recognized as an effective method to quantify real traffic emission and energy consumption when they are applied with microscopic traffic simulation models. This paper presents a framework for developing the Microscopic Emission (HC, CO, NOx, and CO2) and Fuel consumption (MEF) models for light-duty vehicles. The variable of composite acceleration is introduced into the MEF model with the purpose of capturing the effects of historical accelerations interacting with current speed on emission and fuel consumption. The MEF model is calibrated by multivariate least-squares method for two types of light-duty vehicle using on-board data collected in Beijing, China by a Portable Emission Measurement System (PEMS). The instantaneous validation results shows the MEF model performs better with lower Mean Absolute Percentage Error (MAPE) compared to other two models. Moreover, the aggregate validation results tells the MEF model produces reasonable estimations compared to actual measurements with prediction errors within 12%, 10%, 19%, and 9% for HC, CO, NOx emissions and fuel consumption, respectively.

Keywords: Emission, Fuel consumption, Light-duty vehicle, Microscopic, Modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2009
2453 The Reconstruction New Agegraphic and Gauss- Bonnet Dark Energy Models with a Special Power Law Expasion

Authors: V. Fayaz , F. Felegary

Abstract:

Here, in this work we study correspondence the energy density New agegraphic and the energy density Gauss- Bonnet models in flat universe. We reconstruct Λ  and Λ ω for them with 0 ( ) 0 h a t = a t .

Keywords: dark energy, new age graphic, gauss- bonnet, late time universe

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494
2452 Prediction of Product Size Distribution of a Vertical Stirred Mill Based on Breakage Kinetics

Authors: C. R. Danielle, S. Erik, T. Patrick, M. Hugh

Abstract:

In the last decade there has been an increase in demand for fine grinding due to the depletion of coarse-grained orebodies and an increase of processing fine disseminated minerals and complex orebodies. These ores have provided new challenges in concentrator design because fine and ultra-fine grinding is required to achieve acceptable recovery rates. Therefore, the correct design of a grinding circuit is important for minimizing unit costs and increasing product quality. The use of ball mills for grinding in fine size ranges is inefficient and, therefore, vertical stirred grinding mills are becoming increasingly popular in the mineral processing industry due to its already known high energy efficiency. This work presents a hypothesis of a methodology to predict the product size distribution of a vertical stirred mill using a Bond ball mill. The Population Balance Model (PBM) was used to empirically analyze the performance of a vertical mill and a Bond ball mill. The breakage parameters obtained for both grinding mills are compared to determine the possibility of predicting the product size distribution of a vertical mill based on the results obtained from the Bond ball mill. The biggest advantage of this methodology is that most of the minerals processing laboratories already have a Bond ball mill to perform the tests suggested in this study. Preliminary results show the possibility of predicting the performance of a laboratory vertical stirred mill using a Bond ball mill.

Keywords: Bond ball mill, population balance model, product size distribution, vertical stirred mill.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152
2451 Modelling of Soil Structure Interaction of Integral Abutment Bridges

Authors: Thevaneyan K. David, John P. Forth

Abstract:

Integral Abutment Bridges (IAB) are defined as simple or multiple span bridges in which the bridge deck is cast monolithically with the abutment walls. This kind of bridges are becoming very popular due to different aspects such as good response under seismic loading, low initial costs, elimination of bearings, and less maintenance. However the main issue related to the analysis of this type of structures is dealing with soil-structure interaction of the abutment walls and the supporting piles. Various soil constitutive models have been used in studies of soil-structure interaction in this kind of structures by researchers. This paper is an effort to review the implementation of various finite elements model which explicitly incorporates the nonlinear soil and linear structural response considering various soil constitutive models and finite element mesh.

Keywords: Constitutive Models, FEM, Integral AbutmentBridges, Soil-structure Interactions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4723
2450 Validation and Projections for Solar Radiation up to 2100: HadGEM2-AO Global Circulation Model

Authors: Elison Eduardo Jardim Bierhals, Claudineia Brazil, Deivid Pires, Rafael Haag, Elton Gimenez Rossini

Abstract:

The objective of this work is to evaluate the results of solar radiation projections between 2006 and 2013 for the state of Rio Grande do Sul, Brazil. The projections are provided by the General Circulation Models (MCGs) belonging to the Coupled Model Intercomparison Phase 5 (CMIP5). In all, the results of the simulation of six models are evaluated, compared to monthly data, measured by a network of thirteen meteorological stations of the National Meteorological Institute (INMET). The performance of the models is evaluated by the Nash coefficient and the Bias. The results are presented in the form of tables, graphs and spatialization maps. The ACCESS1-0 RCP 4.5 model presented the best results for the solar radiation simulations, for the most optimistic scenario, in much of the state. The efficiency coefficients (CEF) were between 0.95 and 0.98. In the most pessimistic scenario, HADGen2-AO RCP 8.5 had the best accuracy among the analyzed models, presenting coefficients of efficiency between 0.94 and 0.98. From this validation, solar radiation projection maps were elaborated, indicating a seasonal increase of this climatic variable in some regions of the Brazilian territory, mainly in the spring.

Keywords: climate change, projections, solar radiation, validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 860
2449 Importance of Hardware Systems and Circuits in Secure Software Development Life Cycle

Authors: Mir Shahriar Emami

Abstract:

Although it is fully impossible to ensure that a software system is quite secure, developing an acceptable secure software system in a convenient platform is not unreachable. In this paper, we attempt to analyze software development life cycle (SDLC) models from the hardware systems and circuits point of view. To date, the SDLC models pay merely attention to the software security from the software perspectives. In this paper, we present new features for SDLC stages to emphasize the role of systems and circuits in developing secure software system through the software development stages, the point that has not been considered previously in the SDLC models.

Keywords: Systems and circuits security, software security, software process engineering, SDLC, SSDLC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
2448 A Review on Stormwater Harvesting and Reuse

Authors: Fatema Akram, Mohammad G. Rasul, M. Masud K. Khan, M. Sharif I. I. Amir

Abstract:

Australia is a country of some 7,700 million square kilometers with a population of about 22.6 million. At present water security is a major challenge for Australia. In some areas the use of water resources is approaching and in some parts it is exceeding the limits of sustainability. A focal point of proposed national water conservation programs is the recycling of both urban stormwater and treated wastewater. But till now it is not widely practiced in Australia, and particularly stormwater is neglected. In Australia, only 4% of stormwater and rainwater is recycled, whereas less than 1% of reclaimed wastewater is reused within urban areas. Therefore, accurately monitoring, assessing and predicting the availability, quality and use of this precious resource are required for better management. As stormwater is usually of better quality than untreated sewage or industrial discharge, it has better public acceptance for recycling and reuse, particularly for non-potable use such as irrigation, watering lawns, gardens, etc. Existing stormwater recycling practice is far behind of research and no robust technologies developed for this purpose. Therefore, there is a clear need for using modern technologies for assessing feasibility of stormwater harvesting and reuse. Numerical modeling has, in recent times, become a popular tool for doing this job. It includes complex hydrological and hydraulic processes of the study area. The hydrologic model computes stormwater quantity to design the system components, and the hydraulic model helps to route the flow through stormwater infrastructures. Nowadays water quality module is incorporated with these models. Integration of Geographic Information System (GIS) with these models provides extra advantage of managing spatial information. However for the overall management of a stormwater harvesting project, Decision Support System (DSS) plays an important role incorporating database with model and GIS for the proper management of temporal information. Additionally DSS includes evaluation tools and Graphical user interface. This research aims to critically review and discuss all the aspects of stormwater harvesting and reuse such as available guidelines of stormwater harvesting and reuse, public acceptance of water reuse, the scopes and recommendation for future studies. In addition to these, this paper identifies, understand and address the importance of modern technologies capable of proper management of stormwater harvesting and reuse.

Keywords: Stormwater Management, Stormwater Harvesting and Reuse, Numerical Modeling, Geographic Information System (GIS), Decision Support System (DSS), Database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3057
2447 Energy Performance of Buildings Due to Downscaled Seasonal Models

Authors: Anastasia K. Eleftheriadou, Athanasios Sfetsos, Nikolaos Gounaris

Abstract:

The current paper presents an extensive bottom-up framework for assessing building sector-specific vulnerability to climate change: energy supply and demand. The research focuses on the application of downscaled seasonal models for estimating energy performance of buildings in Greece. The ARW-WRF model has been set-up and suitably parameterized to produce downscaled climatological fields for Greece, forced by the output of the CFSv2 model. The outer domain, D01/Europe, included 345 x 345 cells of horizontal resolution 20 x 20 km2 and the inner domain, D02/Greece, comprised 180 x 180 cells of 5 x 5 km2 horizontal resolution. The model run has been setup for a period with a forecast horizon of 6 months, storing outputs on a six hourly basis.

Keywords: Urban environment, vulnerability, climate change, energy performance, seasonal forecast models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
2446 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models

Authors: Viriyavudh Sim, WooYoung Jung

Abstract:

Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.

Keywords: Wind fragility, glass window, high rise apartment, Monte Carlo Simulation method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1222
2445 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: Classifier ensemble, breast cancer survivability, data mining, SEER.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
2444 The Data Mining usage in Production System Management

Authors: Pavel Vazan, Pavol Tanuska, Michal Kebisek

Abstract:

The paper gives the pilot results of the project that is oriented on the use of data mining techniques and knowledge discoveries from production systems through them. They have been used in the management of these systems. The simulation models of manufacturing systems have been developed to obtain the necessary data about production. The authors have developed the way of storing data obtained from the simulation models in the data warehouse. Data mining model has been created by using specific methods and selected techniques for defined problems of production system management. The new knowledge has been applied to production management system. Gained knowledge has been tested on simulation models of the production system. An important benefit of the project has been proposal of the new methodology. This methodology is focused on data mining from the databases that store operational data about the production process.

Keywords: data mining, data warehousing, management of production system, simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3477
2443 Optimal Calculation of Partial Transmission Ratios of Four-Step Helical Gearboxes for Getting Minimal Gearbox Length

Authors: Vu Ngoc Pi

Abstract:

This paper presents a new study on the applications of optimization and regression analysis techniques for optimal calculation of partial ratios of four-step helical gearboxes for getting minimal gearbox length. In the paper, basing on the moment equilibrium condition of a mechanic system including four gear units and their regular resistance condition, models for determination of the partial ratios of the gearboxes are proposed. In particular, explicit models for calculation of the partial ratios are proposed by using regression analysis. Using these models, the determination of the partial ratios is accurate and simple.

Keywords: Gearbox design; optimal design; helical gearbox, transmission ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2094
2442 Continuous Threshold Prey Harvesting in Predator-Prey Models

Authors: Jonathan Bohn, Jorge Rebaza, Kaitlin Speer

Abstract:

The dynamics of a predator-prey model with continuous threshold policy harvesting functions on the prey is studied. Theoretical and numerical methods are used to investigate boundedness of solutions, existence of bionomic equilibria, and the stability properties of coexistence equilibrium points and periodic orbits. Several bifurcations as well as some heteroclinic orbits are computed.

Keywords: Predator-prey models, threshold harvesting, dynamicalsystems

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2307
2441 A Systemic Maturity Model

Authors: Emir H. Pernet, Jeimy J. Cano

Abstract:

Maturity models, used descriptively to explain changes in reality or normatively to guide managers to make interventions to make organizations more effective and efficient, are based on the principles of statistical quality control and PDCA continuous improvement (Plan, Do, Check, Act). Some frameworks developed over the concept of maturity models include COBIT, CMM, and ITIL. This paper presents some limitations of traditional maturity models, most of them related to the mechanistic and reductionist principles over which those models are built. As systems theory helps the understanding of the dynamics of organizations and organizational change, the development of a systemic maturity model can help to overcome some of those limitations. This document proposes a systemic maturity model, based on a systemic conceptualization of organizations, focused on the study of the functioning of the parties, the relationships among them, and their behavior as a whole. The concept of maturity from the system theory perspective is conceptually defined as an emergent property of the organization, which arises as a result of the degree of alignment and integration of their processes. This concept is operationalized through a systemic function that measures the maturity of organizations, and finally validated by the measuring of maturity in some organizations. For its operationalization and validation, the model was applied to measure the maturity of organizational Governance, Risk and Compliance (GRC) processes.

Keywords: GRC, Maturity Model, Systems Theory, Viable System Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2699