Search results for: Information Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10332

Search results for: Information Model

5682 Software Engineering Inspired Cost Estimation for Process Modelling

Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller

Abstract:

Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development process and the process of process modelling which is a phase of the Business Process Management life-cycle.

Keywords: Cost Estimation, Effort Estimation, Process Modelling, Business Process Management, COCOMO.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2261
5681 Mining Association Rules from Unstructured Documents

Authors: Hany Mahgoub

Abstract:

This paper presents a system for discovering association rules from collections of unstructured documents called EART (Extract Association Rules from Text). The EART system treats texts only not images or figures. EART discovers association rules amongst keywords labeling the collection of textual documents. The main characteristic of EART is that the system integrates XML technology (to transform unstructured documents into structured documents) with Information Retrieval scheme (TF-IDF) and Data Mining technique for association rules extraction. EART depends on word feature to extract association rules. It consists of four phases: structure phase, index phase, text mining phase and visualization phase. Our work depends on the analysis of the keywords in the extracted association rules through the co-occurrence of the keywords in one sentence in the original text and the existing of the keywords in one sentence without co-occurrence. Experiments applied on a collection of scientific documents selected from MEDLINE that are related to the outbreak of H5N1 avian influenza virus.

Keywords: Association rules, information retrieval, knowledgediscovery in text, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2420
5680 Depth-Averaged Modelling of Erosion and Sediment Transport in Free-Surface Flows

Authors: Thomas Rowan, Mohammed Seaid

Abstract:

A fast finite volume solver for multi-layered shallow water flows with mass exchange and an erodible bed is developed. This enables the user to solve a number of complex sediment-based problems including (but not limited to), dam-break over an erodible bed, recirculation currents and bed evolution as well as levy and dyke failure. This research develops methodologies crucial to the under-standing of multi-sediment fluvial mechanics and waterway design. In this model mass exchange between the layers is allowed and, in contrast to previous models, sediment and fluid are able to transfer between layers. In the current study we use a two-step finite volume method to avoid the solution of the Riemann problem. Entrainment and deposition rates are calculated for the first time in a model of this nature. In the first step the governing equations are rewritten in a non-conservative form and the intermediate solutions are calculated using the method of characteristics. In the second stage, the numerical fluxes are reconstructed in conservative form and are used to calculate a solution that satisfies the conservation property. This method is found to be considerably faster than other comparative finite volume methods, it also exhibits good shock capturing. For most entrainment and deposition equations a bed level concentration factor is used. This leads to inaccuracies in both near bed level concentration and total scour. To account for diffusion, as no vertical velocities are calculated, a capacity limited diffusion coefficient is used. The additional advantage of this multilayer approach is that there is a variation (from single layer models) in bottom layer fluid velocity: this dramatically reduces erosion, which is often overestimated in simulations of this nature using single layer flows. The model is used to simulate a standard dam break. In the dam break simulation, as expected, the number of fluid layers utilised creates variation in the resultant bed profile, with more layers offering a higher deviation in fluid velocity . These results showed a marked variation in erosion profiles from standard models. The overall the model provides new insight into the problems presented at minimal computational cost.

Keywords: Erosion, finite volume method, sediment transport, shallow water equations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 965
5679 Preparation and Investigation of Photocatalytic Properties of ZnO Nanocrystals: Effect of Operational Parameters and Kinetic Study

Authors: N. Daneshvar, S. Aber, M. S. Seyed Dorraji, A. R. Khataee, M. H. Rasoulifard

Abstract:

ZnO nanocrystals with mean diameter size 14 nm have been prepared by precipitation method, and examined as photocatalyst for the UV-induced degradation of insecticide diazinon as deputy of organic pollutant in aqueous solution. The effects of various parameters, such as illumination time, the amount of photocatalyst, initial pH values and initial concentration of insecticide on the photocatalytic degradation diazinon were investigated to find desired conditions. In this case, the desired parameters were also tested for the treatment of real water containing the insecticide. Photodegradation efficiency of diazinon was compared between commercial and prepared ZnO nanocrystals. The results indicated that UV/ZnO process applying prepared nanocrystalline ZnO offered electrical energy efficiency and quantum yield better than commercial ZnO. The present study, on the base of Langmuir-Hinshelwood mechanism, illustrated a pseudo first-order kinetic model with rate constant of surface reaction equal to 0.209 mg l-1 min-1 and adsorption equilibrium constant of 0.124 l mg-1.

Keywords: Zinc oxide nanopowder, Electricity consumption, Quantum yield, Nanoparticles, Photodegradation, Kinetic model, Insecticide.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3538
5678 Thermohydraulic Performance of Double Flow Solar Air Heater with Corrugated Absorber

Authors: S. P. Sharma, Som Nath Saha

Abstract:

This paper deals with the analytical investigation of thermal and thermohydraulic performance of double flow solar air heaters with corrugated and flat plate absorber. A mathematical model of double flow solar air heater has been presented, and a computer program in C++ language is developed to estimate the outlet temperature of air for the evaluation of thermal and thermohydraulic efficiency by solving the governing equations numerically using relevant correlations for heat transfer coefficients. The results obtained from the mathematical model is compared with the available experimental results and it is found to be reasonably good. The results show that the double flow solar air heaters have higher efficiency than conventional solar air heater, although the double flow corrugated absorber is superior to that of flat plate double flow solar air heater. It is also observed that the thermal efficiency increases with increase in mass flow rate; however, thermohydraulic efficiency increases with increase in mass flow rate up to a certain limit, attains the maximum value, then thereafter decreases sharply.

Keywords: Corrugated absorber, double flow, solar air heater, thermohydraulic efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1480
5677 Mathematical Model of Smoking Time Temperature Effect on Ribbed Smoked Sheets Quality

Authors: Rifah Ediati, Jajang

Abstract:

The quality of Ribbed Smoked Sheets (RSS) primarily based on color, dryness, and the presence or absence of fungus and bubbles. This quality is strongly influenced by the drying and fumigation process namely smoking process. Smoking that is held in high temperature long time will result scorched dark brown sheets, whereas if the temperature is too low or slow drying rate would resulted in less mature sheets and growth of fungus. Therefore need to find the time and temperature for optimum quality of sheets. Enhance, unmonitored heat and mass transfer during smoking process lead to high losses of energy balance. This research aims to generate simple empirical mathematical model describing the effect of smoking time and temperature to RSS quality of color, water content, fungus and bubbles. The second goal of study was to analyze energy balance during smoking process. Experimental study was conducted by measuring temperature, residence time and quality parameters of 16 sheets sample in smoking rooms. Data for energy consumption balance such as mass of fuel wood, mass of sheets being smoked, construction temperature, ambient temperature and relative humidity were taken directly along the smoking process. It was found that mathematical model correlating smoking temperature and time with color is Color = -169 - 0.184 T4 - 0.193 T3 - 0.160 0.405 T1 + T2 + 0.388 t1 +3.11 t2 + 3.92t3 + 0.215 t4 with R square 50.8% and with moisture is Moisture = -1.40-0.00123 T4 + 0.00032 T3 + 0.00260 T2 - 0.00292 T1 - 0.0105 t1 + 0.0290 t2 + 0.0452 t3 + 0.00061 t4 with R square of 49.9%. Smoking room energy analysis found useful energy was 27.8%. The energy stored in the material construction 7.3%. Lost of energy in conversion of wood combustion, ventilation and others were 16.6%. The energy flowed out through the contact of material construction with the ambient air was found to be the highest contribution to energy losses, it reached 48.3%.

Keywords: RSS quality, temperature, time, smoking room, energy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2731
5676 The Effect of Failure Rate on Repair and Maintenance Costs of Four Agricultural Tractor Models

Authors: Fatemeh Afsharnia, Mohammad Amin Asoodar, Abbas Abdeshahi

Abstract:

In economical evaluation literature, although the combination of some variables such as repair and maintenance costs and accumulated use hours has been widely considered in determining of optimum life for tractor, no investigation has indicated the influence of failure rate on repair and maintenance costs. In this study, the owners of three hundred tractors, which include Massey Ferguson, John Deere and Universal, were interviewed, from five regions of Khouzestan Province. A regression model was used to predict the tractors annual repair and maintenance costs based on failure rate. Results showed that the maximum percentage of annual repair and maintenance costs occurred in engine parts for MF285, JD3140 and U650 tractors while these costs for tire, ring, ball bearing and operator seat were higher compared to other MF399 tractor systems. According to the results of the regression, the failure rate increase would lead to annual repair and maintenance costs increase for all tractors. But, of all the tractors, repair and maintenance costs of JD3140 tractors extremely affected by the failure rate increase.

Keywords: Failure rate, tractor, annual repair and maintenance costs, regression model, Khouzestan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4272
5675 Main Cause of Children's Deaths in Indigenous Wayuu Community from Department of La Guajira: A Research Developed through Data Mining Use

Authors: Isaura Esther Solano Núñez, David Suarez

Abstract:

The main purpose of this research is to discover what causes death in children of the Wayuu community, and deeply analyze those results in order to take corrective measures to properly control infant mortality. We consider important to determine the reasons that are producing early death in this specific type of population, since they are the most vulnerable to high risk environmental conditions. In this way, the government, through competent authorities, may develop prevention policies and the right measures to avoid an increase of this tragic fact. The methodology used to develop this investigation is data mining, which consists in gaining and examining large amounts of data to produce new and valuable information. Through this technique it has been possible to determine that the child population is dying mostly from malnutrition. In short, this technique has been very useful to develop this study; it has allowed us to transform large amounts of information into a conclusive and important statement, which has made it easier to take appropriate steps to resolve a particular situation.

Keywords: Malnutrition, datamining, analytical, descriptive, population, wayuu, indigenous.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 667
5674 Iraqi Short Term Electrical Load Forecasting Based On Interval Type-2 Fuzzy Logic

Authors: Firas M. Tuaimah, Huda M. Abdul Abbas

Abstract:

Accurate Short Term Load Forecasting (STLF) is essential for a variety of decision making processes. However, forecasting accuracy can drop due to the presence of uncertainty in the operation of energy systems or unexpected behavior of exogenous variables. Interval Type 2 Fuzzy Logic System (IT2 FLS), with additional degrees of freedom, gives an excellent tool for handling uncertainties and it improved the prediction accuracy. The training data used in this study covers the period from January 1, 2012 to February 1, 2012 for winter season and the period from July 1, 2012 to August 1, 2012 for summer season. The actual load forecasting period starts from January 22, till 28, 2012 for winter model and from July 22 till 28, 2012 for summer model. The real data for Iraqi power system which belongs to the Ministry of Electricity.

Keywords: Short term load forecasting, prediction interval, type 2 fuzzy logic systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1870
5673 Activation Parameters of the Low Temperature Creep Controlling Mechanism in Martensitic Steels

Authors: M. Münch, R. Brandt

Abstract:

Martensitic steels with an ultimate tensile strength beyond 2000 MPa are applied in the powertrain of vehicles due to their excellent fatigue strength and high creep resistance. However, the creep controlling mechanism in martensitic steels at ambient temperatures up to 423 K is not evident. The purpose of this study is to review the low temperature creep (LTC) behavior of martensitic steels at temperatures from 363 K to 523 K. Thus, the validity of a logarithmic creep law is reviewed and the stress and temperature dependence of the creep parameters α and β are revealed. Furthermore, creep tests are carried out, which include stepped changes in temperature or stress, respectively. On one hand, the change of the creep rate due to a temperature step provides information on the magnitude of the activation energy of the LTC controlling mechanism and on the other hand, the stress step approach provides information on the magnitude of the activation volume. The magnitude, the temperature dependency, and the stress dependency of both material specific activation parameters may deliver a significant contribution to the disclosure of the nature of the LTC rate controlling mechanism.

Keywords: Activation parameters, creep mechanisms, high strength steels, low temperature creep.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 681
5672 Numerical Investigation for External Strengthening of Dapped-End Beams

Authors: A. Abdel-Moniem, H. Madkour, K. Farah, A. Abdullah

Abstract:

The reduction in dapped end beams depth nearby the supports tends to produce stress concentration and hence results in shear cracks, if it does not have an adequate reinforcement detailing. This study investigates numerically the efficiency of applying different external strengthening techniques to the dapped end of such beams. A two-dimensional finite element model was built to predict the structural behavior of dapped ends strengthened with different techniques. The techniques included external bonding of the steel angle at the re-entrant corner, un-bounded bolt anchoring, external steel plate jacketing, exterior carbon fiber wrapping and/or stripping and external inclined steel plates. The FE analysis results are then presented in terms of the ultimate load capacities, load-deflection and crack pattern at failure. The results showed that the FE model, at various stages, was found to be comparable to the available test data. Moreover, it enabled the capture of the failure progress, with acceptable accuracy, which is very difficult in a laboratory test.

Keywords: Dapped-end beams, finite element, shear failure, strengthening techniques, reinforced concrete, numerical investigation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 970
5671 The Public Law Studies: Relationship between Accountability, Environmental Education and Smart Cities

Authors: Aline Alves Bandeira, Luís Pedro Lima, Maria Cecília de Paula Silva, Paulo Henrique de Viveiros Tavares

Abstract:

Nowadays, the study of public policies regarding management efficiency is essential. Public policies are about what governments do or do not do, being an area that has grown worldwide, contributing through the knowledge of technologies and methodologies that monitor and evaluate the performance of public administrators. The information published on official government websites needs to provide for transparency and responsiveness of managers. Thus, transparency is a primordial factor for the execution of accountability, providing, in this way, services to the citizen with the expansion of transparent, efficient, democratic information and that value administrative eco-efficiency. The ecologically balanced management of a Smart City must optimize environmental education, building a fairer society, which brings about equality in the use of quality environmental resources. Smart Cities add value in the construction of public management, enabling interaction between people, enhancing environmental education and the practical applicability of administrative eco-efficiency, fostering economic development and improving the quality of life.

Keywords: Accountability, environmental education, new public administration, smart cities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 566
5670 Regional Aircraft Selection Using Preference Analysis for Reference Ideal Solution (PARIS)

Authors: C. Ardil

Abstract:

The paper presents a multiple criteria decision making analysis process to determine the most suitable regional aircraft type according to a set of evaluation criteria. The main purpose of this study is to use different decision making methods to determine the most suitable regional aircraft for aviation operators. In this context, the nine regional aircraft types were analyzed using multiple criteria decision making analysis methods. Preference analysis for reference ideal solution (PARIS) was used in regional aircraft selection process. The findings of the proposed model show that the ranking results of the multiple criteria decision making models are consistent with each other, and the proposed method is efficient, and the results are valid. Finally, the Embraer E195-E2 model regional aircraft is chosen as the most suitable aircraft type.

Keywords: aircraft, regional aircraft selection, multiple criteria decision making, multiple criteria decision making analysis, mean weight, entropy weight, MCDMA, PARIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 421
5669 Level of Service Based Methodology for Municipal Infrastructure Management

Authors: Z. Khan, O. Moselhi, T. Zayed

Abstract:

Development of levels of service in municipal context is a flexible vehicle to assist in performing quality-cost trade-off analysis for municipal services. This trade-off depends on the willingness of a community to pay as well as on the condition of the assets. Community perspective of the performance of an asset from service point of view may be quite different from the municipality perspective of the performance of the same asset from condition point of view. This paper presents a three phased level of service based methodology for water mains that consists of :1)development of an Analytical Hierarchy model of level of service 2) development of Fuzzy Weighted Sum model of water main condition index and 3) deriving a Fuzzy logic based function that maps level of service to asset condition index. This mapping will assist asset managers in quantifying condition improvement requirement to meet service goals and to make more informed decisions on interventions and relayed priorities.

Keywords: Asset Management, Level of Service, Condition Index, Analytical Hierarchy, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1932
5668 Design and Implementation of Reed Solomon Encoder on FPGA

Authors: Amandeep Singh, Mandeep Kaur

Abstract:

Error correcting codes are used for detection and correction of errors in digital communication system. Error correcting coding is based on appending of redundancy to the information message according to a prescribed algorithm. Reed Solomon codes are part of channel coding and withstand the effect of noise, interference and fading. Galois field arithmetic is used for encoding and decoding reed Solomon codes. Galois field multipliers and linear feedback shift registers are used for encoding the information data block. The design of Reed Solomon encoder is complex because of use of LFSR and Galois field arithmetic. The purpose of this paper is to design and implement Reed Solomon (255, 239) encoder with optimized and lesser number of Galois Field multipliers. Symmetric generator polynomial is used to reduce the number of GF multipliers. To increase the capability toward error correction, convolution interleaving will be used with RS encoder. The Design will be implemented on Xilinx FPGA Spartan II.

Keywords: Galois Field, Generator polynomial, LFSR, Reed Solomon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4821
5667 A Spatial Hypergraph Based Semi-Supervised Band Selection Method for Hyperspectral Imagery Semantic Interpretation

Authors: Akrem Sellami, Imed Riadh Farah

Abstract:

Hyperspectral imagery (HSI) typically provides a wealth of information captured in a wide range of the electromagnetic spectrum for each pixel in the image. Hence, a pixel in HSI is a high-dimensional vector of intensities with a large spectral range and a high spectral resolution. Therefore, the semantic interpretation is a challenging task of HSI analysis. We focused in this paper on object classification as HSI semantic interpretation. However, HSI classification still faces some issues, among which are the following: The spatial variability of spectral signatures, the high number of spectral bands, and the high cost of true sample labeling. Therefore, the high number of spectral bands and the low number of training samples pose the problem of the curse of dimensionality. In order to resolve this problem, we propose to introduce the process of dimensionality reduction trying to improve the classification of HSI. The presented approach is a semi-supervised band selection method based on spatial hypergraph embedding model to represent higher order relationships with different weights of the spatial neighbors corresponding to the centroid of pixel. This semi-supervised band selection has been developed to select useful bands for object classification. The presented approach is evaluated on AVIRIS and ROSIS HSIs and compared to other dimensionality reduction methods. The experimental results demonstrate the efficacy of our approach compared to many existing dimensionality reduction methods for HSI classification.

Keywords: Hyperspectral image, spatial hypergraph, dimensionality reduction, semantic interpretation, band selection, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1198
5666 Simulated Annealing Algorithm for Data Aggregation Trees in Wireless Sensor Networks and Comparison with Genetic Algorithm

Authors: Ladan Darougaran, Hossein Shahinzadeh, Hajar Ghotb, Leila Ramezanpour

Abstract:

In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.

Keywords: Data aggregation, wireless sensor networks, energy efficiency, simulated annealing algorithm, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1666
5665 One Hour Ahead Load Forecasting Using Artificial Neural Network for the Western Area of Saudi Arabia

Authors: A. J. Al-Shareef, E. A. Mohamed, E. Al-Judaibi

Abstract:

Load forecasting has become in recent years one of the major areas of research in electrical engineering. Most traditional forecasting models and artificial intelligence neural network techniques have been tried out in this task. Artificial neural networks (ANN) have lately received much attention, and a great number of papers have reported successful experiments and practical tests. This article presents the development of an ANN-based short-term load forecasting model with improved generalization technique for the Regional Power Control Center of Saudi Electricity Company, Western Operation Area (SEC-WOA). The proposed ANN is trained with weather-related data and historical electric load-related data using the data from the calendar years 2001, 2002, 2003, and 2004 for training. The model tested for one week at five different seasons, typically, winter, spring, summer, Ramadan and fall seasons, and the mean absolute average error for one hour-ahead load forecasting found 1.12%.

Keywords: Artificial neural networks, short-term load forecasting, back propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2092
5664 Neural Network Based Approach of Software Maintenance Prediction for Laboratory Information System

Authors: Vuk M. Popovic, Dunja D. Popovic

Abstract:

Software maintenance phase is started once a software project has been developed and delivered. After that, any modification to it corresponds to maintenance. Software maintenance involves modifications to keep a software project usable in a changed or a changing environment, to correct discovered faults, and modifications, and to improve performance or maintainability. Software maintenance and management of software maintenance are recognized as two most important and most expensive processes in a life of a software product. This research is basing the prediction of maintenance, on risks and time evaluation, and using them as data sets for working with neural networks. The aim of this paper is to provide support to project maintenance managers. They will be able to pass the issues planned for the next software-service-patch to the experts, for risk and working time evaluation, and afterward to put all data to neural networks in order to get software maintenance prediction. This process will lead to the more accurate prediction of the working hours needed for the software-service-patch, which will eventually lead to better planning of budget for the software maintenance projects.

Keywords: Laboratory information system, maintenance engineering, neural networks, software maintenance, software maintenance costs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1100
5663 Optimization of Enzymatic Hydrolysis of Manihot Esculenta Root Starch by Immobilizeda-Amylase Using Response Surface Methodology

Authors: G. Baskar, C. Muthukumaran, S. Renganathan

Abstract:

Enzymatic hydrolysis of starch from natural sources finds potential application in commercial production of alcoholic beverage and bioethanol. In this study the effect of starch concentration, temperature, time and enzyme concentration were studied and optimized for hydrolysis of cassava (Manihot esculenta) starch powder (of mesh 80/120) into glucose syrup by immobilized (using Polyacrylamide gel) a-amylase using central composite design. The experimental result on enzymatic hydrolysis of cassava starch was subjected to multiple linear regression analysis using MINITAB 14 software. Positive linear effect of starch concentration, enzyme concentration and time was observed on hydrolysis of cassava starch by a-amylase. The statistical significance of the model was validated by F-test for analysis of variance (p < 0.01). The optimum value of starch concentration temperature, time and enzyme concentration were found to be 4.5% (w/v), 45oC, 150 min, and 1% (w/v) enzyme. The maximum glucose yield at optimum condition was 5.17 mg/mL.

Keywords: Enzymatic hydrolysis, Alcoholic beverage, Centralcomposite design, Polynomial model, glucose yield.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2210
5662 Designing Software Quality Measurement System for Telecommunication Industry Using Object-Oriented Technique

Authors: Nor Fazlina Iryani Abdul Hamid, Mohamad Khatim Hasan

Abstract:

Numbers of software quality measurement system have been implemented over the past few years, but none of them focuses on telecommunication industry. Software quality measurement system for telecommunication industry was a system that could calculate the quality value of the measured software that totally focused in telecommunication industry. Before designing a system, quality factors, quality attributes and quality metrics were identified based on literature review and survey. Then, using the identified quality factors, quality attributes and quality metrics, quality model for telecommunication industry was constructed. Each identified quality metrics had its own formula. Quality value for the system was measured based on the quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). The system was designed using object-oriented approach in web-based environment. Thus, existing of software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.

Keywords: Software Quality, Quality Measurement, Object-oriented Approach, Net satisfaction Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2431
5661 Lexical Based Method for Opinion Detection on Tripadvisor Collection

Authors: Faiza Belbachir, Thibault Schienhinski

Abstract:

The massive development of online social networks allows users to post and share their opinions on various topics. With this huge volume of opinion, it is interesting to extract and interpret these information for different domains, e.g., product and service benchmarking, politic, system of recommendation. This is why opinion detection is one of the most important research tasks. It consists on differentiating between opinion data and factual data. The difficulty of this task is to determine an approach which returns opinionated document. Generally, there are two approaches used for opinion detection i.e. Lexical based approaches and Machine Learning based approaches. In Lexical based approaches, a dictionary of sentimental words is used, words are associated with weights. The opinion score of document is derived by the occurrence of words from this dictionary. In Machine learning approaches, usually a classifier is trained using a set of annotated document containing sentiment, and features such as n-grams of words, part-of-speech tags, and logical forms. Majority of these works are based on documents text to determine opinion score but dont take into account if these texts are really correct. Thus, it is interesting to exploit other information to improve opinion detection. In our work, we will develop a new way to consider the opinion score. We introduce the notion of trust score. We determine opinionated documents but also if these opinions are really trustable information in relation with topics. For that we use lexical SentiWordNet to calculate opinion and trust scores, we compute different features about users like (numbers of their comments, numbers of their useful comments, Average useful review). After that, we combine opinion score and trust score to obtain a final score. We applied our method to detect trust opinions in TRIPADVISOR collection. Our experimental results report that the combination between opinion score and trust score improves opinion detection.

Keywords: Tripadvisor, Opinion detection, SentiWordNet, trust score.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 730
5660 Noise Performance of Magnetic Field Tunable Avalanche Transit Time Source

Authors: Partha Banerjee, Aritra Acharyya, Arindam Biswas, A. K. Bhattacharjee, Amit Banerjee, Hiroshi Inokawa

Abstract:

The effect of magnetic field on the noise performance of the magnetic field tunable avalanche transit time (MAGTATT) device based on Si, designed to operate at W-band (75 – 110 GHz), has been studied in this paper. A comprehensive two-dimensional (2D) model has been developed. The simulation results show that due to the presence of applied external transverse magnetic field, both the noise spectral density and noise measure of the MAGTATT device increase significantly. The noise performance of the device has been found to be further deteriorated if the magnetic field strength is further increased. Hence, in order to achieve the magnetic field tuning of the radio frequency (RF) properties of impact avalanche transit time (IMPATT) source, the noise performance of it has to be sacrificed in fair extent. Moreover, it clearly indicates that an IMPATT source must be covered with appropriate magnetic shielding material to avoid undesirable shift in operating frequency and output power and objectionable amount of deterioration in noise performance due to the presence of external magnetic field.

Keywords: 2-D model, IMPATT, MAGTATT, mm-wave, noise performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 787
5659 Validation of Reverse Engineered Web Application Models

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.

Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600
5658 A Model for Test Case Selection in the Software-Development Life Cycle

Authors: Adtha Lawanna

Abstract:

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Keywords: Software maintenance, regression test selection, test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
5657 A Model for Test Case Selection in the Software-Development Life Cycle

Authors: Adtha Lawanna

Abstract:

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Keywords: Software maintenance, regression test selection, test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
5656 Lactic Acid-Chitosan Films’ Properties and Their in vivo Wound Healing Activity

Authors: T. S. Moe, T. A. Khaing

Abstract:

Chitosan is a derivative of chitin, a compound usually isolated from the shells of some crustaceans such as crab, lobster and shrimp. It has biocompatible, biodegradable, and antimicrobial properties. To use these properties of chitosan in biomedical fields, chitosan films (1%, 2%, 3% and 4%) were prepared by using l% lactic acid as solvent. The effects of chitosan films on tensile strength, elongation at break, degree of swelling, thickness, morphology, allergic and irritation reactions and antibacterial property were evaluated. Staphylococcus aureus and Escherichia coli were used as tested microorganisms. In vivo wound healing activities of chitosan films were investigated using mice model. As results, Chitosan films have similar appearance and good swelling properties and 4% chitosan film showed the better swelling activity and the greatest elongation ratio than the other chitosan films. They also showed their good activity of wound healing in mice model. Moreover, the results showed that the films did not produce any unwilling symptoms (allergy or irritation). In conclusion, it is evident that the chitosan film has the potentiality to use as wound healing biofilms in the biomedical fields.

Keywords: Chitosan, wound healing, antibacterial activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2612
5655 Accuracy of Displacement Estimation and Selection of Capacitors for a Four Degrees of Freedom Capacitive Force Sensor

Authors: Chisato Murakami, Makoto Takahashi

Abstract:

Force sensor has been used as requisite for knowing information on the amount and the directions of forces on the skin surface. We have developed a four-degrees-of-freedom capacitive force sensor (approximately 20×20×5 mm3) that has a flexible structure and sixteen parallel plate capacitors. An iterative algorithm was developed for estimating four displacements from the sixteen capacitances using fourth-order polynomial approximation of characteristics between capacitance and displacement. The estimation results from measured capacitances had large error caused by deterioration of the characteristics. In this study, effective capacitors had major information were selected on the basis of the capacitance change range and the characteristic shape. Maximum errors in calibration and non-calibration points were 25%and 6.8%.However the maximum error was larger than desired value, the smallness of averaged value indicated the occurrence of a few large error points. On the other hand, error in non-calibration point was within desired value.

 

Keywords: Force sensors, capacitive sensors, estimation, iterative algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1594
5654 Volatility Switching between Two Regimes

Authors: Josip Visković, Josip Arnerić, Ante Rozga

Abstract:

Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modeling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behavior of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.

Keywords: Central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2503
5653 Attacks Classification in Adaptive Intrusion Detection using Decision Tree

Authors: Dewan Md. Farid, Nouria Harbi, Emna Bahri, Mohammad Zahidur Rahman, Chowdhury Mofizur Rahman

Abstract:

Recently, information security has become a key issue in information technology as the number of computer security breaches are exposed to an increasing number of security threats. A variety of intrusion detection systems (IDS) have been employed for protecting computers and networks from malicious network-based or host-based attacks by using traditional statistical methods to new data mining approaches in last decades. However, today's commercially available intrusion detection systems are signature-based that are not capable of detecting unknown attacks. In this paper, we present a new learning algorithm for anomaly based network intrusion detection system using decision tree algorithm that distinguishes attacks from normal behaviors and identifies different types of intrusions. Experimental results on the KDD99 benchmark network intrusion detection dataset demonstrate that the proposed learning algorithm achieved 98% detection rate (DR) in comparison with other existing methods.

Keywords: Detection rate, decision tree, intrusion detectionsystem, network security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3590