Search results for: Universal Approximation function
602 Optimized Fuzzy Control by Particle Swarm Optimization Technique for Control of CSTR
Authors: Saeed Vaneshani, Hooshang Jazayeri-Rad
Abstract:
Fuzzy logic control (FLC) systems have been tested in many technical and industrial applications as a useful modeling tool that can handle the uncertainties and nonlinearities of modern control systems. The main drawback of the FLC methodologies in the industrial environment is challenging for selecting the number of optimum tuning parameters. In this paper, a method has been proposed for finding the optimum membership functions of a fuzzy system using particle swarm optimization (PSO) algorithm. A synthetic algorithm combined from fuzzy logic control and PSO algorithm is used to design a controller for a continuous stirred tank reactor (CSTR) with the aim of achieving the accurate and acceptable desired results. To exhibit the effectiveness of proposed algorithm, it is used to optimize the Gaussian membership functions of the fuzzy model of a nonlinear CSTR system as a case study. It is clearly proved that the optimized membership functions (MFs) provided better performance than a fuzzy model for the same system, when the MFs were heuristically defined.Keywords: continuous stirred tank reactor (CSTR), fuzzy logiccontrol (FLC), membership function(MF), particle swarmoptimization (PSO)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3201601 Media Regulation and Public Sphere in the Digital Age: An Analysis in the Light of Constructive Democracy
Abstract:
The article proposed intends to analyze the possibility (and conditions) of a media regulation law in a democratic rule of law in the twenty-first century. To do so, will be presented initially the idea of the public sphere (by Jürgen Habermas), showing how it is presented as an interface between the citizen and the state (or the private and public) and how important is it in a deliberative democracy. Based on this paradigm, the traditional perception of the role of public information (such as system functional element) and on the possibility of media regulation will be exposed, due to the public nature of their activity. A critical argument will then be displayed from two different perspectives: a) the formal function of the current media information, considering that the digital age has fragmented the information access; b) the concept of a constructive democracy, which reduces the need for representation, changing the strategic importance of the public sphere. The question to be addressed (based on the comparative law) is if the regulation is justified in a polycentric democracy, especially when it operates under the digital age (with immediate and virtual communication). The proposal is to be presented in the sense that even in a twenty-first century the media in a democratic rule of law still has an extremely important role and may be subject to regulation, but this should be on terms very different (and narrower) from those usually defended.Keywords: Media regulation, public sphere, digital age, constructive democracy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2434600 Experimental and Analytical Study of Scrap Tire Rubber Pad for Seismic Isolation
Authors: Huma Kanta Mishra, Akira Igarashi
Abstract:
A seismic isolation pad produced by utilizing the scrap tire rubber which contains interleaved steel reinforcing cords has been proposed. The steel cords are expected to function similar to the steel plates used in conventional laminated rubber bearings. The scrap tire rubber pad (STRP) isolator is intended to be used in low rise residential buildings of highly seismic areas of the developing countries. Experimental investigation was conducted on unbonded STRP isolators, and test results provided useful information including stiffness, damping values and an eventual instability of the isolation unit. Finite element analysis (FE analysis) of STRP isolator was carried out on properly bonded samples. These types of isolators provide positive incremental force resisting capacity up to shear strain level of 155%. This paper briefly discusses the force deformation behavior of bonded STRP isolators including stability of the isolation unit.Keywords: base isolation, buckling load, finite element analysis, STRP isolators.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2952599 Software Engineering Inspired Cost Estimation for Process Modelling
Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller
Abstract:
Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development process and the process of process modelling which is a phase of the Business Process Management life-cycle.Keywords: Cost Estimation, Effort Estimation, Process Modelling, Business Process Management, COCOMO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2294598 An Examination and Validation of the Theoretical Resistivity-Temperature Relationship for Conductors
Authors: Fred Lacy
Abstract:
Electrical resistivity is a fundamental parameter of metals or electrical conductors. Since resistivity is a function of temperature, in order to completely understand the behavior of metals, a temperature dependent theoretical model is needed. A model based on physics principles has recently been developed to obtain an equation that relates electrical resistivity to temperature. This equation is dependent upon a parameter associated with the electron travel time before being scattered, and a parameter that relates the energy of the atoms and their separation distance. Analysis of the energy parameter reveals that the equation is optimized if the proportionality term in the equation is not constant but varies over the temperature range. Additional analysis reveals that the theoretical equation can be used to determine the mean free path of conduction electrons, the number of defects in the atomic lattice, and the ‘equivalent’ charge associated with the metallic bonding of the atoms. All of this analysis provides validation for the theoretical model and provides insight into the behavior of metals where performance is affected by temperatures (e.g., integrated circuits and temperature sensors).
Keywords: Callendar–van Dusen, conductivity, mean free path, resistance temperature detector, temperature sensor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2185597 Performance Improvement in the Bivariate Models by using Modified Marginal Variance of Noisy Observations for Image-Denoising Applications
Authors: R. Senthilkumar
Abstract:
Most simple nonlinear thresholding rules for wavelet- based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. This paper attempts to give a recipe for selecting one of the popular image-denoising algorithms based on VisuShrink, SureShrink, OracleShrink, BayesShrink and BiShrink and also this paper compares different Bivariate models used for image denoising applications. The first part of the paper compares different Shrinkage functions used for image-denoising. The second part of the paper compares different bivariate models and the third part of this paper uses the Bivariate model with modified marginal variance which is based on Laplacian assumption. This paper gives an experimental comparison on six 512x512 commonly used images, Lenna, Barbara, Goldhill, Clown, Boat and Stonehenge. The following noise powers 25dB,26dB, 27dB, 28dB and 29dB are added to the six standard images and the corresponding Peak Signal to Noise Ratio (PSNR) values are calculated for each noise level.Keywords: BiShrink, Image-Denoising, PSNR, Shrinkage function
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1347596 A Family of Entropies on Interval-valued Intuitionistic Fuzzy Sets and Their Applications in Multiple Attribute Decision Making
Abstract:
The entropy of intuitionistic fuzzy sets is used to indicate the degree of fuzziness of an interval-valued intuitionistic fuzzy set(IvIFS). In this paper, we deal with the entropies of IvIFS. Firstly, we propose a family of entropies on IvIFS with a parameter λ ∈ [0, 1], which generalize two entropy measures defined independently by Zhang and Wei, for IvIFS, and then we prove that the new entropy is an increasing function with respect to the parameter λ. Furthermore, a new multiple attribute decision making (MADM) method using entropy-based attribute weights is proposed to deal with the decision making situations where the alternatives on attributes are expressed by IvIFS and the attribute weights information is unknown. Finally, a numerical example is given to illustrate the applications of the proposed method.
Keywords: Interval-valued intuitionistic fuzzy sets, intervalvalued intuitionistic fuzzy entropy, multiple attribute decision making
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647595 Degradation of EE2 by Different Consortium of Enriched Nitrifying Activated Sludge
Authors: Pantip Kayee
Abstract:
17α-ethinylestradiol (EE2) is a recalcitrant micropollutant which is found in small amounts in municipal wastewater. But these small amounts still adversely affect for the reproductive function of aquatic organisms. Evidence in the past suggested that full-scale WWTPs equipped with nitrification process enhanced the removal of EE2 in the municipal wastewater. EE2 has been proven to be able to be transformed by ammonia oxidizing bacteria (AOB) via co-metabolism. This research aims to clarify the EE2 degradation pattern by different consortium of ammonia oxidizing microorganism (AOM) including AOA (ammonia oxidizing archaea) and investigate contribution between the existing ammonia monooxygenase (AMO) and new synthesized AOM. The result showed that AOA or AOB of N. oligotropha cluster in enriched nitrifying activated sludge (NAS) from 2mM and 5mM, commonly found in municipal WWTPs, could degrade EE2 in wastewater via co-metabolism. Moreover, the investigation of the contribution between the existing ammonia monooxygenase (AMO) and new synthesized AOM demonstrated that the new synthesized AMO enzyme may perform ammonia oxidation rather than the existing AMO enzyme or the existing AMO enzyme may has a small amount to oxidize ammonia.
Keywords: 17α-ethinylestradiol, nitrification, ammonia oxidizing bacteria, ammonia oxidizing archaea.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2026594 Effect of Eccentricity on Conjugate Natural Convection in Vertical Eccentric Annuli
Authors: A. Jamal, M. A. I. El-Shaarawi, E. M. A. Mokheimer
Abstract:
Combined conduction-free convection heat transfer in vertical eccentric annuli is numerically investigated using a finitedifference technique. Numerical results, representing the heat transfer parameters such as annulus walls temperature, heat flux, and heat absorbed in the developing region of the annulus, are presented for a Newtonian fluid of Prandtl number 0.7, fluid-annulus radius ratio 0.5, solid-fluid thermal conductivity ratio 10, inner and outer wall dimensionless thicknesses 0.1 and 0.2, respectively, and dimensionless eccentricities 0.1, 0.3, 0.5, and 0.7. The annulus walls are subjected to thermal boundary conditions, which are obtained by heating one wall isothermally whereas keeping the other wall at inlet fluid temperature. In the present paper, the annulus heights required to achieve thermal full development for prescribed eccentricities are obtained. Furthermore, the variation in the height of thermal full development as function of the geometrical parameter, i.e., eccentricity is also investigated.Keywords: Conjugate natural convection, eccentricity, heat transfer, vertical eccentric annuli.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2222593 Level of Service Based Methodology for Municipal Infrastructure Management
Authors: Z. Khan, O. Moselhi, T. Zayed
Abstract:
Development of levels of service in municipal context is a flexible vehicle to assist in performing quality-cost trade-off analysis for municipal services. This trade-off depends on the willingness of a community to pay as well as on the condition of the assets. Community perspective of the performance of an asset from service point of view may be quite different from the municipality perspective of the performance of the same asset from condition point of view. This paper presents a three phased level of service based methodology for water mains that consists of :1)development of an Analytical Hierarchy model of level of service 2) development of Fuzzy Weighted Sum model of water main condition index and 3) deriving a Fuzzy logic based function that maps level of service to asset condition index. This mapping will assist asset managers in quantifying condition improvement requirement to meet service goals and to make more informed decisions on interventions and relayed priorities.Keywords: Asset Management, Level of Service, Condition Index, Analytical Hierarchy, Fuzzy Logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950592 Effect of Transglutaminase Cross Linking on the Functional Properties as a Function of NaCl Concentration of Legumes Protein Isolate
Authors: Nahid A. Ali, Salma H. Ahmed, ElShazali A. Mohamed, Isam A. Mohamed Ahmed, Elfadil E.Babiker
Abstract:
The effect of cross linking of the protein isolates of three legumes with the microbial enzyme transglutaminase (EC 2.3.2.13) on the functional properties at different NaCl concentration was studied. The reduction in the total free amino groups (OD340) of the polymerized protein showed that TGase treatment cross-linking the protein subunit of each legume. The solubility of the protein polymer of each legume was greatly improved at high concentration of NaCl. At 1.2 M NaCl the solubility of the native legumes protein was significantly decreased but after polymerization slightly improved. Cross linked proteins were less turbid on heating to higher temperature as compared to native proteins and the temperature at which the protein turns turbid also increased in the polymerized proteins. The emulsifying and foaming properties of the protein polymer were greatly improved at all concentrations of NaCl for all legumes.Keywords: Functional properties, Legumes, Protein isolate, NaCl, Transglutaminase.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2592591 Influence of Machining Process on Surface Integrity of Plasma Coating
Authors: T. Zlámal, J. Petrů, M. Pagáč, P. Krajkovič
Abstract:
For the required function of components with the thermal spray coating, it is necessary to perform additional machining of the coated surface. The paper deals with assessing the surface integrity of Metco 2042, a plasma sprayed coating, after its machining. The selected plasma sprayed coating serves as an abradable sealing coating in a jet engine. Therefore, the spray and its surface must meet high quality and functional requirements. Plasma sprayed coatings are characterized by lamellar structure, which requires a special approach to their machining. Therefore, the experimental part involves the set-up of special cutting tools and cutting parameters under which the applied coating was machined. For the assessment of suitably set machining parameters, selected parameters of surface integrity were measured and evaluated during the experiment. To determine the size of surface irregularities and the effect of the selected machining technology on the sprayed coating surface, the surface roughness parameters Ra and Rz were measured. Furthermore, the measurement of sprayed coating surface hardness by the HR 15 Y method before and after machining process was used to determine the surface strengthening. The changes of strengthening were detected after the machining. The impact of chosen cutting parameters on the surface roughness after the machining was not proven.
Keywords: Machining, plasma sprayed coating, surface integrity, strengthening.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014590 Spectral Entropy Employment in Speech Enhancement based on Wavelet Packet
Authors: Talbi Mourad, Salhi Lotfi, Chérif Adnen
Abstract:
In this work, we are interested in developing a speech denoising tool by using a discrete wavelet packet transform (DWPT). This speech denoising tool will be employed for applications of recognition, coding and synthesis. For noise reduction, instead of applying the classical thresholding technique, some wavelet packet nodes are set to zero and the others are thresholded. To estimate the non stationary noise level, we employ the spectral entropy. A comparison of our proposed technique to classical denoising methods based on thresholding and spectral subtraction is made in order to evaluate our approach. The experimental implementation uses speech signals corrupted by two sorts of noise, white and Volvo noises. The obtained results from listening tests show that our proposed technique is better than spectral subtraction. The obtained results from SNR computation show the superiority of our technique when compared to the classical thresholding method using the modified hard thresholding function based on u-law algorithm.
Keywords: Enhancement, spectral subtraction, SNR, discrete wavelet packet transform, spectral entropy Histogram
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992589 Reform-Oriented Teaching of Introductory Statistics in the Health, Social and Behavioral Sciences – Historical Context and Rationale
Authors: Rossi A. Hassad
Abstract:
There is widespread emphasis on reform in the teaching of introductory statistics at the college level. Underpinning this reform is a consensus among educators and practitioners that traditional curricular materials and pedagogical strategies have not been effective in promoting statistical literacy, a competency that is becoming increasingly necessary for effective decision-making and evidence-based practice. This paper explains the historical context of, and rationale for reform-oriented teaching of introductory statistics (at the college level) in the health, social and behavioral sciences (evidence-based disciplines). A firm understanding and appreciation of the basis for change in pedagogical approach is important, in order to facilitate commitment to reform, consensus building on appropriate strategies, and adoption and maintenance of best practices. In essence, reform-oriented pedagogy, in this context, is a function of the interaction among content, pedagogy, technology, and assessment. The challenge is to create an appropriate balance among these domains.
Keywords: Reform-oriented, reform, introductory statistics, health, behavioral sciences, evidence-based, psychology, teaching, learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 996588 Influence of Confined Acoustic Phonons on the Shubnikov – de Haas Magnetoresistance Oscillations in a Doped Semiconductor Superlattice
Authors: Pham Ngoc Thang, Le Thai Hung, Nguyen Quang Bau
Abstract:
The influence of confined acoustic phonons on the Shubnikov – de Haas magnetoresistance oscillations in a doped semiconductor superlattice (DSSL), subjected in a magnetic field, DC electric field, and a laser radiation, has been theoretically studied based on quantum kinetic equation method. The analytical expression for the magnetoresistance in a DSSL has been obtained as a function of external fields, DSSL parameters, and especially the quantum number m characterizing the effect of confined acoustic phonons. When m goes to zero, the results for bulk phonons in a DSSL could be achieved. Numerical calculations are also achieved for the GaAs:Si/GaAs:Be DSSL and compared with other studies. Results show that the Shubnikov – de Haas magnetoresistance oscillations amplitude decrease as the increasing of phonon confinement effect.
Keywords: Shubnikov–de Haas magnetoresistance oscillations, quantum kinetic equation, confined acoustic phonons, laser radiation, doped semiconductor superlattices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1461587 Power Quality Improvement Using PI and Fuzzy Logic Controllers Based Shunt Active Filter
Authors: Dipen A. Mistry, Bhupelly Dheeraj, Ravit Gautam, Manmohan Singh Meena, Suresh Mikkili
Abstract:
In recent years the large scale use of the power electronic equipment has led to an increase of harmonics in the power system. The harmonics results into a poor power quality and have great adverse economical impact on the utilities and customers. Current harmonics are one of the most common power quality problems and are usually resolved by using shunt active filter (SHAF). The main objective of this work is to develop PI and Fuzzy logic controllers (FLC) to analyze the performance of Shunt Active Filter for mitigating current harmonics under balanced and unbalanced sinusoidal source voltage conditions for normal load and increased load. When the supply voltages are ideal (balanced), both PI and FLC are converging to the same compensation characteristics. However, the supply voltages are non-ideal (unbalanced), FLC offers outstanding results. Simulation results validate the superiority of FLC with triangular membership function over the PI controller.
Keywords: DC link voltage, Fuzzy logic controller, Harmonics, PI controller, Shunt Active Filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5162586 Using Genetic Algorithm for Distributed Generation Allocation to Reduce Losses and Improve Voltage Profile
Authors: M. Sedighizadeh, A. Rezazadeh
Abstract:
This paper presents a method for the optimal allocation of Distributed generation in distribution systems. In this paper, our aim would be optimal distributed generation allocation for voltage profile improvement and loss reduction in distribution network. Genetic Algorithm (GA) was used as the solving tool, which referring two determined aim; the problem is defined and objective function is introduced. Considering to fitness values sensitivity in genetic algorithm process, there is needed to apply load flow for decision-making. Load flow algorithm is combined appropriately with GA, till access to acceptable results of this operation. We used MATPOWER package for load flow algorithm and composed it with our Genetic Algorithm. The suggested method is programmed under MATLAB software and applied ETAP software for evaluating of results correctness. It was implemented on part of Tehran electricity distributing grid. The resulting operation of this method on some testing system is illuminated improvement of voltage profile and loss reduction indexes.Keywords: Distributed Generation, Allocation, Voltage Profile, losses, Genetic Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893585 An Energy-Efficient Distributed Unequal Clustering Protocol for Wireless Sensor Networks
Authors: Sungju Lee, Jangsoo Lee , Hongjoong Sin, Seunghwan Yoo, Sanghyuck Lee, Jaesik Lee, Yongjun Lee, Sungchun Kim
Abstract:
The wireless sensor networks have been extensively deployed and researched. One of the major issues in wireless sensor networks is a developing energy-efficient clustering protocol. Clustering algorithm provides an effective way to prolong the lifetime of a wireless sensor networks. In the paper, we compare several clustering protocols which significantly affect a balancing of energy consumption. And we propose an Energy-Efficient Distributed Unequal Clustering (EEDUC) algorithm which provides a new way of creating distributed clusters. In EEDUC, each sensor node sets the waiting time. This waiting time is considered as a function of residual energy, number of neighborhood nodes. EEDUC uses waiting time to distribute cluster heads. We also propose an unequal clustering mechanism to solve the hot-spot problem. Simulation results show that EEDUC distributes the cluster heads, balances the energy consumption well among the cluster heads and increases the network lifetime.Keywords: Wireless Sensor Network, Distributed UnequalClustering, Multi-hop, Lifetime.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2489584 Establishing a Probabilistic Model of Extrapolated Wind Speed Data for Wind Energy Prediction
Authors: Mussa I. Mgwatu, Reuben R. M. Kainkwa
Abstract:
Wind is among the potential energy resources which can be harnessed to generate wind energy for conversion into electrical power. Due to the variability of wind speed with time and height, it becomes difficult to predict the generated wind energy more optimally. In this paper, an attempt is made to establish a probabilistic model fitting the wind speed data recorded at Makambako site in Tanzania. Wind speeds and direction were respectively measured using anemometer (type AN1) and wind Vane (type WD1) both supplied by Delta-T-Devices at a measurement height of 2 m. Wind speeds were then extrapolated for the height of 10 m using power law equation with an exponent of 0.47. Data were analysed using MINITAB statistical software to show the variability of wind speeds with time and height, and to determine the underlying probability model of the extrapolated wind speed data. The results show that wind speeds at Makambako site vary cyclically over time; and they conform to the Weibull probability distribution. From these results, Weibull probability density function can be used to predict the wind energy.Keywords: Probabilistic models, wind speed, wind energy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2346583 Performance Analysis of a Discrete-time GeoX/G/1 Queue with Single Working Vacation
Authors: Shan Gao, Zaiming Liu
Abstract:
This paper treats a discrete-time batch arrival queue with single working vacation. The main purpose of this paper is to present a performance analysis of this system by using the supplementary variable technique. For this purpose, we first analyze the Markov chain underlying the queueing system and obtain its ergodicity condition. Next, we present the stationary distributions of the system length as well as some performance measures at random epochs by using the supplementary variable method. Thirdly, still based on the supplementary variable method we give the probability generating function (PGF) of the number of customers at the beginning of a busy period and give a stochastic decomposition formulae for the PGF of the stationary system length at the departure epochs. Additionally, we investigate the relation between our discretetime system and its continuous counterpart. Finally, some numerical examples show the influence of the parameters on some crucial performance characteristics of the system.
Keywords: Discrete-time queue, batch arrival, working vacation, supplementary variable technique, stochastic decomposition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434582 A New Source Code Auditing Algorithm for Detecting LFI and RFI in PHP Programs
Authors: Seyed Ali Mir Heydari, Mohsen Sayadiharikandeh
Abstract:
Static analysis of source code is used for auditing web applications to detect the vulnerabilities. In this paper, we propose a new algorithm to analyze the PHP source code for detecting LFI and RFI potential vulnerabilities. In our approach, we first define some patterns for finding some functions which have potential to be abused because of unhandled user inputs. More precisely, we use regular expression as a fast and simple method to define some patterns for detection of vulnerabilities. As inclusion functions could be also used in a safe way, there could occur many false positives (FP). The first cause of these FP-s could be that the function does not use a usersupplied variable as an argument. So, we extract a list of usersupplied variables to be used for detecting vulnerable lines of code. On the other side, as vulnerability could spread among the variables like by multi-level assignment, we also try to extract the hidden usersupplied variables. We use the resulted list to decrease the false positives of our method. Finally, as there exist some ways to prevent the vulnerability of inclusion functions, we define also some patterns to detect them and decrease our false positives.Keywords: User-supplied Variables, hidden user-supplied variables, PHP vulnerabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2507581 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. Earlier we predicted the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven datasets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: Software Metrics, Fault prediction, Cross project, Within project.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2546580 Robust Statistics Based Algorithm to Remove Salt and Pepper Noise in Images
Authors: V.R.Vijaykumar, P.T.Vanathi, P.Kanagasabapathy, D.Ebenezer
Abstract:
In this paper, a robust statistics based filter to remove salt and pepper noise in digital images is presented. The function of the algorithm is to detect the corrupted pixels first since the impulse noise only affect certain pixels in the image and the remaining pixels are uncorrupted. The corrupted pixels are replaced by an estimated value using the proposed robust statistics based filter. The proposed method perform well in removing low to medium density impulse noise with detail preservation upto a noise density of 70% compared to standard median filter, weighted median filter, recursive weighted median filter, progressive switching median filter, signal dependent rank ordered mean filter, adaptive median filter and recently proposed decision based algorithm. The visual and quantitative results show the proposed algorithm outperforms in restoring the original image with superior preservation of edges and better suppression of impulse noise
Keywords: Image denoising, Nonlinear filter, Robust Statistics, and Salt and Pepper Noise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2202579 Join and Meet Block Based Default Definite Decision Rule Mining from IDT and an Incremental Algorithm
Authors: Chen Wu, Jingyu Yang
Abstract:
Using maximal consistent blocks of tolerance relation on the universe in incomplete decision table, the concepts of join block and meet block are introduced and studied. Including tolerance class, other blocks such as tolerant kernel and compatible kernel of an object are also discussed at the same time. Upper and lower approximations based on those blocks are also defined. Default definite decision rules acquired from incomplete decision table are proposed in the paper. An incremental algorithm to update default definite decision rules is suggested for effective mining tasks from incomplete decision table into which data is appended. Through an example, we demonstrate how default definite decision rules based on maximal consistent blocks, join blocks and meet blocks are acquired and how optimization is done in support of discernibility matrix and discernibility function in the incomplete decision table.Keywords: rough set, incomplete decision table, maximalconsistent block, default definite decision rule, join and meet block.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1288578 A Semi-Fragile Signature based Scheme for Ownership Identification and Color Image Authentication
Authors: M. Hamad Hassan, S.A.M. Gilani
Abstract:
In this paper, a novel scheme is proposed for ownership identification and authentication using color images by deploying Cryptography and Digital Watermarking as underlaying technologies. The former is used to compute the contents based hash and the latter to embed the watermark. The host image that will claim to be the rightful owner is first transformed from RGB to YST color space exclusively designed for watermarking based applications. Geometrically YS ÔèÑ T and T channel corresponds to the chrominance component of color image, therefore suitable for embedding the watermark. The T channel is divided into 4×4 nonoverlapping blocks. The size of block is important for enhanced localization, security and low computation. Each block along with ownership information is then deployed by SHA160, a one way hash function to compute the content based hash, which is always unique and resistant against birthday attack instead of using MD5 that may raise the condition i.e. H(m)=H(m'). The watermark payload varies from block to block and computed by the variance factorα . The quality of watermarked images is quite high both subjectively and objectively. Our scheme is blind, computationally fast and exactly locates the tampered region.
Keywords: Hash Collision, LSB, MD5, PSNR, SHA160.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563577 Analysis of Formyl Peptide Receptor 1 Protein Value as an Indicator of Neutrophil Chemotaxis Dysfunction in Aggressive Periodontitis
Authors: Prajna Metta, Yanti Rusyanti, Nunung Rusminah, Bremmy Laksono
Abstract:
The decrease of neutrophil chemotaxis function may cause increased susceptibility to aggressive periodontitis (AP). Neutrophil chemotaxis is affected by formyl peptide receptor 1 (FPR1), which when activated will respond to bacterial chemotactic peptide formyl methionyl leusyl phenylalanine (FMLP). FPR1 protein value is decreased in response to a wide number of inflammatory stimuli in AP patients. This study was aimed to assess the alteration of FPR1 protein value in AP patients and if FPR1 protein value could be used as an indicator of neutrophil chemotaxis dysfunction in AP. This is a case control study with 20 AP patients and 20 control subjects. Three milliliters of peripheral blood were drawn and analyzed for FPR1 protein value with ELISA. The data were statistically analyzed with Mann-Whitney test (p>0,05). Results showed that the mean value of FPR1 protein value in AP group is 0,353 pg/mL (0,11 to 1,18 pg/mL) and the mean value of FPR1 protein value in control group is 0,296 pg/mL (0,05 to 0,88 pg/mL). P value 0,787 > 0,05 suggested that there is no significant difference of FPR1 protein value in both groups. The present study suggests that FPR1 protein value has no significance alteration in AP patients and could not be used as an indicator of neutrophil chemotaxis dysfunction.
Keywords: Aggressive periodontitis, chemotaxis dysfunction, FPR1 protein value, neutrophil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 840576 Influence of an External Magnetic Field on the Acoustomagnetoelectric Field in a Rectangular Quantum Wire with an Infinite Potential by Using a Quantum Kinetic Equation
Authors: N. Q. Bau, N. V. Nghia
Abstract:
The acoustomagnetoelectric (AME) field in a rectangular quantum wire with an infinite potential (RQWIP) is calculated in the presence of an external magnetic field (EMF) by using the quantum kinetic equation for the distribution function of electrons system interacting with external phonons and electrons scattering with internal acoustic phonon in a RQWIP. We obtained ananalytic expression for the AME field in the RQWIP in the presence of the EMF. The dependence of AME field on the frequency of external acoustic wave, the temperature T of system, the cyclotron frequency of the EMF and the intensity of the EMF is obtained. Theoretical results for the AME field are numerically evaluated, plotted and discussed for a specific RQWIP GaAs/GaAsAl. This result has shown that the dependence of the AME field on intensity of the EMF is nonlinearly and it is many distinct maxima in the quantized magnetic region. We also compared received fields with those for normal bulk semiconductors, quantum well and quantum wire to show the difference. The influence of an EMF on AME field in a RQWIP is newly developed.
Keywords: Rectangular quantum wire, acoustomagnetoelectric field, electron-phonon interaction, kinetic equation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1418575 Structural Reliability of Existing Structures: A Case Study
Authors: Z. Sakka, I. Assakkaf, T. Al-Yaqoub, J. Parol
Abstract:
reliability-based methodology for the assessment and evaluation of reinforced concrete (R/C) structural elements of concrete structures is presented herein. The results of the reliability analysis and assessment for R/C structural elements were verified by the results obtained through deterministic methods. The outcomes of the reliability-based analysis were compared against currently adopted safety limits that are incorporated in the reliability indices β’s, according to international standards and codes. The methodology is based on probabilistic analysis using reliability concepts and statistics of the main random variables that are relevant to the subject matter, and for which they are to be used in the performance-function equation(s) associated with the structural elements under study. These methodology techniques can result in reliability index β, which is commonly known as the reliability index or reliability measure value that can be utilized to assess and evaluate the safety, human risk, and functionality of the structural component. Also, these methods can result in revised partial safety factor values for certain target reliability indices that can be used for the purpose of redesigning the R/C elements of the building and in which they could assist in considering some other remedial actions to improve the safety and functionality of the member.
Keywords: Concrete Structures, FORM, Monte Carlo Simulation, Structural Reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3091574 An Information Theoretic Approach to Rescoring Peptides Produced by De Novo Peptide Sequencing
Authors: John R. Rose, James P. Cleveland, Alvin Fox
Abstract:
Tandem mass spectrometry (MS/MS) is the engine driving high-throughput protein identification. Protein mixtures possibly representing thousands of proteins from multiple species are treated with proteolytic enzymes, cutting the proteins into smaller peptides that are then analyzed generating MS/MS spectra. The task of determining the identity of the peptide from its spectrum is currently the weak point in the process. Current approaches to de novo sequencing are able to compute candidate peptides efficiently. The problem lies in the limitations of current scoring functions. In this paper we introduce the concept of proteome signature. By examining proteins and compiling proteome signatures (amino acid usage) it is possible to characterize likely combinations of amino acids and better distinguish between candidate peptides. Our results strongly support the hypothesis that a scoring function that considers amino acid usage patterns is better able to distinguish between candidate peptides. This in turn leads to higher accuracy in peptide prediction.Keywords: Tandem mass spectrometry, proteomics, scoring, peptide, de novo, mutual information
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1728573 Improvement of Central Composite Design in Modeling and Optimization of Simulation Experiments
Authors: A. Nuchitprasittichai, N. Lerdritsirikoon, T. Khamsing
Abstract:
Simulation modeling can be used to solve real world problems. It provides an understanding of a complex system. To develop a simplified model of process simulation, a suitable experimental design is required to be able to capture surface characteristics. This paper presents the experimental design and algorithm used to model the process simulation for optimization problem. The CO2 liquefaction based on external refrigeration with two refrigeration circuits was used as a simulation case study. Latin Hypercube Sampling (LHS) was purposed to combine with existing Central Composite Design (CCD) samples to improve the performance of CCD in generating the second order model of the system. The second order model was then used as the objective function of the optimization problem. The results showed that adding LHS samples to CCD samples can help capture surface curvature characteristics. Suitable number of LHS sample points should be considered in order to get an accurate nonlinear model with minimum number of simulation experiments.Keywords: Central composite design, CO2 liquefaction, Latin Hypercube Sampling, simulation – based optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741