Search results for: fuzzy logic estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2166

Search results for: fuzzy logic estimation

816 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis

Authors: Petr Gurný

Abstract:

One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the creditscoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.

Keywords: Credit-scoring Models, Multidimensional Subordinated Lévy Model, Probability of Default.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
815 Is Curcumine Effect Comparable to 5- Aminosalicylic Acid or Budesonide on a Rat Model of Ulcerative Colitis Induced by Trinitrobenzene Sulfonic Acid?

Authors: Inas E. Darwish, Alia M. Arab, Tarek A. Azeim, Teshreen M. Zeitoun, Wafaa A. Hewedy, Moemen A. Heiba, Iman S. Emara

Abstract:

Inflammatory bowel disease (IBD) is a chronic relapsing-remitting condition that afflicts millions of people throughout the world and impairs their daily functions and quality of life. Treatment of IBD depends largely on 5-aminosalicylic acid (5- ASA) and corticosteroids. The present study aimed to clarify the effects of 5-aminosalicylic acid, budesonide and currcumin on 90 male albino rats against trinitrobenzene sulfonic acid (TNB) induced colitis. TNB was injected intrarectally to 50 rats. The other 40 rats served as control groups. Both 5-ASA (in a dose of 120 mg/kg) and budesonide (in a dose of 0.1 mg/kg) were administered daily for one week whereas currcumin was injected intraperitonially (in a dose of 30 mg/kg daily) for 14 days after injection of either TNB in the colitis rats (group B) or saline in control groups (group A). The study included estimation of macroscopic score index, histological examination of H&E stained sections of the colonic tissue, biochemical estimation of myeloperoxidase (MPO), nitric oxide (NO), and caspase-3 levels, in addition to studying the effect of tested drugs on colonic motility. It was found that budesonide and curcumin improved mucosal healing, reduced both NO production and caspase- 3 level. They had the best impact on the disturbed colonic motility in TNBS-model of colitis.

Keywords: Colitis, curcumin, nitric oxide.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
814 Estimation of Attenuation and Phase Delay in Driving Voltage Waveform of a Digital-Noiseless, Ultra-High-Speed Image Sensor

Authors: V. T. S. Dao, T. G. Etoh, C. Vo Le, H. D. Nguyen, K. Takehara, T. Akino, K. Nishi

Abstract:

Since 2004, we have been developing an in-situ storage image sensor (ISIS) that captures more than 100 consecutive images at a frame rate of 10 Mfps with ultra-high sensitivity as well as the video camera for use with this ISIS. Currently, basic research is continuing in an attempt to increase the frame rate up to 100 Mfps and above. In order to suppress electro-magnetic noise at such high frequency, a digital-noiseless imaging transfer scheme has been developed utilizing solely sinusoidal driving voltages. This paper presents highly efficient-yet-accurate expressions to estimate attenuation as well as phase delay of driving voltages through RC networks of an ultra-high-speed image sensor. Elmore metric for a fundamental RC chain is employed as the first-order approximation. By application of dimensional analysis to SPICE data, we found a simple expression that significantly improves the accuracy of the approximation. Similarly, another simple closed-form model to estimate phase delay through fundamental RC networks is also obtained. Estimation error of both expressions is much less than previous works, only less 2% for most of the cases . The framework of this analysis can be extended to address similar issues of other VLSI structures.

Keywords: Dimensional Analysis, ISIS, Digital-noiseless, RC network, Attenuation, Phase Delay, Elmore model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1440
813 Designing of Full Adder Using Low Power Techniques

Authors: Shashank Gautam

Abstract:

This paper proposes techniques like MT CMOS, POWER GATING, DUAL STACK, GALEOR and LECTOR to reduce the leakage power. A Full Adder has been designed using these techniques and power dissipation is calculated and is compared with general CMOS logic of Full Adder. Simulation results show the validity of the proposed techniques is effective to save power dissipation and to increase the speed of operation of the circuits to a large extent.

Keywords: Low Power, MT CMOS, Galeor, Lector, Power Gating, Dual Stack, Full Adder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2098
812 Estimation of Crustal Thickness within the Sokoto Basin North-Western Nigeria Using Bouguer Gravity Anomaly Data

Authors: T. T. Olugbenga, A. I. Augie

Abstract:

This research proposes an interpretation of the Bouguer’ gravity anomaly data of some parts of Sokoto basin for the estimation of crustal thickness. The study area is bounded between latitudes 1100′0″N and 1300′0″N, and longitudes 400′0″E and 600′0″E that covered Koko, Jega, B/Kebbi, Argungu, Lema, Bodinga, Tamgaza, Gunmi,Daki Takwas, Dange, Sokoto, Ilella, T/Mafara, Anka, Maru, Gusau, K/Namoda, and Sabon Birni within Sokoto, Kebbi and Zamfara state respectively. The established map of the study area was digitized in X, Y and Z format using excel software package and the digitized data were processed using Surfer version 13 software. The Moho and Conrad depths based on a relationship between Bouguer’ gravity anomaly determined crustal thickness were estimated as 35 to 37 km and 19 to 21 km, respectively. The crustal region has been categorized into: Crustal thinning zone that is the region with high gravity anomaly value due to its greater geothermal energy and also Crustal thickening zone which the region with low anomaly values due to its lower geothermal energy. Birnin kebbi, Jega, Sokoto were identified as the region of hydrocarbon potential with an estimate of 35 km thickness within the crustal region which is referred to as crustal thickening as a result of its low but sufficient geothermal energy to decompose organic matter within the region to form hydrocarbons.

Keywords: Bouguer gravity anomaly, crustal thickness, geothermal energy, hydrocarbons, Moho and Conrad Depths.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 606
811 Evolutionary Design of Polynomial Controller

Authors: R. Matousek, S. Lang, P. Minar, P. Pivonka

Abstract:

In the control theory one attempts to find a controller that provides the best possible performance with respect to some given measures of performance. There are many sorts of controllers e.g. a typical PID controller, LQR controller, Fuzzy controller etc. In the paper will be introduced polynomial controller with novel tuning method which is based on the special pole placement encoding scheme and optimization by Genetic Algorithms (GA). The examples will show the performance of the novel designed polynomial controller with comparison to common PID controller.

Keywords: Evolutionary design, Genetic algorithms, PID controller, Pole placement, Polynomial controller

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2131
810 Optimization by Means of Genetic Algorithm of the Equivalent Electrical Circuit Model of Different Order for Li-ion Battery Pack

Authors: V. Pizarro-Carmona, S. Castano-Solis, M. Cortés-Carmona, J. Fraile-Ardanuy, D. Jimenez-Bermejo

Abstract:

The purpose of this article is to optimize the Equivalent Electric Circuit Model (EECM) of different orders to obtain greater precision in the modeling of Li-ion battery packs. Optimization includes considering circuits based on 1RC, 2RC and 3RC networks, with a dependent voltage source and a series resistor. The parameters are obtained experimentally using tests in the time domain and in the frequency domain. Due to the high non-linearity of the behavior of the battery pack, Genetic Algorithm (GA) was used to solve and optimize the parameters of each EECM considered (1RC, 2RC and 3RC). The objective of the estimation is to minimize the mean square error between the measured impedance in the real battery pack and those generated by the simulation of different proposed circuit models. The results have been verified by comparing the Nyquist graphs of the estimation of the complex impedance of the pack. As a result of the optimization, the 2RC and 3RC circuit alternatives are considered as viable to represent the battery behavior. These battery pack models are experimentally validated using a hardware-in-the-loop (HIL) simulation platform that reproduces the well-known New York City cycle (NYCC) and Federal Test Procedure (FTP) driving cycles for electric vehicles. The results show that using GA optimization allows obtaining EECs with 2RC or 3RC networks, with high precision to represent the dynamic behavior of a battery pack in vehicular applications.

Keywords: Li-ion battery packs modeling optimized, EECM, GA, electric vehicle applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 516
809 Analysis of Security Vulnerabilities for Mobile Health Applications

Authors: Y. Cifuentes, L. Beltrán, L. Ramírez

Abstract:

The availability to deploy mobile applications for health care is increasing daily thru different mobile app stores. But within these capabilities the number of hacking attacks has also increased, in particular into medical mobile applications. The security vulnerabilities in medical mobile apps can be triggered by errors in code, incorrect logic, poor design, among other parameters. This is usually used by malicious attackers to steal or modify the users’ information. The aim of this research is to analyze the vulnerabilities detected in mobile medical apps according to risk factor standards defined by OWASP in 2014.

Keywords: mHealth apps, OWASP, protocols, security vulnerabilities, risk factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4383
808 Pattern Recognition of Biological Signals

Authors: Paulo S. Caparelli, Eduardo Costa, Alexsandro S. Soares, Hipolito Barbosa

Abstract:

This paper presents an evolutionary method for designing electronic circuits and numerical methods associated with monitoring systems. The instruments described here have been used in studies of weather and climate changes due to global warming, and also in medical patient supervision. Genetic Programming systems have been used both for designing circuits and sensors, and also for determining sensor parameters. The authors advance the thesis that the software side of such a system should be written in computer languages with a strong mathematical and logic background in order to prevent software obsolescence, and achieve program correctness.

Keywords: Pattern recognition, evolutionary computation, biological signal, functional programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
807 Exploring Socio-Economic Barriers of Green Entrepreneurship in Iran and Their Interactions Using Interpretive Structural Modeling

Authors: Younis Jabarzadeh, Rahim Sarvari, Negar Ahmadi Alghalandis

Abstract:

Entrepreneurship at both individual and organizational level is one of the most driving forces in economic development and leads to growth and competition, job generation and social development. Especially in developing countries, the role of entrepreneurship in economic and social prosperity is more emphasized. But the effect of global economic development on the environment is undeniable, especially in negative ways, and there is a need to rethink current business models and the way entrepreneurs act to introduce new businesses to address and embed environmental issues in order to achieve sustainable development. In this paper, green or sustainable entrepreneurship is addressed in Iran to identify challenges and barriers entrepreneurs in the economic and social sectors face in developing green business solutions. Sustainable or green entrepreneurship has been gaining interest among scholars in recent years and addressing its challenges and barriers need much more attention to fill the gap in the literature and facilitate the way those entrepreneurs are pursuing. This research comprised of two main phases: qualitative and quantitative. At qualitative phase, after a thorough literature review, fuzzy Delphi method is utilized to verify those challenges and barriers by gathering a panel of experts and surveying them. In this phase, several other contextually related factors were added to the list of identified barriers and challenges mentioned in the literature. Then, at the quantitative phase, Interpretive Structural Modeling is applied to construct a network of interactions among those barriers identified at the previous phase. Again, a panel of subject matter experts comprised of academic and industry experts was surveyed. The results of this study can be used by policymakers in both the public and industry sector, to introduce more systematic solutions to eliminate those barriers and help entrepreneurs overcome challenges of sustainable entrepreneurship. It also contributes to the literature as the first research in this type which deals with the barriers of sustainable entrepreneurship and explores their interaction.

Keywords: Green entrepreneurship, barriers, Fuzzy Delphi Method, interpretive structural modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1373
806 Some Considerations on UML Class Diagram Formalisation Approaches

Authors: Abdullah A. H. Alzahrani, Majd Zohri Yafi, Fawaz K. Alarfaj

Abstract:

Unified Modelling Language (UML) is a software modelling language that is widely used and accepted. One significant drawback, of which, is that the language lacks formality. This makes carrying out any type of rigorous analysis difficult process. Many researchers attempt to introduce their approaches to formalise UML diagrams. However, it is always hard to decide what language and/or approach to use. Therefore, in this paper, we highlight some of the advantages and disadvantages of number of those approaches. We also try to compare different counterpart approaches. In addition, we draw some guidelines to help in choosing the suitable approach. Special concern is given to the formalisation of the static aspects of UML shown is class diagrams.

Keywords: UML formalisation, Object Constraints Language (OCL), Description Logic (DL), Z language.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071
805 Extended Deductive Databases with Uncertain Information

Authors: Daniel Stamate

Abstract:

The paper presents an approach for handling uncertain information in deductive databases using multivalued logics. Uncertainty means that database facts may be assigned logical values other than the conventional ones - true and false. The logical values represent various degrees of truth, which may be combined and propagated by applying the database rules. A corresponding multivalued database semantics is defined. We show that it extends successful conventional semantics as the well-founded semantics, and has a polynomial time data complexity.

Keywords: Reasoning under uncertainty, multivalued logics, deductive databases, logic programs, multivalued semantics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1331
804 Engineering Photodynamic with Radioactive Therapeutic Systems for Sustainable Molecular Polarity: Autopoiesis Systems

Authors: Moustafa Osman Mohammed

Abstract:

This paper introduces Luhmann’s autopoietic social systems starting with the original concept of autopoiesis by biologists and scientists, including the modification of general systems based on socialized medicine. A specific type of autopoietic system is explained in the three existing groups of the ecological phenomena: interaction, social and medical sciences. This hypothesis model, nevertheless, has a nonlinear interaction with its natural environment ‘interactional cycle’ for the exchange of photon energy with molecular without any changes in topology. The external forces in the systems environment might be concomitant with the natural fluctuations’ influence (e.g. radioactive radiation, electromagnetic waves). The cantilever sensor deploys insights to the future chip processor for prevention of social metabolic systems. Thus, the circuits with resonant electric and optical properties are prototyped on board as an intra–chip inter–chip transmission for producing electromagnetic energy approximately ranges from 1.7 mA at 3.3 V to service the detection in locomotion with the least significant power losses. Nowadays, therapeutic systems are assimilated materials from embryonic stem cells to aggregate multiple functions of the vessels nature de-cellular structure for replenishment. While, the interior actuators deploy base-pair complementarity of nucleotides for the symmetric arrangement in particular bacterial nanonetworks of the sequence cycle creating double-stranded DNA strings. The DNA strands must be sequenced, assembled, and decoded in order to reconstruct the original source reliably. The design of exterior actuators have the ability in sensing different variations in the corresponding patterns regarding beat-to-beat heart rate variability (HRV) for spatial autocorrelation of molecular communication, which consists of human electromagnetic, piezoelectric, electrostatic and electrothermal energy to monitor and transfer the dynamic changes of all the cantilevers simultaneously in real-time workspace with high precision. A prototype-enabled dynamic energy sensor has been investigated in the laboratory for inclusion of nanoscale devices in the architecture with a fuzzy logic control for detection of thermal and electrostatic changes with optoelectronic devices to interpret uncertainty associated with signal interference. Ultimately, the controversial aspect of molecular frictional properties is adjusted to each other and forms its unique spatial structure modules for providing the environment mutual contribution in the investigation of mass temperature changes due to pathogenic archival architecture of clusters.

Keywords: Autopoiesis, quantum photonics, portable energy, photonic structure, photodynamic therapeutic system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 836
803 Social Media as a ‘Service’ for Value Co-Creation by Integrating Sponsoring Companies, Sports Entities and Fans

Authors: Harri Jalonen

Abstract:

Social media has changed the ways we communicate, collaborate and connect with each other. It has also influenced our habits of consuming sports. Social media has allowed direct interaction between sponsoring companies, athletes/players and fans. Drawing on the service dominant logic of value co-creation, the conceptual paper identifies three operant resources which are beneficial for value co-creation: i) social identity and sense of community, ii) congruence and brand personality, and iii) participatory culture and fan activation. The paper contributes to the theoretical discussion on how social can be media used for value co-creation purposes in the sports industry.

Keywords: Sport, value co-creation, social media, service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
802 Array Data Transformation for Source Code Obfuscation

Authors: S. Praveen, P. Sojan Lal

Abstract:

Obfuscation is a low cost software protection methodology to avoid reverse engineering and re engineering of applications. Source code obfuscation aims in obscuring the source code to hide the functionality of the codes. This paper proposes an Array data transformation in order to obfuscate the source code which uses arrays. The applications using the proposed data structures force the programmer to obscure the logic manually. It makes the developed obscured codes hard to reverse engineer and also protects the functionality of the codes.

Keywords: Reverse Engineering, Source Code Obfuscation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2009
801 The Countabilities of Soft Topological Spaces

Authors: Weijian Rong

Abstract:

Soft topological spaces are considered as mathematical tools for dealing with uncertainties, and a fuzzy topological space is a special case of the soft topological space. The purpose of this paper is to study soft topological spaces. We introduce some new concepts in soft topological spaces such as soft first-countable spaces, soft second-countable spaces and soft separable spaces, and some basic properties of these concepts are explored.

Keywords: soft sets, soft first-countable spaces, soft second countable spaces, soft separable spaces, soft Lindelöf.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2386
800 Adaptive Kalman Filter for Noise Estimation and Identification with Bayesian Approach

Authors: Farhad Asadi, S. Hossein Sadati

Abstract:

Bayesian approach can be used for parameter identification and extraction in state space models and its ability for analyzing sequence of data in dynamical system is proved in different literatures. In this paper, adaptive Kalman filter with Bayesian approach for identification of variances in measurement parameter noise is developed. Next, it is applied for estimation of the dynamical state and measurement data in discrete linear dynamical system. This algorithm at each step time estimates noise variance in measurement noise and state of system with Kalman filter. Next, approximation is designed at each step separately and consequently sufficient statistics of the state and noise variances are computed with a fixed-point iteration of an adaptive Kalman filter. Different simulations are applied for showing the influence of noise variance in measurement data on algorithm. Firstly, the effect of noise variance and its distribution on detection and identification performance is simulated in Kalman filter without Bayesian formulation. Then, simulation is applied to adaptive Kalman filter with the ability of noise variance tracking in measurement data. In these simulations, the influence of noise distribution of measurement data in each step is estimated, and true variance of data is obtained by algorithm and is compared in different scenarios. Afterwards, one typical modeling of nonlinear state space model with inducing noise measurement is simulated by this approach. Finally, the performance and the important limitations of this algorithm in these simulations are explained. 

Keywords: adaptive filtering, Bayesian approach Kalman filtering approach, variance tracking

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 583
799 Semantic Web Agent Communication Capable of Reasoning with Ontology and Agent Locations

Authors: Visit Hirankitti, Vuong Tran Xuan

Abstract:

Multi-agent communication of Semantic Web information cannot be realized without the need to reason with ontology and agent locations. This is because for an agent to be able to reason with an external semantic web ontology, it must know where and how to access to that ontology. Similarly, for an agent to be able to communicate with another agent, it must know where and how to send a message to that agent. In this paper we propose a framework of an agent which can reason with ontology and agent locations in order to perform reasoning with multiple distributed ontologies and perform communication with other agents on the semantic web. The agent framework and its communication mechanism are formulated entirely in meta-logic.

Keywords: Semantic Web, agent communication, ontologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1430
798 Estimation of Relative Permeabilities and Capillary Pressures in Shale Using Simulation Method

Authors: F. C. Amadi, G. C. Enyi, G. Nasr

Abstract:

Relative permeabilities are practical factors that are used to correct the single phase Darcy’s law for application to multiphase flow. For effective characterisation of large-scale multiphase flow in hydrocarbon recovery, relative permeability and capillary pressures are used. These parameters are acquired via special core flooding experiments. Special core analysis (SCAL) module of reservoir simulation is applied by engineers for the evaluation of these parameters. But, core flooding experiments in shale core sample are expensive and time consuming before various flow assumptions are achieved for instance Darcy’s law. This makes it imperative for the application of coreflooding simulations in which various analysis of relative permeabilities and capillary pressures of multiphase flow can be carried out efficiently and effectively at a relative pace. This paper presents a Sendra software simulation of core flooding to achieve to relative permeabilities and capillary pressures using different correlations. The approach used in this study was three steps. The first step, the basic petrophysical parameters of Marcellus shale sample such as porosity was determined using laboratory techniques. Secondly, core flooding was simulated for particular scenario of injection using different correlations. And thirdly the best fit correlations for the estimation of relative permeability and capillary pressure was obtained. This research approach saves cost and time and very reliable in the computation of relative permeability and capillary pressures at steady or unsteady state, drainage or imbibition processes in oil and gas industry when compared to other methods.

Keywords: Special core analysis (SCAL), relative permeability, capillary pressures, drainage, imbibition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
797 Improved Estimation of Evolutionary Spectrum based on Short Time Fourier Transforms and Modified Magnitude Group Delay by Signal Decomposition

Authors: H K Lakshminarayana, J S Bhat, H M Mahesh

Abstract:

A new estimator for evolutionary spectrum (ES) based on short time Fourier transform (STFT) and modified group delay function (MGDF) by signal decomposition (SD) is proposed. The STFT due to its built-in averaging, suppresses the cross terms and the MGDF preserves the frequency resolution of the rectangular window with the reduction in the Gibbs ripple. The present work overcomes the magnitude distortion observed in multi-component non-stationary signals with STFT and MGDF estimation of ES using SD. The SD is achieved either through discrete cosine transform based harmonic wavelet transform (DCTHWT) or perfect reconstruction filter banks (PRFB). The MGDF also improves the signal to noise ratio by removing associated noise. The performance of the present method is illustrated for cross chirp and frequency shift keying (FSK) signals, which indicates that its performance is better than STFT-MGDF (STFT-GD) alone. Further its noise immunity is better than STFT. The SD based methods, however cannot bring out the frequency transition path from band to band clearly, as there will be gap in the contour plot at the transition. The PRFB based STFT-SD shows good performance than DCTHWT decomposition method for STFT-GD.

Keywords: Evolutionary Spectrum, Modified Group Delay, Discrete Cosine Transform, Harmonic Wavelet Transform, Perfect Reconstruction Filter Banks, Short Time Fourier Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589
796 Statistical Assessment of Models for Determination of Soil – Water Characteristic Curves of Sand Soils

Authors: S. J. Matlan, M. Mukhlisin, M. R. Taha

Abstract:

Characterization of the engineering behavior of unsaturated soil is dependent on the soil-water characteristic curve (SWCC), a graphical representation of the relationship between water content or degree of saturation and soil suction. A reasonable description of the SWCC is thus important for the accurate prediction of unsaturated soil parameters. The measurement procedures for determining the SWCC, however, are difficult, expensive, and timeconsuming. During the past few decades, researchers have laid a major focus on developing empirical equations for predicting the SWCC, with a large number of empirical models suggested. One of the most crucial questions is how precisely existing equations can represent the SWCC. As different models have different ranges of capability, it is essential to evaluate the precision of the SWCC models used for each particular soil type for better SWCC estimation. It is expected that better estimation of SWCC would be achieved via a thorough statistical analysis of its distribution within a particular soil class. With this in view, a statistical analysis was conducted in order to evaluate the reliability of the SWCC prediction models against laboratory measurement. Optimization techniques were used to obtain the best-fit of the model parameters in four forms of SWCC equation, using laboratory data for relatively coarse-textured (i.e., sandy) soil. The four most prominent SWCCs were evaluated and computed for each sample. The result shows that the Brooks and Corey model is the most consistent in describing the SWCC for sand soil type. The Brooks and Corey model prediction also exhibit compatibility with samples ranging from low to high soil water content in which subjected to the samples that evaluated in this study.

Keywords: Soil-water characteristic curve (SWCC), statistical analysis, unsaturated soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2642
795 Modified Montgomery for RSA Cryptosystem

Authors: Rupali Verma, Maitreyee Dutta, Renu Vig

Abstract:

Encryption and decryption in RSA are done by modular exponentiation which is achieved by repeated modular multiplication. Hence efficiency of modular multiplication directly determines the efficiency of RSA cryptosystem. This paper designs a Modified Montgomery Modular Multiplication in which addition of operands is computed by 4:2 compressor. The basic logic operations in addition are partitioned over two iterations such that parallel computations are performed. This reduces the critical path delay of proposed Montgomery design. The proposed design and RSA are implemented on Virtex 2 and Virtex 5 FPGAs. The two factors partitioning and parallelism have improved the frequency and throughput of proposed design.

Keywords: RSA, Montgomery modular multiplication, 4:2 compressor, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2579
794 Estimation and Removal of Chlorophenolic Compounds from Paper Mill Waste Water by Electrochemical Treatment

Authors: R. Sharma, S. Kumar, C. Sharma

Abstract:

A number of toxic chlorophenolic compounds are formed during pulp bleaching. The nature and concentration of these chlorophenolic compounds largely depends upon the amount and nature of bleaching chemicals used. These compounds are highly recalcitrant and difficult to remove but are partially removed by the biochemical treatment processes adopted by the paper industry. Identification and estimation of these chlorophenolic compounds has been carried out in the primary and secondary clarified effluents from the paper mill by GCMS. Twenty-six chorophenolic compounds have been identified and estimated in paper mill waste waters. Electrochemical treatment is an efficient method for oxidation of pollutants and has successfully been used to treat textile and oil waste water. Electrochemical treatment using less expensive anode material, stainless steel electrodes has been tried to study their removal. The electrochemical assembly comprised a DC power supply, a magnetic stirrer and stainless steel (316 L) electrode. The optimization of operating conditions has been carried out and treatment has been performed under optimized treatment conditions. Results indicate that 68.7% and 83.8% of cholorphenolic compounds are removed during 2 h of electrochemical treatment from primary and secondary clarified effluent respectively. Further, there is a reduction of 65.1, 60 and 92.6% of COD, AOX and color, respectively for primary clarified and 83.8%, 75.9% and 96.8% of COD, AOX and color, respectively for secondary clarified effluent. EC treatment has also been found to increase significantly the biodegradability index of wastewater because of conversion of non- biodegradable fraction into biodegradable fraction. Thus, electrochemical treatment is an efficient method for the degradation of cholorophenolic compounds, removal of color, AOX and other recalcitrant organic matter present in paper mill waste water.

Keywords: Chlorophenolics, effluent, electrochemical treatment, wastewater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1872
793 Stature Estimation Using Foot and Shoeprint Length of Malaysian Population

Authors: M. Khairulmazidah, A. B. Nurul Nadiah, A. R. Rumiza

Abstract:

Formulation of biological profile is one of the modern roles of forensic anthropologist. The present study was conducted to estimate height using foot and shoeprint length of Malaysian population. The present work can be very useful information in the process of identification of individual in forensic cases based on shoeprint evidence. It can help to narrow down suspects and ease the police investigation. Besides, stature is important parameters in determining the partial identify of unidentified and mutilated bodies. Thus, this study can help the problem encountered in cases of mass disaster, massacre, explosions and assault cases. This is because it is very hard to identify parts of bodies in these cases where people are dismembered and become unrecognizable. Samples in this research were collected from 200 Malaysian adults (100 males and 100 females) with age ranging from 20 to 45 years old. In this research, shoeprint length were measured based on the print of the shoes made from the flat shoes. Other information like gender, foot length and height of subject were also recorded. The data was analyzed using IBM® SPSS Statistics 19 software. Results indicated that, foot length has a strong correlation with stature than shoeprint length for both sides of the feet. However, in the unknown, where the gender was undetermined have shown a better correlation in foot length and shoeprint length parameter compared to males and females analyzed separately. In addition, prediction equations are developed to estimate the stature using linear regression analysis of foot length and shoeprint length. However, foot lengths give better prediction than shoeprint length. 

Keywords: Forensic anthropology, foot length, shoeprints, stature estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3025
792 An Efficient Method of Shot Cut Detection

Authors: Lenka Krulikovská, Jaroslav Polec

Abstract:

In this paper we present a method of abrupt cut detection with a novel logic of frames- comparison. Actual frame is compared with its motion estimated prediction instead of comparison with successive frame. Four different similarity metrics were employed to estimate the resemblance of compared frames. Obtained results were evaluated by standard used measures of test accuracy and compared with existing approach. Based on the results, we claim the proposed method is more effective and Pearson correlation coefficient obtained the best results among chosen similarity metrics.

Keywords: Abrupt cut, mutual information, shot cut detection, Pearson correlation coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908
791 A Computer Proven Application of the Discrete Logarithm Problem

Authors: Sebastian Kusch, Markus Kaiser

Abstract:

In this paper we analyze the application of a formal proof system to the discrete logarithm problem used in publickey cryptography. That means, we explore a computer verification of the ElGamal encryption scheme with the formal proof system Isabelle/HOL. More precisely, the functional correctness of this algorithm is formally verified with computer support. Besides, we present a formalization of the DSA signature scheme in the Isabelle/HOL system. We show that this scheme is correct what is a necessary condition for the usefulness of any cryptographic signature scheme.

Keywords: Formal proof system, higher-order logic, formal verification, cryptographic signature scheme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538
790 Novel Adaptive Channel Equalization Algorithms by Statistical Sampling

Authors: János Levendovszky, András Oláh

Abstract:

In this paper, novel statistical sampling based equalization techniques and CNN based detection are proposed to increase the spectral efficiency of multiuser communication systems over fading channels. Multiuser communication combined with selective fading can result in interferences which severely deteriorate the quality of service in wireless data transmission (e.g. CDMA in mobile communication). The paper introduces new equalization methods to combat interferences by minimizing the Bit Error Rate (BER) as a function of the equalizer coefficients. This provides higher performance than the traditional Minimum Mean Square Error equalization. Since the calculation of BER as a function of the equalizer coefficients is of exponential complexity, statistical sampling methods are proposed to approximate the gradient which yields fast equalization and superior performance to the traditional algorithms. Efficient estimation of the gradient is achieved by using stratified sampling and the Li-Silvester bounds. A simple mechanism is derived to identify the dominant samples in real-time, for the sake of efficient estimation. The equalizer weights are adapted recursively by minimizing the estimated BER. The near-optimal performance of the new algorithms is also demonstrated by extensive simulations. The paper has also developed a (Cellular Neural Network) CNN based approach to detection. In this case fast quadratic optimization has been carried out by t, whereas the task of equalizer is to ensure the required template structure (sparseness) for the CNN. The performance of the method has also been analyzed by simulations.

Keywords: Cellular Neural Network, channel equalization, communication over fading channels, multiuser communication, spectral efficiency, statistical sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1498
789 Likelihood Estimation for Stochastic Epidemics with Heterogeneous Mixing Populations

Authors: Yilun Shang

Abstract:

We consider a heterogeneously mixing SIR stochastic epidemic process in populations described by a general graph. Likelihood theory is developed to facilitate statistic inference for the parameters of the model under complete observation. We show that these estimators are asymptotically Gaussian unbiased estimates by using a martingale central limit theorem.

Keywords: statistic inference, maximum likelihood, epidemicmodel, heterogeneous mixing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1388
788 Subpixel Detection of Circular Objects Using Geometric Property

Authors: Wen-Yen Wu, Wen-Bin Yu

Abstract:

In this paper, we propose a method for detecting circular shapes with subpixel accuracy. First, the geometric properties of circles have been used to find the diameters as well as the circumference pixels. The center and radius are then estimated by the circumference pixels. Both synthetic and real images have been tested by the proposed method. The experimental results show that the new method is efficient.

Keywords: Subpixel, least squares estimation, circle detection, Hough transformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2111
787 Advanced Stochastic Models for Partially Developed Speckle

Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije

Abstract:

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.

Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720