Search results for: interval reduction.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1780

Search results for: interval reduction.

1720 Model Order Reduction for Frequency Response and Effect of Order of Method for Matching Condition

Authors: Aref Ghafouri, Mohammad Javad Mollakazemi, Farhad Asadi

Abstract:

In this paper, model order reduction method is used for approximation in linear and nonlinearity aspects in some experimental data. This method can be used for obtaining offline reduced model for approximation of experimental data and can produce and follow the data and order of system and also it can match to experimental data in some frequency ratios. In this study, the method is compared in different experimental data and influence of choosing of order of the model reduction for obtaining the best and sufficient matching condition for following the data is investigated in format of imaginary and reality part of the frequency response curve and finally the effect and important parameter of number of order reduction in nonlinear experimental data is explained further.

Keywords: Frequency response, Order of model reduction, frequency matching condition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2014
1719 Efficient Dimensionality Reduction of Directional Overcurrent Relays Optimal Coordination Problem

Authors: Fouad Salha , X. Guillaud

Abstract:

Directional over current relays (DOCR) are commonly used in power system protection as a primary protection in distribution and sub-transmission electrical systems and as a secondary protection in transmission systems. Coordination of protective relays is necessary to obtain selective tripping. In this paper, an approach for efficiency reduction of DOCRs nonlinear optimum coordination (OC) is proposed. This was achieved by modifying the objective function and relaxing several constraints depending on the four constraints classification, non-valid, redundant, pre-obtained and valid constraints. According to this classification, the far end fault effect on the objective function and constraints, and in consequently on relay operating time, was studied. The study was carried out, firstly by taking into account the near-end and far-end faults in DOCRs coordination problem formulation; and then faults very close to the primary relays (nearend faults). The optimal coordination (OC) was achieved by simultaneously optimizing all variables (TDS and Ip) in nonlinear environment by using of Genetic algorithm nonlinear programming techniques. The results application of the above two approaches on 6-bus and 26-bus system verify that the far-end faults consideration on OC problem formulation don-t lose the optimality.

Keywords: Backup/Primary relay, Coordination time interval (CTI), directional over current relays, Genetic algorithm, time dial setting (TDS), pickup current setting (Ip), nonlinear programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1528
1718 The role of pH on Cr(VI) Reduction and Removal by Arthrobacter Viscosus

Authors: B. Silva, H. Figueiredo, I. C. Neves, T. Tavares

Abstract:

Arthrobacter viscosus biomass was used for Cr(VI) biosorption. The effect of pH on Cr(VI) reduction and removal from aqueous solution was studied in the range of 1-4. The Cr(VI) removal involves both redox reaction and adsorption of metal ions on biomass surface. The removal rate of Cr(VI) was enhanced by very acid conditions, while higher solution pH values favored the removal of total chromium. The best removal efficiency and uptake were reached at pH 4, 72.5 % and 12.6 mgCr/gbiomass, respectively.

Keywords: Biosorption, chromium, pH, reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608
1717 Recent Trends in Nonlinear Methods of HRV Analysis: A Review

Authors: Ramesh K. Sunkaria

Abstract:

The linear methods of heart rate variability analysis such as non-parametric (e.g. fast Fourier transform analysis) and parametric methods (e.g. autoregressive modeling) has become an established non-invasive tool for marking the cardiac health, but their sensitivity and specificity were found to be lower than expected with positive predictive value <30%. This may be due to considering the RR-interval series as stationary and re-sampling them prior to their use for analysis, whereas actually it is not. This paper reviews the non-linear methods of HRV analysis such as correlation dimension, largest Lyupnov exponent, power law slope, fractal analysis, detrended fluctuation analysis, complexity measure etc. which are currently becoming popular as these uses the actual RR-interval series. These methods are expected to highly accurate cardiac health prognosis.

Keywords: chaos, nonlinear dynamics, sample entropy, approximate entropy, detrended fluctuation analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2299
1716 The Multi-scenario Knapsack Problem: An Adaptive Search Algorithm

Authors: Mhand Hifi, Hedi Mhalla, Mustapha Michaphy

Abstract:

In this paper, we study the multi-scenario knapsack problem, a variant of the well-known NP-Hard single knapsack problem. We investigate the use of an adaptive algorithm for solving heuristically the problem. The used method combines two complementary phases: a size reduction phase and a dynamic 2- opt procedure one. First, the reduction phase applies a polynomial reduction strategy; that is used for reducing the size problem. Second, the adaptive search procedure is applied in order to attain a feasible solution Finally, the performances of two versions of the proposed algorithm are evaluated on a set of randomly generated instances.

Keywords: combinatorial optimization, max-min optimization, knapsack, heuristics, problem reduction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
1715 MIMO System Order Reduction Using Real-Coded Genetic Algorithm

Authors: Swadhin Ku. Mishra, Sidhartha Panda, Simanchala Padhy, C. Ardil

Abstract:

In this paper, real-coded genetic algorithm (RCGA) optimization technique has been applied for large-scale linear dynamic multi-input-multi-output (MIMO) system. The method is based on error minimization technique where the integral square error between the transient responses of original and reduced order models has been minimized by RCGA. The reduction procedure is simple computer oriented and the approach is comparable in quality with the other well-known reduction techniques. Also, the proposed method guarantees stability of the reduced model if the original high-order MIMO system is stable. The proposed approach of MIMO system order reduction is illustrated with the help of an example and the results are compared with the recently published other well-known reduction techniques to show its superiority.

Keywords: Multi-input-multi-output (MIMO) system.Modelorder reduction. Integral squared error (ISE). Real-coded geneticalgorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2219
1714 Removal of Hexavalent Chromium from Wastewater by Use of Scrap Iron

Authors: Marius Gheju, Rodica Pode

Abstract:

Hexavalent chromium is highly toxic to most living organisms and a known human carcinogen by the inhalation route of exposure. Therefore, treatment of Cr(VI) contaminated wastewater is essential before their discharge to the natural water bodies. Cr(VI) reduction to Cr(III) can be beneficial because a more mobile and more toxic chromium species is converted to a less mobile and less toxic form. Zero-valence-state metals, such as scrap iron, can serve as electron donors for reducing Cr(VI) to Cr(III). The influence of pH on scrap iron capacity to reduce Cr(VI) was investigated in this study. Maximum reduction capacity of scrap iron was observed at the beginning of the column experiments; the lower the pH, the greater the experiment duration with maximum scrap iron reduction capacity. The experimental results showed that highest maximum reduction capacity of scrap iron was 12.5 mg Cr(VI)/g scrap iron, at pH 2.0, and decreased with increasing pH up to 1.9 mg Cr(VI)/g scrap iron at pH = 7.3.

Keywords: hexavalent chromium, heavy metals, scrap iron, reduction capacity, wastewater treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037
1713 Dicotyledon Weed Quantification Algorithm for Selective Herbicide Application in Maize Crops: Statistical Evaluation of the Potential Herbicide Savings

Authors: Morten Stigaard Laursen, Rasmus Nyholm Jørgensen, Henrik Skov Midtiby, Anders Krogh Mortensen, Sanmohan Baby

Abstract:

This work contributes a statistical model and simulation framework yielding the best estimate possible for the potential herbicide reduction when using the MoDiCoVi algorithm all the while requiring a efficacy comparable to conventional spraying. In June 2013 a maize field located in Denmark were seeded. The field was divided into parcels which was assigned to one of two main groups: 1) Control, consisting of subgroups of no spray and full dose spraty; 2) MoDiCoVi algorithm subdivided into five different leaf cover thresholds for spray activation. In addition approximately 25% of the parcels were seeded with additional weeds perpendicular to the maize rows. In total 299 parcels were randomly assigned with the 28 different treatment combinations. In the statistical analysis, bootstrapping was used for balancing the number of replicates. The achieved potential herbicide savings was found to be 70% to 95% depending on the initial weed coverage. However additional field trials covering more seasons and locations are needed to verify the generalisation of these results. There is a potential for further herbicide savings as the time interval between the first and second spraying session was not long enough for the weeds to turn yellow, instead they only stagnated in growth.

Keywords: Weed crop discrimination, macrosprayer, herbicide reduction, site-specific, sprayer-boom.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 995
1712 Vibration Reduction Module with Flexure Springs for Personal Tools

Authors: Donghyun Hwang, Soo-Hun Lee, Moon G. Lee

Abstract:

In the various working field, vibration may cause injurious to human body. Especially, in case of the vibration which is constantly and repeatedly transferred to the human. That gives serious physical problem, so called, Reynaud phenomenon. In this paper, we propose a vibration transmissibility reduction module with flexure mechanism for personal tools. At first, we select a target personal tool, grass cutter, and measure the level of vibration transmissibility on the hand. And then, we develop the concept design of the module that has stiffness for reduction the vibration transmissibility more than 20%, where the vibration transmissibility is measured with an accelerometer. In addition, the vibration reduction can be enhanced when the interior gap between inner and outer body is filled with silicone gel. This will be verified by the further experiment.

Keywords: Flexure spring, tool engineering, vibration damping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906
1711 Reliability Analysis of k-out-of-n : G System Using Triangular Intuitionistic Fuzzy Numbers

Authors: Tanuj Kumar, Rakesh Kumar Bajaj

Abstract:

In the present paper, we analyze the vague reliability of k-out-of-n : G system (particularly, series and parallel system) with independent and non-identically distributed components, where the reliability of the components are unknown. The reliability of each component has been estimated using statistical confidence interval approach. Then we converted these statistical confidence interval into triangular intuitionistic fuzzy numbers. Based on these triangular intuitionistic fuzzy numbers, the reliability of the k-out-of-n : G system has been calculated. Further, in order to implement the proposed methodology and to analyze the results of k-out-of-n : G system, a numerical example has been provided.

Keywords: Vague set, vague reliability, triangular intuitionistic fuzzy number, k-out-of-n : G system, series and parallel system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2934
1710 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: Logistic Regression LoR, Kernel Density Estimator KDE, Handwriting, Confidence Interval, Repeatability, Reproducibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 413
1709 The Study of Groundcover for Heat Reduction

Authors: Winai Mankhatitham

Abstract:

This research investigated groundcover on the roof (green roof) which can reduce the temperature and carbon monoxide. This study is divided into 3 main aspects: 1. Types of groundcover affecting heat reduction 2. The efficiency on heat reduction of 3 types of groundcover, i.e. lawn, arachis pintoi, and purslane 3. Database for designing green roof. This study has been designed as an experimental research by simulating the 3 types of groundcover in 3 trays placed in the green house for recording the temperature change for 24 hours. The results showed that the groundcover with the highest heat reduction efficiency was lawn. The dense of the lawn can protect the heat transfer to the soil. For the further study, there should be a comparative study of the thickness and the types of soil to get more information for the suitable types of groundcover and the soil for designing the energy saving green roof.

Keywords: Groundcover, Green Roof, Heat Reduction, Energy Saving.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1398
1708 Optimum Design of Heat Exchanger in Diesel Engine Cold EGR for Pollutants Reduction

Authors: Nasser Ghassembaglou, Armin Rahmatfam, Faramarz Ranjbar

Abstract:

Using cold EGR method with variable venturi and turbocharger has a very significant effect on reduction of NOX and grime simultaneously. EGR cooler is one of the most important parts in the cold EGR circuit. In this paper optimum design of cooler for working in different percentages of EGR and for determining optimum temperature of exhausted gases, growth of efficiency, reduction of weight, dimension, expenditures, sediment and also optimum performance by using gasoil which has significant amounts of brimstone are investigated and optimized.

Keywords: Cold EGR, NOX, Cooler.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3863
1707 Modeling Language for Constructing Solvers in Machine Learning: Reductionist Perspectives

Authors: Tsuyoshi Okita

Abstract:

For a given specific problem an efficient algorithm has been the matter of study. However, an alternative approach orthogonal to this approach comes out, which is called a reduction. In general for a given specific problem this reduction approach studies how to convert an original problem into subproblems. This paper proposes a formal modeling language to support this reduction approach in order to make a solver quickly. We show three examples from the wide area of learning problems. The benefit is a fast prototyping of algorithms for a given new problem. It is noted that our formal modeling language is not intend for providing an efficient notation for data mining application, but for facilitating a designer who develops solvers in machine learning.

Keywords: Formal language, statistical inference problem, reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1286
1706 Hexavalent Chromium Pollution Abatement by use of Scrap Iron

Authors: Marius Gheju, Laura Cocheci

Abstract:

In this study, the reduction of Cr(VI) by use of scrap iron, a cheap and locally available industrial waste, was investigated in continuous system. The greater scrap iron efficiency observed for the first two sections of the column filling indicate that most of the reduction process was carried out in the bottom half of the column filling. This was ascribed to a constant decrease of Cr(VI) concentration inside the filling, as the water front passes from the bottom to the top end of the column. While the bottom section of the column filling was heavily passivated with secondary mineral phases, the top section was less affected by the passivation process; therefore the column filling would likely ensure the reduction of Cr(VI) for time periods longer than 216 hours. The experimental results indicate that fixed beds columns packed with scrap iron could be successfully used for the first step of Cr(VI) polluted wastewater treatment. However, the mass of scrap iron filling should be carefully estimated since it significantly affects the Cr(VI) reduction efficiency.

Keywords: hexavalent chromium, heavy metals, scrap iron, reduction capacity, wastewater treatment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1792
1705 Reduction Conditions of Briquetted Solid Wastes Generated by the Integrated Iron and Steel Plant

Authors: Gökhan Polat, Dicle Kocaoğlu Yılmazer, Muhlis Nezihi Sarıdede

Abstract:

Iron oxides are the main input to produce iron in integrated iron and steel plants. During production of iron from iron oxides, some wastes with high iron content occur. These main wastes can be classified as basic oxygen furnace (BOF) sludge, flue dust and rolling scale. Recycling of these wastes has a great importance for both environmental effects and reduction of production costs. In this study, recycling experiments were performed on basic oxygen furnace sludge, flue dust and rolling scale which contain 53.8%, 54.3% and 70.2% iron respectively. These wastes were mixed together with coke as reducer and these mixtures are pressed to obtain cylindrical briquettes. These briquettes were pressed under various compacting forces from 1 ton to 6 tons. Also, both stoichiometric and twice the stoichiometric cokes were added to investigate effect of coke amount on reduction properties of the waste mixtures. Then, these briquettes were reduced at 1000°C and 1100°C during 30, 60, 90, 120 and 150 min in a muffle furnace. According to the results of reduction experiments, the effect of compacting force, temperature and time on reduction ratio of the wastes were determined. It is found that 1 ton compacting force, 150 min reduction time and 1100°C are the optimum conditions to obtain reduction ratio higher than 75%.

Keywords: Iron oxide wastes, reduction, coke, recycling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274
1704 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis

Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen

Abstract:

Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.

Keywords: Hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589
1703 Using Interval Constrained Petri Nets for the Fuzzy Regulation of Quality: Case of Assembly Process Mechanics

Authors: Nabli L., Dhouibi H., Collart Dutilleul S., Craye E.

Abstract:

The indistinctness of the manufacturing processes makes that a parts cannot be realized in an absolutely exact way towards the specifications on the dimensions. It is thus necessary to assume that the effectively realized product has to belong in a very strict way to compatible intervals with a correct functioning of the parts. In this paper we present an approach based on mixing tow different characteristics theories, the fuzzy system and Petri net system. This tool has been proposed to model and control the quality in an assembly system. A robust command of a mechanical assembly process is presented as an application. This command will then have to maintain the specifications interval of parts in front of the variations. It also illustrates how the technique reacts when the product quality is high, medium, or low.

Keywords: Petri nets, production rate, performance evaluation, tolerant system, fuzzy sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1242
1702 Products in Early Development Phases: Ecological Classification and Evaluation Using an Interval Arithmetic Based Calculation Approach

Authors: Helen L. Hein, Joachim Schwarte

Abstract:

As a pillar of sustainable development, ecology has become an important milestone in research community, especially due to global challenges like climate change. The ecological performance of products can be scientifically conducted with life cycle assessments. In the construction sector, significant amounts of CO2 emissions are assigned to the energy used for building heating purposes. Therefore, sustainable construction materials for insulating purposes are substantial, whereby aerogels have been explored intensively in the last years due to their low thermal conductivity. Therefore, the WALL-ACE project aims to develop an aerogel-based thermal insulating plaster that would achieve minor thermal conductivities. But as in the early stage of development phases, a lot of information is still missing or not yet accessible, the ecological performance of innovative products bases increasingly on uncertain data that can lead to significant deviations in the results. To be able to predict realistically how meaningful the results are and how viable the developed products may be with regard to their corresponding respective market, these deviations however have to be considered. Therefore, a classification method is presented in this study, which may allow comparing the ecological performance of modern products with already established and competitive materials. In order to achieve this, an alternative calculation method was used that allows computing with lower and upper bounds to consider all possible values without precise data. The life cycle analysis of the considered products was conducted with an interval arithmetic based calculation method. The results lead to the conclusion that the interval solutions describing the possible environmental impacts are so wide that the result usability is limited. Nevertheless, a further optimization in reducing environmental impacts of aerogels seems to be needed to become more competitive in the future.

Keywords: Aerogel-based, insulating material, early develop¬ment phase, interval arithmetic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 570
1701 Genetic Algorithm Approach for Solving the Falkner–Skan Equation

Authors: Indu Saini, Phool Singh, Vikas Malik

Abstract:

A novel method based on Genetic Algorithm to solve the boundary value problems (BVPs) of the Falkner–Skan equation over a semi-infinite interval has been presented. In our approach, we use the free boundary formulation to truncate the semi-infinite interval into a finite one. Then we use the shooting method based on Genetic Algorithm to transform the BVP into initial value problems (IVPs). Genetic Algorithm is used to calculate shooting angle. The initial value problems arisen during shooting are computed by Runge-Kutta Fehlberg method. The numerical solutions obtained by the present method are in agreement with those obtained by previous authors.

Keywords: Boundary Layer Flow, Falkner–Skan equation, Genetic Algorithm, Shooting method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2473
1700 Amino Acid Coated Silver Nanoparticles: A Green Catalyst for Methylene Blue Reduction

Authors: Abhishek Chandra, Man Singh

Abstract:

Highly stable and homogeneously dispersed amino acid coated silver nanoparticles (ANP) of ≈ 10 nm diameter, ranging from 420 to 430 nm are prepared on AgNO3 solution addition to gum of Azadirachta indica solution at 373.15 K. The amino acids were selected based on their polarity. The synthesized nanoparticles were characterized by UV-Vis, FTIR spectroscopy, HR-TEM, XRD, SEM and 1H-NMR. The coated nanoparticles were used as catalyst for the reduction of methylene blue dye in presence of Sn(II) in aqueous, anionic and cationic micellar media. The rate of reduction of dye was determined by measuring the absorbance at 660 nm, spectrophotometrically and followed the order: Kcationic > Kanionic > Kwater. After 12 min and in absence of the ANP, only 2%, 3% and 6% of the dye reduction was completed in aqueous, anionic and cationic micellar media respectively while, in presence of ANP coated by polar neutral amino acid with non-polar -R group, the reduction completed to 84%, 95% and 98% respectively. The ANP coated with polar neutral amino acid having non-polar -R group, increased the rate of reduction of the dye by 94, 3205 and 6370 folds in aqueous, anionic and cationic micellar media respectively. Also, the rate of reduction of the dye increased by three folds when the micellar media was changed from anionic to cationic when the ANP is coated by a polar neutral amino acid having a non-polar -R group.

Keywords: Silver nanoparticle, surfactant, methylene blue, amino acid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2479
1699 Reduction of Peak Input Currents during Charge Pump Boosting in Monolithically Integrated High-Voltage Generators

Authors: Jan Doutreloigne

Abstract:

This paper describes two methods for the reduction of the peak input current during the boosting of Dickson charge pumps. Both methods are implemented in the fully integrated Dickson charge pumps of a high-voltage display driver chip for smart-card applications. Experimental results reveal good correspondence with Spice simulations and show a reduction of the peak input current by a factor of 6 during boosting.

Keywords: Bi-stable display driver, Dickson charge pump, highvoltage generator, peak current reduction, sub-pump boosting, variable frequency boosting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1564
1698 Multivariable System Reduction Using Stability Equation Method and SRAM

Authors: D. Bala Bhaskar

Abstract:

An algorithm is proposed for the order reduction of large scale linear dynamic multi variable systems where the reduced order model denominator is obtained by using Stability equation method and numerator coefficients are obtained by using SRAM. The proposed algorithm produces a lower order model for an original stable high order multivariable system. The reduction procedure is easy to understand, efficient and computer oriented. To highlight the advantages of the approach, the algorithm is illustrated with the help of a numerical example and the results are compared with the other existing techniques in literature.

Keywords: Multi variable systems, order reduction, stability equation method, SRAM, time domain characteristics, ISE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 674
1697 Mammogram Image Size Reduction Using 16-8 bit Conversion Technique

Authors: Ayman A. AbuBaker, Rami S.Qahwaji, Musbah J. Aqel, Mohmmad H. Saleh

Abstract:

Two algorithms are proposed to reduce the storage requirements for mammogram images. The input image goes through a shrinking process that converts the 16-bit images to 8-bits by using pixel-depth conversion algorithm followed by enhancement process. The performance of the algorithms is evaluated objectively and subjectively. A 50% reduction in size is obtained with no loss of significant data at the breast region.

Keywords: Breast cancer, Image processing, Image reduction, Mammograms, Image enhancement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1993
1696 Optimal Image Compression Based on Sign and Magnitude Coding of Wavelet Coefficients

Authors: Mbainaibeye Jérôme, Noureddine Ellouze

Abstract:

Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.

Keywords: Image compression, wavelet transform, sign coding, magnitude coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631
1695 Heterogeneous Attribute Reduction in Noisy System based on a Generalized Neighborhood Rough Sets Model

Authors: Siyuan Jing, Kun She

Abstract:

Neighborhood Rough Sets (NRS) has been proven to be an efficient tool for heterogeneous attribute reduction. However, most of researches are focused on dealing with complete and noiseless data. Factually, most of the information systems are noisy, namely, filled with incomplete data and inconsistent data. In this paper, we introduce a generalized neighborhood rough sets model, called VPTNRS, to deal with the problem of heterogeneous attribute reduction in noisy system. We generalize classical NRS model with tolerance neighborhood relation and the probabilistic theory. Furthermore, we use the neighborhood dependency to evaluate the significance of a subset of heterogeneous attributes and construct a forward greedy algorithm for attribute reduction based on it. Experimental results show that the model is efficient to deal with noisy data.

Keywords: attribute reduction, incomplete data, inconsistent data, tolerance neighborhood relation, rough sets

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1546
1694 Investigating Climate Change Trend Based on Data Simulation and IPCC Scenario during 2010-2030 AD: Case Study of Fars Province

Authors: Leila Rashidian, Abbas Ebrahimi

Abstract:

The development of industrial activities, increase in fossil fuel consumption, vehicles, destruction of forests and grasslands, changes in land use, and population growth have caused to increase the amount of greenhouse gases especially CO2 in the atmosphere in recent decades. This has led to global warming and climate change. In the present paper, we have investigated the trend of climate change according to the data simulation during the time interval of 2010-2030 in the Fars province. In this research, the daily climatic parameters such as maximum and minimum temperature, precipitation and number of sunny hours during the 1977-2008 time interval for synoptic stations of Shiraz and Abadeh and during 1995-2008 for Lar stations and also the output of HADCM3 model in 2010-2030 time interval have been used based on the A2 propagation scenario. The results of the model show that the average temperature will increase by about 1 degree centigrade and the amount of precipitation will increase by 23.9% compared to the observational data. In conclusion, according to the temperature increase in this province, the amount of precipitation in the form of snow will be reduced and precipitations often will occur in the form of rain. This 1-degree centigrade increase during the season will reduce production by 6 to 10% because of shortening the growing period of wheat.

Keywords: Climate change, Lars.WG, HADCM3 model, Fars province, climatic parameters, A2 scenario.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1123
1693 Krylov Model Order Reduction of a Thermal Subsea Model

Authors: J. Šindler, A. Suleng, T. Jelstad Olsen, P. Bárta

Abstract:

A subsea hydrocarbon production system can undergo planned and unplanned shutdowns during the life of the field. The thermal FEA is used to simulate the cool down to verify the insulation design of the subsea equipment, but it is also used to derive an acceptable insulation design for the cold spots. The driving factors of subsea analyses require fast responding and accurate models of the equipment cool down. This paper presents cool down analysis carried out by a Krylov subspace reduction method, and compares this approach to the commonly used FEA solvers. The model considered represents a typical component of a subsea production system, a closed valve on a dead leg. The results from the Krylov reduction method exhibits the least error and requires the shortest computational time to reach the solution. These findings make the Krylov model order reduction method very suitable for the above mentioned subsea applications.

Keywords: Model order reduction, Krylov subspace, subsea production system, finite element.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2284
1692 The Effects on Yield and Yield Components of Different Level Cluster Tip Reduction and Foliar Boric Acid Applications on Alphonse Lavallee Grape Cultivar

Authors: A. Akın, H. Çoban

Abstract:

This study was carried out to determine the effects of Control (C), 1/3 Cluster Tip Reduction (1/3 CTR), 1/6 Cluster Tip Reduction (1/6 CTR), 1/9 Cluster Tip Reduction (1/9 CTR), 1/3 CTR + Boric Acid (BA), 1/6 CTR + BA, 1/9 CTR + BA applications on yield and yield components of four years old Alphonse Lavallee grape variety (Vitis vinifera L.) grown on grafted 110 Paulsen rootstock in Konya province in Turkey in the vegetation period in 2015. According to the results, the highest maturity index 21.46 with 1/9 CTR application; the highest grape juice yields 736.67 ml with 1/3 CTR + BA application; the highest L* color value 32.07 with 1/9 CTR application; the highest a* color value 1.74 with 1/9 CTR application; the highest b* color value 3.72 with 1/9 CTR application were obtained. The effects of applications on grape fresh yield, cluster weight and berry weight were not found statistically significant.

Keywords: Alphonse Lavallee grape cultivar, different cluster tip reduction (1/3, 1/6, 1/9), foliar boric acid application, yield, quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
1691 An efficient Activity Network Reduction Algorithm based on the Label Correcting Tracing Algorithm

Authors: Weng Ming Chu

Abstract:

When faced with stochastic networks with an uncertain duration for their activities, the securing of network completion time becomes problematical, not only because of the non-identical pdf of duration for each node, but also because of the interdependence of network paths. As evidenced by Adlakha & Kulkarni [1], many methods and algorithms have been put forward in attempt to resolve this issue, but most have encountered this same large-size network problem. Therefore, in this research, we focus on network reduction through a Series/Parallel combined mechanism. Our suggested algorithm, named the Activity Network Reduction Algorithm (ANRA), can efficiently transfer a large-size network into an S/P Irreducible Network (SPIN). SPIN can enhance stochastic network analysis, as well as serve as the judgment of symmetry for the Graph Theory.

Keywords: Series/Parallel network, Stochastic network, Network reduction, Interdictive Graph, Complexity Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1337