Search results for: Complexity reduction approach
6708 The Multi-scenario Knapsack Problem: An Adaptive Search Algorithm
Authors: Mhand Hifi, Hedi Mhalla, Mustapha Michaphy
Abstract:
In this paper, we study the multi-scenario knapsack problem, a variant of the well-known NP-Hard single knapsack problem. We investigate the use of an adaptive algorithm for solving heuristically the problem. The used method combines two complementary phases: a size reduction phase and a dynamic 2- opt procedure one. First, the reduction phase applies a polynomial reduction strategy; that is used for reducing the size problem. Second, the adaptive search procedure is applied in order to attain a feasible solution Finally, the performances of two versions of the proposed algorithm are evaluated on a set of randomly generated instances.
Keywords: combinatorial optimization, max-min optimization, knapsack, heuristics, problem reduction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16176707 Combining Diverse Neural Classifiers for Complex Problem Solving: An ECOC Approach
Authors: R. Ebrahimpour, M. Abbasnezhad Arabi, H. Babamiri Moghaddam
Abstract:
Combining classifiers is a useful method for solving complex problems in machine learning. The ECOC (Error Correcting Output Codes) method has been widely used for designing combining classifiers with an emphasis on the diversity of classifiers. In this paper, in contrast to the standard ECOC approach in which individual classifiers are chosen homogeneously, classifiers are selected according to the complexity of the corresponding binary problem. We use SATIMAGE database (containing 6 classes) for our experiments. The recognition error rate in our proposed method is %10.37 which indicates a considerable improvement in comparison with the conventional ECOC and stack generalization methods.Keywords: Error correcting output code, combining classifiers, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14006706 Removal of Hexavalent Chromium from Wastewater by Use of Scrap Iron
Authors: Marius Gheju, Rodica Pode
Abstract:
Hexavalent chromium is highly toxic to most living organisms and a known human carcinogen by the inhalation route of exposure. Therefore, treatment of Cr(VI) contaminated wastewater is essential before their discharge to the natural water bodies. Cr(VI) reduction to Cr(III) can be beneficial because a more mobile and more toxic chromium species is converted to a less mobile and less toxic form. Zero-valence-state metals, such as scrap iron, can serve as electron donors for reducing Cr(VI) to Cr(III). The influence of pH on scrap iron capacity to reduce Cr(VI) was investigated in this study. Maximum reduction capacity of scrap iron was observed at the beginning of the column experiments; the lower the pH, the greater the experiment duration with maximum scrap iron reduction capacity. The experimental results showed that highest maximum reduction capacity of scrap iron was 12.5 mg Cr(VI)/g scrap iron, at pH 2.0, and decreased with increasing pH up to 1.9 mg Cr(VI)/g scrap iron at pH = 7.3.
Keywords: hexavalent chromium, heavy metals, scrap iron, reduction capacity, wastewater treatment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20846705 Vibration Reduction Module with Flexure Springs for Personal Tools
Authors: Donghyun Hwang, Soo-Hun Lee, Moon G. Lee
Abstract:
In the various working field, vibration may cause injurious to human body. Especially, in case of the vibration which is constantly and repeatedly transferred to the human. That gives serious physical problem, so called, Reynaud phenomenon. In this paper, we propose a vibration transmissibility reduction module with flexure mechanism for personal tools. At first, we select a target personal tool, grass cutter, and measure the level of vibration transmissibility on the hand. And then, we develop the concept design of the module that has stiffness for reduction the vibration transmissibility more than 20%, where the vibration transmissibility is measured with an accelerometer. In addition, the vibration reduction can be enhanced when the interior gap between inner and outer body is filled with silicone gel. This will be verified by the further experiment.
Keywords: Flexure spring, tool engineering, vibration damping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19566704 Two-Stage Approach for Solving the Multi-Objective Optimization Problem on Combinatorial Configurations
Authors: Liudmyla Koliechkina, Olena Dvirna
Abstract:
The statement of the multi-objective optimization problem on combinatorial configurations is formulated, and the approach to its solution is proposed. The problem is of interest as a combinatorial optimization one with many criteria, which is a model of many applied tasks. The approach to solving the multi-objective optimization problem on combinatorial configurations consists of two stages; the first is the reduction of the multi-objective problem to the single criterion based on existing multi-objective optimization methods, the second stage solves the directly replaced single criterion combinatorial optimization problem by the horizontal combinatorial method. This approach provides the optimal solution to the multi-objective optimization problem on combinatorial configurations, taking into account additional restrictions for a finite number of steps.Keywords: Discrete set, linear combinatorial optimization, multi-objective optimization, multipermutation, Pareto solutions, partial permutation set, permutation, structural graph.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6666703 The Study of Groundcover for Heat Reduction
Authors: Winai Mankhatitham
Abstract:
This research investigated groundcover on the roof (green roof) which can reduce the temperature and carbon monoxide. This study is divided into 3 main aspects: 1. Types of groundcover affecting heat reduction 2. The efficiency on heat reduction of 3 types of groundcover, i.e. lawn, arachis pintoi, and purslane 3. Database for designing green roof. This study has been designed as an experimental research by simulating the 3 types of groundcover in 3 trays placed in the green house for recording the temperature change for 24 hours. The results showed that the groundcover with the highest heat reduction efficiency was lawn. The dense of the lawn can protect the heat transfer to the soil. For the further study, there should be a comparative study of the thickness and the types of soil to get more information for the suitable types of groundcover and the soil for designing the energy saving green roof.
Keywords: Groundcover, Green Roof, Heat Reduction, Energy Saving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15006702 Digital Transformation of Lean Production: Systematic Approach for the Determination of Digitally Pervasive Value Chains
Authors: Peter Burggräf, Matthias Dannapfel, Hanno Voet, Patrick-Benjamin Bök, Jérôme Uelpenich, Julian Hoppe
Abstract:
The increasing digitalization of value chains can help companies to handle rising complexity in their processes and thereby reduce the steadily increasing planning and control effort in order to raise performance limits. Due to technological advances, companies face the challenge of smart value chains for the purpose of improvements in productivity, handling the increasing time and cost pressure and the need of individualized production. Therefore, companies need to ensure quick and flexible decisions to create self-optimizing processes and, consequently, to make their production more efficient. Lean production, as the most commonly used paradigm for complexity reduction, reaches its limits when it comes to variant flexible production and constantly changing market and environmental conditions. To lift performance limits, which are inbuilt in current value chains, new methods and tools must be applied. Digitalization provides the potential to derive these new methods and tools. However, companies lack the experience to harmonize different digital technologies. There is no practicable framework, which instructs the transformation of current value chains into digital pervasive value chains. Current research shows that a connection between lean production and digitalization exists. This link is based on factors such as people, technology and organization. In this paper, the introduced method for the determination of digitally pervasive value chains takes the factors people, technology and organization into account and extends existing approaches by a new dimension. It is the first systematic approach for the digital transformation of lean production and consists of four steps: The first step of ‘target definition’ describes the target situation and defines the depth of the analysis with regards to the inspection area and the level of detail. The second step of ‘analysis of the value chain’ verifies the lean-ability of processes and lies in a special focus on the integration capacity of digital technologies in order to raise the limits of lean production. Furthermore, the ‘digital evaluation process’ ensures the usefulness of digital adaptions regarding their practicability and their integrability into the existing production system. Finally, the method defines actions to be performed based on the evaluation process and in accordance with the target situation. As a result, the validation and optimization of the proposed method in a German company from the electronics industry shows that the digital transformation of current value chains based on lean production achieves a raise of their inbuilt performance limits.
Keywords: Digitalization, digital transformation, lean production, Industrie 4.0, value chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20326701 Functional Decomposition Based Effort Estimation Model for Software-Intensive Systems
Authors: Nermin Sökmen
Abstract:
An effort estimation model is needed for softwareintensive projects that consist of hardware, embedded software or some combination of the two, as well as high level software solutions. This paper first focuses on functional decomposition techniques to measure functional complexity of a computer system and investigates its impact on system development effort. Later, it examines effects of technical difficulty and design team capability factors in order to construct the best effort estimation model. With using traditional regression analysis technique, the study develops a system development effort estimation model which takes functional complexity, technical difficulty and design team capability factors as input parameters. Finally, the assumptions of the model are tested.
Keywords: Functional complexity, functional decomposition, development effort, technical difficulty, design team capability, regression analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22776700 An Improved Transfer Logic of the Two-Path Algorithm for Acoustic Echo Cancellation
Abstract:
Adaptive echo cancellers with two-path algorithm are applied to avoid the false adaptation during the double-talk situation. In the two-path algorithm, several transfer logic solutions have been proposed to control the filter update. This paper presents an improved transfer logic solution. It improves the convergence speed of the two-path algorithm, and allows the reduction of the memory elements and computational complexity. Results of simulations show the improved performance of the proposed solution.Keywords: Acoustic echo cancellation, Echo return lossenhancement (ERLE), Two-path algorithm, Transfer logic
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17696699 A Fuzzy Approach for Delay Proportion Differentiated Service
Authors: Mehran Garmehi, Yasser Mansouri
Abstract:
There are two paradigms proposed to provide QoS for Internet applications: Integrated service (IntServ) and Differentiated service (DiffServ).Intserv is not appropriate for large network like Internet. Because is very complex. Therefore, to reduce the complexity of QoS management, DiffServ was introduced to provide QoS within a domain using aggregation of flow and per- class service. In theses networks QoS between classes is constant and it allows low priority traffic to be effected from high priority traffic, which is not suitable. In this paper, we proposed a fuzzy controller, which reduced the effect of low priority class on higher priority ones. Our simulations shows that, our approach reduces the latency dependency of low priority class on higher priority ones, in an effective manner.
Keywords: QoS, Differentiated Service (DiffServ), FuzzyController, Delay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12846698 Face Recognition Using Double Dimension Reduction
Authors: M. A Anjum, M. Y. Javed, A. Basit
Abstract:
In this paper a new approach to face recognition is presented that achieves double dimension reduction making the system computationally efficient with better recognition results. In pattern recognition techniques, discriminative information of image increases with increase in resolution to a certain extent, consequently face recognition results improve with increase in face image resolution and levels off when arriving at a certain resolution level. In the proposed model of face recognition, first image decimation algorithm is applied on face image for dimension reduction to a certain resolution level which provides best recognition results. Due to better computational speed and feature extraction potential of Discrete Cosine Transform (DCT) it is applied on face image. A subset of coefficients of DCT from low to mid frequencies that represent the face adequately and provides best recognition results is retained. A trade of between decimation factor, number of DCT coefficients retained and recognition rate with minimum computation is obtained. Preprocessing of the image is carried out to increase its robustness against variations in poses and illumination level. This new model has been tested on different databases which include ORL database, Yale database and a color database. The proposed technique has performed much better compared to other techniques. The significance of the model is two fold: (1) dimension reduction up to an effective and suitable face image resolution (2) appropriate DCT coefficients are retained to achieve best recognition results with varying image poses, intensity and illumination level.
Keywords: Biometrics, DCT, Face Recognition, Feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14916697 An Agent-Based Approach to Immune Modelling: Priming Individual Response
Authors: Dimitri Perrin, Heather J. Ruskin, Martin Crane
Abstract:
This study focuses on examining why the range of experience with respect to HIV infection is so diverse, especially in regard to the latency period. An agent-based approach in modelling the infection is used to extract high-level behaviour which cannot be obtained analytically from the set of interaction rules at the cellular level. A prototype model encompasses local variation in baseline properties, contributing to the individual disease experience, and is included in a network which mimics the chain of lymph nodes. The model also accounts for stochastic events such as viral mutations. The size and complexity of the model require major computational effort and parallelisation methods are used.Keywords: HIV, Immune modelling, Agent-based system, individual response.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12726696 Optimum Design of Heat Exchanger in Diesel Engine Cold EGR for Pollutants Reduction
Authors: Nasser Ghassembaglou, Armin Rahmatfam, Faramarz Ranjbar
Abstract:
Using cold EGR method with variable venturi and turbocharger has a very significant effect on reduction of NOX and grime simultaneously. EGR cooler is one of the most important parts in the cold EGR circuit. In this paper optimum design of cooler for working in different percentages of EGR and for determining optimum temperature of exhausted gases, growth of efficiency, reduction of weight, dimension, expenditures, sediment and also optimum performance by using gasoil which has significant amounts of brimstone are investigated and optimized.
Keywords: Cold EGR, NOX, Cooler.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39056695 Hexavalent Chromium Pollution Abatement by use of Scrap Iron
Authors: Marius Gheju, Laura Cocheci
Abstract:
In this study, the reduction of Cr(VI) by use of scrap iron, a cheap and locally available industrial waste, was investigated in continuous system. The greater scrap iron efficiency observed for the first two sections of the column filling indicate that most of the reduction process was carried out in the bottom half of the column filling. This was ascribed to a constant decrease of Cr(VI) concentration inside the filling, as the water front passes from the bottom to the top end of the column. While the bottom section of the column filling was heavily passivated with secondary mineral phases, the top section was less affected by the passivation process; therefore the column filling would likely ensure the reduction of Cr(VI) for time periods longer than 216 hours. The experimental results indicate that fixed beds columns packed with scrap iron could be successfully used for the first step of Cr(VI) polluted wastewater treatment. However, the mass of scrap iron filling should be carefully estimated since it significantly affects the Cr(VI) reduction efficiency.Keywords: hexavalent chromium, heavy metals, scrap iron, reduction capacity, wastewater treatment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18356694 Reduction Conditions of Briquetted Solid Wastes Generated by the Integrated Iron and Steel Plant
Authors: Gökhan Polat, Dicle Kocaoğlu Yılmazer, Muhlis Nezihi Sarıdede
Abstract:
Iron oxides are the main input to produce iron in integrated iron and steel plants. During production of iron from iron oxides, some wastes with high iron content occur. These main wastes can be classified as basic oxygen furnace (BOF) sludge, flue dust and rolling scale. Recycling of these wastes has a great importance for both environmental effects and reduction of production costs. In this study, recycling experiments were performed on basic oxygen furnace sludge, flue dust and rolling scale which contain 53.8%, 54.3% and 70.2% iron respectively. These wastes were mixed together with coke as reducer and these mixtures are pressed to obtain cylindrical briquettes. These briquettes were pressed under various compacting forces from 1 ton to 6 tons. Also, both stoichiometric and twice the stoichiometric cokes were added to investigate effect of coke amount on reduction properties of the waste mixtures. Then, these briquettes were reduced at 1000°C and 1100°C during 30, 60, 90, 120 and 150 min in a muffle furnace. According to the results of reduction experiments, the effect of compacting force, temperature and time on reduction ratio of the wastes were determined. It is found that 1 ton compacting force, 150 min reduction time and 1100°C are the optimum conditions to obtain reduction ratio higher than 75%.
Keywords: Iron oxide wastes, reduction, coke, recycling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13206693 Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding
Authors: K. Anitha Sheela, J. Tarun Kumar
Abstract:
HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.Keywords: AMC, HSDPA, LDPC, WCDMA, 3GPP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20476692 Evaluation of NH3-Slip from Diesel Vehicles Equipped with Selective Catalytic Reduction Systems by Neural Networks Approach
Authors: Mona Lisa M. Oliveira, Nara A. Policarpo, Ana Luiza B. P. Barros, Carla A. Silva
Abstract:
Selective catalytic reduction systems for nitrogen oxides reduction by ammonia has been the chosen technology by most of diesel vehicle (i.e. bus and truck) manufacturers in Brazil, as also in Europe. Furthermore, at some conditions, over-stoichiometric ammonia availability is also needed that increases the NH3 slips even more. Ammonia (NH3) by this vehicle exhaust aftertreatment system provides a maximum efficiency of NOx removal if a significant amount of NH3 is stored on its catalyst surface. In the other words, the practice shows that slightly less than 100% of the NOx conversion is usually targeted, so that the aqueous urea solution hydrolyzes to NH3 via other species formation, under relatively low temperatures. This paper presents a model based on neural networks integrated with a road vehicle simulator that allows to estimate NH3-slip emission factors for different driving conditions and patterns. The proposed model generates high NH3slips which are not also limited in Brazil, but more efforts needed to be made to elucidate the contribution of vehicle-emitted NH3 to the urban atmosphere.Keywords: Ammonia slip, neural-network, vehicles emissions, SCR-NOx.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10386691 E-Business Security: Methodological Considerations
Authors: Ja'far Alqatawna, Jawed Siddiqi, Babak Akhgar, Mohammad Hjouj Btoush
Abstract:
A great deal of research works in the field information systems security has been based on a positivist paradigm. Applying the reductionism concept of the positivist paradigm for information security means missing the bigger picture and thus, the lack of holism which could be one of the reasons why security is still overlooked, comes as an afterthought or perceived from a purely technical dimension. We need to reshape our thinking and attitudes towards security especially in a complex and dynamic environment such as e- Business to develop a holistic understanding of e-Business security in relation to its context as well as considering all the stakeholders in the problem area. In this paper we argue the suitability and need for more inductive interpretive approach and qualitative research method to investigate e-Business security. Our discussion is based on a holistic framework of enquiry, nature of the research problem, the underling theoretical lens and the complexity of e-Business environment. At the end we present a research strategy for developing a holistic framework for understanding of e-Business security problems in the context of developing countries based on an interdisciplinary inquiry which considers their needs and requirements.Keywords: e-Business Security, Complexity, Methodological considerations, interpretive qualitative research and Case study method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15066690 Theoretical Considerations for Software Component Metrics
Authors: V. Lakshmi Narasimhan, Bayu Hendradjaya
Abstract:
We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.Keywords: Component Assembly, Component Based SoftwareEngineering, CORBA Component Model, Software ComponentMetrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22806689 Climate Change and Environmental Education: The Application of Concept Map for Representing the Knowledge Complexity of Climate Change
Authors: Hsueh-Chih, Chen, Yau-Ting, Sung, Tsai-Wen, Lin, Hung-Teng, Chou
Abstract:
It has formed an essential issue that Climate Change, composed of highly knowledge complexity, reveals its significant impact on human existence. Therefore, specific national policies, some of which present the educational aspects, have been published for overcoming the imperative problem. Accordingly, the study aims to analyze as well as integrate the relationship between Climate Change and environmental education and apply the perspective of concept map to represent the knowledge contents and structures of Climate Change; by doing so, knowledge contents of Climate Change could be represented in an even more comprehensive way and manipulated as the tool for environmental education. The method adapted for this study is knowledge conversion model compounded of the platform for experts and teachers, who were the participants for this study, to cooperate and combine each participant-s standpoints into a complete knowledge framework that is the foundation for structuring the concept map. The result of this research contains the important concepts, the precise propositions and the entire concept map for representing the robust concepts of Climate Change.
Keywords: Climate Change, knowledge complexity, concept map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17356688 Amino Acid Coated Silver Nanoparticles: A Green Catalyst for Methylene Blue Reduction
Authors: Abhishek Chandra, Man Singh
Abstract:
Highly stable and homogeneously dispersed amino acid coated silver nanoparticles (ANP) of ≈ 10 nm diameter, ranging from 420 to 430 nm are prepared on AgNO3 solution addition to gum of Azadirachta indica solution at 373.15 K. The amino acids were selected based on their polarity. The synthesized nanoparticles were characterized by UV-Vis, FTIR spectroscopy, HR-TEM, XRD, SEM and 1H-NMR. The coated nanoparticles were used as catalyst for the reduction of methylene blue dye in presence of Sn(II) in aqueous, anionic and cationic micellar media. The rate of reduction of dye was determined by measuring the absorbance at 660 nm, spectrophotometrically and followed the order: Kcationic > Kanionic > Kwater. After 12 min and in absence of the ANP, only 2%, 3% and 6% of the dye reduction was completed in aqueous, anionic and cationic micellar media respectively while, in presence of ANP coated by polar neutral amino acid with non-polar -R group, the reduction completed to 84%, 95% and 98% respectively. The ANP coated with polar neutral amino acid having non-polar -R group, increased the rate of reduction of the dye by 94, 3205 and 6370 folds in aqueous, anionic and cationic micellar media respectively. Also, the rate of reduction of the dye increased by three folds when the micellar media was changed from anionic to cationic when the ANP is coated by a polar neutral amino acid having a non-polar -R group.Keywords: Silver nanoparticle, surfactant, methylene blue, amino acid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25286687 Reduction of Peak Input Currents during Charge Pump Boosting in Monolithically Integrated High-Voltage Generators
Authors: Jan Doutreloigne
Abstract:
This paper describes two methods for the reduction of the peak input current during the boosting of Dickson charge pumps. Both methods are implemented in the fully integrated Dickson charge pumps of a high-voltage display driver chip for smart-card applications. Experimental results reveal good correspondence with Spice simulations and show a reduction of the peak input current by a factor of 6 during boosting.Keywords: Bi-stable display driver, Dickson charge pump, highvoltage generator, peak current reduction, sub-pump boosting, variable frequency boosting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16276686 Development of a Quantitative Material Wastage Management Plan for Effective Waste Reduction in the Building Construction Industry
Authors: Kwok Tak Kit
Abstract:
Combating climate change is becoming a hot topic in various sectors. Building construction and infrastructure sectors contributed a significant proportion of waste and greenhouse gas (GHG) emissions in the environment of different countries and cities. However, there is little research on the micro-level of waste management, “building construction material wastage management,” and fewer reviews about regulatory control in the building construction sector. This paper focuses on the potentialities and importance of material wastage management and reviews the deficiencies of the current standard to take into account the reduction of material wastage in a systematic and quantitative approach.
Keywords: Quantitative measurement, material wastage management plan, waste management, uncalculated waste, circular economy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6866685 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images
Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge
Abstract:
Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.
Keywords: Band selection, fuzzy C-means, K-means, hyperspectral image.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18146684 An Analysis of Dynamic Economic Dispatch Using Search Space Reduction Based Gravitational Search Algorithm
Authors: K. C. Meher, R. K. Swain, C. K. Chanda
Abstract:
This paper presents the performance analysis of dynamic search space reduction (DSR) based gravitational search algorithm (GSA) to solve dynamic economic dispatch of thermal generating units with valve point effects. Dynamic economic dispatch basically dictates the best setting of generator units with anticipated load demand over a definite period of time. In this paper, the presented technique is considered that deals an inequality constraints treatment mechanism known as DSR strategy to accelerate the optimization process. The presented method is demonstrated through five-unit test systems to verify its effectiveness and robustness. The simulation results are compared with other existing evolutionary methods reported in the literature. It is intuited from the comparison that the fuel cost and other performances of the presented approach yield fruitful results with marginal value of simulation time.Keywords: Dynamic economic dispatch, dynamic search space reduction strategy, gravitational search algorithm, ramp rate limits, valve-point effects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14946683 Mammogram Image Size Reduction Using 16-8 bit Conversion Technique
Authors: Ayman A. AbuBaker, Rami S.Qahwaji, Musbah J. Aqel, Mohmmad H. Saleh
Abstract:
Two algorithms are proposed to reduce the storage requirements for mammogram images. The input image goes through a shrinking process that converts the 16-bit images to 8-bits by using pixel-depth conversion algorithm followed by enhancement process. The performance of the algorithms is evaluated objectively and subjectively. A 50% reduction in size is obtained with no loss of significant data at the breast region.Keywords: Breast cancer, Image processing, Image reduction, Mammograms, Image enhancement
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20346682 Heterogeneous Attribute Reduction in Noisy System based on a Generalized Neighborhood Rough Sets Model
Authors: Siyuan Jing, Kun She
Abstract:
Neighborhood Rough Sets (NRS) has been proven to be an efficient tool for heterogeneous attribute reduction. However, most of researches are focused on dealing with complete and noiseless data. Factually, most of the information systems are noisy, namely, filled with incomplete data and inconsistent data. In this paper, we introduce a generalized neighborhood rough sets model, called VPTNRS, to deal with the problem of heterogeneous attribute reduction in noisy system. We generalize classical NRS model with tolerance neighborhood relation and the probabilistic theory. Furthermore, we use the neighborhood dependency to evaluate the significance of a subset of heterogeneous attributes and construct a forward greedy algorithm for attribute reduction based on it. Experimental results show that the model is efficient to deal with noisy data.Keywords: attribute reduction, incomplete data, inconsistent data, tolerance neighborhood relation, rough sets
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15876681 Reduction of Linear Time-Invariant Systems Using Routh-Approximation and PSO
Authors: S. Panda, S. K. Tomar, R. Prasad, C. Ardil
Abstract:
Order reduction of linear-time invariant systems employing two methods; one using the advantages of Routh approximation and other by an evolutionary technique is presented in this paper. In Routh approximation method the denominator of the reduced order model is obtained using Routh approximation while the numerator of the reduced order model is determined using the indirect approach of retaining the time moments and/or Markov parameters of original system. By this method the reduced order model guarantees stability if the original high order model is stable. In the second method Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical examples.
Keywords: Model Order Reduction, Markov Parameters, Routh Approximation, Particle Swarm Optimization, Integral Squared Error, Steady State Stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32876680 Development of a RAM Simulation Model for Acid Gas Removal System
Authors: Ainul Akmar Mokhtar, Masdi Muhammad, Hilmi Hussin, Mohd Amin Abdul Majid
Abstract:
A reliability, availability and maintainability (RAM) model has been built for acid gas removal plant for system analysis that will play an important role in any process modifications, if required, for achieving its optimum performance. Due to the complexity of the plant, the model was based on a Reliability Block Diagram (RBD) with a Monte Carlo simulation engine. The model has been validated against actual plant data as well as local expert opinions, resulting in an acceptable simulation model. The results from the model showed that the operation and maintenance can be further improved, resulting in reduction of the annual production loss.
Keywords: Acid gas removal plant, RAM model, Reliabilityblock diagram
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23416679 The Effects on Yield and Yield Components of Different Level Cluster Tip Reduction and Foliar Boric Acid Applications on Alphonse Lavallee Grape Cultivar
Abstract:
This study was carried out to determine the effects of Control (C), 1/3 Cluster Tip Reduction (1/3 CTR), 1/6 Cluster Tip Reduction (1/6 CTR), 1/9 Cluster Tip Reduction (1/9 CTR), 1/3 CTR + Boric Acid (BA), 1/6 CTR + BA, 1/9 CTR + BA applications on yield and yield components of four years old Alphonse Lavallee grape variety (Vitis vinifera L.) grown on grafted 110 Paulsen rootstock in Konya province in Turkey in the vegetation period in 2015. According to the results, the highest maturity index 21.46 with 1/9 CTR application; the highest grape juice yields 736.67 ml with 1/3 CTR + BA application; the highest L* color value 32.07 with 1/9 CTR application; the highest a* color value 1.74 with 1/9 CTR application; the highest b* color value 3.72 with 1/9 CTR application were obtained. The effects of applications on grape fresh yield, cluster weight and berry weight were not found statistically significant.
Keywords: Alphonse Lavallee grape cultivar, different cluster tip reduction (1/3, 1/6, 1/9), foliar boric acid application, yield, quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1849