Search results for: design process
7972 A Case Study on Product Development Performance Measurement
Authors: Liv Gingnell, Evelina Ericsson, Joakim Lilliesköld, Robert Langerström
Abstract:
In recent years, an increased competition and lower profit margins have necessitated a focus on improving the performance of the product development process, an area that traditionally have been excluded from detailed steering and evaluation. A systematic improvement requires a good understanding of the current performance, wherefore the interest for product development performance measurement has increased dramatically. This paper presents a case study that evaluates the performance of the product development performance measurement system used in a Swedish company that is a part of a global corporate group. The study is based on internal documentation and eighteen in-depth interviews with stakeholders involved in the product development process. The results from the case study includes a description of what metrics that are in use, how these are employed, and its affect on the quality of the performance measurement system. Especially, the importance of having a well-defined process proved to have a major impact on the quality of the performance measurement system in this particular case.
Keywords: Outcome metric, Performance driver, Performance measurement, Product development process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15837971 Coordinated Design of TCSC Controller and PSS Employing Particle Swarm Optimization Technique
Authors: Sidhartha Panda, N. P. Padhy
Abstract:
This paper investigates the application of Particle Swarm Optimization (PSO) technique for coordinated design of a Power System Stabilizer (PSS) and a Thyristor Controlled Series Compensator (TCSC)-based controller to enhance the power system stability. The design problem of PSS and TCSC-based controllers is formulated as a time domain based optimization problem. PSO algorithm is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. To compare the capability of PSS and TCSC-based controller, both are designed independently first and then in a coordinated manner for individual and coordinated application. The proposed controllers are tested on a weakly connected power system. The eigenvalue analysis and non-linear simulation results are presented to show the effectiveness of the coordinated design approach over individual design. The simulation results show that the proposed controllers are effective in damping low frequency oscillations resulting from various small disturbances like change in mechanical power input and reference voltage setting.Keywords: Particle swarm optimization, Phillips-Heffron model, power system stability, PSS, TCSC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21647970 The Impact of Local Decision-Making in Regional Development Schemes on the Achievement of Efficiency in EU Funds
Authors: Kuyucu Helvacioglu Asli Deniz, Tektas Arzu
Abstract:
European Union candidate status provides a strong motivation for decision-making in the candidate countries in shaping the regional development policy where there is an envisioned transfer of power from center to the periphery. The process of Europeanization anticipates the candidate countries configure their regional institutional templates in the context of the requirements of the European Union policies and introduces new instruments of incentive framework of enlargement to be employed in regional development schemes. It is observed that the contribution of the local actors to the decision making in the design of the allocation architectures enhances the efficiency of the funds and increases the positive effects of the projects funded under the regional development objectives. This study aims at exploring the performances of the three regional development grant schemes in Turkey, established and allocated under the pre-accession process with a special emphasis given to the roles of the national and local actors in decision-making for regional development. Efficiency analyses have been conducted using the DEA methodology which has proved to be a superior method in comparative efficiency and benchmarking measurements. The findings of this study as parallel to similar international studies, provides that the participation of the local actors to the decision-making in funding contributes both to the quality and the efficiency of the projects funded under the EU schemes.Keywords: Efficiency, European Union Funds, RegionalDevelopment, Turkey
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16507969 A Hypercube Social Feature Extraction and Multipath Routing in Delay Tolerant Networks
Authors: S. Balaji, M. Rajaram, Y. Harold Robinson, E. Golden Julie
Abstract:
Delay Tolerant Networks (DTN) which have sufficient state information include trajectory and contact information, to protect routing efficiency. However, state information is dynamic and hard to obtain without a global and/or long-term collection process. To deal with these problems, the internal social features of each node are introduced in the network to perform the routing process. This type of application is motivated from several human contact networks where people contact each other more frequently if they have more social features in common. Two unique processes were developed for this process; social feature extraction and multipath routing. The routing method then becomes a hypercube–based feature matching process. Furthermore, the effectiveness of multipath routing is evaluated and compared to that of single-path routing.
Keywords: Delay tolerant networks, entropy, human contact networks, hyper cubes, multipath Routing, social features.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13147968 The Ethics of Dissent: The Case of David Kelly
Authors: A. Kayes, D. Christopher Kayes
Abstract:
In this paper, we rely on the story of the late British weapons inspector David Kelly to illustrate how sensemaking can inform the study of the ethics of suppression of dissent. Using archival data, we reconstruct Dr. Kelly-s key responsibilities as a weapons inspector and government employee. We begin by clarifying the concept of dissent and how it is a useful organizational process. We identify the various ways that dissent has been discussed in the organizational literature and reconsider the process of sensemaking. We conclude that suppression of opinions that deviate from the majority is part of the identity maintenance of the sensemaking process. We illustrate the prevention of dissent in organizations consists of a set of unsatisfactory trade-offs.Keywords: ethics, dissent, suppression, sensemaking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23077967 Elaboration and Optimization of Pellets Used for Precise Glass Grinding
Authors: N. Belkhir, A. Chorfa, D. Bouzid
Abstract:
In this work, grinding or microcutting tools in the form of pellets were manufactured using a bounded alumina abrasive grains. The bound used is a vitreous material containing quartz feldspars, kaolinite and a quantity of hematite. The pellets were used in glass grinding process to replace the free abrasive grains lapping process. The study of the elaborated pellets were done to define their effectiveness in the grinding process and to optimize the influence of the pellets elaboration parameters. The obtained results show the existence of an optimal combination of the pellets elaboration parameters for each glass grinding phase (coarse to fine grinding). The final roughness (rms) reached by the elaborated pellets on a BK7 glass surface was about 0.392 μm.
Keywords: Abrasive grain, glass, grinding, pellet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15857966 Managing Iterations in Product Design and Development
Authors: K. Aravindhan, Trishit Bandyopadhyay, Mahesh Mehendale, Supriya Kumar De
Abstract:
The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.
Keywords: Decision Points, Iteration, Product Design, Rework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21997965 Mathematical Modeling of Switching Processes in Magnetically Controlled MEMS Switches
Authors: Sergey M. Karabanov, Dmitry V. Suvorov, Dmitry Yu. Tarabrin
Abstract:
The operating principle of magnetically controlled microelectromechanical system (MEMS) switches is based on controlling the beam movement under the influence of a magnetic field. Currently, there is a MEMS switch design with a flexible ferromagnetic electrode in the form of a fixed-terminal beam, with an electrode fastened on a straight or cranked anchor. The basic performance characteristics of magnetically controlled MEMS switches (service life, sensitivity, contact resistance, fast response) are largely determined by the flexible electrode design. To ensure the stable and controlled motion of the flexible electrode, it is necessary to provide the optimal design of a flexible electrode.
Keywords: MEMS switch, magnetic sensitivity, magnetic concentrator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7407964 Improving the Decision-Making Process and Transparency of Corporate Governance Using XBRL
Authors: Claudiu Brandas
Abstract:
Several recent studies have shown that the transparency of financial reporting have a significant influence on investor-s decisions. Thus, regulation authorities and professional organizations (IFAC) have emphasized the role of XBRL (eXtensible Business Reporting Language) and interactive data as a means of promoting transparency and monitoring corporate reporting. In this context, this paper has as objective the analysis of interactive reporting through XBRL and its use as a support in the process of taking decisions in corporate governance, namely the potential of interactive reports in XBRL to increase the transparency and monitoring process of corporate governance.Keywords: Corporate Governance, decision, financial reporting, transparency, XBRL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23107963 Latent Topic Based Medical Data Classification
Authors: Jian-hua Yeh, Shi-yi Kuo
Abstract:
This paper discusses the classification process for medical data. In this paper, we use the data from ACM KDDCup 2008 to demonstrate our classification process based on latent topic discovery. In this data set, the target set and outliers are quite different in their nature: target set is only 0.6% size in total, while the outliers consist of 99.4% of the data set. We use this data set as an example to show how we dealt with this extremely biased data set with latent topic discovery and noise reduction techniques. Our experiment faces two major challenge: (1) extremely distributed outliers, and (2) positive samples are far smaller than negative ones. We try to propose a suitable process flow to deal with these issues and get a best AUC result of 0.98.
Keywords: classification, latent topics, outlier adjustment, feature scaling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16467962 A Weighted Least Square Algorithm for Low-Delay FIR Filters with Piecewise Variable Stopbands
Authors: Yasunori Sugita, Toshinori Yoshikawa, Naoyuki Aikawa
Abstract:
Variable digital filters are useful for various signal processing and communication applications where the frequency characteristics, such as fractional delays and cutoff frequencies, can be varied. In this paper, we propose a design method of variable FIR digital filters with an approximate linear phase characteristic in the passband. The proposed variable FIR filters have some large attenuation in stopband and their large attenuation can be varied by spectrum parameters. In the proposed design method, a quasi-equiripple characteristic can be obtained by using an iterative weighted least square method. The usefulness of the proposed design method is verified through some examples.
Keywords: Weighted Least Squares Approximation, Variable FIR Filters, Low-Delay, Quasi-Equiripple
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16657961 The Composting Process from a Waste Management Method to a Remediation Procedure
Authors: G. Petruzzelli, F. Pedron, M. Grifoni, F. Gorini, I. Rosellini, B. Pezzarossa
Abstract:
Composting is a controlled technology to enhance the natural aerobic process of organic wastes degradation. The resulting product is a humified material that is principally recyclable for agricultural purpose. The composting process is one of the most important tools for waste management, by the European Community legislation. In recent years composting has been increasingly used as a remediation technology to remove biodegradable contaminants from soil, and to modulate heavy metals bioavailability in phytoremediation strategies. An optimization in the recovery of resources from wastes through composting could enhance soil fertility and promote its use in the remediation biotechnologies of contaminated soils.
Keywords: Agriculture, biopile, compost, soil clean-up, waste recycling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22937960 Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring
Authors: Youngji Yoo, Cheong-Sool Park, Jun Seok Kim, Young-Hak Lee, Sung-Shick Kim, Jun-Geol Baek
Abstract:
In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.Keywords: Semiconductor, non-normal mixed process data, clustering, Statistical Quality Control (SQC), regression tree, Pearson distribution system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17877959 Design and Implementation of Reed Solomon Encoder on FPGA
Authors: Amandeep Singh, Mandeep Kaur
Abstract:
Error correcting codes are used for detection and correction of errors in digital communication system. Error correcting coding is based on appending of redundancy to the information message according to a prescribed algorithm. Reed Solomon codes are part of channel coding and withstand the effect of noise, interference and fading. Galois field arithmetic is used for encoding and decoding reed Solomon codes. Galois field multipliers and linear feedback shift registers are used for encoding the information data block. The design of Reed Solomon encoder is complex because of use of LFSR and Galois field arithmetic. The purpose of this paper is to design and implement Reed Solomon (255, 239) encoder with optimized and lesser number of Galois Field multipliers. Symmetric generator polynomial is used to reduce the number of GF multipliers. To increase the capability toward error correction, convolution interleaving will be used with RS encoder. The Design will be implemented on Xilinx FPGA Spartan II.
Keywords: Galois Field, Generator polynomial, LFSR, Reed Solomon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48517958 A Framework for Product Development Process including HW and SW Components
Authors: Namchul Do, Gyeongseok Chae
Abstract:
This paper proposes a framework for product development including hardware and software components. It provides separation of hardware dependent software, modifications of current product development process, and integration of software modules with existing product configuration models and assembly product structures. In order to decide the dependent software, the framework considers product configuration modules and engineering changes of associated software and hardware components. In order to support efficient integration of the two different hardware and software development, a modified product development process is proposed. The process integrates the dependent software development into product development through the interchanges of specific product information. By using existing product data models in Product Data Management (PDM), the framework represents software as modules for product configurations and software parts for product structure. The framework is applied to development of a robot system in order to show its effectiveness.Keywords: HW and SW Development Integration, ProductDevelopment with Software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26137957 A Mobile Agent-based Clustering Data Fusion Algorithm in WSN
Authors: Xiangbin Zhu, Wenjuan Zhang
Abstract:
In wireless sensor networks,the mobile agent technology is used in data fusion. According to the node residual energy and the results of partial integration,we design the node clustering algorithm. Optimization of mobile agent in the routing within the cluster strategy for wireless sensor networks to further reduce the amount of data transfer. Through the experiments, using mobile agents in the integration process within the cluster can be reduced the path loss in some extent.
Keywords: wireless sensor networks, data fusion, mobile agent
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15187956 Modeling of Dielectric Heating in Radio- Frequency Applicator Optimized for Uniform Temperature by Means of Genetic Algorithms
Authors: Camelia Petrescu, Lavinia Ferariu
Abstract:
The paper presents an optimization study based on genetic algorithms (GA-s) for a radio-frequency applicator used in heating dielectric band products. The weakly coupled electro-thermal problem is analyzed using 2D-FEM. The design variables in the optimization process are: the voltage of a supplementary “guard" electrode and six geometric parameters of the applicator. Two objective functions are used: temperature uniformity and total active power absorbed by the dielectric. Both mono-objective and multiobjective formulations are implemented in GA optimization.Keywords: Dielectric heating, genetic algorithms, optimization, RF applicators.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19377955 Impact of Process Parameters on Tensile Strength of Fused Deposition Modeling Printed Crisscross Poylactic Acid
Authors: Shilpesh R. Rajpurohit, Harshit K. Dave
Abstract:
Additive manufacturing gains the popularity in recent times, due to its capability to create prototype as well functional as end use product directly from CAD data without any specific requirement of tooling. Fused deposition modeling (FDM) is one of the widely used additive manufacturing techniques that are used to create functional end use part of polymer that is comparable with the injection-molded parts. FDM printed part has an application in various fields such as automobile, aerospace, medical, electronic, etc. However, application of FDM part is greatly affected by poor mechanical properties. Proper selection of the process parameter could enhance the mechanical performance of the printed part. In the present study, experimental investigation has been carried out to study the behavior of the mechanical performance of the printed part with respect to process variables. Three process variables viz. raster angle, raster width and layer height have been varied to understand its effect on tensile strength. Further, effect of process variables on fractured surface has been also investigated.
Keywords: 3D printing, fused deposition modeling, layer height, raster angle, raster width, tensile strength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16677954 Design and Construction of the Semi-Automatic Sliced Ginger Machine
Authors: J. Chatthong, W. Boonchouytan, R. Burapa
Abstract:
The purpose of study was to design and construction the semi-automatic sliced ginger machine for reduce production times in sheet and slice ginger procedure furthermore, reduced amount of labor of slides and cutting method. Take consider into clean and safety of workers and consumers. The principle of machines, used 1 horsepower motor, rotation speed of sliced blade 967 rpm, the diameter of sliced dish 310 mm, consists of 2 blades for sheet cutting ginger and the power from motor which transfer to rotate the sliced blade roller, rotation speed 440 rpm. The slice cutter roller was sliced ginger from sheet ginger to line ginger. The conveyer could adjustment level of motors, used to the beginning area that sheet ginger was transference to the roller for sheet and sliced cutting in next process. The cover of sliced cutting had channel for 1 tuber of ginger. The semi-automatic sliced ginger machine could produced sheet ginger 81.8 kg/h (6.2 times of labor) and line ginger 17.9 kg/h (2.5 times of labor) compare with, labor work could produced sheet ginger 13.2 kg/h and line ginger 7.1 kg/h, and when timekeeper, the total times of semi auto machine 30.86 kg/h and labor 4.6 kg/h, there for the semi auto machine was 6.7 times of labor. The semiautomatic sliced ginger machine convenient, easy for use and maintain, in addition to reduce fatigue of body and seriousness from works; must be used high skill, and protection accident in slicing procedure. Beside, machine could used with other vegetables for example potato, carrot .etcKeywords: Sliced Machine, Sliced Ginger, Line Ginger
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32427953 Comparison of Automated Zone Design Census Output Areas with Existing Output Areas in South Africa
Authors: T. Mokhele, O. Mutanga, F. Ahmed
Abstract:
South Africa is one of the few countries that have stopped using the same Enumeration Areas (EAs) for census enumeration and dissemination. The advantage of this change is that confidentiality issue could be addressed for census dissemination as the design of geographic unit for collection is mainly to ensure that this unit is covered by one enumerator. The objective of this paper was to evaluate the performance of automated zone design output areas against non-zone design developed geographies using the 2001 census data, and 2011 census to some extent, as the main input. The comparison of the Automated Zone-design Tool (AZTool) census output areas with the Small Area Layers (SALs) and SubPlaces based on confidentiality limit, population distribution, and degree of homogeneity, as well as shape compactness, was undertaken. Further, SPSS was employed for validation of the AZTool output results. The results showed that AZTool developed output areas out-perform the existing official SAL and SubPlaces with regard to minimum population threshold, population distribution and to some extent to homogeneity. Therefore, it was concluded that AZTool program provides a new alternative to the creation of optimised census output areas for dissemination of population census data in South Africa.Keywords: AZTool, enumeration areas, small areal layers, South Africa.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7567952 Designing Information Systems in Education as Prerequisite for Successful Management Results
Authors: Vladimir Simovic, Matija Varga, Tonco Marusic
Abstract:
This research paper shows matrix technology models and examples of information systems in education (in the Republic of Croatia and in the Germany) in support of business, education (when learning and teaching) and e-learning. Here we researched and described the aims and objectives of the main process in education and technology, with main matrix classes of data. In this paper, we have example of matrix technology with detailed description of processes related to specific data classes in the processes of education and an example module that is support for the process: ‘Filling in the directory and the diary of work’ and ‘evaluation’. Also, on the lower level of the processes, we researched and described all activities which take place within the lower process in education. We researched and described the characteristics and functioning of modules: ‘Fill the directory and the diary of work’ and ‘evaluation’. For the analysis of the affinity between the aforementioned processes and/or sub-process we used our application model created in Visual Basic, which was based on the algorithm for analyzing the affinity between the observed processes and/or sub-processes.Keywords: Designing, education management, information systems, matrix technology, process affinity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11047951 Value Co-Creation in Used-Car Auctions: A Service Scientific Perspective
Authors: Safdar Muhammad Usman, Youji Kohda, Katsuhiro Umemoto
Abstract:
Electronic market place plays an important intermediary role for connecting dealers and retail customers. The main aim of this paper is to design a value co-creation model in used-car auctions. More specifically, the study has been designed in order to describe the process of value co-creation in used-car auctions, to explore the co-created values in used-car auctions, and finally conclude the paper indicating the future research directions. Our analysis shows that economic values as well as non-economic values are co-created in used-car auctions. In addition, this paper contributes to the academic society broadening the view of value co-creation in service science.
Keywords: Value co-creation, Used-car auctions, Non-economic values, Service science.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24867950 Solving Single Machine Total Weighted Tardiness Problem Using Gaussian Process Regression
Authors: Wanatchapong Kongkaew
Abstract:
This paper proposes an application of probabilistic technique, namely Gaussian process regression, for estimating an optimal sequence of the single machine with total weighted tardiness (SMTWT) scheduling problem. In this work, the Gaussian process regression (GPR) model is utilized to predict an optimal sequence of the SMTWT problem, and its solution is improved by using an iterated local search based on simulated annealing scheme, called GPRISA algorithm. The results show that the proposed GPRISA method achieves a very good performance and a reasonable trade-off between solution quality and time consumption. Moreover, in the comparison of deviation from the best-known solution, the proposed mechanism noticeably outperforms the recently existing approaches.
Keywords: Gaussian process regression, iterated local search, simulated annealing, single machine total weighted tardiness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22437949 An Evaluation of Average Run Length of MaxEWMA and MaxGWMA Control Charts
Authors: S. Phanyaem
Abstract:
Exponentially weighted moving average control chart (EWMA) is a popular chart used for detecting shift in the mean of parameter of distributions in quality control. The objective of this paper is to compare the efficiency of control chart to detect an increases in the mean of a process. In particular, we compared the Maximum Exponentially Weighted Moving Average (MaxEWMA) and Maximum Generally Weighted Moving Average (MaxGWMA) control charts when the observations are Exponential distribution. The criteria for evaluate the performance of control chart is called, the Average Run Length (ARL). The result of comparison show that in the case of process is small sample size, the MaxEWMA control chart is more efficiency to detect shift in the process mean than MaxGWMA control chart. For the case of large sample size, the MaxEWMA control chart is more sensitive to detect small shift in the process mean than MaxGWMA control chart, and when the process is a large shift in mean, the MaxGWMA control chart is more sensitive to detect mean shift than MaxEWMA control chart.
Keywords: Maximum Exponentially Weighted Moving Average, Maximum General Weighted Moving Average, Average Run Length.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21837948 An Optimization Tool-Based Design Strategy Applied to Divide-by-2 Circuits with Unbalanced Loads
Authors: Agord M. Pinto Jr., Yuzo Iano, Leandro T. Manera, Raphael R. N. Souza
Abstract:
This paper describes an optimization tool-based design strategy for a Current Mode Logic CML divide-by-2 circuit. Representing a building block for output frequency generation in a RFID protocol based-frequency synthesizer, the circuit was designed to minimize the power consumption for driving of multiple loads with unbalancing (at transceiver level). Implemented with XFAB XC08 180 nm technology, the circuit was optimized through MunEDA WiCkeD tool at Cadence Virtuoso Analog Design Environment ADE.Keywords: Divide-by-2 circuit, CMOS technology, PLL phase locked-loop, optimization tool, CML current mode logic, RF transceiver.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21367947 Optimization of Molasses Desugarization Process Using Steffen Method in Sugar Beet Factories
Authors: Simin Asadollahi, Mohammad Hossein Haddad Khodaparast
Abstract:
Molasses is one of the most important by-products in sugar industry, which contains a large amount of sucrose. The routine way to separate the sucrose from molasses is using steffen method. Whereas this method is very usual in sugar factories, the aim of this research is optimization of this method. Mentioned optimization depends to three factors of reactor alkality, reactor temperature and diluted molasses brix. Accordingly, three different stages must be done:
- Construction of a pilot plant similar to actual steffen system in sugar factories
- Experimenting using the pilot plant
- Laboratory analysis
These experiences included 27 treatments in three replications. In each replication, brix, polarization and purity characters in Saccharate syrup and hot and cold waste were measured. The results showed that diluted molasses brix, reactor alkality and reactor temperature had many significant effects on Saccharate purity and efficiency of molasses desugarization. This research was performed in "randomize complete design" form & was analyzed with "duncan multiple range test". The significant difference in the level of α = 5% is observed between the treatments. The results indicated that the optimal conditions for molasses desugarization by steffen method are: diluted molasses brix= 10, reactor alkality= 10 and reactor temperature=8˚C.
Keywords: Molasses desugarization, Saccharate purity, Steffen process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30147946 Approximate Frequent Pattern Discovery Over Data Stream
Authors: Kittisak Kerdprasop, Nittaya Kerdprasop
Abstract:
Frequent pattern discovery over data stream is a hard problem because a continuously generated nature of stream does not allow a revisit on each data element. Furthermore, pattern discovery process must be fast to produce timely results. Based on these requirements, we propose an approximate approach to tackle the problem of discovering frequent patterns over continuous stream. Our approximation algorithm is intended to be applied to process a stream prior to the pattern discovery process. The results of approximate frequent pattern discovery have been reported in the paper.Keywords: Frequent pattern discovery, Approximate algorithm, Data stream analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13487945 Evaluation of Risks in New Product Innovation
Authors: Emre Alptekin, Damla Yalçınyiğit, Gülfem Alptekin
Abstract:
In highly competitive environments, a growing number of companies must regularly launch new products speedily and successfully. A company-s success is based on the systematic, conscious product designing method which meets the market requirements and takes risks as well as resources into consideration. Research has found that developing and launching new products are inherently risky endeavors. Hence in this research, we aim at introducing a risk evaluation framework for the new product innovation process. Our framework is based on the fuzzy analytical hierarchy process (FAHP) methodology. We have applied all the stages of the framework on the risk evaluation process of a pharmaceuticals company.Keywords: Evaluation, risks, product innovation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15007944 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.
Keywords: Data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14837943 Fractal Dimension: An Index to Quantify Parameters in Genetic Algorithms
Authors: Mahmoud R. Shaghaghian
Abstract:
Genetic Algorithms (GAs) are direct searching methods which require little information from design space. This characteristic beside robustness of these algorithms makes them to be very popular in recent decades. On the other hand, while this method is employed, there is no guarantee to achieve optimum results. This obliged designer to run such algorithms more than one time to achieve more reliable results. There are many attempts to modify the algorithms to make them more efficient. In this paper, by application of fractal dimension (particularly, Box Counting Method), the complexity of design space are established for determination of mutation and crossover probabilities (Pm and Pc). This methodology is followed by a numerical example for more clarification. It is concluded that this modification will improve efficiency of GAs and make them to bring about more reliable results especially for design space with higher fractal dimensions.Keywords: Genetic Algorithm, Fractal Dimension, BoxCounting Method, Weierstrass-Mandelbrot function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477