Search results for: product optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6773

Search results for: product optimization

6173 Robotic Arm-Automated Spray Painting with One-Shot Object Detection and Region-Based Path Optimization

Authors: Iqraq Kamal, Akmal Razif, Sivadas Chandra Sekaran, Ahmad Syazwan Hisaburi

Abstract:

Painting plays a crucial role in the aerospace manufacturing industry, serving both protective and cosmetic purposes for components. However, the traditional manual painting method is time-consuming and labor-intensive, posing challenges for the sector in achieving higher efficiency. Additionally, the current automated robot path planning has been a bottleneck for spray painting processes, as typical manual teaching methods are time-consuming, error-prone, and skill-dependent. Therefore, it is essential to develop automated tool path planning methods to replace manual ones, reducing costs and improving product quality. Focusing on flat panel painting in aerospace manufacturing, this study aims to address issues related to unreliable part identification techniques caused by the high-mixture, low-volume nature of the industry. The proposed solution involves using a spray gun and a UR10 robotic arm with a vision system that utilizes one-shot object detection (OS2D) to identify parts accurately. Additionally, the research optimizes path planning by concentrating on the region of interest—specifically, the identified part, rather than uniformly covering the entire painting tray.

Keywords: aerospace manufacturing, one-shot object detection, automated spray painting, vision-based path optimization, deep learning, automation, robotic arm

Procedia PDF Downloads 82
6172 Solving Flowshop Scheduling Problems with Ant Colony Optimization Heuristic

Authors: Arshad Mehmood Ch, Riaz Ahmad, Imran Ali Ch, Waqas Durrani

Abstract:

This study deals with the application of Ant Colony Optimization (ACO) approach to solve no-wait flowshop scheduling problem (NW-FSSP). ACO algorithm so developed has been coded on Matlab computer application. The paper covers detailed steps to apply ACO and focuses on judging the strength of ACO in relation to other solution techniques previously applied to solve no-wait flowshop problem. The general purpose approach was able to find reasonably accurate solutions for almost all the problems under consideration and was able to handle a fairly large spectrum of problems with far reduced CPU effort. Careful scrutiny of the results reveals that the algorithm presented results better than other approaches like Genetic algorithm and Tabu Search heuristics etc; earlier applied to solve NW-FSSP data sets.

Keywords: no-wait, flowshop, scheduling, ant colony optimization (ACO), makespan

Procedia PDF Downloads 435
6171 Using Serious Games to Integrate the Potential of Mass Customization into the Fuzzy Front-End of New Product Development

Authors: Michael N. O'Sullivan, Con Sheahan

Abstract:

Mass customization is the idea of offering custom products or services to satisfy the needs of each individual customer while maintaining the efficiency of mass production. Technologies like 3D printing and artificial intelligence have many start-ups hoping to capitalize on this dream of creating personalized products at an affordable price, and well established companies scrambling to innovate and maintain their market share. However, the majority of them are failing as they struggle to understand one key question – where does customization make sense? Customization and personalization only make sense where the value of the perceived benefit outweighs the cost to implement it. In other words, will people pay for it? Looking at the Kano Model makes it clear that it depends on the product. In products where customization is an inherent need, like prosthetics, mass customization technologies can be highly beneficial. However, for products that already sell as a standard, like headphones, offering customization is likely only an added bonus, and so the product development team must figure out if the customers’ perception of the added value of this feature will outweigh its premium price tag. This can be done through the use of a ‘serious game,’ whereby potential customers are given a limited budget to collaboratively buy and bid on potential features of the product before it is developed. If the group choose to buy customization over other features, then the product development team should implement it into their design. If not, the team should prioritize the features on which the customers have spent their budget. The level of customization purchased can also be translated to an appropriate production method, for example, the most expensive type of customization would likely be free-form design and could be achieved through digital fabrication, while a lower level could be achieved through short batch production. Twenty-five teams of final year students from design, engineering, construction and technology tested this methodology when bringing a product from concept through to production specification, and found that it allowed them to confidently decide what level of customization, if any, would be worth offering for their product, and what would be the best method of producing it. They also found that the discussion and negotiations between players during the game led to invaluable insights, and often decided to play a second game where they offered customers the option to buy the various customization ideas that had been discussed during the first game.

Keywords: Kano model, mass customization, new product development, serious game

Procedia PDF Downloads 136
6170 TRIZ-Based Conflicts-Solving Applications in New Product Development (NPD) Process and Knowledge Management (KM) System

Authors: Chi-Hao Yeh

Abstract:

The aim of this paper is to show how to apply TRIZ to resolve conflicts in management area, which can be readily applied in new product development (NPD) process and Knowledge Management (KM) system in desinging and manfacturing stages. TRIZ has been well-known as a creative and innovative thinking theory in solving engineering and technology contradictions in the last two decades. However, few studies and practical usage were proposed in management area. Conflicts occurring including schedule, budget, and risk plannings at smart phone R&D process are discussed to demonstrate the ideas guided by 39 TRIZ management parameters, 40 TRIZ innovative principles, and contradiction matrix. The results show that TRIZ is able to provide direct, quick and effective alternatives to resolve the management conflicts. In this manner, huge effort and cost can be actually saved and practical experince can be stored in KM system. In this paper, an innovative 3C consuming product such as smart-phone is utilized as a case study to describe the proposed TRIZ-based conflicts-solving approaches in NPD process and Knowledge Management (KM) system.

Keywords: TRIZ, conflicts-solving in managment area, new product development (NPD), knowledge management (KM), smart-phone

Procedia PDF Downloads 522
6169 Supersized Pricing and Anticipated Consumption Guilt: The Moderating Role of Product Type and Health Claims

Authors: Asim Shabir, Ruqia Shaikh

Abstract:

Supersized pricing is an effective strategy often used by marketers to make consumers buy more. However, such a strategy also results in more purchases and consumption, especially of hedonic food products. This study brings interesting insights about supersized pricing as it provides value-based justification to consumers; as a result, the guilt associated with the purchase and consumption of hedonic products diminishes, which mediates the impact between supersized pricing and size choice. Interestingly, there is a three-way interaction between pricing, product type, and health goal prime. Health prime diminishes the impact of supersized pricing in the case of more hedonic products (unhealthy) compared to less hedonic (perceived as healthy) products.

Keywords: supersized pricing, anticipated consumption guilt, health claim, product type

Procedia PDF Downloads 110
6168 Multi-Response Optimization of CNC Milling Parameters Using Taguchi Based Grey Relational Analysis for AA6061 T6 Aluminium Alloy

Authors: Varsha Singh, Kishan Fuse

Abstract:

This paper presents a study of the grey-Taguchi method to optimize CNC milling parameters of AA6061 T6 aluminium alloy. Grey-Taguchi method combines Taguchi method based design of experiments (DOE) with grey relational analysis (GRA). Multi-response optimization of different quality characteristics as surface roughness, material removal rate, cutting forces is done using grey relational analysis (GRA). The milling parameters considered for experiments include cutting speed, feed per tooth, and depth of cut. Each parameter with three levels is selected. A grey relational grade is used to estimate overall quality characteristics performance. The Taguchi’s L9 orthogonal array is used for design of experiments. MINITAB 17 software is used for optimization. Analysis of variance (ANOVA) is used to identify most influencing parameter. The experimental results show that grey relational analysis is effective method for optimizing multi-response characteristics. Optimum results are finally validated by performing confirmation test.

Keywords: ANOVA, CNC milling, grey relational analysis, multi-response optimization

Procedia PDF Downloads 309
6167 Optimal Power Distribution and Power Trading Control among Loads in a Smart Grid Operated Industry

Authors: Vivek Upadhayay, Siddharth Deshmukh

Abstract:

In recent years utilization of renewable energy sources has increased majorly because of the increase in global warming concerns. Organization these days are generally operated by Micro grid or smart grid on a small level. Power optimization and optimal load tripping is possible in a smart grid based industry. In any plant or industry loads can be divided into different categories based on their importance to the plant and power requirement pattern in the working days. Coming up with an idea to divide loads in different such categories and providing different power management algorithm to each category of load can reduce the power cost and can come handy in balancing stability and reliability of power. An objective function is defined which is subjected to a variable that we are supposed to minimize. Constraint equations are formed taking difference between the power usages pattern of present day and same day of previous week. By considering the objectives of minimal load tripping and optimal power distribution the proposed problem formulation is a multi-object optimization problem. Through normalization of each objective function, the multi-objective optimization is transformed to single-objective optimization. As a result we are getting the optimized values of power required to each load for present day by use of the past values of the required power for the same day of last week. It is quite a demand response scheduling of power. These minimized values then will be distributed to each load through an algorithm used to optimize the power distribution at a greater depth. In case of power storage exceeding the power requirement, profit can be made by selling exceeding power to the main grid.

Keywords: power flow optimization, power trading enhancement, smart grid, multi-object optimization

Procedia PDF Downloads 525
6166 Optimal Driving Strategies for a Hybrid Street Type Motorcycle: Modelling and Control

Authors: Jhon Vargas, Gilberto Osorio-Gomez, Tatiana Manrique

Abstract:

This work presents an optimal driving strategy proposal for a 125 c.c. street-type hybrid electric motorcycle with a parallel configuration. The results presented in this article are complementary regarding the control proposal of a hybrid motorcycle. In order to carry out such developments, a representative dynamic model of the motorcycle is used, in which also are described different optimization functionalities for predetermined driving modes. The purpose is to implement an off-line optimal driving strategy which distributes energy to both engines by minimizing an objective torque requirement function. An optimal dynamic contribution is found from the optimization routine, and the optimal percentage contribution for vehicle cruise speed is implemented in the proposed online PID controller.

Keywords: dynamic model, driving strategies, parallel hybrid motorcycle, PID controller, optimization

Procedia PDF Downloads 191
6165 An Informative Marketing Platform: Methodology and Architecture

Authors: Martina Marinelli, Samanta Vellante, Francesco Pilotti, Daniele Di Valerio, Gaetanino Paolone

Abstract:

Any development in web marketing technology requires changes in information engineering to identify instruments and techniques suitable for the production of software applications for informative marketing. Moreover, for large web solutions, designing an interface that enables human interactions is a complex process that must bridge between informative marketing requirements and the developed solution. A user-friendly interface in web marketing applications is crucial for a successful business. The paper introduces mkInfo - a software platform that implements informative marketing. Informative marketing is a new interpretation of marketing which places the information at the center of every marketing action. The creative team includes software engineering researchers who have recently authored an article on automatic code generation. The authors have created the mkInfo software platform to generate informative marketing web applications. For each web application, it is possible to automatically implement an opt in page, a landing page, a sales page, and a thank you page: one only needs to insert the content. mkInfo implements an autoresponder to send mail according to a predetermined schedule. The mkInfo platform also includes e-commerce for a product or service. The stakeholder can access any opt-in page and get basic information about a product or service. If he wants to know more, he will need to provide an e-mail address to access a landing page that will generate an e-mail sequence. It will provide him with complete information about the product or the service. From this point on, the stakeholder becomes a user and is now able to purchase the product or related services through the mkInfo platform. This paper suggests a possible definition for Informative Marketing, illustrates its basic principles, and finally details the mkInfo platform that implements it. This paper also offers some Informative Marketing models, which are implemented in the mkInfo platform. Informative marketing can be applied to products or services. It is necessary to realize a web application for each product or service. The mkInfo platform enables the product or the service producer to send information concerning a specific product or service to all stakeholders. In conclusion, the technical contributions of this paper are: a different interpretation of marketing based on information; a modular architecture for web applications, particularly for one with standard features such as information storage, exchange, and delivery; multiple models to implement informative marketing; a software platform enabling the implementation of such models in a web application. Future research aims to enable stakeholders to provide information about a product or a service so that the information gathered about a product or a service includes both the producer’s and the stakeholders' point of view. The purpose is to create an all-inclusive management system of the knowledge regarding a specific product or service: a system that includes everything about the product or service and is able to address even unexpected questions.

Keywords: informative marketing, opt in page, software platform, web application

Procedia PDF Downloads 127
6164 Optimization and Coordination of Organic Product Supply Chains under Competition: An Analytical Modeling Perspective

Authors: Mohammadreza Nematollahi, Bahareh Mosadegh Sedghy, Alireza Tajbakhsh

Abstract:

The last two decades have witnessed substantial attention to organic and sustainable agricultural supply chains. Motivated by real-world practices, this paper aims to address two main challenges observed in organic product supply chains: decentralized decision-making process between farmers and their retailers, and competition between organic products and their conventional counterparts. To this aim, an agricultural supply chain consisting of two farmers, a conventional farmer and an organic farmer who offers an organic version of the same product, is considered. Both farmers distribute their products through a single retailer, where there exists competition between the organic and the conventional product. The retailer, as the market leader, sets the wholesale price, and afterward, the farmers set their production quantity decisions. This paper first models the demand functions of the conventional and organic products by incorporating the effect of asymmetric brand equity, which captures the fact that consumers usually pay a premium for organic due to positive perceptions regarding their health and environmental benefits. Then, profit functions with consideration of some characteristics of organic farming, including crop yield gap and organic cost factor, are modeled. Our research also considers both economies and diseconomies of scale in farming production as well as the effects of organic subsidy paid by the government to support organic farming. This paper explores the investigated supply chain in three scenarios: decentralized, centralized, and coordinated decision-making structures. In the decentralized scenario, the conventional and organic farmers and the retailer maximize their own profits individually. In this case, the interaction between the farmers is modeled under the Bertrand competition, while analyzing the interaction between the retailer and farmers under the Stackelberg game structure. In the centralized model, the optimal production strategies are obtained from the entire supply chain perspective. Analytical models are developed to derive closed-form optimal solutions. Moreover, analytical sensitivity analyses are conducted to explore the effects of main parameters like the crop yield gap, organic cost factor, organic subsidy, and percent price premium of the organic product on the farmers’ and retailer’s optimal strategies. Afterward, a coordination scenario is proposed to convince the three supply chain members to shift from the decentralized to centralized decision-making structure. The results indicate that the proposed coordination scenario provides a win-win-win situation for all three members compared to the decentralized model. Moreover, our paper demonstrates that the coordinated model respectively increases and decreases the production and price of organic produce, which in turn motivates the consumption of organic products in the market. Moreover, the proposed coordination model helps the organic farmer better handle the challenges of organic farming, including the additional cost and crop yield gap. Last but not least, our results highlight the active role of the organic subsidy paid by the government as a means of promoting sustainable organic product supply chains. Our paper shows that although the amount of organic subsidy plays a significant role in the production and sales price of organic products, the allocation method of subsidy between the organic farmer and retailer is not of that importance.

Keywords: analytical game-theoretic model, product competition, supply chain coordination, sustainable organic supply chain

Procedia PDF Downloads 112
6163 Block Mining: Block Chain Enabled Process Mining Database

Authors: James Newman

Abstract:

Process mining is an emerging technology that looks to serialize enterprise data in time series data. It has been used by many companies and has been the subject of a variety of research papers. However, the majority of current efforts have looked at how to best create process mining from standard relational databases. This paper is the first pass at outlining a database custom-built for the minimal viable product of process mining. We present Block Miner, a blockchain protocol to store process mining data across a distributed network. We demonstrate the feasibility of storing process mining data on the blockchain. We present a proof of concept and show how the intersection of these two technologies helps to solve a variety of issues, including but not limited to ransomware attacks, tax documentation, and conflict resolution.

Keywords: blockchain, process mining, memory optimization, protocol

Procedia PDF Downloads 104
6162 A Novel Algorithm for Production Scheduling

Authors: Ali Mohammadi Bolban Abad, Fariborz Ahmadi

Abstract:

Optimization in manufacture is a method to use limited resources to obtain the best performance and reduce waste. In this paper a new algorithm based on eurygaster life is introduced to obtain a plane in which task order and completion time of resources are defined. Evaluation results show our approach has less make span when the resources are allocated with some products in comparison to genetic algorithm.

Keywords: evolutionary computation, genetic algorithm, particle swarm optimization, NP-Hard problems, production scheduling

Procedia PDF Downloads 380
6161 Simulation and Optimization of an Annular Methanol Reformer

Authors: Shu-Bo Yang, Wei Wu, Yuan-Heng Liu

Abstract:

This research aims to design a heat-exchanger type of methanol reformer coupled with a preheating design in gPROMS® environment. The endothermic methanol steam reforming reaction (MSR) and the exothermic preferential oxidation reaction (PROX) occur in the inner tube and the outer tube of the reformer, respectively. The effective heat transfer manner between the inner and outer tubes is investigated. It is verified that the countercurrent-flow type reformer provides the higher hydrogen yield than the cocurrent-flow type. Since the hot spot temperature appears in the outer tube, an improved scheme is proposed to suppress the hot spot temperature by splitting the excess air flowing into two sites. Finally, an optimization algorithm for maximizing the hydrogen yield is employed to determine optimal operating conditions.

Keywords: methanol reformer, methanol steam reforming, optimization, simulation

Procedia PDF Downloads 333
6160 Study of the Effect of Inclusion of TiO2 in Active Flux on Submerged Arc Welding of Low Carbon Mild Steel Plate and Parametric Optimization of the Process by Using DEA Based Bat Algorithm

Authors: Sheetal Kumar Parwar, J. Deb Barma, A. Majumder

Abstract:

Submerged arc welding is a very complex process. It is a very efficient and high performance welding process. In this present study an attempt have been done to reduce the welding distortion by increased amount of oxide flux through TiO2 in submerged arc welding process. Care has been taken to avoid the excessiveness of the adding agent for attainment of significant results. Data Envelopment Analysis (DEA) based BAT algorithm is used for the parametric optimization purpose in which DEA Data Envelopment Analysis is used to convert multi response parameters into a single response parameter. The present study also helps to know the effectiveness of the addition of TiO2 in active flux during submerged arc welding process.

Keywords: BAT algorithm, design of experiment, optimization, submerged arc welding

Procedia PDF Downloads 640
6159 Fructooligosaccharide Prebiotics: Optimization of Different Cultivation Parameters on Their Microbial Production

Authors: Elsayed Ahmed Elsayed, Azza Noor El-Deen, Mohamed A. Farid, Mohamed A. Wadaan

Abstract:

Recently, a great attention has been paid to the use of dietary carbohydrates as prebiotic functional foods. Among the new commercially available products, fructooligosaccharides (FOS), which are microbial produced from sucrose, have attracted special interest due to their valuable properties and, thus, have a great economic potential for the sugar industrial branch. They are non-cariogenic sweeteners of low caloric value, as they are not hydrolyzed by the gastro-intestinal enzymes, promoting selectively the growth of the bifidobacteria in the colon, helping to eliminate the harmful microbial species to human and animal health and preventing colon cancer. FOS has been also found to reduce cholesterol, phospholipids and triglyceride levels in blood. FOS has been mainly produced by microbial fructosyltransferase (FTase) enzymes. The present work outlines bioprocess optimization for different cultivation parameters affecting the production of FTase by Penicillium aurantiogriseum AUMC 5605. The optimization involves both traditional as well as fractional factorial design approaches. Additionally, the production process will be compared under batch and fed-batch conditions. Finally, the optimized process conditions will be applied to 5-L stirred tank bioreactor cultivations.

Keywords: prebiotics, fructooligosaccharides, optimization, cultivation

Procedia PDF Downloads 387
6158 Open Innovation for Crowdsourced Product Development: The Case Study of Quirky.com

Authors: Ana Bilandzic, Marcus Foth, Greg Hearn

Abstract:

In a narrow sense, innovation is the invention and commercialisation of a new product or service in the marketplace. The literature suggests places that support knowledge exchange and social interaction, e.g. coffee shops, to nurture innovative ideas. With the widespread success of Internet, interpersonal communication and interaction changed. Online platforms complement physical places for idea exchange and innovation – the rise of hybrid, ‘net localities.’ Further, since its introduction in 2003 by Chesbrough, the concept of open innovation received increased attention as a topic in academic research as well as an innovation strategy applied by companies. Open innovation allows companies to seek and release intellectual property and new ideas from outside of their own company. As a consequence, the innovation process is no longer only managed within the company, but it is pursued in a co-creation process with customers, suppliers, and other stakeholders. Quirky.com (Quirky), a company founded by Ben Kaufman in 2009, recognised the opportunity given by the Internet for knowledge exchange and open innovation. Quirky developed an online platform that makes innovation available to everyone. This paper reports on a study that analysed Quirky’s business process in an extended event-driven process chain (eEPC). The aim was to determine how the platform enabled crowdsourced innovation for physical products on the Internet. The analysis reveals that key elements of the business model are based on open innovation. Quirky is an example of how open innovation can support crowdsourced and crowdfunded product ideation, development and selling. The company opened up various stages in the innovation process to its members to contribute in the product development, e.g. product ideation, design, and market research. Throughout the process, members earn influence through participating in the product development. Based on the influence they receive, shares on the product’s turnover. The outcomes of the study’s analysis highlighted certain benefits of open innovation for product development. The paper concludes with recommendations for future research to look into opportunities of open innovation approaches to be adopted by tertiary institutions as a novel way to commercialise research intellectual property.

Keywords: business process, crowdsourced innovation, open innovation, Quirky

Procedia PDF Downloads 233
6157 Effect of Impurities in the Chlorination Process of TiO2

Authors: Seok Hong Min, Tae Kwon Ha

Abstract:

With the increasing interest on Ti alloys, the extraction process of Ti from its typical ore, TiO2, has long been and will be important issue. As an intermediate product for the production of pigment or titanium metal sponge, tetrachloride (TiCl4) is produced by fluidized bed using high TiO2 feedstock. The purity of TiCl4 after chlorination is subjected to the quality of the titanium feedstock. Since the impurities in the TiCl4 product are reported to final products, the purification process of the crude TiCl4 is required. The purification process includes fractional distillation and chemical treatment, which depends on the nature of the impurities present and the required quality of the final product. In this study, thermodynamic analysis on the impurity effect in the chlorination process, which is the first step of extraction of Ti from TiO2, has been conducted. All thermodynamic calculations were performed using the FactSage thermodynamical software.

Keywords: rutile, titanium, chlorination process, impurities, thermodynamic calculation, FactSage

Procedia PDF Downloads 309
6156 Review of Suitable Advanced Oxidation Processes for Degradation of Organic Compounds in Produced Water during Enhanced Oil Recovery

Authors: Smita Krishnan, Krittika Chandran, Chandra Mohan Sinnathambi

Abstract:

Produced water and its treatment and management are growing challenges in all producing regions. This water is generally considered as a nonrevenue product, but it can have significant value in enhanced oil recovery techniques if it meets the required quality standards. There is also an interest in the beneficial uses of produced water for agricultural and industrial applications. Advanced Oxidation Process is a chemical technology that has been growing recently in the wastewater treatment industry, and it is highly recommended for non-easily removal of organic compounds. The efficiency of AOPs is compound specific, therefore, the optimization of each process should be done based on different aspects.

Keywords: advanced oxidation process, photochemical processes, degradation, organic contaminants

Procedia PDF Downloads 505
6155 Extractive Fermentation of Ethanol Using Vacuum Fractionation Technique

Authors: Weeraya Samnuknit, Apichat Boontawan

Abstract:

A vacuum fractionation technique was introduced to remove ethanol from fermentation broth. The effect of initial glucose and ethanol concentrations were investigated for specific productivity. The inhibitory ethanol concentration was observed at 100 g/L. In order to increase the fermentation performance, the ethanol product was removed as soon as it is produced. The broth was boiled at 35°C by reducing the pressure to 65 mBar. The ethanol/water vapor was fractionated for up to 90 wt% before leaving the column. Ethanol concentration in the broth was kept lower than 25 g/L, thus minimized the product inhibition effect to the yeast cells. For batch extractive fermentation, a high substrate utilization rate was obtained at 26.6 g/L.h and most of glucose was consumed within 21 h. For repeated-batch extractive fermentation, addition of glucose was carried out up to 9 times and ethanol was produced more than 8-fold higher than batch fermentation.

Keywords: ethanol, extractive fermentation, product inhibition, vacuum fractionation

Procedia PDF Downloads 254
6154 Association Rules Mining Task Using Metaheuristics: Review

Authors: Abir Derouiche, Abdesslem Layeb

Abstract:

Association Rule Mining (ARM) is one of the most popular data mining tasks and it is widely used in various areas. The search for association rules is an NP-complete problem that is why metaheuristics have been widely used to solve it. The present paper presents the ARM as an optimization problem and surveys the proposed approaches in the literature based on metaheuristics.

Keywords: Optimization, Metaheuristics, Data Mining, Association rules Mining

Procedia PDF Downloads 161
6153 Investigating the Effective Factors on Product Performance and Prioritizing Them: Case Study of Pars-Khazar Company

Authors: Ebrahim Sabermaash Eshghi, Donna Sandsmark

Abstract:

Nowadays, successful companies try to create a reliable and unique competitive position in the market. It is important to consider that only choosing and codifying a competitive strategy appropriate with the market conditions does not have any influence on the final performance of the company by itself, but it is the connection and interaction between upstream level strategies and functional level strategies which leads to development of company performance in its operating environment. Given the importance of the subject, this study tries to investigate effective factors on product performance and prioritize them. This study was done with quantitative-qualitative approach (interview and questionnaire). In sum, 103 informed managers and experts of Pars-Khazar Company were investigated in a census. Validity of measure tools was approved through experts’ judgments. Reliability of the tools was also gained through Cronbach's Alpha Coefficient as 0.930 and in sum, validity and reliability of the tools was approved generally. Analysis of collected data was done through Spearman Correlation Test and Friedman Test using SPSS software. The results showed that management of distribution and demand process (0.675), management of Product Pre-test (0.636) and Manufacturing and inventory management(0.628) had the highest correlation with product performance. Prioritization of factors of structure of launching new products based on the average showed that management of volume of launched products and Manufacturing and inventory management had the most importance.

Keywords: product performance, home appliances, market, case study

Procedia PDF Downloads 224
6152 Title: Real World Evidence a Tool to Overcome the Lack of a Comparative Arm in Drug Evaluation in the Context of Rare Diseases

Authors: Mohamed Wahba

Abstract:

Objective: To build a comparative arm for product (X) in specific gene mutated advanced gastrointestinal cancer using real world evidence to fulfill HTA requirements in drug evaluation. Methods: Data for product (X) were collected from phase II clinical trial while real world data for (Y) and (Z) were collected from US database. Real-world (RW) cohorts were matched to clinical trial base line characteristics using weighting by odds method. Outcomes included progression-free survival (PFS) and overall survival (OS) rates. Study location and participants: Internationally (product X, n=80) and from USA (Product Y and Z, n=73) Results: Two comparisons were made: trial cohort 1 (X) versus real-world cohort 1 (Z), trial cohort 2 (X) versus real-world cohort 2 (Y). For first line, the median OS was 9.7 months (95% CI 8.6- 11.5) and the median PFS was 5.2 months (95% CI 4.7- not reached) for real-world cohort 1. For second line, the median OS was 10.6 months (95% CI 4.7- 27.3) for real-world cohort 2 and the median PFS was 5.0 months (95% CI 2.1- 29.3). For OS analysis, results were statistically significant but not for PFS analysis. Conclusion: This study provided the clinical comparative outcomes needed for HTA evaluation.

Keywords: real world evidence, pharmacoeconomics, HTA agencies, oncology

Procedia PDF Downloads 90
6151 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 469
6150 ACO-TS: an ACO-based Algorithm for Optimizing Cloud Task Scheduling

Authors: Fahad Y. Al-dawish

Abstract:

The current trend by a large number of organizations and individuals to use cloud computing. Many consider it a significant shift in the field of computing. Cloud computing are distributed and parallel systems consisting of a collection of interconnected physical and virtual machines. With increasing request and profit of cloud computing infrastructure, diverse computing processes can be executed on cloud environment. Many organizations and individuals around the world depend on the cloud computing environments infrastructure to carry their applications, platform, and infrastructure. One of the major and essential issues in this environment related to allocating incoming tasks to suitable virtual machine (cloud task scheduling). Cloud task scheduling is classified as optimization problem, and there are several meta-heuristic algorithms have been anticipated to solve and optimize this problem. Good task scheduler should execute its scheduling technique on altering environment and the types of incoming task set. In this research project a cloud task scheduling methodology based on ant colony optimization ACO algorithm, we call it ACO-TS Ant Colony Optimization for Task Scheduling has been proposed and compared with different scheduling algorithms (Random, First Come First Serve FCFS, and Fastest Processor to the Largest Task First FPLTF). Ant Colony Optimization (ACO) is random optimization search method that will be used for assigning incoming tasks to available virtual machines VMs. The main role of proposed algorithm is to minimizing the makespan of certain tasks set and maximizing resource utilization by balance the load among virtual machines. The proposed scheduling algorithm was evaluated by using Cloudsim toolkit framework. Finally after analyzing and evaluating the performance of experimental results we find that the proposed algorithm ACO-TS perform better than Random, FCFS, and FPLTF algorithms in each of the makespaan and resource utilization.

Keywords: cloud Task scheduling, ant colony optimization (ACO), cloudsim, cloud computing

Procedia PDF Downloads 422
6149 Biogeography Based CO2 and Cost Optimization of RC Cantilever Retaining Walls

Authors: Ibrahim Aydogdu, Alper Akin

Abstract:

In this study, the development of minimizing the cost and the CO2 emission of the RC retaining wall design has been performed by Biogeography Based Optimization (BBO) algorithm. This has been achieved by developing computer programs utilizing BBO algorithm which minimize the cost and the CO2 emission of the RC retaining walls. Objective functions of the optimization problem are defined as the minimized cost, the CO2 emission and weighted aggregate of the cost and the CO2 functions of the RC retaining walls. In the formulation of the optimum design problem, the height and thickness of the stem, the length of the toe projection, the thickness of the stem at base level, the length and thickness of the base, the depth and thickness of the key, the distance from the toe to the key, the number and diameter of the reinforcement bars are treated as design variables. In the formulation of the optimization problem, flexural and shear strength constraints and minimum/maximum limitations for the reinforcement bar areas are derived from American Concrete Institute (ACI 318-14) design code. Moreover, the development length conditions for suitable detailing of reinforcement are treated as a constraint. The obtained optimum designs must satisfy the factor of safety for failure modes (overturning, sliding and bearing), strength, serviceability and other required limitations to attain practically acceptable shapes. To demonstrate the efficiency and robustness of the presented BBO algorithm, the optimum design example for retaining walls is presented and the results are compared to the previously obtained results available in the literature.

Keywords: bio geography, meta-heuristic search, optimization, retaining wall

Procedia PDF Downloads 401
6148 Design and Optimization of a Small Hydraulic Propeller Turbine

Authors: Dario Barsi, Marina Ubaldi, Pietro Zunino, Robert Fink

Abstract:

A design and optimization procedure is proposed and developed to provide the geometry of a high efficiency compact hydraulic propeller turbine for low head. For the preliminary design of the machine, classic design criteria, based on the use of statistical correlations for the definition of the fundamental geometric parameters and the blade shapes are used. These relationships are based on the fundamental design parameters (i.e., specific speed, flow coefficient, work coefficient) in order to provide a simple yet reliable procedure. Particular attention is paid, since from the initial steps, on the correct conformation of the meridional channel and on the correct arrangement of the blade rows. The preliminary geometry thus obtained is used as a starting point for the hydrodynamic optimization procedure, carried out using a CFD calculation software coupled with a genetic algorithm that generates and updates a large database of turbine geometries. The optimization process is performed using a commercial approach that solves the turbulent Navier Stokes equations (RANS) by exploiting the axial-symmetric geometry of the machine. The geometries generated within the database are therefore calculated in order to determine the corresponding overall performance. In order to speed up the optimization calculation, an artificial neural network (ANN) based on the use of an objective function is employed. The procedure was applied for the specific case of a propeller turbine with an innovative design of a modular type, specific for applications characterized by very low heads. The procedure is tested in order to verify its validity and the ability to automatically obtain the targeted net head and the maximum for the total to total internal efficiency.

Keywords: renewable energy conversion, hydraulic turbines, low head hydraulic energy, optimization design

Procedia PDF Downloads 150
6147 Cylindrical Spacer Shape Optimization for Enhanced Inhalation Therapy

Authors: Shahab Azimi, Siamak Arzanpour, Anahita Sayyar

Abstract:

Asthma and Chronic obstructive pulmonary disease (COPD) are common lung diseases that have a significant global impact. Pressurized metered dose inhalers (pMDIs) are widely used for treatment, but they can have limitations such as high medication release speed resulting in drug deposition in the mouth or oral cavity and difficulty achieving proper synchronization with inhalation by users. Spacers are add-on devices that improve the efficiency of pMDIs by reducing the release speed and providing space for aerosol particle breakup to have finer and medically effective medication. The aim of this study is to optimize the size and cylindrical shape of spacers to enhance their drug delivery performance. The study was based on fluid dynamics theory and employed Ansys software for simulation and optimization. Results showed that optimization of the spacer's geometry greatly influenced its performance and improved drug delivery. This study provides a foundation for future research on enhancing the efficiency of inhalation therapy for lung diseases.

Keywords: asthma, COPD, pressurized metered dose inhalers, spacers, CFD, shape optimization

Procedia PDF Downloads 99
6146 Structural Optimization of Shell and Arched Structures

Authors: Mitchell Gohnert, Ryan Bradley

Abstract:

This paper reviews some fundamental concepts of structural optimization, which are based on the type of materials used in construction and the shape of the structure. The first step in structural optimization is to break down all internal forces in a structure into fundamental stresses, which are tensions and compressions. Knowing the stress patterns directs our selection of structural shapes and the most appropriate type of construction material. In our selection of materials, it is essential to understand all construction materials have flaws, or micro-cracks, which reduce the capacity of the material, especially when subjected to tensions. Because of material defects, many construction materials perform significantly better when subjected to compressive forces. Structures are also more efficient if bending moments are eliminated. Bending stresses produce high peak stresses at each face of the member, and therefore, substantially more material is required to resist bending. The shape of the structure also has a profound effect on stress levels. Stress may be reduced dramatically by simply changing the shape. Catenary, triangular and linear shapes are the fundamental structural forms to achieve optimal stress flow. If the natural flow of stress matches the shape of the structures, the most optimal shape is determined.

Keywords: arches, economy of stresses, material strength, optimization, shells

Procedia PDF Downloads 118
6145 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever

Authors: Sudha T., Naveen C.

Abstract:

Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.

Keywords: deep learning model, dengue fever, prediction, optimization

Procedia PDF Downloads 66
6144 Integrated Optimization of Vehicle Microscopic Behavior and Signal Control for Mixed Traffic Based on a Distributed Strategy

Authors: Siliang Luan

Abstract:

In this paper, an integrated-decentralized bi-level optimization framework is developed to coordinate intersection signal operations and vehicle driving behavior at an isolated signalized intersection in a mixed traffic environment. The framework takes advantage of both signal control and conflict elimination by incorporating an integrated level and a decentralized level. Two distinct signal control methods are introduced: the classical green phase control strategy and the white phase control strategy. The latter allows certain vehicles to pass through the intersection during a red phase, thereby reducing idle time. Besides, various vehicle trajectory optimization strategies are tailored to different vehicle-following types, leveraging the capabilities of CAV technology. Enhanced microscopic behavior control strategies, such as car-following and lane-changing controls, are also developed for CAVs to improve their performance in mixed traffic. These strategies are integrated into the proposed framework. The effectiveness of the framework is validated through numerical experiments and sensitivity analysis, demonstrating its advantages in terms of traffic effectiveness, stability, and energy economy.

Keywords: traffic signal optimization, connected and automated vehicles, vehicle microscopic control, traffic control and information technology

Procedia PDF Downloads 7