Search results for: performance optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14824

Search results for: performance optimization

14104 Pavement Maintenance and Rehabilitation Scheduling Using Genetic Algorithm Based Multi Objective Optimization Technique

Authors: Ashwini Gowda K. S, Archana M. R, Anjaneyappa V

Abstract:

This paper presents pavement maintenance and management system (PMMS) to obtain optimum pavement maintenance and rehabilitation strategies and maintenance scheduling for a network using a multi-objective genetic algorithm (MOGA). Optimal pavement maintenance & rehabilitation strategy is to maximize the pavement condition index of the road section in a network with minimum maintenance and rehabilitation cost during the planning period. In this paper, NSGA-II is applied to perform maintenance optimization; this maintenance approach was expected to preserve and improve the existing condition of the highway network in a cost-effective way. The proposed PMMS is applied to a network that assessed pavement based on the pavement condition index (PCI). The minimum and maximum maintenance cost for a planning period of 20 years obtained from the non-dominated solution was found to be 5.190x10¹⁰ ₹ and 4.81x10¹⁰ ₹, respectively.

Keywords: genetic algorithm, maintenance and rehabilitation, optimization technique, pavement condition index

Procedia PDF Downloads 137
14103 Failure Inference and Optimization for Step Stress Model Based on Bivariate Wiener Model

Authors: Soudabeh Shemehsavar

Abstract:

In this paper, we consider the situation under a life test, in which the failure time of the test units are not related deterministically to an observable stochastic time varying covariate. In such a case, the joint distribution of failure time and a marker value would be useful for modeling the step stress life test. The problem of accelerating such an experiment is considered as the main aim of this paper. We present a step stress accelerated model based on a bivariate Wiener process with one component as the latent (unobservable) degradation process, which determines the failure times and the other as a marker process, the degradation values of which are recorded at times of failure. Parametric inference based on the proposed model is discussed and the optimization procedure for obtaining the optimal time for changing the stress level is presented. The optimization criterion is to minimize the approximate variance of the maximum likelihood estimator of a percentile of the products’ lifetime distribution.

Keywords: bivariate normal, Fisher information matrix, inverse Gaussian distribution, Wiener process

Procedia PDF Downloads 311
14102 Resource Leveling Optimization in Construction Projects of High Voltage Substations Using Nature-Inspired Intelligent Evolutionary Algorithms

Authors: Dimitrios Ntardas, Alexandros Tzanetos, Georgios Dounias

Abstract:

High Voltage Substations (HVS) are the intermediate step between production of power and successfully transmitting it to clients, making them one of the most important checkpoints in power grids. Nowadays - renewable resources and consequently distributed generation are growing fast, the construction of HVS is of high importance both in terms of quality and time completion so that new energy producers can quickly and safely intergrade in power grids. The resources needed, such as machines and workers, should be carefully allocated so that the construction of a HVS is completed on time, with the lowest possible cost (e.g. not spending additional cost that were not taken into consideration, because of project delays), but in the highest quality. In addition, there are milestones and several checkpoints to be precisely achieved during construction to ensure the cost and timeline control and to ensure that the percentage of governmental funding will be granted. The management of such a demanding project is a NP-hard problem that consists of prerequisite constraints and resource limits for each task of the project. In this work, a hybrid meta-heuristic method is implemented to solve this problem. Meta-heuristics have been proven to be quite useful when dealing with high-dimensional constraint optimization problems. Hybridization of them results in boost of their performance.

Keywords: hybrid meta-heuristic methods, substation construction, resource allocation, time-cost efficiency

Procedia PDF Downloads 141
14101 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals

Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor

Abstract:

This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.

Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers

Procedia PDF Downloads 65
14100 Design an Intelligent Fire Detection System Based on Neural Network and Particle Swarm Optimization

Authors: Majid Arvan, Peyman Beygi, Sina Rokhsati

Abstract:

In-time detection of fire in buildings is of great importance. Employing intelligent methods in data processing in fire detection systems leads to a significant reduction of fire damage at lowest cost. In this paper, the raw data obtained from the fire detection sensor networks in buildings is processed by using intelligent methods based on neural networks and the likelihood of fire happening is predicted. In order to enhance the quality of system, the noise in the sensor data is reduced by analyzing wavelets and applying SVD technique. Meanwhile, the proposed neural network is trained using particle swarm optimization (PSO). In the simulation work, the data is collected from sensor network inside the room and applied to the proposed network. Then the outputs are compared with conventional MLP network. The simulation results represent the superiority of the proposed method over the conventional one.

Keywords: intelligent fire detection, neural network, particle swarm optimization, fire sensor network

Procedia PDF Downloads 371
14099 Sizing of Drying Processes to Optimize Conservation of the Nuclear Power Plants on Stationary

Authors: Assabo Mohamed, Bile Mohamed, Ali Farah, Isman Souleiman, Olga Alos Ramos, Marie Cadet

Abstract:

The life of a nuclear power plant is regularly punctuated by short or long period outages to carry out maintenance operations and/or nuclear fuel reloading. During these stops periods, it is essential to conserve all the secondary circuit equipment to avoid corrosion priming. This kind of circuit is one of the main components of a nuclear reactor. Indeed, the conservation materials on shutdown of a nuclear unit improve circuit performance and reduce the maintenance cost considerably. This study is a part of the optimization of the dry preservation of equipment from the water station of the nuclear reactor. The main objective is to provide tools to guide Electricity Production Nuclear Centre (EPNC) in order to achieve the criteria required by the chemical specifications of conservation materials. A theoretical model of drying exchangers of water station is developed by the software Engineering Equation Solver (EES). It used to size requirements and air quality needed for dry conservation of equipment. This model is based on heat transfer and mass transfer governing the drying operation. A parametric study is conducted to know the influence of aerothermal factor taking part in the drying operation. The results show that the success of dry conservation of equipment of the secondary circuit of nuclear reactor depends strongly on the draining, the quality of drying air and the flow of air injecting in the secondary circuit. Finally, theoretical case study performed on EES highlights the importance of mastering the entire system to balance the air system to provide each exchanger optimum flow depending on its characteristics. From these results, recommendations to nuclear power plants can be formulated to optimize drying practices and achieve good performance in the conservation of material from the water at the stop position.

Keywords: dry conservation, optimization, sizing, water station

Procedia PDF Downloads 255
14098 A Novel Approach of NPSO on Flexible Logistic (S-Shaped) Model for Software Reliability Prediction

Authors: Pooja Rani, G. S. Mahapatra, S. K. Pandey

Abstract:

In this paper, we propose a novel approach of Neural Network and Particle Swarm Optimization methods for software reliability prediction. We first explain how to apply compound function in neural network so that we can derive a Flexible Logistic (S-shaped) Growth Curve (FLGC) model. This model mathematically represents software failure as a random process and can be used to evaluate software development status during testing. To avoid trapping in local minima, we have applied Particle Swarm Optimization method to train proposed model using failure test data sets. We drive our proposed model using computational based intelligence modeling. Thus, proposed model becomes Neuro-Particle Swarm Optimization (NPSO) model. We do test result with different inertia weight to update particle and update velocity. We obtain result based on best inertia weight compare along with Personal based oriented PSO (pPSO) help to choose local best in network neighborhood. The applicability of proposed model is demonstrated through real time test data failure set. The results obtained from experiments show that the proposed model has a fairly accurate prediction capability in software reliability.

Keywords: software reliability, flexible logistic growth curve model, software cumulative failure prediction, neural network, particle swarm optimization

Procedia PDF Downloads 335
14097 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit

Authors: Ahmed Elrewainy

Abstract:

Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.

Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets

Procedia PDF Downloads 189
14096 Interaction between Mutual Fund Performance and Portfolio Turnover

Authors: Sheng-Ching Wu

Abstract:

This paper examines the interaction between mutual fund performance and portfolio turnover. Active trading could affect fund performance, but underperforming funds could also be traded actively at the same time to perform well. Therefore, we used two-stage least squares to address with simultaneity. The results indicate that funds with higher portfolio turnovers exhibit inferior performance compared with funds having lower turnovers. Moreover, funds with poor performance exhibit higher portfolio turnover. The findings support the assumptions that active trading erodes performance, and that fund managers with poor performance attempt to trade actively to retain employment.

Keywords: mutual funds, portfolio turnover, simultaneity, two-stage least squares

Procedia PDF Downloads 425
14095 Board Structure, Composition, and Firm Performance: A Theoretical and Empirical Review

Authors: Suleiman Ahmed Badayi

Abstract:

Corporate governance literature is very wide and involves several empirical studies conducted on the relationship between board structure, composition and firm performance. The separation of ownership and control in organizations were aimed at reducing the losses suffered by the investors in the event of financial scandals. This paper reviewed the theoretical and empirical literature on the relationship between board composition and its impact on firm performance. The findings from the studies provide different results while some are of the view that board structure is related to firm performance, many empirical studies indicates no relationship. However, others found a U-shape relationship between firm performance and board structure. Therefore, this study argued that board structure is not much significant to determine the financial performance of a firm.

Keywords: board structure, composition, firm performance, corporate governance

Procedia PDF Downloads 548
14094 Maximum Power Point Tracking Using Fuzzy Logic Control for a Stand-Alone PV System with PI Controller for Battery Charging Based on Evolutionary Technique

Authors: Mohamed A. Moustafa Hassan, Omnia S .S. Hussian, Hany M. Elsaved

Abstract:

This paper introduces the application of Fuzzy Logic Controller (FLC) to extract the Maximum Power Point Tracking (MPPT) from the PV panel. In addition, the proportional integral (PI) controller is used to be the strategy for battery charge control according to acceptable performance criteria. The parameters of the PI controller have been tuned via Modified Adaptive Accelerated Coefficient Particle Swarm Optimization (MAACPSO) technique. The simulation results, using MATLAB/Simulink tools, show that the FLC technique has advantages for use in the MPPT problem, as it provides a fast response under changes in environmental conditions such as radiation and temperature. In addition, the use of PI controller based on MAACPSO results in a good performance in terms of controlling battery charging with constant voltage and current to execute rapid charging.

Keywords: battery charging, fuzzy logic control, maximum power point tracking, PV system, PI controller, evolutionary technique

Procedia PDF Downloads 153
14093 Solar Building Design Using GaAs PV Cells for Optimum Energy Consumption

Authors: Hadis Pouyafar, D. Matin Alaghmandan

Abstract:

Gallium arsenide (GaAs) solar cells are widely used in applications like spacecraft and satellites because they have a high absorption coefficient and efficiency and can withstand high-energy particles such as electrons and protons. With the energy crisis, there's a growing need for efficiency and cost-effective solar cells. GaAs cells, with their 46% efficiency compared to silicon cells 23% can be utilized in buildings to achieve nearly zero emissions. This way, we can use irradiation and convert more solar energy into electricity. III V semiconductors used in these cells offer performance compared to other technologies available. However, despite these advantages, Si cells dominate the market due to their prices. In our study, we took an approach by using software from the start to gather all information. By doing so, we aimed to design the optimal building that harnesses the full potential of solar energy. Our modeling results reveal a future; for GaAs cells, we utilized the Grasshopper plugin for modeling and optimization purposes. To assess radiation, weather data, solar energy levels and other factors, we relied on the Ladybug and Honeybee plugins. We have shown that silicon solar cells may not always be the choice for meeting electricity demands, particularly when higher power output is required. Therefore, when it comes to power consumption and the available surface area for photovoltaic (PV) installation, it may be necessary to consider efficient solar cell options, like GaAs solar cells. By considering the building requirements and utilizing GaAs technology, we were able to optimize the PV surface area.

Keywords: gallium arsenide (GaAs), optimization, sustainable building, GaAs solar cells

Procedia PDF Downloads 74
14092 A Critical Study of the Performance of Self Compacting Concrete (SCC) Using Locally Supplied Materials in Bahrain

Authors: A. Umar, A. Tamimi

Abstract:

Development of new types of concrete with improved performance is a very important issue for the whole building industry. The development is based on the optimization of the concrete mix design, with an emphasis not only on the workability and mechanical properties but also to the durability and the reliability of the concrete structure in general. Self-compacting concrete (SCC) is a high-performance material designed to flow into formwork under its own weight and without the aid of mechanical vibration. At the same time it is cohesive enough to fill spaces of almost any size and shape without segregation or bleeding. Construction time is shorter and production of SCC is environmentally friendly (no noise, no vibration). Furthermore, SCC produces a good surface finish. Despite these advantages, SCC has not gained much local acceptance though it has been promoted in the Middle East for the last ten to twelve years. The reluctance in utilizing the advantages of SCC, in Bahrain, may be due to lack of research or published data pertaining to locally produced SCC. Therefore, there is a need to conduct studies on SCC using locally available material supplies. From the literature, it has been observed that the use of viscosity modifying admixtures (VMA), micro silica and glass fibers have proved to be very effective in stabilizing the rheological properties and the strength of fresh and hardened properties of self-compacting concrete (SCC). Therefore, in the present study, it is proposed to carry out investigations of SCC with combinations of various dosages of VMAs with and without micro silica and glass fibers and to study their influence on the properties of fresh and hardened concrete.

Keywords: self-compacting concrete, viscosity modifying admixture, micro silica, glass fibers

Procedia PDF Downloads 642
14091 Impact of Urbanization on the Performance of Higher Education Institutions

Authors: Chandan Jha, Amit Sachan, Arnab Adhikari, Sayantan Kundu

Abstract:

The purpose of this study is to evaluate the performance of Higher Education Institutions (HEIs) of India and examine the impact of urbanization on the performance of HEIs. In this study, the Data Envelopment Analysis (DEA) has been used, and the authors have collected the required data related to performance measures from the National Institutional Ranking Framework web portal. In this study, the authors have evaluated the performance of HEIs by using two different DEA models. In the first model, geographic locations of the institutes have been categorized into two categories, i.e., Urban Vs. Non-Urban. However, in the second model, these geographic locations have been classified into three categories, i.e., Urban, Semi-Urban, Non-Urban. The findings of this study provide several insights related to the degree of urbanization and the performance of HEIs.

Keywords: DEA, higher education, performance evaluation, urbanization

Procedia PDF Downloads 200
14090 A Resource Optimization Strategy for CPU (Central Processing Unit) Intensive Applications

Authors: Junjie Peng, Jinbao Chen, Shuai Kong, Danxu Liu

Abstract:

On the basis of traditional resource allocation strategies, the usage of resources on physical servers in cloud data center is great uncertain. It will cause waste of resources if the assignment of tasks is not enough. On the contrary, it will cause overload if the assignment of tasks is too much. This is especially obvious when the applications are the same type because of its resource preferences. Considering CPU intensive application is one of the most common types of application in the cloud, we studied the optimization strategy for CPU intensive applications on the same server. We used resource preferences to analyze the case that multiple CPU intensive applications run simultaneously, and put forward a model which can predict the execution time for CPU intensive applications which run simultaneously. Based on the prediction model, we proposed the method to select the appropriate number of applications for a machine. Experiments show that the model can predict the execution time accurately for CPU intensive applications. To improve the execution efficiency of applications, we propose a scheduling model based on priority for CPU intensive applications. Extensive experiments verify the validity of the scheduling model.

Keywords: cloud computing, CPU intensive applications, resource optimization, strategy

Procedia PDF Downloads 268
14089 Sensitivity Analysis of Prestressed Post-Tensioned I-Girder and Deck System

Authors: Tahsin A. H. Nishat, Raquib Ahsan

Abstract:

Sensitivity analysis of design parameters of the optimization procedure can become a significant factor while designing any structural system. The objectives of the study are to analyze the sensitivity of deck slab thickness parameter obtained from both the conventional and optimum design methodology of pre-stressed post-tensioned I-girder and deck system and to compare the relative significance of slab thickness. For analysis on conventional method, the values of 14 design parameters obtained by the conventional iterative method of design of a real-life I-girder bridge project have been considered. On the other side for analysis on optimization method, cost optimization of this system has been done using global optimization methodology 'Evolutionary Operation (EVOP)'. The problem, by which optimum values of 14 design parameters have been obtained, contains 14 explicit constraints and 46 implicit constraints. For both types of design parameters, sensitivity analysis has been conducted on deck slab thickness parameter which can become too sensitive for the obtained optimum solution. Deviations of slab thickness on both the upper and lower side of its optimum value have been considered reflecting its realistic possible ranges of variations during construction. In this procedure, the remaining parameters have been kept unchanged. For small deviations from the optimum value, compliance with the explicit and implicit constraints has been examined. Variations in the cost have also been estimated. It is obtained that without violating any constraint deck slab thickness obtained by the conventional method can be increased up to 25 mm whereas slab thickness obtained by cost optimization can be increased only up to 0.3 mm. The obtained result suggests that slab thickness becomes less sensitive in case of conventional method of design. Therefore, for realistic design purpose sensitivity should be conducted for any of the design procedure of girder and deck system.

Keywords: sensitivity analysis, optimum design, evolutionary operations, PC I-girder, deck system

Procedia PDF Downloads 123
14088 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 53
14087 The Bernstein Expansion for Exponentials in Taylor Functions: Approximation of Fixed Points

Authors: Tareq Hamadneh, Jochen Merker, Hassan Al-Zoubi

Abstract:

Bernstein's expansion for exponentials in Taylor functions provides lower and upper optimization values for the range of its original function. these values converge to the original functions if the degree is elevated or the domain subdivided. Taylor polynomial can be applied so that the exponential is a polynomial of finite degree over a given domain. Bernstein's basis has two main properties: its sum equals 1, and positive for all x 2 (0; 1). In this work, we prove the existence of fixed points for exponential functions in a given domain using the optimization values of Bernstein. The Bernstein basis of finite degree T over a domain D is defined non-negatively. Any polynomial p of degree t can be expanded into the Bernstein form of maximum degree t ≤ T, where we only need to compute the coefficients of Bernstein in order to optimize the original polynomial. The main property is that p(x) is approximated by the minimum and maximum Bernstein coefficients (Bernstein bound). If the bound is contained in the given domain, then we say that p(x) has fixed points in the same domain.

Keywords: Bernstein polynomials, Stability of control functions, numerical optimization, Taylor function

Procedia PDF Downloads 123
14086 A Case Study of Conceptual Framework for Process Performance

Authors: Ljubica Milanović Glavan, Vesna Bosilj Vukšić, Dalia Suša

Abstract:

In order to gain a competitive advantage, many companies are focusing on reorganization of their business processes and implementing process-based management. In this context, assessing process performance is essential because it enables individuals and groups to assess where they stand in comparison to their competitors. In this paper, it is argued that process performance measurement is a necessity for a modern process-oriented company and it should be supported by a holistic process performance measurement system. It seems very unlikely that a universal set of performance indicators can be applied successfully to all business processes. Thus, performance indicators must be process-specific and have to be derived from both the strategic enterprise-wide goals and the process goals. Based on the extensive literature review and interviews conducted in Croatian company a conceptual framework for process performance measurement system was developed. The main objective of such system is to help process managers by providing comprehensive and timely information on the performance of business processes. This information can be used to communicate goals and current performance of a business process directly to the process team, to improve resource allocation and process output regarding quantity and quality, to give early warning signals, to make a diagnosis of the weaknesses of a business process, to decide whether corrective actions are needed and to assess the impact of actions taken.

Keywords: Croatia, key performance indicators, performance measurement, process performance

Procedia PDF Downloads 658
14085 Statistical Optimization of Vanillin Production by Pycnoporus Cinnabarinus 1181

Authors: Swarali Hingse, Shraddha Digole, Uday Annapure

Abstract:

The present study investigates the biotransformation of ferulic acid to vanillin by Pycnoporus cinnabarinus and its optimization using one-factor-at-a-time method as well as statistical approach. Effect of various physicochemical parameters and medium components was studied using one-factor-at-a-time method. Screening of the significant factors was carried out using L25 Taguchi orthogonal array and then these selected significant factors were further optimized using response surface methodology (RSM). Significant media components obtained using Taguchi L25 orthogonal array were glucose, KH2PO4 and yeast extract. Further, a Box Behnken design was used to investigate the interactive effects of the three most significant media components. The final medium obtained after optimization using RSM containing glucose (34.89 g/L), diammonium tartrate (1 g/L), yeast extract (1.47 g/L), MgSO4•7H2O (0.5 g/L), KH2PO4 (0.15 g/L), and CaCl2•2H2O (20 mg/L) resulted in amplification of vanillin production from 30.88 mg/L to 187.63 mg/L.

Keywords: ferulic acid, pycnoporus cinnabarinus, response surface methodology, vanillin

Procedia PDF Downloads 371
14084 Effect of the Initial Billet Shape Parameters on the Final Product in a Backward Extrusion Process for Pressure Vessels

Authors: Archana Thangavelu, Han-Ik Park, Young-Chul Park, Joon-Hong Park

Abstract:

In this numerical study, we have proposed a method for evaluation of backward extrusion process of pressure vessel made up of steel. Demand for lighter and stiffer products have been increasing in the last years especially in automobile engineering. Through detailed finite element analysis, effective stress, strain and velocity profile have been obtained with optimal range. The process design of a forward and backward extrusion axe-symmetric part has been studied. Forging is mainly carried out because forged products are highly reliable and possess superior mechanical properties when compared to normal products. Performing computational simulations of 3D hot forging with various dimensions of billet and optimization of weight is carried out using Taguchi Orthogonal Array (OA) Optimization technique. The technique used in this study can be used for newly developed materials to investigate its forgeability for much complicated shapes in closed hot die forging process.

Keywords: backward extrusion, hot forging, optimization, finite element analysis, Taguchi method

Procedia PDF Downloads 299
14083 Research on the Development and Space Optimization of Rental-Type Public Housing in Hangzhou

Authors: Xuran Zhang, Huiru Chen

Abstract:

In recent years, China has made great efforts to cultivate and develop the housing rental market, especially the rental-type public housing, which has been paid attention to by all sectors of the society. This paper takes Hangzhou rental-type public housing as the research object, and divides it into three development stages according to the different supply modes of rental-type public housing. Through data collection and field research, the paper summarizes the spatial characteristics of rental-type public housing from the five perspectives of spatial planning, spatial layout, spatial integration, spatial organization and spatial configuration. On this basis, the paper proposes the optimization of the spatial layout. The study concludes that the spatial layout of rental-type public housing should be coordinated with the development of urban planning. When planning and constructing, it is necessary to select more mixed construction modes, to be properly centralized, and to improve the surrounding transportation service facilities.  It is hoped that the recommendations in this paper will provide a reference for the further development of rental-type public housing in Hangzhou.

Keywords: Hangzhou, rental-type public housing, spatial distribution, spatial optimization

Procedia PDF Downloads 309
14082 Exploring Non-Governmental Organizations’ Performance Management: Bahrain Athletics Association as a Case Study

Authors: Nooralhuda Aljlas

Abstract:

In the ever-growing field of non-governmental organizations, the enhancement of performance management and measurement systems has been increasingly acknowledged by political, economic, social, legal, technological and environmental factors. Within Bahrain Athletics Association, such enhancement results from the key factors leading performance management including collaboration, feedback, human resource management, leadership and participative management. The exploratory, qualitative research conducted reviewed performance management theory. As reviewed, the key factors leading performance management were identified. Drawing on a non-governmental organization case study, the key factors leading Bahrain Athletics Association’s performance management were explored. By exploring the key factors leading Bahrain Athletics Association’s performance management, the research study proposed a theoretical framework of the key factors leading performance management in non-governmental organizations in general. The research study recommended further investigation of the role of the two key factors of command and control and leadership, combining military and civilian approaches to enhancing non-governmental organizations’ performance management.

Keywords: Bahrain athletics association, exploratory, key factor, performance management

Procedia PDF Downloads 351
14081 Self-Healing Performance of Heavyweight Concrete with Steam Curing

Authors: Hideki Igawa, Yoshinori Kitsutaka, Takashi Yokomuro, Hideo Eguchi

Abstract:

In this study, the crack self-healing performance of the heavyweight concrete used in the walls of containers and structures designed to shield radioactive materials was investigated. A steam curing temperature that preserves self-healing properties and demolding strength was identified. The presented simultaneously mixing method using the expanding material and the fly ash in the process of admixture can maximize the self-curing performance. Also adding synthetic fibers in the heavyweight concrete improved the self-healing performance.

Keywords: expanding material, heavyweight concrete, self-healing performance, synthetic fiber

Procedia PDF Downloads 325
14080 When does technology alignment influence supply chain performance

Authors: Joseph Akyeh, Abdul Samed Muntaka, Emmanuel Anin, Dorcas Nuertey

Abstract:

Purpose: This study develops and tests arguments that the relationship between technology alignment and supply chain performance is conditional upon levels of technology championing. Methodology: The proposed relationships are tested on a sample of 217 hospitals in a major sub-Saharan African economy. Findings: Findings from the study indicate that technology alignment has a positive and significant effect on supply chain performance. The study further finds that while technology championing strengthens the direct effects of technology alignment on supply chain performance. Theoretical Contributions: A theoretical contribution from this study is the finding that when technology alignment drives supply chain performance is more complex than previously thought it depends on whether or not technology alignment is first championed by top management. Originality: Though some studies have been conducted on technology alignment and health supply chain performance, to the best of the researcher’s knowledge, no previous study has examined the moderating role of technology championing the link between technology alignment and supply chain performance.

Keywords: technology alignment, supply chain performance, technology championing, structural equation modelling

Procedia PDF Downloads 25
14079 Computational Fluid Dynamics Analysis of Cyclone Separator Performance Using Discrete Phase Model

Authors: Sandeep Mohan Ahuja, Gulshan Kumar Jawa

Abstract:

Cyclone separators are crucial components in various industries tasked with efficiently separating particulate matter from gas streams. Achieving optimal performance hinges on a deep understanding of flow dynamics and particle behaviour within these separators. In this investigation, Computational Fluid Dynamics (CFD) simulations are conducted utilizing the Discrete Phase Model (DPM) to dissect the intricate flow patterns, particle trajectories, and separation efficiency within cyclone separators. The study delves into the influence of pivotal parameters like inlet velocity, particle size distribution, and cyclone geometry on separation efficiency. Through numerical simulations, a comprehensive comprehension of fluid-particle interaction phenomena within cyclone separators is attained, allowing for the assessment of solid collection efficiency across diverse operational conditions and geometrical setups. The insights gleaned from this study promise to advance our understanding of the complex interplay between fluid and particle within cyclone separators, thereby enabling optimization across a wide array of industrial applications. By harnessing the power of CFD simulations and the DPM, this research endeavours to furnish valuable insights for designing, operating, and evaluating the performance of cyclone separators, ultimately fostering greater efficiency and environmental sustainability within industrial processes.

Keywords: cyclone separator, computational fluid dynamics, enhancing efficiency, discrete phase model

Procedia PDF Downloads 26
14078 Optimization of Batch to Up-Scaling of Soy-Based Prepolymer Polyurethane

Authors: Flora Elvistia Firdaus

Abstract:

The chemical structure of soybean oils have to be chemically modified through its tryglyceride to attain resemblance properties with petrochemicals. Sulfur acid catalyst in peracetic acid co-reagent has good performance on modified soybean oil strucutures through its unsaturated fatty acid moiety to the desired hydroxyl functional groups. A series of screening reactions have indicated that the ratio of acetic/peroxide acid 1:7.25 (mol/mol) with temperature of 600°C for soy-epoxide synthesis are prevailed for up-scaling of bodied soybean into 10 and 20 folds from initials. A two-step process was conducted for the preparation of soy-polyol in designated temperatures.

Keywords: soybean, polyol, up-scaling, polyurethane

Procedia PDF Downloads 344
14077 Psychological Capital and Work Engagement as Predictors of Employee Performance in a Technology Industry During COVID-19 Pandemic: Basis for Performance Management

Authors: Marion Francisco

Abstract:

The study sought to investigate the psychological capital and work engagement of employees as predictors of employee performance in the technology industry in Makati City. It made used of a descriptive correlational method of research and utilized standardized tests, such as Psychological Capital Scale, Utrech Work Engagement Scale, and Employee Performance Scale. A convenience sampling technique was used to gather data samples from 100 populations with the help of Roscoe concept approach. The study revealed that both psychological capital and work engagement have a significant relationship with employee performance. Psychological capital and work engagement can predict employee performance of the respondents. With the results given, the study suggests: (1) to focus on maintaining a high level of psychological capital and work engagement, on achieving a very high level of psychological capital and work engagement, and on improving the low level of psychological capital or work engagement mostly during this COVID-19 pandemic using the proposed employee performance management plan and (2) to create a proposed employee performance management plan as necessary to tailor fit on employees needs to enhance their performance that will help meet company and client’s needs.

Keywords: employee performance, performance management, psychological capital, technology industry, work engagement

Procedia PDF Downloads 96
14076 Optimization of the Enzymatic Synthesis of the Silver Core-Shell Nanoparticles

Authors: Lela Pintarić, Iva Rezić, Ana Vrsalović Presečki

Abstract:

Considering an enormous increase of the use of metal nanoparticles with the exactly defined characteristics, the main goal of this research was to found the optimal and environmental friendly method of their synthesis. The synthesis of the inorganic core-shell nanoparticles was optimized as a model. The core-shell nanoparticles are composed of the enzyme core belted with the metal ions, oxides or salts as a shell. In this research, enzyme urease was the core catalyst and the shell nanoparticle was made of silver. Silver nanoparticles are widespread utilized and some of their common uses are: as an addition to disinfectants to ensure an aseptic environment for the patients, as a surface coating for neurosurgical shunts and venous catheters, as an addition to implants, in production of socks for diabetics and athletic clothing where they improve antibacterial characteristics, etc. Characteristics of synthesized nanoparticles directly depend on of their size, so the special care during this optimization was given to the determination of the size of the synthesized nanoparticles. For the purpose of the above mentioned optimization, sixteen experiments were generated by the Design of Experiments (DoE) method and conducted under various temperatures, with different initial concentration of the silver nitrate and constant concentration of the urease of two separate manufacturers. Synthesized nanoparticles were analyzed by the Nanoparticle Tracking Analysis (NTA) method on Malvern NanoSight NS300. Results showed that the initial concentration of the silver ions does not affect the concentration of the synthesized silver nanoparticles neither their size distribution. On the other hand, temperature of the experiments has affected both of the mentioned values.

Keywords: core-shell nanoparticles, optimization, silver, urease

Procedia PDF Downloads 300
14075 Patient Scheduling Improvement in a Cancer Treatment Clinic Using Optimization Techniques

Authors: Maryam Haghi, Ivan Contreras, Nadia Bhuiyan

Abstract:

Chemotherapy is one of the most popular and effective cancer treatments offered to patients in outpatient oncology centers. In such clinics, patients first consult with an oncologist and the oncologist may prescribe a chemotherapy treatment plan for the patient based on the blood test results and the examination of the health status. Then, when the plan is determined, a set of chemotherapy and consultation appointments should be scheduled for the patient. In this work, a comprehensive mathematical formulation for planning and scheduling different types of chemotherapy patients over a planning horizon considering blood test, consultation, pharmacy and treatment stages has been proposed. To be more realistic and to provide an applicable model, this study is focused on a case study related to a major outpatient cancer treatment clinic in Montreal, Canada. Comparing the results of the proposed model with the current practice of the clinic under study shows significant improvements regarding different performance measures. These major improvements in the patients’ schedules reveal that using optimization techniques in planning and scheduling of patients in such highly demanded cancer treatment clinics is an essential step to provide a good coordination between different involved stages which ultimately increases the efficiency of the entire system and promotes the staff and patients' satisfaction.

Keywords: chemotherapy patients scheduling, integer programming, integrated scheduling, staff balancing

Procedia PDF Downloads 168