Search results for: block layout problem
6674 Improvement of Central Composite Design in Modeling and Optimization of Simulation Experiments
Authors: A. Nuchitprasittichai, N. Lerdritsirikoon, T. Khamsing
Abstract:
Simulation modeling can be used to solve real world problems. It provides an understanding of a complex system. To develop a simplified model of process simulation, a suitable experimental design is required to be able to capture surface characteristics. This paper presents the experimental design and algorithm used to model the process simulation for optimization problem. The CO2 liquefaction based on external refrigeration with two refrigeration circuits was used as a simulation case study. Latin Hypercube Sampling (LHS) was purposed to combine with existing Central Composite Design (CCD) samples to improve the performance of CCD in generating the second order model of the system. The second order model was then used as the objective function of the optimization problem. The results showed that adding LHS samples to CCD samples can help capture surface curvature characteristics. Suitable number of LHS sample points should be considered in order to get an accurate nonlinear model with minimum number of simulation experiments.Keywords: central composite design, CO2 liquefaction, latin hypercube sampling, simulation-based optimization
Procedia PDF Downloads 1666673 Using the Simple Fixed Rate Approach to Solve Economic Lot Scheduling Problem under the Basic Period Approach
Authors: Yu-Jen Chang, Yun Chen, Hei-Lam Wong
Abstract:
The Economic Lot Scheduling Problem (ELSP) is a valuable mathematical model that can support decision-makers to make scheduling decisions. The basic period approach is effective for solving the ELSP. The assumption for applying the basic period approach is that a product must use its maximum production rate to be produced. However, a product can lower its production rate to reduce the average total cost when a facility has extra idle time. The past researches discussed how a product adjusts its production rate under the common cycle approach. To the best of our knowledge, no studies have addressed how a product lowers its production rate under the basic period approach. This research is the first paper to discuss this topic. The research develops a simple fixed rate approach that adjusts the production rate of a product under the basic period approach to solve the ELSP. Our numerical example shows our approach can find a better solution than the traditional basic period approach. Our mathematical model that applies the fixed rate approach under the basic period approach can serve as a reference for other related researches.Keywords: economic lot, basic period, genetic algorithm, fixed rate
Procedia PDF Downloads 5636672 Reduction of Multiple User Interference for Optical CDMA Systems Using Successive Interference Cancellation Scheme
Authors: Tawfig Eltaif, Hesham A. Bakarman, N. Alsowaidi, M. R. Mokhtar, Malek Harbawi
Abstract:
In Commonly, it is primary problem that there is multiple user interference (MUI) noise resulting from the overlapping among the users in optical code-division multiple access (OCDMA) system. In this article, we aim to mitigate this problem by studying an interference cancellation scheme called successive interference cancellation (SIC) scheme. This scheme will be tested on two different detection schemes, spectral amplitude coding (SAC) and direct detection systems (DS), using partial modified prime (PMP) as the signature codes. It was found that SIC scheme based on both SAC and DS methods had a potential to suppress the intensity noise, that is to say, it can mitigate MUI noise. Furthermore, SIC/DS scheme showed much lower bit error rate (BER) performance relative to SIC/SAC scheme for different magnitude of effective power. Hence, many more users can be supported by SIC/DS receiver system.Keywords: optical code-division multiple access (OCDMA), successive interference cancellation (SIC), multiple user interference (MUI), spectral amplitude coding (SAC), partial modified prime code (PMP)
Procedia PDF Downloads 5216671 Development of Electronic Waste Management Framework at College of Design Art, Design and Technology
Authors: Wafula Simon Peter, Kimuli Nabayego Ibtihal, Nabaggala Kimuli Nashua
Abstract:
The worldwide use of information and communications technology (ICT) equipment and other electronic equipment is growing and consequently, there is a growing amount of equipment that becomes waste after its time in use. This growth is expected to accelerate since equipment lifetime decreases with time and growing consumption. As a result, e-waste is one of the fastest-growing waste streams globally. The United Nations University (UNU) calculates in its second Global E-waste Monitor 44.7 million metric tonnes (Mt) of e-waste were generated globally in 2016. The study population was 80 respondents, from which a sample of 69 respondents was selected using simple and purposive sampling techniques. This research was carried out to investigate the problem of e-waste and come up with a framework to improve e-waste management. The objective of the study was to develop a framework for improving e-waste management at the College of Engineering, Design, Art and Technology (CEDAT). This was achieved by breaking it down into specific objectives, and these included the establishment of the policy and other Regulatory frameworks being used in e-waste management at CEDAT, the determination of the effectiveness of the e-waste management practices at CEDAT, the establishment of the critical challenges constraining e-waste management at the College, development of a framework for e-waste management. The study reviewed the e-waste regulatory framework used at the college and then collected data which was used to come up with a framework. The study also established that weak policy and regulatory framework, lack of proper infrastructure, improper disposal of e-waste and a general lack of awareness of the e-waste and the magnitude of the problem are the critical challenges of e-waste management. In conclusion, the policy and regulatory framework should be revised, localized and strengthened to contextually address the problem. Awareness campaigns, the development of proper infrastructure and extensive research to establish the volumes and magnitude of the problems will come in handy. The study recommends a framework for the improvement of e-waste.Keywords: e-waste, treatment, disposal, computers, model, management policy and guidelines
Procedia PDF Downloads 796670 Multi-source Question Answering Framework Using Transformers for Attribute Extraction
Authors: Prashanth Pillai, Purnaprajna Mangsuli
Abstract:
Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.Keywords: natural language processing, deep learning, transformers, information retrieval
Procedia PDF Downloads 1936669 Modern Trends in Foreign Direct Investments in Georgia
Authors: Rusudan Kinkladze, Guguli Kurashvili, Ketevan Chitaladze
Abstract:
Foreign direct investment is a driving force in the development of the interdependent national economies, and the study and analysis of investments is an urgent problem. It is particularly important for transitional economies, such as Georgia, and the study and analysis of investments is an urgent problem. Consequently, the goal of the research is the study and analysis of direct foreign investments in Georgia, and identification and forecasting of modern trends, and covers the period of 2006-2015. The study uses the methods of statistical observation, grouping and analysis, the methods of analytical indicators of time series, trend identification and the predicted values are calculated, as well as various literary and Internet sources relevant to the research. The findings showed that modern investment policy In Georgia is favorable for domestic as well as foreign investors. Georgia is still a net importer of investments. In 2015, the top 10 investing countries was led by Azerbaijan, United Kingdom and Netherlands, and the largest share of FDIs were allocated in the transport and communication sector; the financial sector was the second, followed by the health and social work sector, and the same trend will continue in the future.Keywords: foreign direct investments, methods, statistics, analysis
Procedia PDF Downloads 3316668 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information
Authors: A. Preetha Priyadharshini, S. B. M. Priya
Abstract:
In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information
Procedia PDF Downloads 8136667 Modal Density Influence on Modal Complexity Quantification in Dynamic Systems
Authors: Fabrizio Iezzi, Claudio Valente
Abstract:
The viscous damping in dynamic systems can be proportional or non-proportional. In the first case, the mode shapes are real whereas in the second case they are complex. From an engineering point of view, the complexity of the mode shapes is important in order to quantify the non-proportional damping. Different indices exist to provide estimates of the modal complexity. These indices are or not zero, depending whether the mode shapes are not or are complex. The modal density problem arises in the experimental identification when the dynamic systems have close modal frequencies. Depending on the entity of this closeness, the mode shapes can hold fictitious imaginary quantities that affect the values of the modal complexity indices. The results are the failing in the identification of the real or complex mode shapes and then of the proportional or non-proportional damping. The paper aims to show the influence of the modal density on the values of these indices in case of both proportional and non-proportional damping. Theoretical and pseudo-experimental solutions are compared to analyze the problem according to an appropriate mechanical system.Keywords: complex mode shapes, dynamic systems identification, modal density, non-proportional damping
Procedia PDF Downloads 3876666 An Improved Particle Swarm Optimization Technique for Combined Economic and Environmental Power Dispatch Including Valve Point Loading Effects
Authors: Badr M. Alshammari, T. Guesmi
Abstract:
In recent years, the combined economic and emission power dispatch is one of the main problems of electrical power system. It aims to schedule the power generation of generators in order to minimize cost production and emission of harmful gases caused by fossil-fueled thermal units such as CO, CO2, NOx, and SO2. To solve this complicated multi-objective problem, an improved version of the particle swarm optimization technique that includes non-dominated sorting concept has been proposed. Valve point loading effects and system losses have been considered. The three-unit and ten-unit benchmark systems have been used to show the effectiveness of the suggested optimization technique for solving this kind of nonconvex problem. The simulation results have been compared with those obtained using genetic algorithm based method. Comparison results show that the proposed approach can provide a higher quality solution with better performance.Keywords: power dispatch, valve point loading effects, multiobjective optimization, Pareto solutions
Procedia PDF Downloads 2736665 Determination of Agricultural Characteristics of Smooth Bromegrass (Bromus inermis Leyss) Lines under Konya Regional Conditions
Authors: Abdullah Özköse, Ahmet Tamkoç
Abstract:
The present study was conducted to determine the yield and yield components of smooth bromegrass lines under the environmental conditions of the Konya region during the growing seasons between 2011 and 2013. The experiment was performed in the randomized complete block design (RCBD) with four replications. It was found that the selected lines had a statistically significant effect on all the investigated traits, except for the main stem length and the number of nodes in the main stem. According to the two-year average calculated for various parameters checked in the smooth bromegrass lines, the main stem length ranged from 71.6 cm to 79.1 cm, the main stem diameter from 2.12 mm from 2.70 mm, the number of nodes in the main stem from 3.2 to 3.7, the internode length from 11.6 cm to 18.9 cm, flag leaf length from 9.7 cm to 12.7 cm, flag leaf width from 3.58 cm to 6.04 mm, herbage yield from 221.3 kg da–1 to 354.7 kg da–1 and hay yield from 100.4 kg da–1 to 190.1 kg da–1. The study concluded that the smooth bromegrass lines differ in terms of yield and yield components. Therefore, it is very crucial to select suitable varieties of smooth bromegrass to obtain optimum yield.Keywords: semiarid region, smooth bromegrass, yield, yield components
Procedia PDF Downloads 2756664 Urban Ecological Interaction: Air, Water, Light and New Transit at the Human Scale of Barcelona’s Superilles
Authors: Philip Speranza
Abstract:
As everyday transit options are shifting from autocentric to pedestrian and bicycle oriented modes for healthy living, downtown streets are becoming more attractive places to live. However, tools and methods to measure the natural environment at the small scale of streets do not exist. Fortunately, a combination of mobile data collection technology and parametric urban design software now allows an interface to relate urban ecological conditions. This paper describes creation of an interactive tool to measure urban phenomena of air, water, and heat/light at the scale of new three-by-three block pedestrianized areas in Barcelona called Superilles. Each Superilla limits transit to the exterior of the blocks and to create more walkable and bikeable interior streets for healthy living. The research will describe the integration of data collection, analysis, and design output via a live interface using parametric software Rhino Grasshopper and the Human User Interface (UI) plugin.Keywords: transit, urban design, GIS, parametric design, Superilles, Barcelona, urban ecology
Procedia PDF Downloads 2476663 Effect of Time and Rate of Nitrogen Application on the Malting Quality of Barley Yield in Sandy Soil
Authors: A. S. Talaab, Safaa, A. Mahmoud, Hanan S. Siam
Abstract:
A field experiment was conducted during the winter season of 2013/2014 in the barley production area of Dakhala – New Valley Governorate, Egypt to assess the effect of nitrogen rate and time of N fertilizer application on barley grain yield, yield components and N use efficiency of barley and their association with grain yield. The treatments consisted of three levels of nitrogen (0, 70 and 100 kg N/acre) and five application times. The experiment was laid out as a randomized complete block design with three replication. Results revealed that barley grain yield and yield components increased significantly in response to N rate. Splitting N fertilizer amount at several times result in significant effect on grain yield, yield components, protein content and N uptake efficiency when compared with the entire N was applied at once. Application of N at rate of 100 kg N/acre resulted in accumulation of nitrate in the subsurface soil > 30cm. When N application timing considered, less NO3 was found in the soil profile with splitting N application compared with all preplans application.Keywords: nitrogen use efficiency, splitting N fertilizer, barley, NO3
Procedia PDF Downloads 3136662 Cost Overrun in Construction Projects
Authors: Hailu Kebede Bekele
Abstract:
Construction delays are suitable where project events occur at a certain time expected due to causes related to the client, consultant, and contractor. Delay is the major cause of the cost overrun that leads to the poor efficiency of the project. The cost difference between completion and the originally estimated is known as cost overrun. The common ways of cost overruns are not simple issues that can be neglected, but more attention should be given to prevent the organization from being devastated to be failed, and financial expenses to be extended. The reasons that may raised in different studies show that the problem may arise in construction projects due to errors in budgeting, lack of favorable weather conditions, inefficient machinery, and the availability of extravagance. The study is focused on the pace of mega projects that can have a significant change in the cost overrun calculation.15 mega projects are identified to study the problem of the cost overrun in the site. The contractor, consultant, and client are the principal stakeholders in the mega projects. 20 people from each sector were selected to participate in the investigation of the current mega construction project. The main objective of the study on the construction cost overrun is to prioritize the major causes of the cost overrun problem. The methodology that was employed in the construction cost overrun is the qualitative methodology that mostly rates the causes of construction project cost overrun. Interviews, open-ended and closed-ended questions group discussions, and rating qualitative methods are the best methodologies to study construction projects overrun. The result shows that design mistakes, lack of labor, payment delay, old equipment and scheduling, weather conditions, lack of skilled labor, payment delays, transportation, inflation, and order variations, market price fluctuation, and people's thoughts and philosophies, the prior cause of the cost overrun that fail the project performance. The institute shall follow the scheduled activities to bring a positive forward in the project life.Keywords: cost overrun, delay, mega projects, design
Procedia PDF Downloads 626661 Root Biomass Growth in Different Growth Stages of Wheat and Barley Cultivars
Abstract:
This work was conducted in greenhouse conditions in order to investigate root biomass growth of two bread wheat, two durum wheat and two barley cultivars that were grown in irrigated and dry lands, respectively. This work was planned with four replications at a Completely Randomized Block Design in 2011-2012 growing season. In the study, root biomass growth was evaluated at stages of stem elongation, complete of anthesis and full grain maturity. Results showed that there were significant differences between cultivars grown at dry and irrigated lands in all growth stages in terms of root biomass (P < 0.01). According to research results, all of growth stages, dry typed-bread and durum wheats generally had higher root biomass than irrigated typed-cultivars, furthermore that dry typed-barley cultivar, had higher root biomass at GS 31 and GS 69, however lower at GS 92 than Larende. In all cultivars, root biomass increased between GS 31 and GS 69 so that dry typed-cultivars had more root biomass increase than irrigated typed-cultivars. Root biomass of bread wheat increased between GS 69 and GS 92, however root biomass of barley and durum wheat decreased.Keywords: bread and durum wheat, barley, root biomass, different growth stage
Procedia PDF Downloads 6066660 An Image Segmentation Algorithm for Gradient Target Based on Mean-Shift and Dictionary Learning
Authors: Yanwen Li, Shuguo Xie
Abstract:
In electromagnetic imaging, because of the diffraction limited system, the pixel values could change slowly near the edge of the image targets and they also change with the location in the same target. Using traditional digital image segmentation methods to segment electromagnetic gradient images could result in lots of errors because of this change in pixel values. To address this issue, this paper proposes a novel image segmentation and extraction algorithm based on Mean-Shift and dictionary learning. Firstly, the preliminary segmentation results from adaptive bandwidth Mean-Shift algorithm are expanded, merged and extracted. Then the overlap rate of the extracted image block is detected before determining a segmentation region with a single complete target. Last, the gradient edge of the extracted targets is recovered and reconstructed by using a dictionary-learning algorithm, while the final segmentation results are obtained which are very close to the gradient target in the original image. Both the experimental results and the simulated results show that the segmentation results are very accurate. The Dice coefficients are improved by 70% to 80% compared with the Mean-Shift only method.Keywords: gradient image, segmentation and extract, mean-shift algorithm, dictionary iearning
Procedia PDF Downloads 2666659 Fast and Scale-Adaptive Target Tracking via PCA-SIFT
Authors: Yawen Wang, Hongchang Chen, Shaomei Li, Chao Gao, Jiangpeng Zhang
Abstract:
As the main challenge for target tracking is accounting for target scale change and real-time, we combine Mean-Shift and PCA-SIFT algorithm together to solve the problem. We introduce similarity comparison method to determine how the target scale changes, and taking different strategies according to different situation. For target scale getting larger will cause location error, we employ backward tracking to reduce the error. Mean-Shift algorithm has poor performance when tracking scale-changing target due to the fixed bandwidth of its kernel function. In order to overcome this problem, we introduce PCA-SIFT matching. Through key point matching between target and template, that adjusting the scale of tracking window adaptively can be achieved. Because this algorithm is sensitive to wrong match, we introduce RANSAC to reduce mismatch as far as possible. Furthermore target relocating will trigger when number of match is too small. In addition we take comprehensive consideration about target deformation and error accumulation to put forward a new template update method. Experiments on five image sequences and comparison with 6 kinds of other algorithm demonstrate favorable performance of the proposed tracking algorithm.Keywords: target tracking, PCA-SIFT, mean-shift, scale-adaptive
Procedia PDF Downloads 4336658 The Pangs of Unemployment and Its Impediment to Nation Building
Authors: Vitalis Okwuchukwu Opara
Abstract:
The task of nation building primarily consist in welding together, diverse cultural groups into a united nation state, which develops a centripetal political culture that makes its people see themselves as members of one nation linked together by more reliable ties than the coercion offered by the state. Comparatively on the contrary, most countries in the world today are comprised of diverse nationalities, each with its unique set of norms and values, which often come into conflict with others. As such, the task of nation building is in uniting these diverse cultural groups into a united nation state and various human elements that make up its geopolitical zone. The most outstanding impediment to achieving this task is unemployment. Unemployment is like a peril against the nation building. Unemployment is an obstacle for growth of a nation. Often it is said that the wise see obstacles as stepping-stones to advance further. The pangs of unemployment impede nation building such that sometimes it takes very long time to do away with the problem. In recent times, there has been a revolutionary wind blowing across the world. This wind is bound to wake up nations leaders to sit up to their responsibility. Unemployment causes youth restiveness, brings leaders to their knees. It breeds problem. This work is intended to expose the pangs of unemployment and its impending peril to nation building.Keywords: pangs, unemployment, obstacles, nation-building
Procedia PDF Downloads 3556657 Selection of Qualitative Research Strategy for Bullying and Harassment in Sport
Authors: J. Vveinhardt, V. B. Fominiene, L. Jeseviciute-Ufartiene
Abstract:
Relevance of Research: Qualitative research is still regarded as highly subjective and not sufficiently scientific in order to achieve objective research results. However, it is agreed that a qualitative study allows revealing the hidden motives of the research participants, creating new theories, and highlighting the field of problem. There is enough research done to reveal these qualitative research aspects. However, each research area has its own specificity, and sport is unique due to the image of its participants, who are understood as strong and invincible. Therefore, a sport participant might have personal issues to recognize himself as a victim in the context of bullying and harassment. Accordingly, researcher has a dilemma in general making to speak a victim in sport. Thus, ethical aspects of qualitative research become relevant. The plenty fields of sport make a problem determining the sample size of research. Thus, the corresponding problem of this research is which and why qualitative research strategies are the most suitable revealing the phenomenon of bullying and harassment in sport. Object of research is qualitative research strategy for bullying and harassment in sport. Purpose of the research is to analyze strategies of qualitative research selecting suitable one for bullying and harassment in sport. Methods of research were scientific research analyses of qualitative research application for bullying and harassment research. Research Results: Four mane strategies are applied in the qualitative research; inductive, deductive, retroductive, and abductive. Inductive and deductive strategies are commonly used researching bullying and harassment in sport. The inductive strategy is applied as quantitative research in order to reveal and describe the prevalence of bullying and harassment in sport. The deductive strategy is used through qualitative methods in order to explain the causes of bullying and harassment and to predict the actions of the participants of bullying and harassment in sport and the possible consequences of these actions. The most commonly used qualitative method for the research of bullying and harassment in sports is semi-structured interviews in speech and in written. However, these methods may restrict the openness of the participants in the study when recording on the dictator or collecting incomplete answers when the participant in the survey responds in writing because it is not possible to refine the answers. Qualitative researches are more prevalent in terms of technology-defined research data. For example, focus group research in a closed forum allows participants freely interact with each other because of the confidentiality of the selected participants in the study. The moderator can purposefully formulate and submit problem-solving questions to the participants. Hence, the application of intelligent technology through in-depth qualitative research can help discover new and specific information on bullying and harassment in sport. Acknowledgement: This research is funded by the European Social Fund according to the activity ‘Improvement of researchers’ qualification by implementing world-class R&D projects of Measure No. 09.3.3-LMT-K-712.Keywords: bullying, focus group, harassment, narrative, sport, qualitative research
Procedia PDF Downloads 1806656 Visual Working Memory, Reading Abilities, and Vocabulary in Mexican Deaf Signers
Authors: A. Mondaca, E. Mendoza, D. Jackson-Maldonado, A. García-Obregón
Abstract:
Deaf signers usually show lower scores in Auditory Working Memory (AWM) tasks and higher scores in Visual Working Memory (VWM) tasks than their hearing pairs. Further, Working Memory has been correlated with reading abilities and vocabulary in Deaf and Hearing individuals. The aim of the present study is to compare the performance of Mexican Deaf signers and hearing adults in VWM, reading and Vocabulary tasks and observe if the latter are correlated to the former. 15 Mexican Deaf signers were assessed using the Corsi block test for VWM, four different subtests of PROLEC (Batería de Evaluación de los Procesos Lectores) for reading abilities, and the LexTale in its Spanish version for vocabulary. T-tests show significant differences between groups for VWM and Vocabulary but not for all the PROLEC subtests. A significant Pearson correlation was found between VWM and Vocabulary but not between VWM and reading abilities. This work is part of a larger research study and results are not yet conclusive. A discussion about the use of PROLEC as a tool to explore reading abilities in a Deaf population is included.Keywords: deaf signers, visual working memory, reading, Mexican sign language
Procedia PDF Downloads 1686655 Prioritization of Customer Order Selection Factors by Utilizing Conjoint Analysis: A Case Study for a Structural Steel Firm
Authors: Burcu Akyildiz, Cigdem Kadaifci, Y. Ilker Topcu, Burc Ulengin
Abstract:
In today’s business environment, companies should make strategic decisions to gain sustainable competitive advantage. Order selection is a crucial issue among these decisions especially for steel production industry. When the companies allocate a high proportion of their design and production capacities to their ongoing projects, determining which customer order should be chosen among the potential orders without exceeding the remaining capacity is the major critical problem. In this study, it is aimed to identify and prioritize the evaluation factors for the customer order selection problem. Conjoint analysis is used to examine the importance level of each factor which is determined as the potential profit rate per unit of time, the compatibility of potential order with available capacity, the level of potential future order with higher profit, customer credit of future business opportunity, and the negotiability level of production schedule for the order.Keywords: conjoint analysis, order prioritization, profit management, structural steel firm
Procedia PDF Downloads 3846654 Desing of PSS and SVC to Improve Power System Stability
Authors: Mahmoud Samkan
Abstract:
In this paper, the design and assessment of new coordination between Power System Stabilizers (PSSs) and Static Var Compensator (SVC) in a multimachine power system via statistical method are proposed. The coordinated design problem of PSSs and SVC over a wide range of loading conditions is handled as an optimization problem. The Bacterial Swarming Optimization (BSO), which synergistically couples the Bacterial Foraging (BF) with the Particle Swarm Optimization (PSO), is employed to seek for optimal controllers parameters. By minimizing the proposed objective function, in which the speed deviations between generators are involved; stability performance of the system is enhanced. To compare the capability of PSS and SVC, both are designed independently, and then in a coordinated manner. Simultaneous tuning of the BSO based coordinated controller gives robust damping performance over wide range of operating conditions and large disturbance in compare to optimized PSS controller based on BSO (BSOPSS) and optimized SVC controller based on BSO (BSOSVC). Moreover, a statistical T test is executed to validate the robustness of coordinated controller versus uncoordinated one.Keywords: SVC, PSSs, multimachine power system, coordinated design, bacteria swarm optimization, statistical assessment
Procedia PDF Downloads 3766653 Machine Learning Methods for Flood Hazard Mapping
Authors: Stefano Zappacosta, Cristiano Bove, Maria Carmela Marinelli, Paola di Lauro, Katarina Spasenovic, Lorenzo Ostano, Giuseppe Aiello, Marco Pietrosanto
Abstract:
This paper proposes a novel neural network approach for assessing flood hazard mapping. The core of the model is a machine learning component fed by frequency ratios, namely statistical correlations between flood event occurrences and a selected number of topographic properties. The proposed hybrid model can be used to classify four different increasing levels of hazard. The classification capability was compared with the flood hazard mapping River Basin Plans (PAI) designed by the Italian Institute for Environmental Research and Defence, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale). The study area of Piemonte, an Italian region, has been considered without loss of generality. The frequency ratios may be used as a standalone block to model the flood hazard mapping. Nevertheless, the mixture with a neural network improves the classification power of several percentage points, and may be proposed as a basic tool to model the flood hazard map in a wider scope.Keywords: flood modeling, hazard map, neural networks, hydrogeological risk, flood risk assessment
Procedia PDF Downloads 1786652 Optimal Planning of Transmission Line Charging Mode During Black Start of a Hydroelectric Unit
Authors: Mohammad Reza Esmaili
Abstract:
After the occurrence of blackouts, the most important subject is how fast the electric service is restored. Power system restoration is an immensely complex issue and there should be a plan to be executed within the shortest time period. This plan has three main stages of black start, network reconfiguration and load restoration. In the black start stage, operators and experts may face several problems, for instance, the unsuccessful connection of the long high-voltage transmission line connected to the electrical source. In this situation, the generator may be tripped because of the unsuitable setting of its line charging mode or high absorbed reactive power. In order to solve this problem, the line charging process is defined as a nonlinear programming problem, and it is optimized by using GAMS software in this paper. The optimized process is performed on a grid that includes a 250 MW hydroelectric unit and a 400 KV transmission system. Simulations and field test results show the effectiveness of optimal planning.Keywords: power system restoration, black start, line charging mode, nonlinear programming
Procedia PDF Downloads 806651 Parameter Tuning of Complex Systems Modeled in Agent Based Modeling and Simulation
Authors: Rabia Korkmaz Tan, Şebnem Bora
Abstract:
The major problem encountered when modeling complex systems with agent-based modeling and simulation techniques is the existence of large parameter spaces. A complex system model cannot be expected to reflect the whole of the real system, but by specifying the most appropriate parameters, the actual system can be represented by the model under certain conditions. When the studies conducted in recent years were reviewed, it has been observed that there are few studies for parameter tuning problem in agent based simulations, and these studies have focused on tuning parameters of a single model. In this study, an approach of parameter tuning is proposed by using metaheuristic algorithms such as Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Bee Colonies (ABC), Firefly (FA) algorithms. With this hybrid structured study, the parameter tuning problems of the models in the different fields were solved. The new approach offered was tested in two different models, and its achievements in different problems were compared. The simulations and the results reveal that this proposed study is better than the existing parameter tuning studies.Keywords: parameter tuning, agent based modeling and simulation, metaheuristic algorithms, complex systems
Procedia PDF Downloads 2266650 A Development of Science Instructional Model Based on Stem Education Approach to Enhance Scientific Mind and Problem Solving Skills for Primary Students
Authors: Prasita Sooksamran, Wareerat Kaewurai
Abstract:
STEM is an integrated teaching approach promoted by the Ministry of Education in Thailand. STEM Education is an integrated approach to teaching Science, Technology, Engineering, and Mathematics. It has been questioned by Thai teachers on the grounds of how to integrate STEM into the classroom. Therefore, the main objective of this study is to develop a science instructional model based on the STEM approach to enhance scientific mind and problem-solving skills for primary students. This study is participatory action research, and follows the following steps: 1) develop a model 2) seek the advice of experts regarding the teaching model. Developing the instructional model began with the collection and synthesis of information from relevant documents, related research and other sources in order to create prototype instructional model. 2) The examination of the validity and relevance of instructional model by a panel of nine experts. The findings were as follows: 1. The developed instructional model comprised of principles, objective, content, operational procedures and learning evaluation. There were 4 principles: 1) Learning based on the natural curiosity of primary school level children leading to knowledge inquiry, understanding and knowledge construction, 2) Learning based on the interrelation between people and environment, 3) Learning that is based on concrete learning experiences, exploration and the seeking of knowledge, 4) Learning based on the self-construction of knowledge, creativity, innovation and 5) relating their findings to real life and the solving of real-life problems. The objective of this construction model is to enhance scientific mind and problem-solving skills. Children will be evaluated according to their achievements. Lesson content is based on science as a core subject which is integrated with technology and mathematics at grade 6 level according to The Basic Education Core Curriculum 2008 guidelines. The operational procedures consisted of 6 steps: 1) Curiosity 2) Collection of data 3) Collaborative planning 4) Creativity and Innovation 5) Criticism and 6) Communication and Service. The learning evaluation is an authentic assessment based on continuous evaluation of all the material taught. 2. The experts agreed that the Science Instructional Model based on the STEM Education Approach had an excellent level of validity and relevance (4.67 S.D. 0.50).Keywords: instructional model, STEM education, scientific mind, problem solving
Procedia PDF Downloads 1926649 The Environmental Benefits of the Adoption of Emission Control for Locomotives in Brazil
Authors: Rui de Abrantes, André Luiz Silva Forcetto
Abstract:
Air pollution is a big problem in many cities around the world. Brazilian big cities also have this problem, where millions of people are exposed daily to pollutants levels above the recommended by WHO. Brazil has taken several actions to reduce air pollution, among others, controlling the atmospheric emissions from vehicles, non-road mobile machinery, and motorcycles, but on the other side, there are no emissions controls for locomotives, which are exposing the population to tons of pollutants annually. The rail network is not homogeneously distributed in the national territory; it is denser near the big cities, and this way, the population is more exposed to pollutants; apart from that, the government intends to increase the rail network as one of the strategies for greenhouse gas mitigation, complying with the international agreements against the climate changes. This paper initially presents the estimated emissions from locomotive fleets with no emission control and with emission control equivalent to US Tier 3 from 2028 and for the next 20 years. However, we realized that a program equivalent to phase Tier 3 would not be effective, so we proposed a program in two steps that will avoid the release of more than 2.4 million tons of CO and 531,000 tons of hydrocarbons, 3.7 million tons of nitrogen oxides, and 102,000 tons of particulate matter in 20 years.Keywords: locomotives, emission control, air pollution, pollutants emission
Procedia PDF Downloads 496648 Mainstreaming Environmentally-Friendly Household Management Practice through Indonesian Women Social Gathering
Authors: Erinetta P. Anjani, Karina Mariz, Rifqi K. Fathianto
Abstract:
While Islam teaches its’ followers to be mindful of God’s creation, including the environment, Indonesia as one of the world’s largest Muslim country, is now also world’s second-largest plastic waste contributor. The problem of waste is a complicated matter in Indonesia and is worsening because many landfills are now on verge of overcapacity. The causes of this problem are at least due to two things. First is Indonesia’s bad waste management. Second, people’s low of eco-literacy, as can be seen in massive use of non-degradable materials, low rate of waste separation, low rate of recycling and up cycling, whereas households are the largest source of waste in Indonesia. Mostly dealing with patriarchal culture, women in Indonesia play big and important role in their households, from family matter to household management (including waste management), to economic matter. Uniquely, the majority of Muslim women in Indonesia are engaged in -arisan- women social gathering or in -majelis ta’lim- women community in Islamic prayer, which serves as a social mechanism. As many NGOs are working on tackling environmental issues by raising awareness in order for the people to adapt a more environmentally-friendly household management practices, the problem of waste in Indonesia is meeting a bright light. Using qualitative data and descriptive analysis, the following is a proposal for a program intended to spread eco-literacy for waste management to women in Indonesia through their social gathering in order for them to gain awareness and start implementing eco-actions in their households. We attempt Waste4Change, a social company which provides environmentally-friendly waste management services, to reach women with modules that consist of environmental education, trainings, and workshops. We will then monitor and counsel the women to make sure if the lesson is going to be fully applied in their houses. The program will take place nearby University of Indonesia, Depok, West Java.Keywords: eco-literacy, environmental education, household waste management, Muslim women social gathering, Waste4Change
Procedia PDF Downloads 1576647 Optimisation of Intermodal Transport Chain of Supermarkets on Isle of Wight, UK
Authors: Jingya Liu, Yue Wu, Jiabin Luo
Abstract:
This work investigates an intermodal transportation system for delivering goods from a Regional Distribution Centre to supermarkets on the Isle of Wight (IOW) via the port of Southampton or Portsmouth in the UK. We consider this integrated logistics chain as a 3-echelon transportation system. In such a system, there are two types of transport methods used to deliver goods across the Solent Channel: one is accompanied transport, which is used by most supermarkets on the IOW, such as Spar, Lidl and Co-operative food; the other is unaccompanied transport, which is used by Aldi. Five transport scenarios are studied based on different transport modes and ferry routes. The aim is to determine an optimal delivery plan for supermarkets of different business scales on IOW, in order to minimise the total running cost, fuel consumptions and carbon emissions. The problem is modelled as a vehicle routing problem with time windows and solved by genetic algorithm. The computing results suggested that accompanied transport is more cost efficient for small and medium business-scale supermarket chains on IOW, while unaccompanied transport has the potential to improve the efficiency and effectiveness of large business scale supermarket chains.Keywords: genetic algorithm, intermodal transport system, Isle of Wight, optimization, supermarket
Procedia PDF Downloads 3696646 Contextual Toxicity Detection with Data Augmentation
Authors: Julia Ive, Lucia Specia
Abstract:
Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing
Procedia PDF Downloads 1706645 Feasibility of Deployable Encasing for a CVDR (Cockpit Voice and Data Recorder) in Commercial Aircraft
Authors: Vishnu Nair, Rohan Kapoor
Abstract:
Recent commercial aircraft crashes demand a paradigm shift in how the CVDRs are located and recovered, particularly if the aircraft crashes in the sea. CVDR (Cockpit Voice and Data Recorder) is most vital component out of the entire wreckage that can be obtained in order to investigate the sequence of events leading to the crash. It has been a taxing and exorbitantly expensive process locating and retrieving the same in the massive water bodies as it was seen in the air crashes in the recent past, taking the unfortunate Malaysia airlines MH-370 crash into account. The study aims to provide an aid to the persisting problem by improving the buoyant as-well-as the aerodynamic properties of the proposed CVDR encasing. Alongside this the placement of the deployable CVDR on the surface of the aircraft and floatability in case of water submersion are key factors which are taken into consideration for a better resolution to the problem. All of which results into the Deployable-CVDR emerging to the surface of the water-body. Also the whole system is designed such that it can be seamlessly integrated with the current crop of commercial aircraft. The work is supported by carrying out a computational study with the help Ansys-Fluent combination.Keywords: encasing, buoyancy, deployable CVDR, floatability, water submersion
Procedia PDF Downloads 299