Search results for: semidefinite programming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 907

Search results for: semidefinite programming

637 Assessment of Water Quality Network in Karoon River by Dynamic Programming Approach (DPA)

Authors: M. Nasri Nasrabadi, A. A. Hassani

Abstract:

Karoon is one of the greatest and longest rivers of Iran, which because of the existence of numerous industrial, agricultural centers and drinking usage, has a strategic situation in the west and southwest parts of Iran, and the optimal monitoring of its water quality is an essential and indispensable national issue. Due to financial constraints, water quality monitoring network design is an efficient way to manage water quality. The most crucial part is to find appropriate locations for monitoring stations. Considering the objectives of water usage, we evaluate existing water quality sampling stations of this river. There are several methods for assessment of existing monitoring stations such as Sanders method, multiple criteria decision making and dynamic programming approach (DPA) which DPA opted in this study. The results showed that due to the drinking water quality index out of 20 existing monitoring stations, nine stations should be retained on the river, that include of Gorgor-Band-Ghir of A zone, Dez-Band-Ghir of B zone, Teir, Pole Panjom and Zargan of C zone, Darkhoein, Hafar, Chobade, and Sabonsazi of D zone. In additional, stations of Dez river have the best conditions.

Keywords: DPA, karoon river, network monitoring, water quality, sampling site

Procedia PDF Downloads 337
636 Analyzing the Market Growth in Application Programming Interface Economy Using Time-Evolving Model

Authors: Hiroki Yoshikai, Shin’ichi Arakawa, Tetsuya Takine, Masayuki Murata

Abstract:

API (Application Programming Interface) economy is expected to create new value by converting corporate services such as information processing and data provision into APIs and using these APIs to connect services. Understanding the dynamics of a market of API economy under the strategies of participants is crucial to fully maximize the values of the API economy. To capture the behavior of a market in which the number of participants changes over time, we present a time-evolving market model for a platform in which API providers who provide APIs to service providers participate in addition to service providers and consumers. Then, we use the market model to clarify the role API providers play in expanding market participants and forming ecosystems. The results show that the platform with API providers increased the number of market participants by 67% and decreased the cost to develop services by 25% compared to the platform without API providers. Furthermore, during the expansion phase of the market, it is found that the profits of participants are mostly the same when 70% of the revenue from consumers is distributed to service providers and API providers. It is also found that when the market is mature, the profits of the service provider and API provider will decrease significantly due to their competition, and the profit of the platform increases.

Keywords: API economy, ecosystem, platform, API providers

Procedia PDF Downloads 58
635 Approach to Formulate Intuitionistic Fuzzy Regression Models

Authors: Liang-Hsuan Chen, Sheng-Shing Nien

Abstract:

This study aims to develop approaches to formulate intuitionistic fuzzy regression (IFR) models for many decision-making applications in the fuzzy environments using intuitionistic fuzzy observations. Intuitionistic fuzzy numbers (IFNs) are used to characterize the fuzzy input and output variables in the IFR formulation processes. A mathematical programming problem (MPP) is built up to optimally determine the IFR parameters. Each parameter in the MPP is defined as a couple of alternative numerical variables with opposite signs, and an intuitionistic fuzzy error term is added to the MPP to characterize the uncertainty of the model. The IFR model is formulated based on the distance measure to minimize the total distance errors between estimated and observed intuitionistic fuzzy responses in the MPP resolution processes. The proposed approaches are simple/efficient in the formulation/resolution processes, in which the sign of parameters can be determined so that the problem to predetermine the sign of parameters is avoided. Furthermore, the proposed approach has the advantage that the spread of the predicted IFN response will not be over-increased, since the parameters in the established IFR model are crisp. The performance of the obtained models is evaluated and compared with the existing approaches.

Keywords: fuzzy sets, intuitionistic fuzzy number, intuitionistic fuzzy regression, mathematical programming method

Procedia PDF Downloads 109
634 The Reduction of CO2 Emissions Level in Malaysian Transportation Sector: An Optimization Approach

Authors: Siti Indati Mustapa, Hussain Ali Bekhet

Abstract:

Transportation sector represents more than 40% of total energy consumption in Malaysia. This sector is a major user of fossils based fuels, and it is increasingly being highlighted as the sector which contributes least to CO2 emission reduction targets. Considering this fact, this paper attempts to investigate the problem of reducing CO2 emission using linear programming approach. An optimization model which is used to investigate the optimal level of CO2 emission reduction in the road transport sector is presented. In this paper, scenarios have been used to demonstrate the emission reduction model: (1) utilising alternative fuel scenario, (2) improving fuel efficiency scenario, (3) removing fuel subsidy scenario, (4) reducing demand travel, (5) optimal scenario. This study finds that fuel balancing can contribute to the reduction of the amount of CO2 emission by up to 3%. Beyond 3% emission reductions, more stringent measures that include fuel switching, fuel efficiency improvement, demand travel reduction and combination of mitigation measures have to be employed. The model revealed that the CO2 emission reduction in the road transportation can be reduced by 38.3% in the optimal scenario.

Keywords: CO2 emission, fuel consumption, optimization, linear programming, transportation sector, Malaysia

Procedia PDF Downloads 383
633 An Investigation of Community Radio Broadcasting in Phutthamonthon District, Nakhon Pathom, Thailand

Authors: Anchana Sooksomchitra

Abstract:

This study aims to explore and compare the current condition of community radio stations in Phutthamonthon district, Nakhon Pathom province, Thailand, as well as the challenges they are facing. Qualitative research tools including in-depth interviews; documentary analysis; focus group interviews; and observation, are used to examine the content, programming, and management structure of three community radio stations currently in operation within the district. Research findings indicate that the management and operational approaches adopted by the two non-profit stations included in the study, Salaya Pattana and Voice of Dhamma, are more structured and effective than that of the for-profit Tune Radio. Salaya Pattana – backed by the Faculty of Engineering, Mahidol University, and the charity-funded Voice of Dhamma, are comparatively free from political and commercial influence, and able to provide more relevant and consistent community-oriented content to meet the real demand of the audience. Tune Radio, on the other hand, has to rely solely on financial support from political factions and business groups, which heavily influence its content.

Keywords: radio broadcasting, programming, management, community radio, Thailand

Procedia PDF Downloads 369
632 Optimal Delivery of Two Similar Products to N Ordered Customers

Authors: Epaminondas G. Kyriakidis, Theodosis D. Dimitrakos, Constantinos C. Karamatsoukis

Abstract:

The vehicle routing problem (VRP) is a well-known problem in Operations Research and has been widely studied during the last fifty-five years. The context of the VRP is that of delivering products located at a central depot to customers who are scattered in a geographical area and have placed orders for these products. A vehicle or a fleet of vehicles start their routes from the depot and visit the customers in order to satisfy their demands. Special attention has been given to the capacitated VRP in which the vehicles have limited carrying capacity of the goods that must be delivered. In the present work, we present a specific capacitated stochastic vehicle routing problem which has realistic applications to distributions of materials to shops or to healthcare facilities or to military units. A vehicle starts its route from a depot loaded with items of two similar but not identical products. We name these products, product 1 and product 2. The vehicle must deliver the products to N customers according to a predefined sequence. This means that first customer 1 must be serviced, then customer 2 must be serviced, then customer 3 must be serviced and so on. The vehicle has a finite capacity and after servicing all customers it returns to the depot. It is assumed that each customer prefers either product 1 or product 2 with known probabilities. The actual preference of each customer becomes known when the vehicle visits the customer. It is also assumed that the quantity that each customer demands is a random variable with known distribution. The actual demand is revealed upon the vehicle’s arrival at customer’s site. The demand of each customer cannot exceed the vehicle capacity and the vehicle is allowed during its route to return to the depot to restock with quantities of both products. The travel costs between consecutive customers and the travel costs between the customers and the depot are known. If there is shortage for the desired product, it is permitted to deliver the other product at a reduced price. The objective is to find the optimal routing strategy, i.e. the routing strategy that minimizes the expected total cost among all possible strategies. It is possible to find the optimal routing strategy using a suitable stochastic dynamic programming algorithm. It is also possible to prove that the optimal routing strategy has a specific threshold-type structure, i.e. it is characterized by critical numbers. This structural result enables us to construct an efficient special-purpose dynamic programming algorithm that operates only over those routing strategies having this structure. The findings of the present study lead us to the conclusion that the dynamic programming method may be a very useful tool for the solution of specific vehicle routing problems. A problem for future research could be the study of a similar stochastic vehicle routing problem in which the vehicle instead of delivering, it collects products from ordered customers.

Keywords: collection of similar products, dynamic programming, stochastic demands, stochastic preferences, vehicle routing problem

Procedia PDF Downloads 234
631 Use of Transportation Networks to Optimize The Profit Dynamics of the Product Distribution

Authors: S. Jayasinghe, R. B. N. Dissanayake

Abstract:

Optimization modelling together with the Network models and Linear Programming techniques is a powerful tool in problem solving and decision making in real world applications. This study developed a mathematical model to optimize the net profit by minimizing the transportation cost. This model focuses the transportation among decentralized production plants to a centralized distribution centre and then the distribution among island wide agencies considering the customer satisfaction as a requirement. This company produces basically 9 types of food items with 82 different varieties and 4 types of non-food items with 34 different varieties. Among 6 production plants, 4 were located near the city of Mawanella and the other 2 were located in Galewala and Anuradhapura cities which are 80 km and 150 km away from Mawanella respectively. The warehouse located in the Mawanella was the main production plant and also the only distribution plant. This plant distributes manufactured products to 39 agencies island-wide. The average values and average amount of the goods for 6 consecutive months from May 2013 to October 2013 were collected and then average demand values were calculated. The following constraints are used as the necessary requirement to satisfy the optimum condition of the model; there was one source, 39 destinations and supply and demand for all the agencies are equal. Using transport cost for a kilometer, total transport cost was calculated. Then the model was formulated using distance and flow of the distribution. Network optimization and linear programming techniques were used to originate the model while excel solver is used in solving. Results showed that company requires total transport cost of Rs. 146, 943, 034.50 to fulfil the customers’ requirement for a month. This is very much less when compared with data without using the model. Model also proved that company can reduce their transportation cost by 6% when distributing to island-wide customers. Company generally satisfies their customers’ requirements by 85%. This satisfaction can be increased up to 97% by using this model. Therefore this model can be used by other similar companies in order to reduce the transportation cost.

Keywords: mathematical model, network optimization, linear programming

Procedia PDF Downloads 312
630 A Bibliometric Assessment of the Nexus Between Corporate Social Responsibility and Sustainable Development

Authors: Trilochana Dash, Chandan Kumar Sahoo

Abstract:

In today's environment of intensive industrialization, the role of business in societal modernization is critical. The concept of corporate social responsibility (CSR) arose due to rising societal awareness of company conduct. Corporations that practice CSR devote a portion of their profits to society’s sustainable development (SD). The concept of CSR and SD has increased the impact of industries on society. In this study, bibliometric analysis was conducted using the “R” programming language to determine the comprehensiveness of CSR and SD. From 2003 to 2022, bibliometric data was collected from two databases: Scopus and Web of Science (WOS). According to the findings, CSR and SD research has risen exponentially in the past two decades, and “Corporate Social Responsibility and Environment Management” emerged as the most influential journal in this field. The findings also show that relatively very few researchers collaborate in CSR and SD research in the last twenty years. It is widely acknowledged that most CSR and SD research is conducted in developed countries and developing countries undergoing fast industrialization. Thematic evolution and cluster analysis clearly show that the notion of CSR and SD among scholars has been quite popular over the last two decades. Finally, limitations and future directions are discussed.

Keywords: corporate social responsibility, sustainable development, bibliometric analysis, “R” programming language, visualization, holistic picture

Procedia PDF Downloads 52
629 Two Efficient Heuristic Algorithms for the Integrated Production Planning and Warehouse Layout Problem

Authors: Mohammad Pourmohammadi Fallah, Maziar Salahi

Abstract:

In the literature, a mixed-integer linear programming model for the integrated production planning and warehouse layout problem is proposed. To solve the model, the authors proposed a Lagrangian relax-and-fix heuristic that takes a significant amount of time to stop with gaps above 5$\%$ for large-scale instances. Here, we present two heuristic algorithms to solve the problem. In the first one, we use a greedy approach by allocating warehouse locations with less reservation costs and also less transportation costs from the production area to locations and from locations to the output point to items with higher demands. Then a smaller model is solved. In the second heuristic, first, we sort items in descending order according to the fraction of the sum of the demands for that item in the time horizon plus the maximum demand for that item in the time horizon and the sum of all its demands in the time horizon. Then we categorize the sorted items into groups of 3, 4, or 5 and solve a small-scale optimization problem for each group, hoping to improve the solution of the first heuristic. Our preliminary numerical results show the effectiveness of the proposed heuristics.

Keywords: capacitated lot-sizing, warehouse layout, mixed-integer linear programming, heuristics algorithm

Procedia PDF Downloads 152
628 Left to Right-Right Most Parsing Algorithm with Lookahead

Authors: Jamil Ahmed

Abstract:

Left to Right-Right Most (LR) parsing algorithm is a widely used algorithm of syntax analysis. It is contingent on a parsing table, whereas the parsing tables are extracted from the grammar. The parsing table specifies the actions to be taken during parsing. It requires that the parsing table should have no action conflicts for the same input symbol. This requirement imposes a condition on the class of grammars over which the LR algorithms work. However, there are grammars for which the parsing tables hold action conflicts. In such cases, the algorithm needs a capability of scanning (looking-ahead) next input symbols ahead of the current input symbol. In this paper, a ‘Left to Right’-‘Right Most’ parsing algorithm with lookahead capability is introduced. The 'look-ahead' capability in the LR parsing algorithm is the major contribution of this paper. The practicality of the proposed algorithm is substantiated by the parser implementation of the Context Free Grammar (CFG) of an already proposed programming language 'State Controlled Object Oriented Programming' (SCOOP). SCOOP’s Context Free Grammar has 125 productions and 192 item sets. This algorithm parses SCOOP while the grammar requires to ‘look ahead’ the input symbols due to action conflicts in its parsing table. Proposed LR parsing algorithm with lookahead capability can be viewed as an optimization of ‘Simple Left to Right’-‘Right Most’ (SLR) parsing algorithm.

Keywords: left to right-right most parsing, syntax analysis, bottom-up parsing algorithm

Procedia PDF Downloads 85
627 Resource-Constrained Assembly Line Balancing Problems with Multi-Manned Workstations

Authors: Yin-Yann Chen, Jia-Ying Li

Abstract:

Assembly line balancing problems can be categorized into one-sided, two-sided, and multi-manned ones by using the number of operators deployed at workstations. This study explores the balancing problem of a resource-constrained assembly line with multi-manned workstations. Resources include machines or tools in assembly lines such as jigs, fixtures, and hand tools. A mathematical programming model was developed to carry out decision-making and planning in order to minimize the numbers of workstations, resources, and operators for achieving optimal production efficiency. To improve the solution-finding efficiency, a genetic algorithm (GA) and a simulated annealing algorithm (SA) were designed and developed in this study to be combined with a practical case in car making. Results of the GA/SA and mathematics programming were compared to verify their validity. Finally, analysis and comparison were conducted in terms of the target values, production efficiency, and deployment combinations provided by the algorithms in order for the results of this study to provide references for decision-making on production deployment.

Keywords: heuristic algorithms, line balancing, multi-manned workstation, resource-constrained

Procedia PDF Downloads 171
626 Load Forecasting Using Neural Network Integrated with Economic Dispatch Problem

Authors: Mariyam Arif, Ye Liu, Israr Ul Haq, Ahsan Ashfaq

Abstract:

High cost of fossil fuels and intensifying installations of alternate energy generation sources are intimidating main challenges in power systems. Making accurate load forecasting an important and challenging task for optimal energy planning and management at both distribution and generation side. There are many techniques to forecast load but each technique comes with its own limitation and requires data to accurately predict the forecast load. Artificial Neural Network (ANN) is one such technique to efficiently forecast the load. Comparison between two different ranges of input datasets has been applied to dynamic ANN technique using MATLAB Neural Network Toolbox. It has been observed that selection of input data on training of a network has significant effects on forecasted results. Day-wise input data forecasted the load accurately as compared to year-wise input data. The forecasted load is then distributed among the six generators by using the linear programming to get the optimal point of generation. The algorithm is then verified by comparing the results of each generator with their respective generation limits.

Keywords: artificial neural networks, demand-side management, economic dispatch, linear programming, power generation dispatch

Procedia PDF Downloads 160
625 Customer Experience Management in Food and Beverage Outlet at Indian School of Business: Methodology and Recommendations

Authors: Anupam Purwar

Abstract:

In conventional consumer product industry, stockouts are taken care by carrying buffer stock to check underserving caused by changes in customer demand, incorrect forecast or variability in lead times. But, for food outlets, the alternate of carrying buffer stock is unviable because of indispensable need to serve freshly cooked meals. Besides, the food outlet being the sole provider has no incentives to reduce stockouts, as they have no fear of losing revenue, gross profit, customers and market share. Hence, innovative, easy to implement and practical ways of addressing the twin problem of long queues and poor customer experience needs to be investigated. Current work analyses the demand pattern of 11 different food items across a routine day. Based on this optimum resource allocation for all food items has been carried out by solving a linear programming problem with cost minimization as the objective. Concurrently, recommendations have been devised to address this demand and supply side problem keeping in mind their practicability. Currently, the recommendations are being discussed and implemented at ISB (Indian School of Business) Hyderabad campus.

Keywords: F&B industry, resource allocation, demand management, linear programming, LP, queuing analysis

Procedia PDF Downloads 107
624 A Parallel Computation Based on GPU Programming for a 3D Compressible Fluid Flow Simulation

Authors: Sugeng Rianto, P.W. Arinto Yudi, Soemarno Muhammad Nurhuda

Abstract:

A computation of a 3D compressible fluid flow for virtual environment with haptic interaction can be a non-trivial issue. This is especially how to reach good performances and balancing between visualization, tactile feedback interaction, and computations. In this paper, we describe our approach of computation methods based on parallel programming on a GPU. The 3D fluid flow solvers have been developed for smoke dispersion simulation by using combinations of the cubic interpolated propagation (CIP) based fluid flow solvers and the advantages of the parallelism and programmability of the GPU. The fluid flow solver is generated in the GPU-CPU message passing scheme to get rapid development of haptic feedback modes for fluid dynamic data. A rapid solution in fluid flow solvers is developed by applying cubic interpolated propagation (CIP) fluid flow solvers. From this scheme, multiphase fluid flow equations can be solved simultaneously. To get more acceleration in the computation, the Navier-Stoke Equations (NSEs) is packed into channels of texel, where computation models are performed on pixels that can be considered to be a grid of cells. Therefore, despite of the complexity of the obstacle geometry, processing on multiple vertices and pixels can be done simultaneously in parallel. The data are also shared in global memory for CPU to control the haptic in providing kinaesthetic interaction and felling. The results show that GPU based parallel computation approaches provide effective simulation of compressible fluid flow model for real-time interaction in 3D computer graphic for PC platform. This report has shown the feasibility of a new approach of solving the compressible fluid flow equations on the GPU. The experimental tests proved that the compressible fluid flowing on various obstacles with haptic interactions on the few model obstacles can be effectively and efficiently simulated on the reasonable frame rate with a realistic visualization. These results confirm that good performances and balancing between visualization, tactile feedback interaction, and computations can be applied successfully.

Keywords: CIP, compressible fluid, GPU programming, parallel computation, real-time visualisation

Procedia PDF Downloads 401
623 A Mixed Integer Programming Model for Optimizing the Layout of an Emergency Department

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

During the recent years, demand for healthcare services has dramatically increased. As the demand for healthcare services increases, so does the necessity of constructing new healthcare buildings and redesigning and renovating existing ones. Increasing demands necessitate the use of optimization techniques to improve the overall service efficiency in healthcare settings. However, high complexity of care processes remains the major challenge to accomplish this goal. This study proposes a method based on process mining results to address the high complexity of care processes and to find the optimal layout of the various medical centers in an emergency department. ProM framework is used to discover clinical pathway patterns and relationship between activities. Sequence clustering plug-in is used to remove infrequent events and to derive the process model in the form of Markov chain. The process mining results served as an input for the next phase which consists of the development of the optimization model. Comparison of the current ED design with the one obtained from the proposed method indicated that a carefully designed layout can significantly decrease the distances that patients must travel.

Keywords: Mixed Integer programming, Facility layout problem, Process Mining, Healthcare Operation Management

Procedia PDF Downloads 307
622 A Hybrid Expert System for Generating Stock Trading Signals

Authors: Hosein Hamisheh Bahar, Mohammad Hossein Fazel Zarandi, Akbar Esfahanipour

Abstract:

In this paper, a hybrid expert system is developed by using fuzzy genetic network programming with reinforcement learning (GNP-RL). In this system, the frame-based structure of the system uses the trading rules extracted by GNP. These rules are extracted by using technical indices of the stock prices in the training time period. For developing this system, we applied fuzzy node transition and decision making in both processing and judgment nodes of GNP-RL. Consequently, using these method not only did increase the accuracy of node transition and decision making in GNP's nodes, but also extended the GNP's binary signals to ternary trading signals. In the other words, in our proposed Fuzzy GNP-RL model, a No Trade signal is added to conventional Buy or Sell signals. Finally, the obtained rules are used in a frame-based system implemented in Kappa-PC software. This developed trading system has been used to generate trading signals for ten companies listed in Tehran Stock Exchange (TSE). The simulation results in the testing time period shows that the developed system has more favorable performance in comparison with the Buy and Hold strategy.

Keywords: fuzzy genetic network programming, hybrid expert system, technical trading signal, Tehran stock exchange

Procedia PDF Downloads 294
621 Blockchain-Resilient Framework for Cloud-Based Network Devices within the Architecture of Self-Driving Cars

Authors: Mirza Mujtaba Baig

Abstract:

Artificial Intelligence (AI) is evolving rapidly, and one of the areas in which this field has influenced is automation. The automobile, healthcare, education, and robotic industries deploy AI technologies constantly, and the automation of tasks is beneficial to allow time for knowledge-based tasks and also introduce convenience to everyday human endeavors. The paper reviews the challenges faced with the current implementations of autonomous self-driving cars by exploring the machine learning, robotics, and artificial intelligence techniques employed for the development of this innovation. The controversy surrounding the development and deployment of autonomous machines, e.g., vehicles, begs the need for the exploration of the configuration of the programming modules. This paper seeks to add to the body of knowledge of research assisting researchers in decreasing the inconsistencies in current programming modules. Blockchain is a technology of which applications are mostly found within the domains of financial, pharmaceutical, manufacturing, and artificial intelligence. The registering of events in a secured manner as well as applying external algorithms required for the data analytics are especially helpful for integrating, adapting, maintaining, and extending to new domains, especially predictive analytics applications.

Keywords: artificial intelligence, automation, big data, self-driving cars, machine learning, neural networking algorithm, blockchain, business intelligence

Procedia PDF Downloads 86
620 The Mathematics of Fractal Art: Using a Derived Cubic Method and the Julia Programming Language to Make Fractal Zoom Videos

Authors: Darsh N. Patel, Eric Olson

Abstract:

Fractals can be found everywhere, whether it be the shape of a leaf or a system of blood vessels. Fractals are used to help study and understand different physical and mathematical processes; however, their artistic nature is also beautiful to simply explore. This project explores fractals generated by a cubically convergent extension to Newton's method. With this iteration as a starting point, a complex plane spanning from -2 to 2 is created with a color wheel mapped onto it. Next, the polynomial whose roots the fractal will generate from is established. From the Fundamental Theorem of Algebra, it is known that any polynomial has as many roots (counted by multiplicity) as its degree. When generating the fractals, each root will receive its own color. The complex plane can then be colored to indicate the basins of attraction that converge to each root. From a computational point of view, this project’s code identifies which points converge to which roots and then obtains fractal images. A zoom path into the fractal was implemented to easily visualize the self-similar structure. This path was obtained by selecting keyframes at different magnifications through which a path is then interpolated. Using parallel processing, many images were generated and condensed into a video. This project illustrates how practical techniques used for scientific visualization can also have an artistic side.

Keywords: fractals, cubic method, Julia programming language, basin of attraction

Procedia PDF Downloads 218
619 The Development of Online Lessons in Integration Model

Authors: Chalermpol Tapsai

Abstract:

The objectives of this research were to develop and find the efficiency of integrated online lessons by investigating the usage of online lessons, the relationship between learners’ background knowledge, and the achievement after learning with online lessons. The sample group in this study consisted of 97 students randomly selected from 121 students registering in 1/2012 at Trimitwittayaram Learning Center. The sample technique employed stratified sample technique of 4 groups according to their proficiency, i.e. high, moderate, low, and non-knowledge. The research instrument included online lessons in integration model on the topic of Java Programming, test after each lesson, the achievement test at the end of the course, and the questionnaires to find learners’ satisfaction. The results showed that the efficiency of online lessons was 90.20/89.18 with the achievement of after learning with the lessons higher than that before the lessons at the statistically significant level of 0.05. Moreover, the background knowledge of the learners on the programming showed the positive relationship with the achievement learning at the statistically significant level at 0.05. Learners with high background knowledge employed less exercises and samples than those with lower background knowledge. While learners with different background in the group of moderate and low did not show the significant difference in employing samples and exercises.

Keywords: integration model, online lessons, learners’ background knowledge, efficiency

Procedia PDF Downloads 336
618 Assignment of Airlines Technical Members under Disruption

Authors: Walid Moudani

Abstract:

The Crew Reserve Assignment Problem (CRAP) considers the assignment of the crew members to a set of reserve activities covering all the scheduled flights in order to ensure a continuous plan so that operations costs are minimized while its solution must meet hard constraints resulting from the safety regulations of Civil Aviation as well as from the airlines internal agreements. The problem considered in this study is of highest interest for airlines and may have important consequences on the service quality and on the economic return of the operations. In this communication, a new mathematical formulation for the CRAP is proposed which takes into account the regulations and the internal agreements. While current solutions make use of Artificial Intelligence techniques run on main frame computers, a low cost approach is proposed to provide on-line efficient solutions to face perturbed operating conditions. The proposed solution method uses a dynamic programming approach for the duties scheduling problem and when applied to the case of a medium airline while providing efficient solutions, shows good potential acceptability by the operations staff. This optimization scheme can then be considered as the core of an on-line Decision Support System for crew reserve assignment operations management.

Keywords: airlines operations management, combinatorial optimization, dynamic programming, crew scheduling

Procedia PDF Downloads 332
617 JaCoText: A Pretrained Model for Java Code-Text Generation

Authors: Jessica Lopez Espejel, Mahaman Sanoussi Yahaya Alassan, Walid Dahhane, El Hassane Ettifouri

Abstract:

Pretrained transformer-based models have shown high performance in natural language generation tasks. However, a new wave of interest has surged: automatic programming language code generation. This task consists of translating natural language instructions to a source code. Despite the fact that well-known pre-trained models on language generation have achieved good performance in learning programming languages, effort is still needed in automatic code generation. In this paper, we introduce JaCoText, a model based on Transformer neural network. It aims to generate java source code from natural language text. JaCoText leverages the advantages of both natural language and code generation models. More specifically, we study some findings from state of the art and use them to (1) initialize our model from powerful pre-trained models, (2) explore additional pretraining on our java dataset, (3) lead experiments combining the unimodal and bimodal data in training, and (4) scale the input and output length during the fine-tuning of the model. Conducted experiments on CONCODE dataset show that JaCoText achieves new state-of-the-art results.

Keywords: java code generation, natural language processing, sequence-to-sequence models, transformer neural networks

Procedia PDF Downloads 234
616 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming

Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero

Abstract:

Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.

Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up

Procedia PDF Downloads 209
615 Prediction of Temperature Distribution during Drilling Process Using Artificial Neural Network

Authors: Ali Reza Tahavvor, Saeed Hosseini, Nazli Jowkar, Afshin Karimzadeh Fard

Abstract:

Experimental & numeral study of temperature distribution during milling process, is important in milling quality and tools life aspects. In the present study the milling cross-section temperature is determined by using Artificial Neural Networks (ANN) according to the temperature of certain points of the work piece and the points specifications and the milling rotational speed of the blade. In the present work, at first three-dimensional model of the work piece is provided and then by using the Computational Heat Transfer (CHT) simulations, temperature in different nods of the work piece are specified in steady-state conditions. Results obtained from CHT are used for training and testing the ANN approach. Using reverse engineering and setting the desired x, y, z and the milling rotational speed of the blade as input data to the network, the milling surface temperature determined by neural network is presented as output data. The desired points temperature for different milling blade rotational speed are obtained experimentally and by extrapolation method for the milling surface temperature is obtained and a comparison is performed among the soft programming ANN, CHT results and experimental data and it is observed that ANN soft programming code can be used more efficiently to determine the temperature in a milling process.

Keywords: artificial neural networks, milling process, rotational speed, temperature

Procedia PDF Downloads 370
614 Evolving Credit Scoring Models using Genetic Programming and Language Integrated Query Expression Trees

Authors: Alexandru-Ion Marinescu

Abstract:

There exist a plethora of methods in the scientific literature which tackle the well-established task of credit score evaluation. In its most abstract form, a credit scoring algorithm takes as input several credit applicant properties, such as age, marital status, employment status, loan duration, etc. and must output a binary response variable (i.e. “GOOD” or “BAD”) stating whether the client is susceptible to payment return delays. Data imbalance is a common occurrence among financial institution databases, with the majority being classified as “GOOD” clients (clients that respect the loan return calendar) alongside a small percentage of “BAD” clients. But it is the “BAD” clients we are interested in since accurately predicting their behavior is crucial in preventing unwanted loss for loan providers. We add to this whole context the constraint that the algorithm must yield an actual, tractable mathematical formula, which is friendlier towards financial analysts. To this end, we have turned to genetic algorithms and genetic programming, aiming to evolve actual mathematical expressions using specially tailored mutation and crossover operators. As far as data representation is concerned, we employ a very flexible mechanism – LINQ expression trees, readily available in the C# programming language, enabling us to construct executable pieces of code at runtime. As the title implies, they model trees, with intermediate nodes being operators (addition, subtraction, multiplication, division) or mathematical functions (sin, cos, abs, round, etc.) and leaf nodes storing either constants or variables. There is a one-to-one correspondence between the client properties and the formula variables. The mutation and crossover operators work on a flattened version of the tree, obtained via a pre-order traversal. A consequence of our chosen technique is that we can identify and discard client properties which do not take part in the final score evaluation, effectively acting as a dimensionality reduction scheme. We compare ourselves with state of the art approaches, such as support vector machines, Bayesian networks, and extreme learning machines, to name a few. The data sets we benchmark against amount to a total of 8, of which we mention the well-known Australian credit and German credit data sets, and the performance indicators are the following: percentage correctly classified, area under curve, partial Gini index, H-measure, Brier score and Kolmogorov-Smirnov statistic, respectively. Finally, we obtain encouraging results, which, although placing us in the lower half of the hierarchy, drive us to further refine the algorithm.

Keywords: expression trees, financial credit scoring, genetic algorithm, genetic programming, symbolic evolution

Procedia PDF Downloads 89
613 The Impact of Distributed Epistemologies on Software Engineering

Authors: Thomas Smith

Abstract:

Many hackers worldwide would agree that, had it not been for linear-time theory, the refinement of Byzantine fault tolerance might never have occurred. After years of significant research into extreme programming, we validate the refinement of simulated annealing. Maw, our new framework for unstable theory, is the solution to all of these issues.

Keywords: distributed, software engineering, DNS, DHCP

Procedia PDF Downloads 319
612 A Heuristic Based Decomposition Approach for a Hierarchical Production Planning Problem

Authors: Nusrat T. Chowdhury, M. F. Baki, A. Azab

Abstract:

The production planning problem is concerned with specifying the optimal quantities to produce in order to meet the demand for a prespecified planning horizon with the least possible expenditure. Making the right decisions in production planning will affect directly the performance and productivity of a manufacturing firm, which is important for its ability to compete in the market. Therefore, developing and improving solution procedures for production planning problems is very significant. In this paper, we develop a Dantzig-Wolfe decomposition of a multi-item hierarchical production planning problem with capacity constraint and present a column generation approach to solve the problem. The original Mixed Integer Linear Programming model of the problem is decomposed item by item into a master problem and a number of subproblems. The capacity constraint is considered as the linking constraint between the master problem and the subproblems. The subproblems are solved using the dynamic programming approach. We also propose a multi-step iterative capacity allocation heuristic procedure to handle any kind of infeasibility that arises while solving the problem. We compare the computational performance of the developed solution approach against the state-of-the-art heuristic procedure available in the literature. The results show that the proposed heuristic-based decomposition approach improves the solution quality by 20% as compared to the literature.

Keywords: inventory, multi-level capacitated lot-sizing, emission control, setup carryover

Procedia PDF Downloads 102
611 Multi-Stage Multi-Period Production Planning in Wire and Cable Industry

Authors: Mahnaz Hosseinzadeh, Shaghayegh Rezaee Amiri

Abstract:

This paper presents a methodology for serial production planning problem in wire and cable manufacturing process that addresses the problem of input-output imbalance in different consecutive stations, hoping to minimize the halt of machines in each stage. To this end, a linear Goal Programming (GP) model is developed, in which four main categories of constraints as per the number of runs per machine, machines’ sequences, acceptable inventories of machines at the end of each period, and the necessity of fulfillment of the customers’ orders are considered. The model is formulated based upon on the real data obtained from IKO TAK Company, an important supplier of wire and cable for oil and gas and automotive industries in Iran. By solving the model in GAMS software the optimal number of runs, end-of-period inventories, and the possible minimum idle time for each machine are calculated. The application of the numerical results in the target company has shown the efficiency of the proposed model and the solution in decreasing the lead time of the end product delivery to the customers by 20%. Accordingly, the developed model could be easily applied in wire and cable companies for the aim of optimal production planning to reduce the halt of machines in manufacturing stages.

Keywords: goal programming approach, GP, production planning, serial manufacturing process, wire and cable industry

Procedia PDF Downloads 127
610 Supercomputer Simulation of Magnetic Multilayers Films

Authors: Vitalii Yu. Kapitan, Aleksandr V. Perzhu, Konstantin V. Nefedev

Abstract:

The necessity of studying magnetic multilayer structures is explained by the prospects of their practical application as a technological base for creating new storages medium. Magnetic multilayer films have many unique features that contribute to increasing the density of information recording and the speed of storage devices. Multilayer structures are structures of alternating magnetic and nonmagnetic layers. In frame of the classical Heisenberg model, lattice spin systems with direct short- and long-range exchange interactions were investigated by Monte Carlo methods. The thermodynamic characteristics of multilayer structures, such as the temperature behavior of magnetization, energy, and heat capacity, were investigated. The processes of magnetization reversal of multilayer structures in external magnetic fields were investigated. The developed software is based on the new, promising programming language Rust. Rust is a new experimental programming language developed by Mozilla. The language is positioned as an alternative to C and C++. For the Monte Carlo simulation, the Metropolis algorithm and its parallel implementation using MPI and the Wang-Landau algorithm were used. We are planning to study of magnetic multilayer films with asymmetric Dzyaloshinskii–Moriya (DM) interaction, interfacing effects and skyrmions textures. This work was supported by the state task of the Ministry of Education and Science of the Russia # 3.7383.2017/8.9

Keywords: The Monte Carlo methods, Heisenberg model, multilayer structures, magnetic skyrmion

Procedia PDF Downloads 135
609 Static Analysis of Security Issues of the Python Packages Ecosystem

Authors: Adam Gorine, Faten Spondon

Abstract:

Python is considered the most popular programming language and offers its own ecosystem for archiving and maintaining open-source software packages. This system is called the python package index (PyPI), the repository of this programming language. Unfortunately, one-third of these software packages have vulnerabilities that allow attackers to execute code automatically when a vulnerable or malicious package is installed. This paper contributes to large-scale empirical studies investigating security issues in the python ecosystem by evaluating package vulnerabilities. These provide a series of implications that can help the security of software ecosystems by improving the process of discovering, fixing, and managing package vulnerabilities. The vulnerable dataset is generated using the NVD, the national vulnerability database, and the Snyk vulnerability dataset. In addition, we evaluated 807 vulnerability reports in the NVD and 3900 publicly known security vulnerabilities in Python Package Manager (pip) from the Snyk database from 2002 to 2022. As a result, many Python vulnerabilities appear in high severity, followed by medium severity. The most problematic areas have been improper input validation and denial of service attacks. A hybrid scanning tool that combines the three scanners bandit, snyk and dlint, which provide a clear report of the code vulnerability, is also described.

Keywords: Python vulnerabilities, bandit, Snyk, Dlint, Python package index, ecosystem, static analysis, malicious attacks

Procedia PDF Downloads 93
608 The Importance of Downstream Supply Chain in Supply Chain Risk Management: Multi-Objective Optimization

Authors: Zohreh Khojasteh-Ghamari, Takashi Irohara

Abstract:

One of the efficient ways in supply chain risk management is avoiding the interruption in Supply Chain (SC) before it occurs. Although the majority of the organizations focus on their first-tier suppliers to avoid risk in the SC, studies show that in only 60 percent of the disruption cases the reason is first tier suppliers. In the 40 percent of the SC disruptions, the reason is downstream SC, which is the second tier and lower. Due to the increasing complexity and interrelation of modern supply chains, the SC elements have become difficult to trace. Moreover, studies show that there is a vital need to better understand the integration of risk and visibility, especially in the context of multiple objectives. In this study, we propose a multi-objective programming model to avoid disruption in SC. The objective of this study is evaluating the effect of downstream SCV on managing supply chain risk. We propose a multi-objective mathematical programming model with the objective functions of minimizing the total cost and maximizing the downstream supply chain visibility (SCV). The decision variable is supplier selection. We assume there are several manufacturers and several candidate suppliers. For each manufacturer, our model proposes the best suppliers with the lowest cost and maximum visibility in downstream supply chain. We examine the applicability of the model by numerical examples. We also define several scenarios for datasets and observe the tendency. The results show that minimum visibility in downstream SC is needed to have a safe SC network.

Keywords: downstream supply chain, optimization, supply chain risk, supply chain visibility

Procedia PDF Downloads 211