Search results for: bilevel programming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 938

Search results for: bilevel programming

608 Revenue Management of Perishable Products Considering Freshness and Price Sensitive Customers

Authors: Onur Kaya, Halit Bayer

Abstract:

Global grocery and supermarket sales are among the largest markets in the world and perishable products such as fresh produce, dairy and meat constitute the biggest section of these markets. Due to their deterioration over time, the demand for these products depends highly on their freshness. They become totally obsolete after a certain amount of time causing a high amount of wastage and decreases in grocery profits. In addition, customers are asking for higher product variety in perishable product categories, leading to less predictable demand per product and to more out-dating. Effective management of these perishable products is an important issue since it is observed that billions of dollars’ worth of food is expired and wasted every month. We consider coordinated inventory and pricing decisions for perishable products with a time and price dependent random demand function. We use stochastic dynamic programming to model this system for both periodically-reviewed and continuously-reviewed inventory systems and prove certain structural characteristics of the optimal solution. We prove that the optimal ordering decision scenario has a monotone structure and the optimal price value decreases by time. However, the optimal price changes in a non-monotonic structure with respect to inventory size. We also analyze the effect of 1 different parameters on the optimal solution through numerical experiments. In addition, we analyze simple-to-implement heuristics, investigate their effectiveness and extract managerial insights. This study gives valuable insights about the management of perishable products in order to decrease wastage and increase profits.

Keywords: age-dependent demand, dynamic programming, perishable inventory, pricing

Procedia PDF Downloads 247
607 Finding a Set of Long Common Substrings with Repeats from m Input Strings

Authors: Tiantian Li, Lusheng Wang, Zhaohui Zhan, Daming Zhu

Abstract:

In this paper, we propose two string problems, and study algorithms and complexity of various versions for those problems. Let S = {s₁, s₂, . . . , sₘ} be a set of m strings. A common substring of S is a substring appearing in every string in S. Given a set of m strings S = {s₁, s₂, . . . , sₘ} and a positive integer k, we want to find a set C of k common substrings of S such that the k common substrings in C appear in the same order and have no overlap among the m input strings in S, and the total length of the k common substring in C is maximized. This problem is referred to as the longest total length of k common substrings from m input strings (LCSS(k, m) for short). The other problem we study here is called the longest total length of a set of common substrings with length more than l from m input string (LSCSS(l, m) for short). Given a set of m strings S = {s₁, s₂, . . . , sₘ} and a positive integer l, for LSCSS(l, m), we want to find a set of common substrings of S, each is of length more than l, such that the total length of all the common substrings is maximized. We show that both problems are NP-hard when k and m are variables. We propose dynamic programming algorithms with time complexity O(k n₁n₂) and O(n₁n₂) to solve LCSS(k, 2) and LSCSS(l, 2), respectively, where n1 and n₂ are the lengths of the two input strings. We then design an algorithm for LSCSS(l, m) when every length > l common substring appears once in each of the m − 1 input strings. The running time is O(n₁²m), where n1 is the length of the input string with no restriction on length > l common substrings. Finally, we propose a fixed parameter algorithm for LSCSS(l, m), where each length > l common substring appears m − 1 + c times among the m − 1 input strings (other than s1). In other words, each length > l common substring may repeatedly appear at most c times among the m − 1 input strings {s₂, s₃, . . . , sₘ}. The running time of the proposed algorithm is O((n12ᶜ)²m), where n₁ is the input string with no restriction on repeats. The LSCSS(l, m) is proposed to handle whole chromosome sequence alignment for different strains of the same species, where more than 98% of letters in core regions are identical.

Keywords: dynamic programming, algorithm, common substrings, string

Procedia PDF Downloads 18
606 Multiscale Hub: An Open-Source Framework for Practical Atomistic-To-Continuum Coupling

Authors: Masoud Safdari, Jacob Fish

Abstract:

Despite vast amount of existing theoretical knowledge, the implementation of a universal multiscale modeling, analysis, and simulation software framework remains challenging. Existing multiscale software and solutions are often domain-specific, closed-source and mandate a high-level of experience and skills in both multiscale analysis and programming. Furthermore, tools currently existing for Atomistic-to-Continuum (AtC) multiscaling are developed with the assumptions such as accessibility of high-performance computing facilities to the users. These issues mentioned plus many other challenges have reduced the adoption of multiscale in academia and especially industry. In the current work, we introduce Multiscale Hub (MsHub), an effort towards making AtC more accessible through cloud services. As a joint effort between academia and industry, MsHub provides a universal web-enabled framework for practical multiscaling. Developed on top of universally acclaimed scientific programming language Python, the package currently provides an open-source, comprehensive, easy-to-use framework for AtC coupling. MsHub offers an easy to use interface to prominent molecular dynamics and multiphysics continuum mechanics packages such as LAMMPS and MFEM (a free, lightweight, scalable C++ library for finite element methods). In this work, we first report on the design philosophy of MsHub, challenges identified and issues faced regarding its implementation. MsHub takes the advantage of a comprehensive set of tools and algorithms developed for AtC that can be used for a variety of governing physics. We then briefly report key AtC algorithms implemented in MsHub. Finally, we conclude with a few examples illustrating the capabilities of the package and its future directions.

Keywords: atomistic, continuum, coupling, multiscale

Procedia PDF Downloads 177
605 Improving Patient-Care Services at an Oncology Center with a Flexible Adaptive Scheduling Procedure

Authors: P. Hooshangitabrizi, I. Contreras, N. Bhuiyan

Abstract:

This work presents an online scheduling problem which accommodates multiple requests of patients for chemotherapy treatments in a cancer center of a major metropolitan hospital in Canada. To solve the problem, an adaptive flexible approach is proposed which systematically combines two optimization models. The first model is intended to dynamically schedule arriving requests in the form of waiting lists whereas the second model is used to reschedule the already booked patients with the goal of finding better resource allocations when new information becomes available. Both models are created as mixed integer programming formulations. Various controllable and flexible parameters such as deviating the prescribed target dates by a pre-determined threshold, changing the start time of already booked appointments and the maximum number of appointments to move in the schedule are included in the proposed approach to have sufficient degrees of flexibility in handling arrival requests and unexpected changes. Several computational experiments are conducted to evaluate the performance of the proposed approach using historical data provided by the oncology clinic. Our approach achieves outstandingly better results as compared to those of the scheduling system being used in practice. Moreover, several analyses are conducted to evaluate the effect of considering different levels of flexibility on the obtained results and to assess the performance of the proposed approach in dealing with last-minute changes. We strongly believe that the proposed flexible adaptive approach is very well-suited for implementation at the clinic to provide better patient-care services and to utilize available resource more efficiently.

Keywords: chemotherapy scheduling, multi-appointment modeling, optimization of resources, satisfaction of patients, mixed integer programming

Procedia PDF Downloads 169
604 Regret-Regression for Multi-Armed Bandit Problem

Authors: Deyadeen Ali Alshibani

Abstract:

In the literature, the multi-armed bandit problem as a statistical decision model of an agent trying to optimize his decisions while improving his information at the same time. There are several different algorithms models and their applications on this problem. In this paper, we evaluate the Regret-regression through comparing with Q-learning method. A simulation on determination of optimal treatment regime is presented in detail.

Keywords: optimal, bandit problem, optimization, dynamic programming

Procedia PDF Downloads 453
603 Drawing Building Blocks in Existing Neighborhoods: An Automated Pilot Tool for an Initial Approach Using GIS and Python

Authors: Konstantinos Pikos, Dimitrios Kaimaris

Abstract:

Although designing building blocks is a procedure used by many planners around the world, there isn’t an automated tool that will help planners and designers achieve their goals with lesser effort. The difficulty of the subject lies in the repeating process of manually drawing lines, while not only it is mandatory to maintain the desirable offset but to also achieve a lesser impact to the existing building stock. In this paper, using Geographical Information Systems (GIS) and the Python programming language, an automated tool integrated into ArcGIS PRO, is being presented. Despite its simplistic enviroment and the lack of specialized building legislation due to the complex state of the field, a planner who is aware of such technical information can use the tool to draw an initial approach of the final building blocks in an area with pre-existing buildings in an attempt to organize the usually sprawling suburbs of a city or any continuously developing area. The tool uses ESRI’s ArcPy library to handle the spatial data, while interactions with the user is made throught Tkinter. The main process consists of a modification of building edgescoordinates, using NumPy library, in an effort to draw the line of best fit, so the user can get the optimal results per block’s side. Finally, after the tool runs successfully, a table of primary planning information is shown, such as the area of the building block and its coverage rate. Regardless of the primary stage of the tool’s development, it is a solid base where potential planners with programming skills could invest, so they can make the tool adapt to their individual needs. An example of the entire procedure in a test area is provided, highlighting both the strengths and weaknesses of the final results.

Keywords: arcPy, GIS, python, building blocks

Procedia PDF Downloads 181
602 Electronic Payment Recording with Payment History Retrieval Module: A System Software

Authors: Adrian Forca, Simeon Cainday III

Abstract:

The Electronic Payment Recording with Payment History Retrieval Module is developed intendedly for the College of Science and Technology. This system software innovates the manual process of recording the payments done in the department through the development of electronic payment recording system software shifting from the slow and time-consuming procedure to quick yet reliable and accurate way of recording payments because it immediately generates receipts for every transaction. As an added feature to its software process, generation of recorded payment report is integrated eliminating the manual reporting to a more easy and consolidated report. As an added feature to the system, all recorded payments of the students can be retrieved immediately making the system transparent and reliable payment recording software. Viewing the whole process, the system software will shift from the manual process to an organized software technology because the information will be stored in a logically correct and normalized database. Further, the software will be developed using the modern programming language and implement strict programming methods to validate all users accessing the system, evaluate all data passed into the system and information retrieved to ensure data accuracy and reliability. In addition, the system will identify the user and limit its access privilege to establish boundaries of the specific access to information allowed for the store, modify, and update making the information secure against unauthorized data manipulation. As a result, the System software will eliminate the manual procedure and replace with an innovative modern information technology resulting to the improvement of the whole process of payment recording fast, secure, accurate and reliable software innovations.

Keywords: collection, information system, manual procedure, payment

Procedia PDF Downloads 167
601 Integer Programming: Domain Transformation in Nurse Scheduling Problem.

Authors: Geetha Baskaran, Andrzej Barjiela, Rong Qu

Abstract:

Motivation: Nurse scheduling is a complex combinatorial optimization problem. It is also known as NP-hard. It needs an efficient re-scheduling to minimize some trade-off of the measures of violation by reducing selected constraints to soft constraints with measurements of their violations. Problem Statement: In this paper, we extend our novel approach to solve the nurse scheduling problem by transforming it through Information Granulation. Approach: This approach satisfies the rules of a typical hospital environment based on a standard benchmark problem. Generating good work schedules has a great influence on nurses' working conditions which are strongly related to the level of a quality health care. Domain transformation that combines the strengths of operation research and artificial intelligence was proposed for the solution of the problem. Compared to conventional methods, our approach involves judicious grouping (information granulation) of shifts types’ that transforms the original problem into a smaller solution domain. Later these schedules from the smaller problem domain are converted back into the original problem domain by taking into account the constraints that could not be represented in the smaller domain. An Integer Programming (IP) package is used to solve the transformed scheduling problem by expending the branch and bound algorithm. We have used the GNU Octave for Windows to solve this problem. Results: The scheduling problem has been solved in the proposed formalism resulting in a high quality schedule. Conclusion: Domain transformation represents departure from a conventional one-shift-at-a-time scheduling approach. It offers an advantage of efficient and easily understandable solutions as well as offering deterministic reproducibility of the results. We note, however, that it does not guarantee the global optimum.

Keywords: domain transformation, nurse scheduling, information granulation, artificial intelligence, simulation

Procedia PDF Downloads 397
600 Multi-Objective Multi-Period Allocation of Temporary Earthquake Disaster Response Facilities with Multi-Commodities

Authors: Abolghasem Yousefi-Babadi, Ali Bozorgi-Amiri, Aida Kazempour, Reza Tavakkoli-Moghaddam, Maryam Irani

Abstract:

All over the world, natural disasters (e.g., earthquakes, floods, volcanoes and hurricanes) causes a lot of deaths. Earthquakes are introduced as catastrophic events, which is accident by unusual phenomena leading to much loss around the world. Such could be replaced by disasters or any other synonyms strongly demand great long-term help and relief, which can be hard to be managed. Supplies and facilities are very important challenges after any earthquake which should be prepared for the disaster regions to satisfy the people's demands who are suffering from earthquake. This paper proposed disaster response facility allocation problem for disaster relief operations as a mathematical programming model. Not only damaged people in the earthquake victims, need the consumable commodities (e.g., food and water), but also they need non-consumable commodities (e.g., clothes) to protect themselves. Therefore, it is concluded that paying attention to disaster points and people's demands are very necessary. To deal with this objective, both commodities including consumable and need non-consumable commodities are considered in the presented model. This paper presented the multi-objective multi-period mathematical programming model regarding the minimizing the average of the weighted response times and minimizing the total operational cost and penalty costs of unmet demand and unused commodities simultaneously. Furthermore, a Chebycheff multi-objective solution procedure as a powerful solution algorithm is applied to solve the proposed model. Finally, to illustrate the model applicability, a case study of the Tehran earthquake is studied, also to show model validation a sensitivity analysis is carried out.

Keywords: facility location, multi-objective model, disaster response, commodity

Procedia PDF Downloads 258
599 Connecting the Dots: Bridging Academia and National Community Partnerships When Delivering Healthy Relationships Programming

Authors: Nicole Vlasman, Karamjeet Dhillon

Abstract:

Over the past four years, the Healthy Relationships Program has been delivered in community organizations and schools across Canada. More than 240 groups have been facilitated in collaboration with 33 organizations. As a result, 2157 youth have been engaged in the programming. The purpose and scope of the Healthy Relationships Program are to offer sustainable, evidence-based skills through small group implementation to prevent violence and promote positive, healthy relationships in youth. The program development has included extensive networking at regional and national levels. The Healthy Relationships Program is currently being implemented, adapted, and researched within the Resilience and Inclusion through Strengthening and Enhancing Relationships (RISE-R) project. Alongside the project’s research objectives, the RISE-R team has worked to virtually share the ongoing findings of the project through a slow ontology approach. Slow ontology is a practice integrated into project systems and structures whereby slowing the pace and volume of outputs offers creative opportunities. Creative production reveals different layers of success and complements the project, the building blocks for sustainability. As a result of integrating a slow ontology approach, the RISE-R team has developed a Geographic Information System (GIS) that documents local landscapes through a Story Map feature, and more specifically, video installations. Video installations capture the cartography of space and place within the context of singular diverse community spaces (case studies). By documenting spaces via human connections, the project captures narratives, which further enhance the voices and faces of the community within the larger project scope. This GIS project aims to create a visual and interactive flow of information that complements the project's mixed-method research approach. Conclusively, creative project development in the form of a geographic information system can provide learning and engagement opportunities at many levels (i.e., within community organizations and educational spaces or with the general public). In each of these disconnected spaces, fragmented stories are connected through a visual display of project outputs. A slow ontology practice within the context of the RISE-R project documents activities on the fringes and within internal structures; primarily through documenting project successes as further contributions to the Centre for School Mental Health framework (philosophy, recruitment techniques, allocation of resources and time, and a shared commitment to evidence-based products).

Keywords: community programming, geographic information system, project development, project management, qualitative, slow ontology

Procedia PDF Downloads 156
598 Framework for Incorporating Environmental Performance in Network-Level Pavement Maintenance Program

Authors: Jessica Achebe, Susan Tighe

Abstract:

The reduction of material consumption and greenhouse gas emission when maintain and rehabilitating road networks can achieve added benefits including improved life cycle performance of pavements, reduced climate change impacts and human health effect due to less air pollution, improved productivity due to an optimal allocation of resources and reduced road user cost. This is the essence of incorporating environmental sustainability into pavement management. The functionality of performance measurement approach has made it one of the most valuable tool to Pavement Management Systems (PMSs) to account for different criteria in the decision-making process. However measuring the environmental performance of road network is still a far-fetched practice in road network management, more so an ostensive agency-wide environmental sustainability or sustainable maintenance specifications is missing. To address this challenge, this present research focuses on the environmental sustainability performance of network-level pavement management. The ultimate goal is to develop a framework to incorporate environmental sustainability in pavement management systems for network-level maintenance programming. In order to achieve this goal, this paper present the first step, the intention is to review the previous studies that employed environmental performance measures, as well as the suitability of environmental performance indicators for the evaluation of the sustainability of network-level pavement maintenance strategies. Through an industry practice survey, this paper provides a brief forward regarding the pavement manager motivations and barriers to making more sustainable decisions, and data needed to support the network-level environmental sustainability. The trends in network-level sustainable pavement management are also presented, existing gaps are highlighted, and ideas are proposed for network-level sustainable maintenance and rehabilitation programming.

Keywords: pavement management, environment sustainability, network-level evaluation, performance measures

Procedia PDF Downloads 309
597 Optimisation Model for Maximising Social Sustainability in Construction Scheduling

Authors: Laura Florez

Abstract:

The construction industry is labour intensive, and the behaviour and management of workers have a direct impact on the performance of construction projects. One of the issues it currently faces is how to recruit and maintain its workers. Construction is known as an industry where workers face the problem of short employment durations, frequent layoffs, and periods of unemployment between jobs. These challenges not only creates pressures on the workers but also project managers have to constantly train new workers, face skills shortage, and uncertainty on the quality of the workers it will attract. To consider worker’s needs and project managers expectations, one practice that can be implemented is to schedule construction projects to maintain a stable workforce. This paper proposes a mixed integer programming (MIP) model to schedule projects with the objective of maximising social sustainability of construction projects, that is, maximise labour stability. Aside from the social objective, the model accounts for equipment and financial resources required by the projects during the construction phase. To illustrate how the solution strategy works, a construction programme comprised of ten projects is considered. The projects are scheduled to maximise labour stability while simultaneously minimising time and minimising cost. The tradeoff between the values in terms of time, cost, and labour stability allows project managers to consider their preferences and identify which solution best suits their needs. Additionally, the model determines the optimal starting times for each of the projects, working patterns for the workers, and labour costs. This model shows that construction projects can be scheduled to not only benefit the project manager, but also benefit current workers and help attract new workers to the industry. Due to its practicality, it can be a valuable tool to support decision making and assist construction stakeholders when developing schedules that include social sustainability factors.

Keywords: labour stability, mixed-integer programming (MIP), scheduling, workforce management

Procedia PDF Downloads 253
596 Budgetary Performance Model for Managing Pavement Maintenance

Authors: Vivek Hokam, Vishrut Landge

Abstract:

An ideal maintenance program for an industrial road network is one that would maintain all sections at a sufficiently high level of functional and structural conditions. However, due to various constraints such as budget, manpower and equipment, it is not possible to carry out maintenance on all the needy industrial road sections within a given planning period. A rational and systematic priority scheme needs to be employed to select and schedule industrial road sections for maintenance. Priority analysis is a multi-criteria process that determines the best ranking list of sections for maintenance based on several factors. In priority setting, difficult decisions are required to be made for selection of sections for maintenance. It is more important to repair a section with poor functional conditions which includes uncomfortable ride etc. or poor structural conditions i.e. sections those are in danger of becoming structurally unsound. It would seem therefore that any rational priority setting approach must consider the relative importance of functional and structural condition of the section. The maintenance priority index and pavement performance models tend to focus mainly on the pavement condition, traffic criteria etc. There is a need to develop the model which is suitably used with respect to limited budget provisions for maintenance of pavement. Linear programming is one of the most popular and widely used quantitative techniques. A linear programming model provides an efficient method for determining an optimal decision chosen from a large number of possible decisions. The optimum decision is one that meets a specified objective of management, subject to various constraints and restrictions. The objective is mainly minimization of maintenance cost of roads in industrial area. In order to determine the objective function for analysis of distress model it is necessary to fix the realistic data into a formulation. Each type of repair is to be quantified in a number of stretches by considering 1000 m as one stretch. A stretch considered under study is having 3750 m length. The quantity has to be put into an objective function for maximizing the number of repairs in a stretch related to quantity. The distress observed in this stretch are potholes, surface cracks, rutting and ravelling. The distress data is measured manually by observing each distress level on a stretch of 1000 m. The maintenance and rehabilitation measured that are followed currently are based on subjective judgments. Hence, there is a need to adopt a scientific approach in order to effectively use the limited resources. It is also necessary to determine the pavement performance and deterioration prediction relationship with more accurate and economic benefits of road networks with respect to vehicle operating cost. The infrastructure of road network should have best results expected from available funds. In this paper objective function for distress model is determined by linear programming and deterioration model considering overloading is discussed.

Keywords: budget, maintenance, deterioration, priority

Procedia PDF Downloads 208
595 Recycling Service Strategy by Considering Demand-Supply Interaction

Authors: Hui-Chieh Li

Abstract:

Circular economy promotes greater resource productivity and avoids pollution through greater recycling and re-use which bring benefits for both the environment and the economy. The concept is contrast to a linear economy which is ‘take, make, dispose’ model of production. A well-design reverse logistics service strategy could enhance the willingness of recycling of the users and reduce the related logistics cost as well as carbon emissions. Moreover, the recycle brings the manufacturers most advantages as it targets components for closed-loop reuse, essentially converting materials and components from worn-out product into inputs for new ones at right time and right place. This study considers demand-supply interaction, time-dependent recycle demand, time-dependent surplus value of recycled product and constructs models on recycle service strategy for the recyclable waste collector. A crucial factor in optimizing a recycle service strategy is consumer demand. The study considers the relationships between consumer demand towards recycle and product characteristics, surplus value and user behavior. The study proposes a recycle service strategy which differs significantly from the conventional and typical uniform service strategy. Periods with considerable demand and large surplus product value suggest frequent and short service cycle. The study explores how to determine a recycle service strategy for recyclable waste collector in terms of service cycle frequency and duration and vehicle type for all service cycles by considering surplus value of recycled product, time-dependent demand, transportation economies and demand-supply interaction. The recyclable waste collector is responsible for the collection of waste product for the manufacturer. The study also examines the impacts of utilization rate on the cost and profit in the context of different sizes of vehicles. The model applies mathematical programming methods and attempts to maximize the total profit of the distributor during the study period. This study applies the binary logit model, analytical model and mathematical programming methods to the problem. The model specifically explores how to determine a recycle service strategy for the recycler by considering product surplus value, time-dependent recycle demand, transportation economies and demand-supply interaction. The model applies mathematical programming methods and attempts to minimize the total logistics cost of the recycler and maximize the recycle benefits of the manufacturer during the study period. The study relaxes the constant demand assumption and examines how service strategy affects consumer demand towards waste recycling. Results of the study not only help understanding how the user demand for recycle service and product surplus value affects the logistics cost and manufacturer’s benefits, but also provide guidance such as award bonus and carbon emission regulations for the government.

Keywords: circular economy, consumer demand, product surplus value, recycle service strategy

Procedia PDF Downloads 392
594 Using Google Distance Matrix Application Programming Interface to Reveal and Handle Urban Road Congestion Hot Spots: A Case Study from Budapest

Authors: Peter Baji

Abstract:

In recent years, a growing body of literature emphasizes the increasingly negative impacts of urban road congestion in the everyday life of citizens. Although there are different responses from the public sector to decrease traffic congestion in urban regions, the most effective public intervention is using congestion charges. Because travel is an economic asset, its consumption can be controlled by extra taxes or prices effectively, but this demand-side intervention is often unpopular. Measuring traffic flows with the help of different methods has a long history in transport sciences, but until recently, there was not enough sufficient data for evaluating road traffic flow patterns on the scale of an entire road system of a larger urban area. European cities (e.g., London, Stockholm, Milan), in which congestion charges have already been introduced, designated a particular zone in their downtown for paying, but it protects only the users and inhabitants of the CBD (Central Business District) area. Through the use of Google Maps data as a resource for revealing urban road traffic flow patterns, this paper aims to provide a solution for a fairer and smarter congestion pricing method in cities. The case study area of the research contains three bordering districts of Budapest which are linked by one main road. The first district (5th) is the original downtown that is affected by the congestion charge plans of the city. The second district (13th) lies in the transition zone, and it has recently been transformed into a new CBD containing the biggest office zone in Budapest. The third district (4th) is a mainly residential type of area on the outskirts of the city. The raw data of the research was collected with the help of Google’s Distance Matrix API (Application Programming Interface) which provides future estimated traffic data via travel times between freely fixed coordinate pairs. From the difference of free flow and congested travel time data, the daily congestion patterns and hot spots are detectable in all measured roads within the area. The results suggest that the distribution of congestion peak times and hot spots are uneven in the examined area; however, there are frequently congested areas which lie outside the downtown and their inhabitants also need some protection. The conclusion of this case study is that cities can develop a real-time and place-based congestion charge system that forces car users to avoid frequently congested roads by changing their routes or travel modes. This would be a fairer solution for decreasing the negative environmental effects of the urban road transportation instead of protecting a very limited downtown area.

Keywords: Budapest, congestion charge, distance matrix API, application programming interface, pilot study

Procedia PDF Downloads 200
593 Building Tutor and Tutee Pedagogical Agents to Enhance Learning in Adaptive Educational Games

Authors: Ogar Ofut Tumenayu, Olga Shabalina

Abstract:

This paper describes the application of two types of pedagogical agents’ technology with different functions in an adaptive educational game with the sole aim of improving learning and enhancing interactivities in Digital Educational Games (DEG). This idea could promote the elimination of some problems of DEG, like isolation in game-based learning, by introducing a tutor and tutee pedagogical agents. We present an analysis of a learning companion interacting in a peer tutoring environment as a step toward improving social interactions in the educational game environment. We show that tutor and tutee agents use different interventions and interactive approaches: the tutor agent is engaged in tracking the learner’s activities and inferring the learning state, while the tutee agent initiates interactions with the learner at the appropriate times and in appropriate manners. In order to provide motivation to prevent mistakes and clarity a game task, the tutor agent uses the help dialog tool to provide assistance, while the tutee agent provides collaboration assistance by using the hind tool. We presented our idea on a prototype game called “Pyramid Programming Game,” a 2D game that was developed using Libgdx. The game's Pyramid component symbolizes a programming task that is presented to the player in the form of a puzzle. During gameplay, the Agents can instruct, direct, inspire, and communicate emotions. They can also rapidly alter the instructional pattern in response to the learner's performance and knowledge. The pyramid must be effectively destroyed in order to win the game. The game also teaches and illustrates the advantages of utilizing educational agents such as TrA and TeA to assist and motivate students. Our findings support the idea that the functionality of a pedagogical agent should be dualized into an instructional and learner’s companion agent in order to enhance interactivity in a game-based environment.

Keywords: tutor agent, tutee agent, learner’s companion interaction, agent collaboration

Procedia PDF Downloads 68
592 Preventing Violent Extremism in Mozambique and Tanzania: A Survey to Measure Community Resilience

Authors: L. Freeman, D. Bax, V. K. Sapong

Abstract:

Community-based, preventative approaches to violent extremism may be effective and yet remain an underutilised method. In a realm where security approaches dominate, with the focus on countering violence extremism and combatting radicalisation, community resilience programming remains sparse. This paper will present a survey tool that aims to measure the risk and protective factors that can lead to violent extremism in Mozambique and Tanzania. Conducted in four districts in the Cabo Delgado region of Mozambique and one district in Pwani, Tanzania, the survey uses a combination of BRAVE-14, Afrocentric and context-specific questions in order to more fully understand community resilience opportunities and challenges in preventing and countering violent extremism. Developed in Australia and Canada to measure radicalisation risks in individuals and communities, BRAVE-14 is a tool not yet applied in the African continent. Given the emerging threat of Islamic extremism in Northern Mozambique and Eastern Tanzania, which both experience a combination of socio-political exclusion, resource marginalisation and religious/ideological motivations, the development of the survey is timely and fills a much-needed information gap in these regions. Not only have these Islamist groups succeeded in tapping into the grievances of communities by radicalising and recruiting individuals, but their presence in these regions has been characterised by extreme forms of violence, leaving isolated communities vulnerable to attack. The expected result of these findings will facilitate the contextualisation and comparison of the protective and risk factors that inhibit or promote the radicalisation of the youth in these communities. In identifying sources of resilience and vulnerability, this study emphasises the implementation of context-specific intervention programming and provides a strong research tool for understanding youth and community resilience to violent extremism.

Keywords: community resilience, Mozambique, preventing violent extremism, radicalisation, Tanzania

Procedia PDF Downloads 133
591 A Survey on Compression Methods for Table Constraints

Authors: N. Gharbi

Abstract:

Constraint Satisfaction problems are mathematical problems that are often used to model many real-world problems for which we look if there exists a solution satisfying all its constraints. Table constraints are important for modeling parts of many problems since they list all combinations of allowed or forbidden values. However, they admit practical limitations because they are sometimes too large to be represented in a direct way. In this paper, we present a survey of the different categories of the proposed approaches to compress table constraints in order to reduce both space and time complexities.

Keywords: constraint programming, compression, data mining, table constraints

Procedia PDF Downloads 326
590 A Mathematical Model to Select Shipbrokers

Authors: Y. Smirlis, G. Koronakos, S. Plitsos

Abstract:

Shipbrokers assist the ship companies in chartering or selling and buying vessels, acting as intermediates between them and the market. They facilitate deals, providing their expertise, negotiating skills, and knowledge about ship market bargains. Their role is very important as it affects the profitability and market position of a shipping company. Due to their significant contribution, the shipping companies have to employ systematic procedures to evaluate the shipbrokers’ services in order to select the best and, consequently, to achieve the best deals. Towards this, in this paper, we consider shipbrokers as financial service providers, and we formulate the problem of evaluating and selecting shipbrokers’ services as a multi-criteria decision making (MCDM) procedure. The proposed methodology comprises a first normalization step to adjust different scales and orientations of the criteria and a second step that includes the mathematical model to evaluate the performance of the shipbrokers’ services involved in the assessment. The criteria along which the shipbrokers are assessed may refer to their size and reputation, the potential efficiency of the services, the terms and conditions imposed, the expenses (e.g., commission – brokerage), the expected time to accomplish a chartering or selling/buying task, etc. and according to our modelling approach these criteria may be assigned different importance. The mathematical programming model performs a comparative assessment and estimates for the shipbrokers involved in the evaluation, a relative score that ranks the shipbrokers in terms of their potential performance. To illustrate the proposed methodology, we present a case study in which a shipping company evaluates and selects the most suitable among a number of sale and purchase (S&P) brokers. Acknowledgment: This study is supported by the OptiShip project, implemented within the framework of the National Recovery Plan and Resilience “Greece 2.0” and funded by the European Union – NextGenerationEU programme.

Keywords: shipbrokers, multi-criteria decision making, mathematical programming, service-provider selection

Procedia PDF Downloads 89
589 Using Scilab® as New Introductory Method in Numerical Calculations and Programming for Computational Fluid Dynamics (CFD)

Authors: Nicoly Coelho, Eduardo Vieira Vilas Boas, Paulo Orestes Formigoni

Abstract:

Faced with the remarkable developments in the various segments of modern engineering, provided by the increasing technological development, professionals of all educational areas need to overcome the difficulties generated due to the good understanding of those who are starting their academic journey. Aiming to overcome these difficulties, this article aims at an introduction to the basic study of numerical methods applied to fluid mechanics and thermodynamics, demonstrating the modeling and simulations with its substance, and a detailed explanation of the fundamental numerical solution for the use of finite difference method, using SCILAB, a free software easily accessible as it is free and can be used for any research center or university, anywhere, both in developed and developing countries. It is known that the Computational Fluid Dynamics (CFD) is a necessary tool for engineers and professionals who study fluid mechanics, however, the teaching of this area of knowledge in undergraduate programs faced some difficulties due to software costs and the degree of difficulty of mathematical problems involved in this way the matter is treated only in postgraduate courses. This work aims to bring the use of DFC low cost in teaching Transport Phenomena for graduation analyzing a small classic case of fundamental thermodynamics with Scilab® program. The study starts from the basic theory involving the equation the partial differential equation governing heat transfer problem, implies the need for mastery of students, discretization processes that include the basic principles of series expansion Taylor responsible for generating a system capable of convergence check equations using the concepts of Sassenfeld, finally coming to be solved by Gauss-Seidel method. In this work we demonstrated processes involving both simple problems solved manually, as well as the complex problems that required computer implementation, for which we use a small algorithm with less than 200 lines in Scilab® in heat transfer study of a heated plate in rectangular shape on four sides with different temperatures on either side, producing a two-dimensional transport with colored graphic simulation. With the spread of computer technology, numerous programs have emerged requiring great researcher programming skills. Thinking that this ability to program DFC is the main problem to be overcome, both by students and by researchers, we present in this article a hint of use of programs with less complex interface, thus enabling less difficulty in producing graphical modeling and simulation for DFC with an extension of the programming area of experience for undergraduates.

Keywords: numerical methods, finite difference method, heat transfer, Scilab

Procedia PDF Downloads 388
588 Optimal Tamping for Railway Tracks, Reducing Railway Maintenance Expenditures by the Use of Integer Programming

Authors: Rui Li, Min Wen, Kim Bang Salling

Abstract:

For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euros per kilometer per year. In order to reduce such maintenance expenditures, this paper presents a mixed 0-1 linear mathematical model designed to optimize the predictive railway tamping activities for ballast track in the planning horizon of three to four years. The objective function is to minimize the tamping machine actual costs. The approach of the research is using the simple dynamic model for modelling condition-based tamping process and the solution method for finding optimal condition-based tamping schedule. Seven technical and practical aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality recovery on the track quality after tamping operation; (5) Tamping machine operation practices (6) tamping budgets and (7) differentiating the open track from the station sections. A Danish railway track between Odense and Fredericia with 42.6 km of length is applied for a time period of three and four years in the proposed maintenance model. The generated tamping schedule is reasonable and robust. Based on the result from the Danish railway corridor, the total costs can be reduced significantly (50%) than the previous model which is based on optimizing the number of tamping. The different maintenance strategies have been discussed in the paper. The analysis from the results obtained from the model also shows a longer period of predictive tamping planning has more optimal scheduling of maintenance actions than continuous short term preventive maintenance, namely yearly condition-based planning.

Keywords: integer programming, railway tamping, predictive maintenance model, preventive condition-based maintenance

Procedia PDF Downloads 446
587 Procedure to Optimize the Performance of Chemical Laser Using the Genetic Algorithm Optimizations

Authors: Mohammedi Ferhate

Abstract:

This work presents details of the study of the entire flow inside the facility where the exothermic chemical reaction process in the chemical laser cavity is analyzed. In our paper we will describe the principles of chemical lasers where flow reversal is produced by chemical reactions. We explain the device for converting chemical potential energy laser energy. We see that the phenomenon thus has an explosive trend. Finally, the feasibility and effectiveness of the proposed method is demonstrated by computer simulation

Keywords: genetic, lasers, nozzle, programming

Procedia PDF Downloads 95
586 Measurement of CES Production Functions Considering Energy as an Input

Authors: Donglan Zha, Jiansong Si

Abstract:

Because of its flexibility, CES attracts much interest in economic growth and programming models, and the macroeconomics or micro-macro models. This paper focuses on the development, estimating methods of CES production function considering energy as an input. We leave for future research work of relaxing the assumption of constant returns to scale, the introduction of potential input factors, and the generalization method of the optimal nested form of multi-factor production functions.

Keywords: bias of technical change, CES production function, elasticity of substitution, energy input

Procedia PDF Downloads 282
585 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 116
584 Tool for Determining the Similarity between Two Web Applications

Authors: Doru Anastasiu Popescu, Raducanu Dragos Ionut

Abstract:

In this paper the presentation of a tool which measures the similarity between two websites is made. The websites are compound only from webpages created with HTML. The tool uses three ways of calculating the similarity between two websites based on certain results already published. The first way compares all the webpages within a website, the second way compares a webpage with all the pages within the second website and the third way compares two webpages. Java programming language and technologies such as spring, Jsoup, log4j were used for the implementation of the tool.

Keywords: Java, Jsoup, HTM, spring

Procedia PDF Downloads 385
583 Encephalon-An Implementation of a Handwritten Mathematical Expression Solver

Authors: Shreeyam, Ranjan Kumar Sah, Shivangi

Abstract:

Recognizing and solving handwritten mathematical expressions can be a challenging task, particularly when certain characters are segmented and classified. This project proposes a solution that uses Convolutional Neural Network (CNN) and image processing techniques to accurately solve various types of equations, including arithmetic, quadratic, and trigonometric equations, as well as logical operations like logical AND, OR, NOT, NAND, XOR, and NOR. The proposed solution also provides a graphical solution, allowing users to visualize equations and their solutions. In addition to equation solving, the platform, called CNNCalc, offers a comprehensive learning experience for students. It provides educational content, a quiz platform, and a coding platform for practicing programming skills in different languages like C, Python, and Java. This all-in-one solution makes the learning process engaging and enjoyable for students. The proposed methodology includes horizontal compact projection analysis and survey for segmentation and binarization, as well as connected component analysis and integrated connected component analysis for character classification. The compact projection algorithm compresses the horizontal projections to remove noise and obtain a clearer image, contributing to the accuracy of character segmentation. Experimental results demonstrate the effectiveness of the proposed solution in solving a wide range of mathematical equations. CNNCalc provides a powerful and user-friendly platform for solving equations, learning, and practicing programming skills. With its comprehensive features and accurate results, CNNCalc is poised to revolutionize the way students learn and solve mathematical equations. The platform utilizes a custom-designed Convolutional Neural Network (CNN) with image processing techniques to accurately recognize and classify symbols within handwritten equations. The compact projection algorithm effectively removes noise from horizontal projections, leading to clearer images and improved character segmentation. Experimental results demonstrate the accuracy and effectiveness of the proposed solution in solving a wide range of equations, including arithmetic, quadratic, trigonometric, and logical operations. CNNCalc features a user-friendly interface with a graphical representation of equations being solved, making it an interactive and engaging learning experience for users. The platform also includes tutorials, testing capabilities, and programming features in languages such as C, Python, and Java. Users can track their progress and work towards improving their skills. CNNCalc is poised to revolutionize the way students learn and solve mathematical equations with its comprehensive features and accurate results.

Keywords: AL, ML, hand written equation solver, maths, computer, CNNCalc, convolutional neural networks

Procedia PDF Downloads 124
582 Producing Graphical User Interface from Activity Diagrams

Authors: Ebitisam K. Elberkawi, Mohamed M. Elammari

Abstract:

Graphical User Interface (GUI) is essential to programming, as is any other characteristic or feature, due to the fact that GUI components provide the fundamental interaction between the user and the program. Thus, we must give more interest to GUI during building and development of systems. Also, we must give a greater attention to the user who is the basic corner in the dealing with the GUI. This paper introduces an approach for designing GUI from one of the models of business workflows which describe the workflow behavior of a system, specifically through activity diagrams (AD).

Keywords: activity diagram, graphical user interface, GUI components, program

Procedia PDF Downloads 464
581 Pavement Management for a Metropolitan Area: A Case Study of Montreal

Authors: Luis Amador Jimenez, Md. Shohel Amin

Abstract:

Pavement performance models are based on projections of observed traffic loads, which makes uncertain to study funding strategies in the long run if history does not repeat. Neural networks can be used to estimate deterioration rates but the learning rate and momentum have not been properly investigated, in addition, economic evolvement could change traffic flows. This study addresses both issues through a case study for roads of Montreal that simulates traffic for a period of 50 years and deals with the measurement error of the pavement deterioration model. Travel demand models are applied to simulate annual average daily traffic (AADT) every 5 years. Accumulated equivalent single axle loads (ESALs) are calculated from the predicted AADT and locally observed truck distributions combined with truck factors. A back propagation Neural Network (BPN) method with a Generalized Delta Rule (GDR) learning algorithm is applied to estimate pavement deterioration models capable of overcoming measurement errors. Linear programming of lifecycle optimization is applied to identify M&R strategies that ensure good pavement condition while minimizing the budget. It was found that CAD 150 million is the minimum annual budget to good condition for arterial and local roads in Montreal. Montreal drivers prefer the use of public transportation for work and education purposes. Vehicle traffic is expected to double within 50 years, ESALS are expected to double the number of ESALs every 15 years. Roads in the island of Montreal need to undergo a stabilization period for about 25 years, a steady state seems to be reached after.

Keywords: pavement management system, traffic simulation, backpropagation neural network, performance modeling, measurement errors, linear programming, lifecycle optimization

Procedia PDF Downloads 461
580 A Sustainable Supplier Selection and Order Allocation Based on Manufacturing Processes and Product Tolerances: A Multi-Criteria Decision Making and Multi-Objective Optimization Approach

Authors: Ravi Patel, Krishna K. Krishnan

Abstract:

In global supply chains, appropriate and sustainable suppliers play a vital role in supply chain development and feasibility. In a larger organization with huge number of suppliers, it is necessary to divide suppliers based on their past history of quality and delivery of each product category. Since performance of any organization widely depends on their suppliers, well evaluated selection criteria and decision-making models lead to improved supplier assessment and development. In this paper, SCOR® performance evaluation approach and ISO standards are used to determine selection criteria for better utilization of supplier assessment by using hybrid model of Analytic Hierchchy Problem (AHP) and Fuzzy Techniques for Order Preference by Similarity to Ideal Solution (FTOPSIS). AHP is used to determine the global weightage of criteria which helps TOPSIS to get supplier score by using triangular fuzzy set theory. Both qualitative and quantitative criteria are taken into consideration for the proposed model. In addition, a multi-product and multi-time period model is selected for order allocation. The optimization model integrates multi-objective integer linear programming (MOILP) for order allocation and a hybrid approach for supplier selection. The proposed MOILP model optimizes order allocation based on manufacturing process and product tolerances as per manufacturer’s requirement for quality product. The integrated model and solution approach are tested to find optimized solutions for different scenario. The detailed analysis shows the superiority of proposed model over other solutions which considered individual decision making models.

Keywords: AHP, fuzzy set theory, multi-criteria decision making, multi-objective integer linear programming, TOPSIS

Procedia PDF Downloads 172
579 An Adiabatic Quantum Optimization Approach for the Mixed Integer Nonlinear Programming Problem

Authors: Maxwell Henderson, Tristan Cook, Justin Chan Jin Le, Mark Hodson, YoungJung Chang, John Novak, Daniel Padilha, Nishan Kulatilaka, Ansu Bagchi, Sanjoy Ray, John Kelly

Abstract:

We present a method of using adiabatic quantum optimization (AQO) to solve a mixed integer nonlinear programming (MINLP) problem instance. The MINLP problem is a general form of a set of NP-hard optimization problems that are critical to many business applications. It requires optimizing a set of discrete and continuous variables with nonlinear and potentially nonconvex constraints. Obtaining an exact, optimal solution for MINLP problem instances of non-trivial size using classical computation methods is currently intractable. Current leading algorithms leverage heuristic and divide-and-conquer methods to determine approximate solutions. Creating more accurate and efficient algorithms is an active area of research. Quantum computing (QC) has several theoretical benefits compared to classical computing, through which QC algorithms could obtain MINLP solutions that are superior to current algorithms. AQO is a particular form of QC that could offer more near-term benefits compared to other forms of QC, as hardware development is in a more mature state and devices are currently commercially available from D-Wave Systems Inc. It is also designed for optimization problems: it uses an effect called quantum tunneling to explore all lowest points of an energy landscape where classical approaches could become stuck in local minima. Our work used a novel algorithm formulated for AQO to solve a special type of MINLP problem. The research focused on determining: 1) if the problem is possible to solve using AQO, 2) if it can be solved by current hardware, 3) what the currently achievable performance is, 4) what the performance will be on projected future hardware, and 5) when AQO is likely to provide a benefit over classical computing methods. Two different methods, integer range and 1-hot encoding, were investigated for transforming the MINLP problem instance constraints into a mathematical structure that can be embedded directly onto the current D-Wave architecture. For testing and validation a D-Wave 2X device was used, as well as QxBranch’s QxLib software library, which includes a QC simulator based on simulated annealing. Our results indicate that it is mathematically possible to formulate the MINLP problem for AQO, but that currently available hardware is unable to solve problems of useful size. Classical general-purpose simulated annealing is currently able to solve larger problem sizes, but does not scale well and such methods would likely be outperformed in the future by improved AQO hardware with higher qubit connectivity and lower temperatures. If larger AQO devices are able to show improvements that trend in this direction, commercially viable solutions to the MINLP for particular applications could be implemented on hardware projected to be available in 5-10 years. Continued investigation into optimal AQO hardware architectures and novel methods for embedding MINLP problem constraints on to those architectures is needed to realize those commercial benefits.

Keywords: adiabatic quantum optimization, mixed integer nonlinear programming, quantum computing, NP-hard

Procedia PDF Downloads 527