Search results for: fuzzy multi-objective combinatorial programming problem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8278

Search results for: fuzzy multi-objective combinatorial programming problem

7048 Cryptanalysis of ID-Based Deniable Authentication Protocol Based On Diffie-Hellman Problem on Elliptic Curve

Authors: Eun-Jun Yoon

Abstract:

Deniable authentication protocol is a new security authentication mechanism which can enable a receiver to identify the true source of a given message, but not to prove the identity of the sender to a third party. In 2013, Kar proposed a secure ID-based deniable authentication protocol whose security is based on computational infeasibility of solving Elliptic Curve Diffie-Hellman Problem (ECDHP). Kar claimed that the proposed protocol achieves properties of deniable authentication, mutual authentication, and message confidentiality. However, this paper points out that Kar's protocol still suffers from sender spoofing attack and message modification attack unlike its claims.

Keywords: deniable authentication, elliptic curve cryptography, Diffie-Hellman problem, cryptanalysis

Procedia PDF Downloads 318
7047 A Semi-Analytical Method for Analysis of the Axially Symmetric Problem on Indentation of a Hot Circular Punch into an Arbitrarily Nonhomogeneous Halfspace

Authors: S. Aizikovich, L. Krenev, Y. Tokovyy, Y. C. Wang

Abstract:

An approximate analytical-numerical solution to the axisymmetric problem on thermo-mechanical indentation of a flat cylindrical punch into an arbitrarily non-homogeneous elastic half-space is constructed by making use of the bilateral asymptotic method. The key point of this method lies in evaluation of the ker¬nels in the obtained integral equations by making use of a numerical technique. Once the structure of the kernel is defined, it then is approximated by an analytical expression of special kind so that the solution of the integral equation can be achieved analytically. This fact allows for construction of the solution in an analytical form, which is convenient for analysis of the mechanical effects concerned with arbitrarily presumed non-homogeneity of the material.

Keywords: contact problem, circular punch, arbitrarily-nonhomogeneous halfspace

Procedia PDF Downloads 501
7046 Synchronized Vehicle Routing for Equitable Resource Allocation in Food Banks

Authors: Rabiatu Bonku, Faisal Alkaabneh

Abstract:

Inspired by a food banks distribution operation for non-profit organization, we study a variant synchronized vehicle routing problem for equitable resource allocation. This research paper introduces a Mixed Integer Programming (MIP) model aimed at addressing the complex challenge of efficiently distributing vital resources, particularly for food banks serving vulnerable populations in urban areas. Our optimization approach places a strong emphasis on social equity, ensuring a fair allocation of food to partner agencies while minimizing wastage. The primary objective is to enhance operational efficiency while guaranteeing fair distribution and timely deliveries to prevent food spoilage. Furthermore, we assess four distinct models that consider various aspects of sustainability, including social and economic factors. We conduct a comprehensive numerical analysis using real-world data to gain insights into the trade-offs that arise, while also demonstrating the models’ performance in terms of fairness, effectiveness, and the percentage of food waste. This provides valuable managerial insights for food bank managers. We show that our proposed approach makes a significant contribution to the field of logistics optimization and social responsibility, offering valuable insights for improving the operations of food banks.

Keywords: food banks, humanitarian logistics, equitable resource allocation, synchronized vehicle routing

Procedia PDF Downloads 46
7045 Optimization of the Jatropha curcas Supply Chain as a Criteria for the Implementation of Future Collection Points in Rural Areas of Manabi-Ecuador

Authors: Boris G. German, Edward Jiménez, Sebastián Espinoza, Andrés G. Chico, Ricardo A. Narváez

Abstract:

The unique flora and fauna of The Galapagos Islands has leveraged a tourism-driven growth in the islands. Nonetheless, such development is energy-intensive and requires thousands of gallons of diesel each year for thermoelectric electricity generation. The needed transport of fossil fuels from the continent has generated oil spillages and affectations to the fragile ecosystem of the islands. The Zero Fossil Fuels initiative for The Galapagos proposed by the Ecuadorian government as an alternative to reduce the use of fossil fuels in the islands, considers the replacement of diesel in thermoelectric generators, by Jatropha curcas vegetable oil. However, the Jatropha oil supply cannot entirely cover yet the demand for electricity generation in Galapagos. Within this context, the present work aims to provide an optimization model that can be used as a selection criterion for approving new Jatropha Curcas collection points in rural areas of Manabi-Ecuador. For this purpose, existing Jatropha collection points in Manabi were grouped under three regions: north (7 collection points), center (4 collection points) and south (9 collection points). Field work was carried out in every region in order to characterize the collection points, to establish local Jatropha supply and to determine transportation costs. Data collection was complemented using GIS software and an objective function was defined in order to determine the profit associated to Jatropha oil production. The market price of both Jatropha oil and residual cake, were considered for the total revenue; whereas Jatropha price, transportation and oil extraction costs were considered for the total cost. The tonnes of Jatropha fruit and seed, transported from collection points to the extraction plant, were considered as variables. The maximum and minimum amount of the collected Jatropha from each region constrained the optimization problem. The supply chain was optimized using linear programming in order to maximize the profits. Finally, a sensitivity analysis was performed in order to find a profit-based criterion for the acceptance of future collection points in Manabi. The maximum profit reached a value of $ 4,616.93 per year, which represented a total Jatropha collection of 62.3 tonnes Jatropha per year. The northern region of Manabi had the biggest collection share (69%), followed by the southern region (17%). The criteria for accepting new Jatropha collection points in the rural areas of Manabi can be defined by the current maximum profit of the zone and by the variation in the profit when collection points are removed one at a time. The definition of new feasible collection points plays a key role in the supply chain associated to Jatropha oil production. Therefore, a mathematical model that assists decision makers in establishing new collection points while assuring profitability, contributes to guarantee a continued Jatropha oil supply for Galapagos and a sustained economic growth in the rural areas of Ecuador.

Keywords: collection points, Jatropha curcas, linear programming, supply chain

Procedia PDF Downloads 414
7044 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation

Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski

Abstract:

In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.

Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming

Procedia PDF Downloads 389
7043 Multi Objective Simultaneous Assembly Line Balancing and Buffer Sizing

Authors: Saif Ullah, Guan Zailin, Xu Xianhao, He Zongdong, Wang Baoxi

Abstract:

Assembly line balancing problem is aimed to divide the tasks among the stations in assembly lines and optimize some objectives. In assembly lines the workload on stations is different from each other due to different tasks times and the difference in workloads between stations can cause blockage or starvation in some stations in assembly lines. Buffers are used to store the semi-finished parts between the stations and can help to smooth the assembly production. The assembly line balancing and buffer sizing problem can affect the throughput of the assembly lines. Assembly line balancing and buffer sizing problems have been studied separately in literature and due to their collective contribution in throughput rate of assembly lines, balancing and buffer sizing problem are desired to study simultaneously and therefore they are considered concurrently in current research. Current research is aimed to maximize throughput, minimize total size of buffers in assembly line and minimize workload variations in assembly line simultaneously. A multi objective optimization objective is designed which can give better Pareto solutions from the Pareto front and a simple example problem is solved for assembly line balancing and buffer sizing simultaneously. Current research is significant for assembly line balancing research and it can be significant to introduce optimization approaches which can optimize current multi objective problem in future.

Keywords: assembly line balancing, buffer sizing, Pareto solutions

Procedia PDF Downloads 475
7042 Determining the Most Efficient Test Available in Software Testing

Authors: Qasim Zafar, Matthew Anderson, Esteban Garcia, Steven Drager

Abstract:

Software failures can present an enormous detriment to people's lives and cost millions of dollars to repair when they are unexpectedly encountered in the wild. Despite a significant portion of the software development lifecycle and resources are dedicated to testing, software failures are a relatively frequent occurrence. Nevertheless, the evaluation of testing effectiveness remains at the forefront of ensuring high-quality software and software metrics play a critical role in providing valuable insights into quantifiable objectives to assess the level of assurance and confidence in the system. As the selection of appropriate metrics can be an arduous process, the goal of this paper is to shed light on the significance of software metrics by examining a range of testing techniques and metrics as well as identifying key areas for improvement. Additionally, through this investigation, readers will gain a deeper understanding of how metrics can help to drive informed decision-making on delivering high-quality software and facilitate continuous improvement in testing practices.

Keywords: software testing, software metrics, testing effectiveness, black box testing, random testing, adaptive random testing, combinatorial testing, fuzz testing, equivalence partition, boundary value analysis, white box testing

Procedia PDF Downloads 62
7041 Adapting the Chemical Reaction Optimization Algorithm to the Printed Circuit Board Drilling Problem

Authors: Taisir Eldos, Aws Kanan, Waleed Nazih, Ahmad Khatatbih

Abstract:

Chemical Reaction Optimization (CRO) is an optimization metaheuristic inspired by the nature of chemical reactions as a natural process of transforming the substances from unstable to stable states. Starting with some unstable molecules with excessive energy, a sequence of interactions takes the set to a state of minimum energy. Researchers reported successful application of the algorithm in solving some engineering problems, like the quadratic assignment problem, with superior performance when compared with other optimization algorithms. We adapted this optimization algorithm to the Printed Circuit Board Drilling Problem (PCBDP) towards reducing the drilling time and hence improving the PCB manufacturing throughput. Although the PCBDP can be viewed as instance of the popular Traveling Salesman Problem (TSP), it has some characteristics that would require special attention to the transactions that explore the solution landscape. Experimental test results using the standard CROToolBox are not promising for practically sized problems, while it could find optimal solutions for artificial problems and small benchmarks as a proof of concept.

Keywords: evolutionary algorithms, chemical reaction optimization, traveling salesman, board drilling

Procedia PDF Downloads 499
7040 Harmonic Assessment and Mitigation in Medical Diagonesis Equipment

Authors: S. S. Adamu, H. S. Muhammad, D. S. Shuaibu

Abstract:

Poor power quality in electrical power systems can lead to medical equipment at healthcare centres to malfunction and present wrong medical diagnosis. Equipment such as X-rays, computerized axial tomography, etc. can pollute the system due to their high level of harmonics production, which may cause a number of undesirable effects like heating, equipment damages and electromagnetic interferences. The conventional approach of mitigation uses passive inductor/capacitor (LC) filters, which has some drawbacks such as, large sizes, resonance problems and fixed compensation behaviours. The current trends of solutions generally employ active power filters using suitable control algorithms. This work focuses on assessing the level of Total Harmonic Distortion (THD) on medical facilities and various ways of mitigation, using radiology unit of an existing hospital as a case study. The measurement of the harmonics is conducted with a power quality analyzer at the point of common coupling (PCC). The levels of measured THD are found to be higher than the IEEE 519-1992 standard limits. The system is then modelled as a harmonic current source using MATLAB/SIMULINK. To mitigate the unwanted harmonic currents a shunt active filter is developed using synchronous detection algorithm to extract the fundamental component of the source currents. Fuzzy logic controller is then developed to control the filter. The THD without the active power filter are validated using the measured values. The THD with the developed filter show that the harmonics are now within the recommended limits.

Keywords: power quality, total harmonics distortion, shunt active filters, fuzzy logic

Procedia PDF Downloads 463
7039 Industrial-Waste Management in Developing Countries: The Case of Algeria

Authors: L. Sefouhi, M. Djebabra

Abstract:

Industrial operations have been accompanied by a problem: industrial waste which may be toxic, ignitable, corrosive or reactive. If improperly managed, this waste can pose dangerous health and environmental consequences. The industrial waste management becomes a real problem for them. The oil industry is an important sector in Algeria, from exploration to development and marketing of hydrocarbons. For this sector, industrial wastes pose a big problem. The aim of the present study is to present in a systematic way the subject of industrial waste from the point-of-view of definitions in engineering and legislation. This analysis is necessary, as many different approaches and we will attempt to diagnose the current management of industrial waste, namely an inventory of deposits and methods of sorting, packing, storage, and a description of the different disposal routes. Thus, a proposal for a reasoned and responsible management of waste by avoiding a shift towards future expenses related to the disposal of such waste, and prevents pollution they cause to the environment.

Keywords: industrial waste, environment, management, pollution, risks

Procedia PDF Downloads 316
7038 An Automated R-Peak Detection Method Using Common Vector Approach

Authors: Ali Kirkbas

Abstract:

R peaks in an electrocardiogram (ECG) are signs of cardiac activity in individuals that reveal valuable information about cardiac abnormalities, which can lead to mortalities in some cases. This paper examines the problem of detecting R-peaks in ECG signals, which is a two-class pattern classification problem in fact. To handle this problem with a reliable high accuracy, we propose to use the common vector approach which is a successful machine learning algorithm. The dataset used in the proposed method is obtained from MIT-BIH, which is publicly available. The results are compared with the other popular methods under the performance metrics. The obtained results show that the proposed method shows good performance than that of the other. methods compared in the meaning of diagnosis accuracy and simplicity which can be operated on wearable devices.

Keywords: ECG, R-peak classification, common vector approach, machine learning

Procedia PDF Downloads 43
7037 Adding a Few Language-Level Constructs to Improve OOP Verifiability of Semantic Correctness

Authors: Lian Yang

Abstract:

Object-oriented programming (OOP) is the dominant programming paradigm in today’s software industry and it has literally enabled average software developers to develop millions of commercial strength software applications in the era of INTERNET revolution over the past three decades. On the other hand, the lack of strict mathematical model and domain constraint features at the language level has long perplexed the computer science academia and OOP engineering community. This situation resulted in inconsistent system qualities and hard-to-understand designs in some OOP projects. The difficulties with regards to fix the current situation are also well known. Although the power of OOP lies in its unbridled flexibility and enormously rich data modeling capability, we argue that the ambiguity and the implicit facade surrounding the conceptual model of a class and an object should be eliminated as much as possible. We listed the five major usage of class and propose to separate them by proposing new language constructs. By using well-established theories of set and FSM, we propose to apply certain simple, generic, and yet effective constraints at OOP language level in an attempt to find a possible solution to the above-mentioned issues regarding OOP. The goal is to make OOP more theoretically sound as well as to aid programmers uncover warning signs of irregularities and domain-specific issues in applications early on the development stage and catch semantic mistakes at runtime, improving correctness verifiability of software programs. On the other hand, the aim of this paper is more practical than theoretical.

Keywords: new language constructs, set theory, FSM theory, user defined value type, function groups, membership qualification attribute (MQA), check-constraint (CC)

Procedia PDF Downloads 224
7036 A Problem-Based Learning Approach in a Writing Classroom: Tutors’ Experiences and Perceptions

Authors: Muhammad Mukhtar Aliyu

Abstract:

This study investigated tutors’ experiences and perceptions of a problem-based learning approach (PBL) in a writing classroom. The study involved two Nigerian lecturers who facilitated an intact class of second-year students in an English composition course for the period of 12 weeks. Semi-structured interviews were employed to collect data of the study. The lecturers were interviewed before and after the implementation of the PBL process. The overall findings of the study show that the lecturers had positive perceptions of the use of PBL in a writing classroom. Specifically, the findings reveal the lecturers’ positive experiences and perception of the group activities. Finally, the paper gives some pedagogical implications which would give insight for better implementation of the PBL approach.

Keywords: experiences and perception, Nigeria, problem-based learning approach, writing classroom

Procedia PDF Downloads 148
7035 Reducing Hazardous Materials Releases from Railroad Freights through Dynamic Trip Plan Policy

Authors: Omar A. Abuobidalla, Mingyuan Chen, Satyaveer S. Chauhan

Abstract:

Railroad transportation of hazardous materials freights is important to the North America economics that supports the national’s supply chain. This paper introduces various extensions of the dynamic hazardous materials trip plan problems. The problem captures most of the operational features of a real-world railroad transportations systems that dynamically initiates a set of blocks and assigns each shipment to a single block path or multiple block paths. The dynamic hazardous materials trip plan policies have distinguishing features that are integrating the blocking plan, and the block activation decisions. We also present a non-linear mixed integer programming formulation for each variant and present managerial insights based on a hypothetical railroad network. The computation results reveal that the dynamic car scheduling policies are not only able to take advantage of the capacity of the network but also capable of diminishing the population, and environment risks by rerouting the active blocks along the least risky train services without sacrificing the cost advantage of the railroad. The empirical results of this research illustrate that the issue of integrating the blocking plan, and the train makeup of the hazardous materials freights must receive closer attentions.

Keywords: dynamic car scheduling, planning and scheduling hazardous materials freights, airborne hazardous materials, gaussian plume model, integrated blocking and routing plans, box model

Procedia PDF Downloads 195
7034 Prediction of Damage to Cutting Tools in an Earth Pressure Balance Tunnel Boring Machine EPB TBM: A Case Study L3 Guadalajara Metro Line (Mexico)

Authors: Silvia Arrate, Waldo Salud, Eloy París

Abstract:

The wear of cutting tools is one of the most decisive elements when planning tunneling works, programming the maintenance stops and saving the optimum stock of spare parts during the evolution of the excavation. Being able to predict the behavior of cutting tools can give a very competitive advantage in terms of costs and excavation performance, optimized to the needs of the TBM itself. The incredible evolution of data science in recent years gives the option to implement it at the time of analyzing the key and most critical parameters related to machinery with the purpose of knowing how the cutting head is performing in front of the excavated ground. Taking this as a case study, Metro Line 3 of Guadalajara in Mexico will develop the feasibility of using Specific Energy versus data science applied over parameters of Torque, Penetration, and Contact Force, among others, to predict the behavior and status of cutting tools. The results obtained through both techniques are analyzed and verified in the function of the wear and the field situations observed in the excavation in order to determine its effectiveness regarding its predictive capacity. In conclusion, the possibilities and improvements offered by the application of digital tools and the programming of calculation algorithms for the analysis of wear of cutting head elements compared to purely empirical methods allow early detection of possible damage to cutting tools, which is reflected in optimization of excavation performance and a significant improvement in costs and deadlines.

Keywords: cutting tools, data science, prediction, TBM, wear

Procedia PDF Downloads 32
7033 Pruning Algorithm for the Minimum Rule Reduct Generation

Authors: Sahin Emrah Amrahov, Fatih Aybar, Serhat Dogan

Abstract:

In this paper we consider the rule reduct generation problem. Rule Reduct Generation (RG) and Modified Rule Generation (MRG) algorithms, that are used to solve this problem, are well-known. Alternative to these algorithms, we develop Pruning Rule Generation (PRG) algorithm. We compare the PRG algorithm with RG and MRG.

Keywords: rough sets, decision rules, rule induction, classification

Procedia PDF Downloads 509
7032 Proximal Method of Solving Split System of Minimization Problem

Authors: Anteneh Getachew Gebrie, Rabian Wangkeeree

Abstract:

The purpose of this paper is to introduce iterative algorithm solving split system of minimization problem given as a task of finding a common minimizer point of finite family of proper, lower semicontinuous convex functions and whose image under a bounded linear operator is also common minimizer point of another finite family of proper, lower semicontinuous convex functions. We obtain strong convergence of the sequence generated by our algorithm under some suitable conditions on the parameters. The iterative schemes are developed with a way of selecting the step sizes such that the information of operator norm is not necessary. Some applications and numerical experiment is given to analyse the efficiency of our algorithm.

Keywords: Hilbert Space, minimization problems, Moreau-Yosida approximate, split feasibility problem

Procedia PDF Downloads 122
7031 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 253
7030 Market Solvency Capital Requirement Minimization: How Non-linear Solvers Provide Portfolios Complying with Solvency II Regulation

Authors: Abraham Castellanos, Christophe Durville, Sophie Echenim

Abstract:

In this article, a portfolio optimization problem is performed in a Solvency II context: it illustrates how advanced optimization techniques can help to tackle complex operational pain points around the monitoring, control, and stability of Solvency Capital Requirement (SCR). The market SCR of a portfolio is calculated as a combination of SCR sub-modules. These sub-modules are the results of stress-tests on interest rate, equity, property, credit and FX factors, as well as concentration on counter-parties. The market SCR is non convex and non differentiable, which does not make it a natural optimization criteria candidate. In the SCR formulation, correlations between sub-modules are fixed, whereas risk-driven portfolio allocation is usually driven by the dynamics of the actual correlations. Implementing a portfolio construction approach that is efficient on both a regulatory and economic standpoint is not straightforward. Moreover, the challenge for insurance portfolio managers is not only to achieve a minimal SCR to reduce non-invested capital but also to ensure stability of the SCR. Some optimizations have already been performed in the literature, simplifying the standard formula into a quadratic function. But to our knowledge, it is the first time that the standard formula of the market SCR is used in an optimization problem. Two solvers are combined: a bundle algorithm for convex non- differentiable problems, and a BFGS (Broyden-Fletcher-Goldfarb- Shanno)-SQP (Sequential Quadratic Programming) algorithm, to cope with non-convex cases. A market SCR minimization is then performed with historical data. This approach results in significant reduction of the capital requirement, compared to a classical Markowitz approach based on the historical volatility. A comparative analysis of different optimization models (equi-risk-contribution portfolio, minimizing volatility portfolio and minimizing value-at-risk portfolio) is performed and the impact of these strategies on risk measures including market SCR and its sub-modules is evaluated. A lack of diversification of market SCR is observed, specially for equities. This was expected since the market SCR strongly penalizes this type of financial instrument. It was shown that this direct effect of the regulation can be attenuated by implementing constraints in the optimization process or minimizing the market SCR together with the historical volatility, proving the interest of having a portfolio construction approach that can incorporate such features. The present results are further explained by the Market SCR modelling.

Keywords: financial risk, numerical optimization, portfolio management, solvency capital requirement

Procedia PDF Downloads 100
7029 Rewriting, Reframing, and Restructuring the Story: A Narrative and Solution Focused Therapy Approach to Family Therapy

Authors: Eman Tadros

Abstract:

Solution Focused Therapy sheds a positive light on a client’s problem(s) by instilling hope, focusing on the connection with the client, and describing the problem in a way to display change being possible. Solution focused therapists highlight clients’ positive strengths, reframe what clients say, do, or believe in a positive statement, action, or belief. Narrative Therapy focuses on the stories individuals tell about their past in which shape their current and future lives. Changing the language used aids clients in reevaluating their values and views of themselves, this then constructs a more positive way of thinking about their story. Both therapies are based on treating each client as an individual with a problem rather than that the individual is a problem and being able to give power back to the client. The purpose of these ideologies is to open a client to alternative understandings. This paper displays how clinicians can empower and identify their clients’ positive strengths and resiliency factors. Narrative and Solution-Focused Techniques will be integrated to instill positivity and empowerment in clients. Techniques such as deconstruction, collaboration, complimenting, miracle/exception/scaling questioning will be analyzed and modeled. Furthermore, bridging Solution Focused Therapy and Narrative Therapy gives a voice to unheard client(s).

Keywords: solution focused therapy, narrative therapy, empowerment, resilience

Procedia PDF Downloads 228
7028 Pricing, Production and Inventory Policies Manufacturing under Stochastic Demand and Continuous Prices

Authors: Masoud Rabbani, Majede Smizadeh, Hamed Farrokhi-Asl

Abstract:

We study jointly determining prices and production in a multiple period horizon under a general non-stationary stochastic demand with continuous prices. In some periods we need to increase capacity of production to satisfy demand. This paper presents a model to aid multi-period production capacity planning by quantifying the trade-off between product quality and production cost. The product quality is estimated as the statistical variation from the target performances obtained from the output tolerances of the production machines that manufacture the components. We consider different tolerance for different machines that use to increase capacity. The production cost is estimated as the total cost of owning and operating a production facility during the planning horizon.so capacity planning has cost that impact on price. Pricing products often turns out to be difficult to measure them because customers have a reservation price to pay that impact on price and demand. We decide to determine prices and production for periods after enhance capacity and consider reservation price to determine price. First we use an algorithm base on fuzzy set of the optimal objective function values to determine capacity planning by determine maximize interval from upper bound in minimum objectives and define weight for objectives. Then we try to determine inventory and pricing policies. We can use a lemma to solve a problem in MATLAB and find exact answer.

Keywords: price policy, inventory policy, capacity planning, product quality, epsilon -constraint

Procedia PDF Downloads 556
7027 A Tool for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the easy creation of an institutional risk profile for endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support risk factors set up with just the most important values that are important for a particular organisation. Subsequently, the risk profile employs fuzzy models and associated configurations for the file format metadata aggregator to support digital preservation experts with a semi-automatic estimation of endangerment level for file formats. Our goal is to make use of a domain expert knowledge base aggregated from a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation and analysis of risk factors for a requried dimension. The proposed methods improve the visibility of risk factor information and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and automatically aggregated file format metadata from linked open data sources. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: digital information management, file format, endangerment analysis, fuzzy models

Procedia PDF Downloads 390
7026 Creating Renewable Energy Investment Portfolio in Turkey between 2018-2023: An Approach on Multi-Objective Linear Programming Method

Authors: Berker Bayazit, Gulgun Kayakutlu

Abstract:

The World Energy Outlook shows that energy markets will substantially change within a few forthcoming decades. First, determined action plans according to COP21 and aim of CO₂ emission reduction have already impact on policies of countries. Secondly, swiftly changed technological developments in the field of renewable energy will be influential upon medium and long-term energy generation and consumption behaviors of countries. Furthermore, share of electricity on global energy consumption is to be expected as high as 40 percent in 2040. Electrical vehicles, heat pumps, new electronical devices and digital improvements will be outstanding technologies and innovations will be the testimony of the market modifications. In order to meet highly increasing electricity demand caused by technologies, countries have to make new investments in the field of electricity production, transmission and distribution. Specifically, electricity generation mix becomes vital for both prevention of CO₂ emission and reduction of power prices. Majority of the research and development investments are made in the field of electricity generation. Hence, the prime source diversity and source planning of electricity generation are crucial for improving the wealth of citizen life. Approaches considering the CO₂ emission and total cost of generation, are necessary but not sufficient to evaluate and construct the product mix. On the other hand, employment and positive contribution to macroeconomic values are important factors that have to be taken into consideration. This study aims to constitute new investments in renewable energies (solar, wind, geothermal, biogas and hydropower) between 2018-2023 under 4 different goals. Therefore, a multi-objective programming model is proposed to optimize the goals of minimizing the CO₂ emission, investment amount and electricity sales price while maximizing the total employment and positive contribution to current deficit. In order to avoid the user preference among the goals, Dinkelbach’s algorithm and Guzel’s approach have been combined. The achievements are discussed with comparison to the current policies. Our study shows that new policies like huge capacity allotment might be discussible although obligation for local production is positive. The improvements in grid infrastructure and re-design support for the biogas and geothermal can be recommended.

Keywords: energy generation policies, multi-objective linear programming, portfolio planning, renewable energy

Procedia PDF Downloads 227
7025 Research on Pilot Sequence Design Method of Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing System Based on High Power Joint Criterion

Authors: Linyu Wang, Jiahui Ma, Jianhong Xiang, Hanyu Jiang

Abstract:

For the pilot design of the sparse channel estimation model in Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing (MIMO-OFDM) systems, the observation matrix constructed according to the matrix cross-correlation criterion, total correlation criterion and other optimization criteria are not optimal, resulting in inaccurate channel estimation and high bit error rate at the receiver. This paper proposes a pilot design method combining high-power sum and high-power variance criteria, which can more accurately estimate the channel. First, the pilot insertion position is designed according to the high-power variance criterion under the condition of equal power. Then, according to the high power sum criterion, the pilot power allocation is converted into a cone programming problem, and the power allocation is carried out. Finally, the optimal pilot is determined by calculating the weighted sum of the high power sum and the high power variance. Compared with the traditional pilot frequency, under the same conditions, the constructed MIMO-OFDM system uses the optimal pilot frequency for channel estimation, and the communication bit error rate performance obtains a gain of 6~7dB.

Keywords: MIMO-OFDM, pilot optimization, compressed sensing, channel estimation

Procedia PDF Downloads 130
7024 Assessment of the Root Causes of Marine Debris Problem in Lagos State

Authors: Chibuzo Okoye Daniels, Gillian Glegg, Lynda Rodwell

Abstract:

The continuously growing quantity of very slow degrading litter deliberately discarded into the coastal waters around Lagos as marine debris is obvious. What is not known is how to tackle this problem to reduce its prevalence and impact on the environment, economy and community. To identify ways of tackling the marine debris problem two case study areas (Ikoyi and Victoria Islands of Lagos State) were used to assess the root causes, the threat posed by marine debris in the coastal waters around Lagos and the efficacy of current instruments, programmes and initiatives that address marine debris in the study areas. The following methods were used: (1) Self-completed questionnaires for households and businesses within the study areas; (2) Semi-structured interviews with key stakeholders; (3) Observational studies of waste management from collection to disposal and waste management facilities for waste originating from land and maritime sources; (4) Beach surveys and marine debris surveys on shorelines and ports; and (5) Fishing for marine debris. Results of this study identified the following root causes: (1) Indiscriminate human activities and behaviors, and lack of awareness on the part of the main stakeholders and the public of the potential consequences of their actions; (2) Poor solid waste management practices; (3) Lack of strict legal frameworks addressing waste and marine debris problem; and (4) Disposal of non-degradable wastes into domestic sewer system and open streets drains. To effectively tackle marine debris problem in the study areas, adequate, appropriate and cost effective solutions to the above mentioned root causes needs to be identified and effectively transferred for implementation in the study areas.

Keywords: marine debris problem, Lagos state, litter, coastal waters

Procedia PDF Downloads 360
7023 Inverse Mode Shape Problem of Hand-Arm Vibration (Humerus Bone) for Bio-Dynamic Response Using Varying Boundary Conditions

Authors: Ajay R, Rammohan B, Sridhar K S S, Gurusharan N

Abstract:

The objective of the work is to develop a numerical method to solve the inverse mode shape problem by determining the cross-sectional area of a structure for the desired mode shape via the vibration response study of the humerus bone, which is in the form of a cantilever beam with anisotropic material properties. The humerus bone is the long bone in the arm that connects the shoulder to the elbow. The mode shape is assumed to be a higher-order polynomial satisfying a prescribed set of boundary conditions to converge the numerical algorithm. The natural frequency and the mode shapes are calculated for different boundary conditions to find the cross-sectional area of humerus bone from Eigenmode shape with the aid of the inverse mode shape algorithm. The cross-sectional area of humerus bone validates the mode shapes of specific boundary conditions. The numerical method to solve the inverse mode shape problem is validated in the biomedical application by finding the cross-sectional area of a humerus bone in the human arm.

Keywords: Cross-sectional area, Humerus bone, Inverse mode shape problem, Mode shape

Procedia PDF Downloads 109
7022 Comparison of the Boundary Element Method and the Method of Fundamental Solutions for Analysis of Potential and Elasticity

Authors: S. Zenhari, M. R. Hematiyan, A. Khosravifard, M. R. Feizi

Abstract:

The boundary element method (BEM) and the method of fundamental solutions (MFS) are well-known fundamental solution-based methods for solving a variety of problems. Both methods are boundary-type techniques and can provide accurate results. In comparison to the finite element method (FEM), which is a domain-type method, the BEM and the MFS need less manual effort to solve a problem. The aim of this study is to compare the accuracy and reliability of the BEM and the MFS. This comparison is made for 2D potential and elasticity problems with different boundary and loading conditions. In the comparisons, both convex and concave domains are considered. Both linear and quadratic elements are employed for boundary element analysis of the examples. The discretization of the problem domain in the BEM, i.e., converting the boundary of the problem into boundary elements, is relatively simple; however, in the MFS, obtaining appropriate locations of collocation and source points needs more attention to obtain reliable solutions. The results obtained from the presented examples show that both methods lead to accurate solutions for convex domains, whereas the BEM is more suitable than the MFS for concave domains.

Keywords: boundary element method, method of fundamental solutions, elasticity, potential problem, convex domain, concave domain

Procedia PDF Downloads 73
7021 A Collaborative Problem Driven Approach to Design an HR Analytics Application

Authors: L. Atif, C. Rosenthal-Sabroux, M. Grundstein

Abstract:

The requirements engineering process is a crucial phase in the design of complex systems. The purpose of our research is to present a collaborative problem-driven requirements engineering approach that aims at improving the design of a Decision Support System as an Analytics application. This approach has been adopted to design a Human Resource management DSS. The Requirements Engineering process is presented as a series of guidelines for activities that must be implemented to assure that the final product satisfies end-users requirements and takes into account the limitations identified. For this, we know that a well-posed statement of the problem is “a problem whose crucial character arises from collectively produced estimation and a formulation found to be acceptable by all the parties”. Moreover, we know that DSSs were developed to help decision-makers solve their unstructured problems. So, we thus base our research off of the assumption that developing DSS, particularly for helping poorly structured or unstructured decisions, cannot be done without considering end-user decision problems, how to represent them collectively, decisions content, their meaning, and the decision-making process; thus, arise the field issues in a multidisciplinary perspective. Our approach addresses a problem-driven and collaborative approach to designing DSS technologies: It will reflect common end-user problems in the upstream design phase and in the downstream phase these problems will determine the design choices and potential technical solution. We will thus rely on a categorization of HR’s problems for a development mirroring the Analytics solution. This brings out a new data-driven DSS typology: Descriptive Analytics, Explicative or Diagnostic Analytics, Predictive Analytics, Prescriptive Analytics. In our research, identifying the problem takes place with design of the solution, so, we would have to resort a significant transformations of representations associated with the HR Analytics application to build an increasingly detailed representation of the goal to be achieved. Here, the collective cognition is reflected in the establishment of transfer functions of representations during the whole of the design process.

Keywords: DSS, collaborative design, problem-driven requirements, analytics application, HR decision making

Procedia PDF Downloads 279
7020 A Genetic Algorithm Approach to Solve a Weaving Job Scheduling Problem, Aiming Tardiness Minimization

Authors: Carolina Silva, João Nuno Oliveira, Rui Sousa, João Paulo Silva

Abstract:

This study uses genetic algorithms to solve a job scheduling problem in a weaving factory. The underline problem regards an NP-Hard problem concerning unrelated parallel machines, with sequence-dependent setup times. This research uses real data regarding a weaving industry located in the North of Portugal, with a capacity of 96 looms and a production, on average, of 440000 meters of fabric per month. Besides, this study includes a high level of complexity once most of the real production constraints are applied, and several real data instances are tested. Topics such as data analyses and algorithm performance are addressed and tested, to offer a solution that can generate reliable and due date results. All the approaches will be tested in the operational environment, and the KPIs monitored, to understand the solution's impact on the production, with a particular focus on the total number of weeks of late deliveries to clients. Thus, the main goal of this research is to develop a solution that allows for the production of automatically optimized production plans, aiming to the tardiness minimizing.

Keywords: genetic algorithms, textile industry, job scheduling, optimization

Procedia PDF Downloads 139
7019 Problem Solving in Chilean Higher Education: Figurations Prior in Interpretations of Cartesian Graphs

Authors: Verónica Díaz

Abstract:

A Cartesian graph, as a mathematical object, becomes a tool for configuration of change. Its best comprehension is done through everyday life problem-solving associated with its representation. Despite this, the current educational framework favors general graphs, without consideration of their argumentation. Students are required to find the mathematical function without associating it to the development of graphical language. This research describes the use made by students of configurations made prior to Cartesian graphs with regards to an everyday life problem related to a time and distance variation phenomenon. The theoretical framework describes the function conditions of study and their modeling. This is a qualitative, descriptive study involving six undergraduate case studies that were carried out during the first term in 2016 at University of Los Lagos. The research problem concerned the graphic modeling of a real person’s movement phenomenon, and two levels of analysis were identified. The first level aims to identify local and global graph interpretations; a second level describes the iconicity and referentiality degree of an image. According to the results, students were able to draw no figures before the Cartesian graph, highlighting the need for students to represent the context and the movement of which causes the phenomenon change. From this, they managed Cartesian graphs representing changes in position, therefore, achieved an overall view of the graph. However, the local view only indicates specific events in the problem situation, using graphic and verbal expressions to represent movement. This view does not enable us to identify what happens on the graph when the movement characteristics change based on possible paths in the person’s walking speed.

Keywords: cartesian graphs, higher education, movement modeling, problem solving

Procedia PDF Downloads 204