Search results for: fuzzy goal programming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4557

Search results for: fuzzy goal programming

3627 Urban Planning and Sustainable Cities: Issues and Viewpoints

Authors: Prince, Amoako

Abstract:

This article provides an overview of academic research on urban future planning, with a focus on sustainable cities. The goal of the article is to provide a global update on the issues and viewpoints that are now surrounding urban planning, sustainability, and development. Based on scholarly and scientific research, the review presents potential avenues of investigation and development for ensuring a sustainable urban future. Recent scholarly research in the context of sustainable cities has focused on the conceptualization and knowledge generation involved in building sustainable cities. The goal of the study is to describe the present state of research on concepts and terminologies related to sustainable cities, planning, and techniques for developing and evaluating urban sustainability, even though its breadth may not be all-inclusive. The objective is to offer local governments, urban and development practitioners and other stakeholders some perspective and guidance in striving towards urban sustainability in the future.

Keywords: urban sustainability, sustainable urban development, sustainability assessment, sustainable development, sustainable cities

Procedia PDF Downloads 43
3626 Parallel Fuzzy Rough Support Vector Machine for Data Classification in Cloud Environment

Authors: Arindam Chaudhuri

Abstract:

Classification of data has been actively used for most effective and efficient means of conveying knowledge and information to users. The prima face has always been upon techniques for extracting useful knowledge from data such that returns are maximized. With emergence of huge datasets the existing classification techniques often fail to produce desirable results. The challenge lies in analyzing and understanding characteristics of massive data sets by retrieving useful geometric and statistical patterns. We propose a supervised parallel fuzzy rough support vector machine (PFRSVM) for data classification in cloud environment. The classification is performed by PFRSVM using hyperbolic tangent kernel. The fuzzy rough set model takes care of sensitiveness of noisy samples and handles impreciseness in training samples bringing robustness to results. The membership function is function of center and radius of each class in feature space and is represented with kernel. It plays an important role towards sampling the decision surface. The success of PFRSVM is governed by choosing appropriate parameter values. The training samples are either linear or nonlinear separable. The different input points make unique contributions to decision surface. The algorithm is parallelized with a view to reduce training times. The system is built on support vector machine library using Hadoop implementation of MapReduce. The algorithm is tested on large data sets to check its feasibility and convergence. The performance of classifier is also assessed in terms of number of support vectors. The challenges encountered towards implementing big data classification in machine learning frameworks are also discussed. The experiments are done on the cloud environment available at University of Technology and Management, India. The results are illustrated for Gaussian RBF and Bayesian kernels. The effect of variability in prediction and generalization of PFRSVM is examined with respect to values of parameter C. It effectively resolves outliers’ effects, imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. The average classification accuracy for PFRSVM is better than other classifiers for both Gaussian RBF and Bayesian kernels. The experimental results on both synthetic and real data sets clearly demonstrate the superiority of the proposed technique.

Keywords: FRSVM, Hadoop, MapReduce, PFRSVM

Procedia PDF Downloads 490
3625 Perceptions on Community Media for Effective Acculturation in Nigerian Indigenous Languages

Authors: Chima Onwukwe

Abstract:

This study examined perceptions on the effectiveness, attendant challenges and remedies of community media for effective acculturation in Nigerian languages. The qualitative survey design was adopted with Focus Group Discussions (FGD) and Key Informant Interviews (KIIs) of 50 purposively chosen informants. It was perceived that community media could serve as veritable platform for effective acculturation in Nigerian languages since they would engender the setting of acculturation in Nigerian languages as national objective or goal. It was further held that the strengths of community media for acculturation were in being goal-defined, ensuring local content and diversification. The study identified that as palatable as the proposal for community media for effective acculturation in Nigerian languages is; it would be fraught with some set-backs or challenges that were very much surmountable. Perceptions pointed towards transient nature of community media and funding as challenges, as well as multi-based funding as one remedy. Immediate establishment of community media for the purpose of acculturation in Nigerian languages was recommended.

Keywords: perception, community media, acculturation, indigenous language

Procedia PDF Downloads 269
3624 IOT Based Automated Production and Control System for Clean Water Filtration Through Solar Energy Operated by Submersible Water Pump

Authors: Musse Mohamud Ahmed, Tina Linda Achilles, Mohammad Kamrul Hasan

Abstract:

Deterioration of the mother nature is evident these day with clear danger of human catastrophe emanating from greenhouses (GHG) with increasing CO2 emissions to the environment. PV technology can help to reduce the dependency on fossil fuel, decreasing air pollution and slowing down the rate of global warming. The objective of this paper is to propose, develop and design the production of clean water supply to rural communities using an appropriate technology such as Internet of Things (IOT) that does not create any CO2 emissions. Additionally, maximization of solar energy power output and reciprocally minimizing the natural characteristics of solar sources intermittences during less presence of the sun itself is another goal to achieve in this work. The paper presents the development of critical automated control system for solar energy power output optimization using several new techniques. water pumping system is developed to supply clean water with the application of IOT-renewable energy. This system is effective to provide clean water supply to remote and off-grid areas using Photovoltaics (PV) technology that collects energy generated from the sunlight. The focus of this work is to design and develop a submersible solar water pumping system that applies an IOT implementation. Thus, this system has been executed and programmed using Arduino Software (IDE), proteus, Maltab and C++ programming language. The mechanism of this system is that it pumps water from water reservoir that is powered up by solar energy and clean water production was also incorporated using filtration system through the submersible solar water pumping system. The filtering system is an additional application platform which is intended to provide a clean water supply to any households in Sarawak State, Malaysia.

Keywords: IOT, automated production and control system, water filtration, automated submersible water pump, solar energy

Procedia PDF Downloads 88
3623 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules

Authors: Mohsen Maraoui

Abstract:

In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.

Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing

Procedia PDF Downloads 141
3622 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions

Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib

Abstract:

Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.

Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption

Procedia PDF Downloads 411
3621 Simulating Lean and Green Correlation in Supply Chain Context

Authors: Rachid Benmoussa, Fatima Ezzahra Essaber, Roland De Guio, Fatima Zahra Ben Moussa

Abstract:

Implementing green practices in supply chain management is a complex task mainly because ecological, economical and operational goals are usually in conflict. Green practices might thus face companies’ reluctance because managers can consider its implementation obviously as a performance lean degradation. To implement lean and green practices successfully, companies need relevant decision-making tools to highlight the correlation between them. To contribute to this issue, this work tries to answer the following research question: How to use simulation to assess correlation (antagonism or convergence) between lean and green goals? To answer this question, we propose in this paper a based simulation process that measures correlation generally between two variables. So as to prove its relevance, a logistics academic case study is used to illustrate all its stages. It shows, as for example, that Lean goal 'Stock' and Green goal 'CO₂ emission' are not conceptually correlated (linearly).

Keywords: simulation, lean, green, supply chain

Procedia PDF Downloads 502
3620 Timetabling for Interconnected LRT Lines: A Package Solution Based on a Real-world Case

Authors: Huazhen Lin, Ruihua Xu, Zhibin Jiang

Abstract:

In this real-world case, timetabling the LRT network as a whole is rather challenging for the operator: they are supposed to create a timetable to avoid various route conflicts manually while satisfying a given interval and the number of rolling stocks, but the outcome is not satisfying. Therefore, the operator adopts a computerised timetabling tool, the Train Plan Maker (TPM), to cope with this problem. However, with various constraints in the dual-line network, it is still difficult to find an adequate pairing of turnback time, interval and rolling stocks’ number, which requires extra manual intervention. Aiming at current problems, a one-off model for timetabling is presented in this paper to simplify the procedure of timetabling. Before the timetabling procedure starts, this paper presents how the dual-line system with a ring and several branches is turned into a simpler structure. Then, a non-linear programming model is presented in two stages. In the first stage, the model sets a series of constraints aiming to calculate a proper timing for coordinating two lines by adjusting the turnback time at termini. Then, based on the result of the first stage, the model introduces a series of inequality constraints to avoid various route conflicts. With this model, an analysis is conducted to reveal the relation between the ratio of trains in different directions and the possible minimum interval, observing that the more imbalance the ratio is, the less possible to provide frequent service under such strict constraints.

Keywords: light rail transit (LRT), non-linear programming, railway timetabling, timetable coordination

Procedia PDF Downloads 88
3619 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem

Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee

Abstract:

Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.

Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research

Procedia PDF Downloads 336
3618 Optimum Performance of the Gas Turbine Power Plant Using Adaptive Neuro-Fuzzy Inference System and Statistical Analysis

Authors: Thamir K. Ibrahim, M. M. Rahman, Marwah Noori Mohammed

Abstract:

This study deals with modeling and performance enhancements of a gas-turbine combined cycle power plant. A clean and safe energy is the greatest challenges to meet the requirements of the green environment. These requirements have given way the long-time governing authority of steam turbine (ST) in the world power generation, and the gas turbine (GT) will replace it. Therefore, it is necessary to predict the characteristics of the GT system and optimize its operating strategy by developing a simulation system. The integrated model and simulation code for exploiting the performance of gas turbine power plant are developed utilizing MATLAB code. The performance code for heavy-duty GT and CCGT power plants are validated with the real power plant of Baiji GT and MARAFIQ CCGT plants the results have been satisfactory. A new technology of correlation was considered for all types of simulation data; whose coefficient of determination (R2) was calculated as 0.9825. Some of the latest launched correlations were checked on the Baiji GT plant and apply error analysis. The GT performance was judged by particular parameters opted from the simulation model and also utilized Adaptive Neuro-Fuzzy System (ANFIS) an advanced new optimization technology. The best thermal efficiency and power output attained were about 56% and 345MW respectively. Thus, the operation conditions and ambient temperature are strongly influenced on the overall performance of the GT. The optimum efficiency and power are found at higher turbine inlet temperatures. It can be comprehended that the developed models are powerful tools for estimating the overall performance of the GT plants.

Keywords: gas turbine, optimization, ANFIS, performance, operating conditions

Procedia PDF Downloads 425
3617 Effects of Exhibition Firms' Resource Investment Behavior on Their Booth Staffs' Role Perceptions, Goal Acceptance and Work Effort during the Exhibition Period

Authors: Po-Chien Li

Abstract:

Despite the extant literature has hosted a wide-range of knowledge about trade shows, this knowledge base deserves to be further expanded and extended because there exist many unclear issues and overlooked topics. One area that needs much research attention is regarding the behavior and performance of booth workers at the exhibition site. Booth staffs play many key roles in interacting with booth visitors. Their exhibiting-related attitudes and motivations might have significant consequences on a firm’s exhibition results. However, to date, little research, if any, has studied how booth workers are affected and behave in the context of trade fair. The primary purpose of the current study is to develop and test a research model, derived from role theory and resource-based viewpoint, that depicts the effects of a firm’s pre-exhibition resource investment behavior on booth staff’s role perceptions and work behavior during the exhibition period. The author collects data with two survey questionnaires at two trade shows in 2016. One questionnaire is given to the booth head of an exhibiting company, asking about the firm’s resource commitment behavior prior to the exhibition period. In contrast, another questionnaire is provided for a booth worker of the same firm, requesting the individual staff to report his/her own role perceptions, degree of exhibition goal acceptance, and level of work effort during the exhibition period. The study has utilized the following analytic methods, including descriptive statistics, exploratory factor analysis, reliability analysis, and regression analysis. The results of a set of regression analyses show that a firm’s pre-exhibition resource investment behavior has significant effects on a booth staff’s exhibiting perceptions and attitudes. Specifically, an exhibitor’s resource investment behavior has impacts on the factors of booth staff’s role clarity and role conflict. In addition, a booth worker’s role clarity is related to the degree of exhibition goal acceptance, but his/her role conflict is not. Finally, a booth worker’s exhibiting effort is significantly related to the individual’s role clarity, role conflict and goal acceptance. In general, the major contribution of the current research is that it offers insight into and early evidence on the links between an exhibiting firm’s resource commitment behavior and the work perceptions and attitudes of booth staffs during the exhibition period. The current research’s results can benefit the extant literature of exhibition marketing.

Keywords: exhibition resource investment, role perceptions, goal acceptance, work effort

Procedia PDF Downloads 217
3616 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation

Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski

Abstract:

In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.

Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming

Procedia PDF Downloads 407
3615 The Strategy of the International Organization for Migration in Dealing with the Phenomenon of Migration

Authors: Djehich Mohamed Yousri

Abstract:

Nowadays, migration has become a phenomenon that attracts the attention of researchers, countries, agencies, and national and international bodies. Wars and climate change, demographics, poverty, natural disasters, and epidemics are all threats that are contributing daily to forcing more people to migrate. There are those who resort to emigration because of the deteriorating political conditions in their country, others resort to emigration to improve their financial situation, and others emigrate from their country for fear of some penalties and judgments issued against them. In the field of migration, becoming a member of the United Nations as a "relevant organization" gives the United Nations a clear mandate on migration. Its primary goal is to facilitate the management of international migration in an orderly and humane manner. In order to achieve this goal, the organization adopts an international policy to meet the challenges posed in the field of migration. This paper attempts to study the structure of this international organization and its strategy in dealing with the phenomenon of international migration.

Keywords: international organization for migration, immigrants, immigrant rights, resettlement, migration organization strategy

Procedia PDF Downloads 121
3614 Deep Reinforcement Learning with Leonard-Ornstein Processes Based Recommender System

Authors: Khalil Bachiri, Ali Yahyaouy, Nicoleta Rogovschi

Abstract:

Improved user experience is a goal of contemporary recommender systems. Recommender systems are starting to incorporate reinforcement learning since it easily satisfies this goal of increasing a user’s reward every session. In this paper, we examine the most effective Reinforcement Learning agent tactics on the Movielens (1M) dataset, balancing precision and a variety of recommendations. The absence of variability in final predictions makes simplistic techniques, although able to optimize ranking quality criteria, worthless for consumers of the recommendation system. Utilizing the stochasticity of Leonard-Ornstein processes, our suggested strategy encourages the agent to investigate its surroundings. Research demonstrates that raising the NDCG (Discounted Cumulative Gain) and HR (HitRate) criterion without lowering the Ornstein-Uhlenbeck process drift coefficient enhances the diversity of suggestions.

Keywords: recommender systems, reinforcement learning, deep learning, DDPG, Leonard-Ornstein process

Procedia PDF Downloads 142
3613 Prediction of Damage to Cutting Tools in an Earth Pressure Balance Tunnel Boring Machine EPB TBM: A Case Study L3 Guadalajara Metro Line (Mexico)

Authors: Silvia Arrate, Waldo Salud, Eloy París

Abstract:

The wear of cutting tools is one of the most decisive elements when planning tunneling works, programming the maintenance stops and saving the optimum stock of spare parts during the evolution of the excavation. Being able to predict the behavior of cutting tools can give a very competitive advantage in terms of costs and excavation performance, optimized to the needs of the TBM itself. The incredible evolution of data science in recent years gives the option to implement it at the time of analyzing the key and most critical parameters related to machinery with the purpose of knowing how the cutting head is performing in front of the excavated ground. Taking this as a case study, Metro Line 3 of Guadalajara in Mexico will develop the feasibility of using Specific Energy versus data science applied over parameters of Torque, Penetration, and Contact Force, among others, to predict the behavior and status of cutting tools. The results obtained through both techniques are analyzed and verified in the function of the wear and the field situations observed in the excavation in order to determine its effectiveness regarding its predictive capacity. In conclusion, the possibilities and improvements offered by the application of digital tools and the programming of calculation algorithms for the analysis of wear of cutting head elements compared to purely empirical methods allow early detection of possible damage to cutting tools, which is reflected in optimization of excavation performance and a significant improvement in costs and deadlines.

Keywords: cutting tools, data science, prediction, TBM, wear

Procedia PDF Downloads 49
3612 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 371
3611 Weibull Cumulative Distribution Function Analysis with Life Expectancy Endurance Test Result of Power Window Switch

Authors: Miky Lee, K. Kim, D. Lim, D. Cho

Abstract:

This paper presents the planning, rationale for test specification derivation, sampling requirements, test facilities, and result analysis used to conduct lifetime expectancy endurance tests on power window switches (PWS) considering thermally induced mechanical stress under diurnal cyclic temperatures during normal operation (power cycling). The detail process of analysis and test results on the selected PWS set were discussed in this paper. A statistical approach to ‘life time expectancy’ was given to the measurement standards dealing with PWS lifetime determination through endurance tests. The approach choice, within the framework of the task, was explained. The present task was dedicated to voltage drop measurement to derive lifetime expectancy while others mostly consider contact or surface resistance. The measurements to perform and the main instruments to measure were fully described accordingly. The failure data from tests were analyzed to conclude lifetime expectancy through statistical method using Weibull cumulative distribution function. The first goal of this task is to develop realistic worst case lifetime endurance test specification because existing large number of switch test standards cannot induce degradation mechanism which makes the switches less reliable. 2nd goal is to assess quantitative reliability status of PWS currently manufactured based on test specification newly developed thru this project. The last and most important goal is to satisfy customer’ requirement regarding product reliability.

Keywords: power window switch, endurance test, Weibull function, reliability, degradation mechanism

Procedia PDF Downloads 235
3610 Evidence Theory Based Emergency Multi-Attribute Group Decision-Making: Application in Facility Location Problem

Authors: Bidzina Matsaberidze

Abstract:

It is known that, in emergency situations, multi-attribute group decision-making (MAGDM) models are characterized by insufficient objective data and a lack of time to respond to the task. Evidence theory is an effective tool for describing such incomplete information in decision-making models when the expert and his knowledge are involved in the estimations of the MAGDM parameters. We consider an emergency decision-making model, where expert assessments on humanitarian aid from distribution centers (HADC) are represented in q-rung ortho-pair fuzzy numbers, and the data structure is described within the data body theory. Based on focal probability construction and experts’ evaluations, an objective function-distribution centers’ selection ranking index is constructed. Our approach for solving the constructed bicriteria partitioning problem consists of two phases. In the first phase, based on the covering’s matrix, we generate a matrix, the columns of which allow us to find all possible partitionings of the HADCs with the service centers. Some constraints are also taken into consideration while generating the matrix. In the second phase, based on the matrix and using our exact algorithm, we find the partitionings -allocations of the HADCs to the centers- which correspond to the Pareto-optimal solutions. For an illustration of the obtained results, a numerical example is given for the facility location-selection problem.

Keywords: emergency MAGDM, q-rung orthopair fuzzy sets, evidence theory, HADC, facility location problem, multi-objective combinatorial optimization problem, Pareto-optimal solutions

Procedia PDF Downloads 92
3609 Creating Renewable Energy Investment Portfolio in Turkey between 2018-2023: An Approach on Multi-Objective Linear Programming Method

Authors: Berker Bayazit, Gulgun Kayakutlu

Abstract:

The World Energy Outlook shows that energy markets will substantially change within a few forthcoming decades. First, determined action plans according to COP21 and aim of CO₂ emission reduction have already impact on policies of countries. Secondly, swiftly changed technological developments in the field of renewable energy will be influential upon medium and long-term energy generation and consumption behaviors of countries. Furthermore, share of electricity on global energy consumption is to be expected as high as 40 percent in 2040. Electrical vehicles, heat pumps, new electronical devices and digital improvements will be outstanding technologies and innovations will be the testimony of the market modifications. In order to meet highly increasing electricity demand caused by technologies, countries have to make new investments in the field of electricity production, transmission and distribution. Specifically, electricity generation mix becomes vital for both prevention of CO₂ emission and reduction of power prices. Majority of the research and development investments are made in the field of electricity generation. Hence, the prime source diversity and source planning of electricity generation are crucial for improving the wealth of citizen life. Approaches considering the CO₂ emission and total cost of generation, are necessary but not sufficient to evaluate and construct the product mix. On the other hand, employment and positive contribution to macroeconomic values are important factors that have to be taken into consideration. This study aims to constitute new investments in renewable energies (solar, wind, geothermal, biogas and hydropower) between 2018-2023 under 4 different goals. Therefore, a multi-objective programming model is proposed to optimize the goals of minimizing the CO₂ emission, investment amount and electricity sales price while maximizing the total employment and positive contribution to current deficit. In order to avoid the user preference among the goals, Dinkelbach’s algorithm and Guzel’s approach have been combined. The achievements are discussed with comparison to the current policies. Our study shows that new policies like huge capacity allotment might be discussible although obligation for local production is positive. The improvements in grid infrastructure and re-design support for the biogas and geothermal can be recommended.

Keywords: energy generation policies, multi-objective linear programming, portfolio planning, renewable energy

Procedia PDF Downloads 244
3608 Creation of a Realistic Railway Simulator Developed on a 3D Graphic Game Engine Using a Numerical Computing Programming Environment

Authors: Kshitij Ansingkar, Yohei Hoshino, Liangliang Yang

Abstract:

Advances in algorithms related to autonomous systems have made it possible to research on improving the accuracy of a train’s location. This has the capability of increasing the throughput of a railway network without the need for the creation of additional infrastructure. To develop such a system, the railway industry requires data to test sensor fusion theories or implement simultaneous localization and mapping (SLAM) algorithms. Though such simulation data and ground truth datasets are available for testing automation algorithms of vehicles, however, due to regulations and economic considerations, there is a dearth of such datasets in the railway industry. Thus, there is a need for the creation of a simulation environment that can generate realistic synthetic datasets. This paper proposes (1) to leverage the capabilities of open-source 3D graphic rendering software to create a visualization of the environment. (2) to utilize open-source 3D geospatial data for accurate visualization and (3) to integrate the graphic rendering software with a programming language and numerical computing platform. To develop such an integrated platform, this paper utilizes the computing platform’s advanced sensor models like LIDAR, camera, IMU or GPS and merges it with the 3D rendering of the game engine to generate high-quality synthetic data. Further, these datasets can be used to train Railway models and improve the accuracy of a train’s location.

Keywords: 3D game engine, 3D geospatial data, dataset generation, railway simulator, sensor fusion, SLAM

Procedia PDF Downloads 1
3607 Harmonic Assessment and Mitigation in Medical Diagonesis Equipment

Authors: S. S. Adamu, H. S. Muhammad, D. S. Shuaibu

Abstract:

Poor power quality in electrical power systems can lead to medical equipment at healthcare centres to malfunction and present wrong medical diagnosis. Equipment such as X-rays, computerized axial tomography, etc. can pollute the system due to their high level of harmonics production, which may cause a number of undesirable effects like heating, equipment damages and electromagnetic interferences. The conventional approach of mitigation uses passive inductor/capacitor (LC) filters, which has some drawbacks such as, large sizes, resonance problems and fixed compensation behaviours. The current trends of solutions generally employ active power filters using suitable control algorithms. This work focuses on assessing the level of Total Harmonic Distortion (THD) on medical facilities and various ways of mitigation, using radiology unit of an existing hospital as a case study. The measurement of the harmonics is conducted with a power quality analyzer at the point of common coupling (PCC). The levels of measured THD are found to be higher than the IEEE 519-1992 standard limits. The system is then modelled as a harmonic current source using MATLAB/SIMULINK. To mitigate the unwanted harmonic currents a shunt active filter is developed using synchronous detection algorithm to extract the fundamental component of the source currents. Fuzzy logic controller is then developed to control the filter. The THD without the active power filter are validated using the measured values. The THD with the developed filter show that the harmonics are now within the recommended limits.

Keywords: power quality, total harmonics distortion, shunt active filters, fuzzy logic

Procedia PDF Downloads 479
3606 Positive Energy Districts in the Swedish Energy System

Authors: Vartan Ahrens Kayayan, Mattias Gustafsson, Erik Dotzauer

Abstract:

The European Union is introducing the positive energy district concept, which has the goal to reduce overall carbon dioxide emissions. Other studies have already mapped the make-up of such districts, and reviewed their definitions and where they are positioned. The Swedish energy system is unique compared to others in Europe, due to the implementation of low-carbon electricity and heat energy sources and high uptake of district heating. The goal for this paper is to start the discussion about how the concept of positive energy districts can best be applied to the Swedish context and meet their mitigation goals. To explore how these differences impact the formation of positive energy districts, two cases were analyzed for their methods and how these integrate into the Swedish energy system: a district in Uppsala with a focus on energy and another in Helsingborg with a focus on climate. The case in Uppsala uses primary energy calculations which can be critisied but take a virtual border that allows for its surrounding system to be considered. The district in Helsingborg has a complex methodology for considering the life cycle emissions of the neighborhood. It is successful in considering the energy balance on a monthly basis, but it can be problematized in terms of creating sub-optimized systems due to setting tight geographical constraints. The discussion of shaping the definitions and methodologies for positive energy districts is taking place in Europe and Sweden. We identify three pitfalls that must be avoided so that positive energy districts meet their mitigation goals in the Swedish context. The goal of pushing out fossil fuels is not relevant in the current energy system, the mismatch between summer electricity production and winter energy demands should be addressed, and further implementations should consider collaboration with the established district heating grid.

Keywords: positive energy districts, energy system, renewable energy, European Union

Procedia PDF Downloads 78
3605 Enhancing goal Achivement through Improved Communication Skills

Authors: Lin Xie, Yang Wang

Abstract:

An extensive body of research studies suggest that students, teachers, and supervisors can enhance the likelihood of reaching their goals by improving their communication skills. It is highly important to learn how and when to provide different kinds of feedback, e.g. anticipatory, corrective and positive) will gain better result and higher morale. The purpose of this mixed methods research is twofold: 1) To find out what factors affect effective communication among different stakeholders and how these factors affect student learning 2) What are the good practices for improving communication among different stakeholders and improve student achievement. This presentation first begins with an introduction to the recent research on Marshall’s Nonviolent Communication Techniques (NVC), including four important components: observations, feelings, needs, requests. These techniques can be effectively applied at all levels of communication. To develop an in-depth understanding of the relationship among different techniques within, this research collected, compared, and combined qualitative and quantitative data to better improve communication and support student learning.

Keywords: communication, education, language learning, goal achievement, academic success

Procedia PDF Downloads 72
3604 The Effects of an Online Career Intervention on University Students’ Levels of Career Adaptability

Authors: Anna Veres

Abstract:

People’s ability to adapt to a constantly changing environment is essential. Career adaptability is central to Career Construction Theory, where proper adaptation to new situations, changing environments, and jobs require adequate career development. Based on current career theories and the possibilities offered by digital technology, the primary goal of this study is to develop career adaptability through an online tool. Its secondary goal is to apply for an online career intervention program and explore its developmental possibilities. A total of 132 university students from the bachelor program took part in the study, from which 65 students received a four-week online career intervention, while 67 participants formed the control group. Based on the results, it can state that career adaptability can be developed, and there is a great demand and interest from university students to use career-related programs on online platforms. Career interventions should be performed online as well if there is suitable software and a well-constructed program. Limitations and further implications are discussed.

Keywords: career adaptability, career development, online career intervention, university students

Procedia PDF Downloads 140
3603 Development of a Web-Based Application for Intelligent Fertilizer Management in Rice Cultivation

Authors: Hao-Wei Fu, Chung-Feng Kao

Abstract:

In the era of rapid technological advancement, information technology (IT) has become integral to modern life, exerting significant influence across diverse sectors and serving as a catalyst for development in various industries. Within agriculture, the integration of IT offers substantial benefits, notably enhancing operational efficiency. Real-time monitoring systems, for instance, have been widely embraced in agriculture, effectively improving crop management practices. This study specifically addresses the management of rice panicle fertilizer, presenting the development of a web application tailored to handle data associated with rice panicle fertilizer management. Leveraging the normalized difference red edge index, this application optimizes the quantity of rice panicle fertilizer used, providing recommendations to agricultural stakeholders and service providers in the agricultural information sector. The overarching objective is to minimize costs while maximizing yields. Furthermore, a robust database system has been established to store and manage relevant data for future reference in rice cultivation management. Additionally, the study utilizes the Representational State Transfer software architectural style to construct an application programming interface (API), facilitating data creation, retrieval, updating, and deletion for users via the HyperText Transfer Protocol methods. Future plans involve integrating this API with third-party services to incorporate it into larger frameworks, thus catering to the diverse requirements of various third-party services.

Keywords: application programming interface, HyperText Transfer Protocol, nitrogen fertilizer intelligent management, web-based application

Procedia PDF Downloads 61
3602 Revenue Management of Perishable Products Considering Freshness and Price Sensitive Customers

Authors: Onur Kaya, Halit Bayer

Abstract:

Global grocery and supermarket sales are among the largest markets in the world and perishable products such as fresh produce, dairy and meat constitute the biggest section of these markets. Due to their deterioration over time, the demand for these products depends highly on their freshness. They become totally obsolete after a certain amount of time causing a high amount of wastage and decreases in grocery profits. In addition, customers are asking for higher product variety in perishable product categories, leading to less predictable demand per product and to more out-dating. Effective management of these perishable products is an important issue since it is observed that billions of dollars’ worth of food is expired and wasted every month. We consider coordinated inventory and pricing decisions for perishable products with a time and price dependent random demand function. We use stochastic dynamic programming to model this system for both periodically-reviewed and continuously-reviewed inventory systems and prove certain structural characteristics of the optimal solution. We prove that the optimal ordering decision scenario has a monotone structure and the optimal price value decreases by time. However, the optimal price changes in a non-monotonic structure with respect to inventory size. We also analyze the effect of 1 different parameters on the optimal solution through numerical experiments. In addition, we analyze simple-to-implement heuristics, investigate their effectiveness and extract managerial insights. This study gives valuable insights about the management of perishable products in order to decrease wastage and increase profits.

Keywords: age-dependent demand, dynamic programming, perishable inventory, pricing

Procedia PDF Downloads 247
3601 Important Factors Affecting the Effectiveness of Quality Control Circles

Authors: Sogol Zarafshan

Abstract:

The present study aimed to identify important factors affecting the effectiveness of quality control circles in a hospital, as well as rank them using a combination of fuzzy VIKOR and Grey Relational Analysis (GRA). The study population consisted of five academic members and five experts in the field of nursing working in a hospital, who were selected using a purposive sampling method. Also, a sample of 107 nurses was selected through a simple random sampling method using their employee codes and the random-number table. The required data were collected using a researcher-made questionnaire which consisted of 12 factors. The validity of this questionnaire was confirmed through giving the opinions of experts and academic members who participated in the present study, as well as performing confirmatory factor analysis. Its reliability also was verified (α=0.796). The collected data were analyzed using SPSS 22.0 and LISREL 8.8, as well as VIKOR–GRA and IPA methods. The results of ranking the factors affecting the effectiveness of quality control circles showed that the highest and lowest ranks were related to ‘Managers’ and supervisors’ support’ and ‘Group leadership’. Also, the highest hospital performance was for factors such as ‘Clear goals and objectives’ and ‘Group cohesiveness and homogeneity’, and the lowest for ‘Reward system’ and ‘Feedback system’, respectively. The results showed that although ‘Training the members’, ‘Using the right tools’ and ‘Reward system’ were factors that were of great importance, the organization’s performance for these factors was poor. Therefore, these factors should be paid more attention by the studied hospital managers and should be improved as soon as possible.

Keywords: Quality control circles, Fuzzy VIKOR, Grey Relational Analysis, Importance–Performance Analysis

Procedia PDF Downloads 135
3600 Guidelines for Cooperation between Police and the Media with an Approach to Prevent Juvenile Delinquency

Authors: Akbar Salimi, Mehdi Moghimi

Abstract:

Goal: Today, the cooperative and systemic work is of importance and guarantees higher efficiency. This research was done with the aim of understanding the guidelines for co-op between police and the national media in order to reduce the juvenile delinquency. Method: This research is applied in terms of goal and of a compound type, which was done through a descriptive-analytical methodology. The data were collected through field surveys and documents. The statistical population included the professors of a higher education center in the area of education affairs, where as many as 36 people were randomly selected. The data collection procedure was by way of interview and researcher made questionnaire. Findings and results: Problems caused by the national media in the area of adolescents are categorized in three levels of production, broadcasting and consumption and elimination and reduction of the problems entail a set of estimations and predictions and also some education which the police forces has the capability to operationalize them. Thus, three hypotheses were defined and by conducting t and Friedman tests, all three hypotheses were confirmed and their rating was identified.

Keywords: management, media, TV, adolscents, delinquency

Procedia PDF Downloads 255
3599 Polarization of Lithuanian Society on Issues Related to Language Politics

Authors: Eglė Žurauskaitė, Eglė Gudavičienė

Abstract:

The goal of this paper is to reveal how polarization is constructed through the use of impoliteness strategies. In general, media helps to spread various ideas very fast, and it means that processes of polarization are best revealed in computer-mediated communication (CMC) contexts. For this reason, data for the research was collected from online texts about a current, very diverse topic in Lithuania - Lithuanian language policy and regulations, because this topic is causing a lot of tension in Lithuanian society. Computer-mediated communication allows users to edit their message before they send it. It means that addressees carefully select verbal expressions to convey their message. In other words, each impoliteness strategy and its verbal expression were created intentionally. Impoliteness strategies in this research are understood as various ways to reach a communicative goal: belittle the other. To reach the goal, the public opinions of various Lithuanian public figures (e. g., cultural people, politicians, officials) were collected from new portals in 2019–2023 and analyzed using both quantitative and qualitative approaches. First, problematic aspects of the language policy, for which public figures complain, were identified. Then instances when public figures take a defensive position were analyzed: how they express this position and what it reveals about Lithuanian culture. Findings of this research demonstrate how concepts of impoliteness theory can be applied in analyzing the process of polarization in Lithuanian society on issues related to the State language policy. Also, to reveal how polarization is constructed, these tasks were set: a) determine which impoliteness strategies are used throughout the process of creating polarization, b) analyze how they were expressed verbally (e. g., as an advice, offer, etc.).

Keywords: impoliteness, Lithuanian language policy, polarization, impoliteness strategies

Procedia PDF Downloads 57
3598 Finding a Set of Long Common Substrings with Repeats from m Input Strings

Authors: Tiantian Li, Lusheng Wang, Zhaohui Zhan, Daming Zhu

Abstract:

In this paper, we propose two string problems, and study algorithms and complexity of various versions for those problems. Let S = {s₁, s₂, . . . , sₘ} be a set of m strings. A common substring of S is a substring appearing in every string in S. Given a set of m strings S = {s₁, s₂, . . . , sₘ} and a positive integer k, we want to find a set C of k common substrings of S such that the k common substrings in C appear in the same order and have no overlap among the m input strings in S, and the total length of the k common substring in C is maximized. This problem is referred to as the longest total length of k common substrings from m input strings (LCSS(k, m) for short). The other problem we study here is called the longest total length of a set of common substrings with length more than l from m input string (LSCSS(l, m) for short). Given a set of m strings S = {s₁, s₂, . . . , sₘ} and a positive integer l, for LSCSS(l, m), we want to find a set of common substrings of S, each is of length more than l, such that the total length of all the common substrings is maximized. We show that both problems are NP-hard when k and m are variables. We propose dynamic programming algorithms with time complexity O(k n₁n₂) and O(n₁n₂) to solve LCSS(k, 2) and LSCSS(l, 2), respectively, where n1 and n₂ are the lengths of the two input strings. We then design an algorithm for LSCSS(l, m) when every length > l common substring appears once in each of the m − 1 input strings. The running time is O(n₁²m), where n1 is the length of the input string with no restriction on length > l common substrings. Finally, we propose a fixed parameter algorithm for LSCSS(l, m), where each length > l common substring appears m − 1 + c times among the m − 1 input strings (other than s1). In other words, each length > l common substring may repeatedly appear at most c times among the m − 1 input strings {s₂, s₃, . . . , sₘ}. The running time of the proposed algorithm is O((n12ᶜ)²m), where n₁ is the input string with no restriction on repeats. The LSCSS(l, m) is proposed to handle whole chromosome sequence alignment for different strains of the same species, where more than 98% of letters in core regions are identical.

Keywords: dynamic programming, algorithm, common substrings, string

Procedia PDF Downloads 14