Search results for: weighted fuzzy goal programming
4022 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements
Authors: Agnieszka A. Malinowska, R. Hejmanowski
Abstract:
A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)
Procedia PDF Downloads 1594021 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification
Authors: Hung-Sheng Lin, Cheng-Hsuan Li
Abstract:
Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction
Procedia PDF Downloads 3444020 Optimization of Air Pollution Control Model for Mining
Authors: Zunaira Asif, Zhi Chen
Abstract:
The sustainable measures on air quality management are recognized as one of the most serious environmental concerns in the mining region. The mining operations emit various types of pollutants which have significant impacts on the environment. This study presents a stochastic control strategy by developing the air pollution control model to achieve a cost-effective solution. The optimization method is formulated to predict the cost of treatment using linear programming with an objective function and multi-constraints. The constraints mainly focus on two factors which are: production of metal should not exceed the available resources, and air quality should meet the standard criteria of the pollutant. The applicability of this model is explored through a case study of an open pit metal mine, Utah, USA. This method simultaneously uses meteorological data as a dispersion transfer function to support the practical local conditions. The probabilistic analysis and the uncertainties in the meteorological conditions are accomplished by Monte Carlo simulation. Reasonable results have been obtained to select the optimized treatment technology for PM2.5, PM10, NOx, and SO2. Additional comparison analysis shows that baghouse is the least cost option as compared to electrostatic precipitator and wet scrubbers for particulate matter, whereas non-selective catalytical reduction and dry-flue gas desulfurization are suitable for NOx and SO2 reduction respectively. Thus, this model can aid planners to reduce these pollutants at a marginal cost by suggesting control pollution devices, while accounting for dynamic meteorological conditions and mining activities.Keywords: air pollution, linear programming, mining, optimization, treatment technologies
Procedia PDF Downloads 2084019 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm
Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali
Abstract:
Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir
Procedia PDF Downloads 2654018 A Generalized Weighted Loss for Support Vextor Classification and Multilayer Perceptron
Authors: Filippo Portera
Abstract:
Usually standard algorithms employ a loss where each error is the mere absolute difference between the true value and the prediction, in case of a regression task. In the present, we present several error weighting schemes that are a generalization of the consolidated routine. We study both a binary classification model for Support Vextor Classification and a regression net for Multylayer Perceptron. Results proves that the error is never worse than the standard procedure and several times it is better.Keywords: loss, binary-classification, MLP, weights, regression
Procedia PDF Downloads 954017 Information Technology Approaches to Literature Text Analysis
Authors: Ayse Tarhan, Mustafa Ilkan, Mohammad Karimzadeh
Abstract:
Science was considered as part of philosophy in ancient Greece. By the nineteenth century, it was understood that philosophy was very inclusive and that social and human sciences such as literature, history, and psychology should be separated and perceived as an autonomous branch of science. The computer was also first seen as a tool of mathematical science. Over time, computer science has grown by encompassing every area in which technology exists, and its growth compelled the division of computer science into different disciplines, just as philosophy had been divided into different branches of science. Now there is almost no branch of science in which computers are not used. One of the newer autonomous disciplines of computer science is digital humanities, and one of the areas of digital humanities is literature. The material of literature is words, and thanks to the software tools created using computer programming languages, data that a literature researcher would need months to complete, can be achieved quickly and objectively. In this article, three different tools that literary researchers can use in their work will be introduced. These studies were created with the computer programming languages Python and R and brought to the world of literature. The purpose of introducing the aforementioned studies is to set an example for the development of special tools or programs on Ottoman language and literature in the future and to support such initiatives. The first example to be introduced is the Stylometry tool developed with the R language. The other is The Metrical Tool, which is used to measure data in poems and was developed with Python. The latest literature analysis tool in this article is Voyant Tools, which is a multifunctional and easy-to-use tool.Keywords: DH, literature, information technologies, stylometry, the metrical tool, voyant tools
Procedia PDF Downloads 1514016 Medical and Surgical Nursing Care
Authors: Nassim Salmi
Abstract:
Postoperative mobilization is an important part of fundamental care. Increased mobilization has a positive effect on recovery, but immobilization is still a challenge in postoperative care. Aims: To report how the establishment of a national nursing database was used to measure postoperative mobilization in patients undergoing surgery for ovarian cancer. Mobilization was defined as at least 3 hours out of bed on postoperative day 1, with the goal set at achieving this in 60% of patients. Clinical nurses on 4400 patients with ovarian cancer performed data entry. Findings: 46.7% of patients met the goal for mobilization on the first postoperative day, but variations in duration and type of mobilization were observed. Of those mobilized, 51.8% had been walking in the hallway. A national nursing database creates opportunities to optimize fundamental care. By comparing nursing data with oncological, surgical, and pathology data, it became possible to study mobilization in relation to cancer stage, comorbidity, treatment, and extent of surgery.Keywords: postoperative care, gynecology, nursing documentation, database
Procedia PDF Downloads 1164015 Modeling Metrics for Monitoring Software Project Performance Based on the GQM Model
Authors: Mariayee Doraisamy, Suhaimi bin Ibrahim, Mohd Naz’ri Mahrin
Abstract:
There are several methods to monitor software projects and the objective for monitoring is to ensure that the software projects are developed and delivered successfully. A performance measurement is a method that is closely associated with monitoring and it can be scrutinized by looking at two important attributes which are efficiency and effectiveness both of which are factors that are important for the success of a software project. Consequently, a successful steering is achieved by monitoring and controlling a software project via the performance measurement criteria and metrics. Hence, this paper is aimed at identifying the performance measurement criteria and the metrics for monitoring the performance of a software project by using the Goal Question Metrics (GQM) approach. The GQM approach is utilized to ensure that the identified metrics are reliable and useful. These identified metrics are useful guidelines for project managers to monitor the performance of their software projects.Keywords: component, software project performance, goal question metrics, performance measurement criteria, metrics
Procedia PDF Downloads 3564014 The Difference in Basic Skills among Different Positional Players in Football
Authors: Habib Sk, Ashoke Kumar Biswas
Abstract:
Football is a team game. Eleven players of each team are arranged in different positions of play to serve the specific task during a game situation. Some such basic positions in a soccer game are (i) goal keepers (ii) defenders (iii) midfielders and (iv) forwards. Irrespective of the position, it is required for all football players to learn and get skilled in basic soccer skills like passing, receiving, heading, throwing, dribbling, etc. The purpose of the study was to find out the difference in these basic soccer skills among positional players in football if any. A total of thirty-nine (39) teen aged football players between 13 to 19 years were selected from Hooghly district in West Bengal, India, as subjects. Out of them, there were seven (7) goal keepers, twelve (12) defenders, thirteen (13) midfielders, and seven (7) forwards. Passing, dribbling, tackling, heading, and receiving were the selected basic soccer skills. The performance of the subjects of different positional groups in different selected soccer skills was tested using a standard test for each. On the basis of results obtained through statistical analysis of data, following results were obtained: i) there was significant difference among the groups in passing, dribbling and heading but not in receiving; ii) the goal keepers and defenders were the weakest in all selected soccer skills; iii) midfielders were found better in receiving than other three skills of passing, dribbling and heading; and iv) the forward group of players was found to be the better in passing, dribbling and heading but weakest in receiving than other groups.Keywords: performance, difference, skill, fundamental, soccer, position
Procedia PDF Downloads 1464013 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 3704012 Development of Automatic Laser Scanning Measurement Instrument
Authors: Chien-Hung Liu, Yu-Fen Chen
Abstract:
This study used triangular laser probe and three-axial direction mobile platform for surface measurement, programmed it and applied it to real-time analytic statistics of different measured data. This structure was used to design a system integration program: using triangular laser probe for scattering or reflection non-contact measurement, transferring the captured signals to the computer through RS-232, and using RS-485 to control the three-axis platform for a wide range of measurement. The data captured by the laser probe are formed into a 3D surface. This study constructed an optical measurement application program in the concept of visual programming language. First, the signals are transmitted to the computer through RS-232/RS-485, and then the signals are stored and recorded in graphic interface timely. This programming concept analyzes various messages, and makes proper presentation graphs and data processing to provide the users with friendly graphic interfaces and data processing state monitoring, and identifies whether the present data are normal in graphic concept. The major functions of the measurement system developed by this study are thickness measurement, SPC, surface smoothness analysis, and analytical calculation of trend line. A result report can be made and printed promptly. This study measured different heights and surfaces successfully, performed on-line data analysis and processing effectively, and developed a man-machine interface for users to operate.Keywords: laser probe, non-contact measurement, triangulation measurement principle, statistical process control, labVIEW
Procedia PDF Downloads 3604011 Improving School Design through Diverse Stakeholder Participation in the Programming Phase
Authors: Doris C. C. K. Kowaltowski, Marcella S. Deliberador
Abstract:
The architectural design process, in general, is becoming more complex, as new technical, social, environmental, and economical requirements are imposed. For school buildings, this scenario is also valid. The quality of a school building depends on known design criteria and professional knowledge, as well as feedback from building performance assessments. To attain high-performance school buildings, a design process should add a multidisciplinary team, through an integrated process, to ensure that the various specialists contribute at an early stage to design solutions. The participation of stakeholders is of special importance at the programming phase when the search for the most appropriate design solutions is underway. The composition of a multidisciplinary team should comprise specialists in education, design professionals, and consultants in various fields such as environmental comfort and psychology, sustainability, safety and security, as well as administrators, public officials and neighbourhood representatives. Users, or potential users (teachers, parents, students, school officials, and staff), should be involved. User expectations must be guided, however, toward a proper understanding of a response of design to needs to avoid disappointment. In this context, appropriate tools should be introduced to organize such diverse participants and ensure a rich and focused response to needs and a productive outcome of programming sessions. In this paper, different stakeholder in a school design process are discussed in relation to their specific contributions and a tool in the form of a card game is described to structure the design debates and ensure a comprehensive decision-making process. The game is based on design patterns for school architecture as found in the literature and is adapted to a specific reality: State-run public schools in São Paulo, Brazil. In this State, school buildings are managed by a foundation called Fundação para o Desenvolvimento da Educação (FDE). FDE supervises new designs and is responsible for the maintenance of ~ 5000 schools. The design process of this context was characterised with a recommendation to improve the programming phase. Card games can create a common environment, to which all participants can relate and, therefore, can contribute to briefing debates on an equal footing. The cards of the game described here represent essential school design themes as found in the literature. The tool was tested with stakeholder groups and with architecture students. In both situations, the game proved to be an efficient tool to stimulate school design discussions and to aid in the elaboration of a rich, focused and thoughtful architectural program for a given demand. The game organizes the debates and all participants are shown to spontaneously contribute each in his own field of expertise to the decision-making process. Although the game was specifically based on a local school design process it shows potential for other contexts because the content is based on known facts, needs and concepts of school design, which are global. A structured briefing phase with diverse stakeholder participation can enrich the design process and consequently improve the quality of school buildings.Keywords: architectural program, design process, school building design, stakeholder
Procedia PDF Downloads 4054010 Assessment of Green Infrastructure for Sustainable Urban Water Management
Authors: Suraj Sharma
Abstract:
Green infrastructure (GI) offers a contemporary approach for reducing the risk of flooding, improve water quality, and harvesting stormwater for sustainable use. GI promotes landscape planning to enhance sustainable development and urban resilience. However, the existing literature is lacking in ensuring the comprehensive assessment of GI performance in terms of ecosystem function and services for social, ecological, and economical system resilience. We propose a robust indicator set and fuzzy comprehensive evaluation (FCE) for quantitative and qualitative analysis for sustainable water management to assess the capacity of urban resilience. Green infrastructure in urban resilience water management system (GIUR-WMS) supports decision-making for GI planning through scenario comparisons with urban resilience capacity index. To demonstrate the GIUR-WMS, we develop five scenarios for five sectors of Chandigarh (12, 26, 14, 17, and 34) to test common type of GI (rain barrel, rain gardens, detention basins, porous pavements, and open spaces). The result shows the open spaces achieve the highest green infrastructure urban resilience index of 4.22/5. To implement the open space scenario in urban sites, suitable vacant can be converted to green spaces (example: forest, low impact recreation areas, and detention basins) GIUR-WMS is easy to replicate, customize and apply to cities of different sizes to assess environmental, social and ecological dimensions.Keywords: green infrastructure, assessment, urban resilience, water management system, fuzzy comprehensive evaluation
Procedia PDF Downloads 1434009 Impact of the Electricity Market Prices during the COVID-19 Pandemic on Energy Storage Operation
Authors: Marin Mandić, Elis Sutlović, Tonći Modrić, Luka Stanić
Abstract:
With the restructuring and deregulation of the power system, storage owners, generation companies or private producers can offer their multiple services on various power markets and earn income in different types of markets, such as the day-ahead, real-time, ancillary services market, etc. During the COVID-19 pandemic, electricity prices, as well as ancillary services prices, increased significantly. The optimization of the energy storage operation was performed using a suitable model for simulating the operation of a pumped storage hydropower plant under market conditions. The objective function maximizes the income earned through energy arbitration, regulation-up, regulation-down and spinning reserve services. The optimization technique used for solving the objective function is mixed integer linear programming (MILP). In numerical examples, the pumped storage hydropower plant operation has been optimized considering the already achieved hourly electricity market prices from Nord Pool for the pre-pandemic (2019) and the pandemic (2020 and 2021) years. The impact of the electricity market prices during the COVID-19 pandemic on energy storage operation is shown through the analysis of income, operating hours, reserved capacity and consumed energy for each service. The results indicate the role of energy storage during a significant fluctuation in electricity and services prices.Keywords: electrical market prices, electricity market, energy storage optimization, mixed integer linear programming (MILP) optimization
Procedia PDF Downloads 1744008 Learning-Oriented School Education: Indicator Construction and Taiwan's Implementation Performance
Authors: Meiju Chen, Chaoyu Guo, Chia Wei Tang
Abstract:
The present study's purpose is twofold: first, to construct indicators for learning-oriented school education and, second, to conduct a survey to examine how learning-oriented education has been implemented in junior high schools after the launch of the 12-year compulsory curriculum. For indicator system construction, we compiled relevant literature to develop a preliminary indicator list model and then conducted two rounds of a questionnaire survey to gain comprehensive feedback from experts to finalize our indicator model. In the survey's first round, 12 experts were invited to evaluate the indicators' appropriateness. Based on the experts' consensus, we determined our final indicator list and used it to develop the Fuzzy Delphi questionnaire to finalize the indicator system and each indicator's relative value. For the fact-finding survey, we collected 454 valid samples to examine how the concept of learning-oriented education is adopted and implemented in the junior high school context. We also used this data in our importance-performance analysis to explore the strengths and weaknesses of school education in Taiwan. The results suggest that the indicator system for learning-oriented school education must consist of seven dimensions and 34 indicators. Among the seven dimensions, 'student learning' and 'curriculum planning and implementation' are the most important yet underperforming dimensions that need immediate improvement. We anticipate that the indicator system will be a useful tool for other countries' evaluation of schools' performance in learning-oriented education.Keywords: learning-oriented education, school education, fuzzy Delphi method, importance-performance analysis
Procedia PDF Downloads 1434007 A Low Cost Education Proposal Using Strain Gauges and Arduino to Develop a Balance
Authors: Thais Cavalheri Santos, Pedro Jose Gabriel Ferreira, Alexandre Daliberto Frugoli, Lucio Leonardo, Pedro Americo Frugoli
Abstract:
This paper presents a low cost education proposal to be used in engineering courses. The engineering education in universities of a developing country that is in need of an increasing number of engineers carried out with quality and affordably, pose a difficult problem to solve. In Brazil, the political and economic scenario requires academic managers able to reduce costs without compromising the quality of education. Within this context, the elaboration of a physics principles teaching method with the construction of an electronic balance is proposed. First, a method to develop and construct a load cell through which the students can understand the physical principle of strain gauges and bridge circuit will be proposed. The load cell structure was made with aluminum 6351T6, in dimensions of 80 mm x 13 mm x 13 mm and for its instrumentation, a complete Wheatstone Bridge was assembled with strain gauges of 350 ohms. Additionally, the process involves the use of a software tool to document the prototypes (design circuits), the conditioning of the signal, a microcontroller, C language programming as well as the development of the prototype. The project also intends to use an open-source I/O board (Arduino Microcontroller). To design the circuit, the Fritizing software will be used and, to program the controller, an open-source software named IDE®. A load cell was chosen because strain gauges have accuracy and their use has several applications in the industry. A prototype was developed for this study, and it confirmed the affordability of this educational idea. Furthermore, the goal of this proposal is to motivate the students to understand the several possible applications in high technology of the use of load cells and microcontroller.Keywords: Arduino, load cell, low-cost education, strain gauge
Procedia PDF Downloads 3034006 Clinical Applications of Amide Proton Transfer Magnetic Resonance Imaging: Detection of Brain Tumor Proliferative Activity
Authors: Fumihiro Ima, Shinichi Watanabe, Shingo Maeda, Haruna Imai, Hiroki Niimi
Abstract:
It is important to know growth rate of brain tumors before surgery because it influences treatment planning including not only surgical resection strategy but also adjuvant therapy after surgery. Amide proton transfer (APT) imaging is an emerging molecular magnetic resonance imaging (MRI) technique based on chemical exchange saturation transfer without administration of contrast medium. The underlying assumption in APT imaging of tumors is that there is a close relationship between the proliferative activity of the tumor and mobile protein synthesis. We aimed to evaluate the diagnostic performance of APT imaging of pre-and post-treatment brain tumors. Ten patients with brain tumor underwent conventional and APT-weighted sequences on a 3.0 Tesla MRI before clinical intervention. The maximum and the minimum APT-weighted signals (APTWmax and APTWmin) in each solid tumor region were obtained and compared before and after clinical intervention. All surgical specimens were examined for histopathological diagnosis. Eight of ten patients underwent adjuvant therapy after surgery. Histopathological diagnosis was glioma in 7 patients (WHO grade 2 in 2 patients, WHO grade 3 in 3 patients and WHO grade 4 in 2 patients), meningioma WHO grade1 in 2 patients and primary lymphoma of the brain in 1 patient. High-grade gliomas showed significantly higher APTW-signals than that in low-grade gliomas. APTWmax in one huge parasagittal meningioma infiltrating into the skull bone was higher than that in glioma WHO grade 4. On the other hand, APTWmax in another convexity meningioma was the same as that in glioma WHO grade 3. Diagnosis of primary lymphoma of the brain was possible with APT imaging before pathological confirmation. APTW-signals in residual tumors decreased dramatically within one year after adjuvant therapy in all patients. APT imaging demonstrated excellent diagnostic performance for the planning of surgery and adjuvant therapy of brain tumors.Keywords: amides, magnetic resonance imaging, brain tumors, cell proliferation
Procedia PDF Downloads 1394005 Vehicle Routing Problem Considering Alternative Roads under Triple Bottom Line Accounting
Authors: Onur Kaya, Ilknur Tukenmez
Abstract:
In this study, we consider vehicle routing problems on networks with alternative direct links between nodes, and we analyze a multi-objective problem considering the financial, environmental and social objectives in this context. In real life, there might exist several alternative direct roads between two nodes, and these roads might have differences in terms of their lengths and durations. For example, a road might be shorter than another but might require longer time due to traffic and speed limits. Similarly, some toll roads might be shorter or faster but require additional payment, leading to higher costs. We consider such alternative links in our problem and develop a mixed integer linear programming model that determines which alternative link to use between two nodes, in addition to determining the optimal routes for different vehicles, depending on the model objectives and constraints. We consider the minimum cost routing as the financial objective for the company, minimizing the CO2 emissions and gas usage as the environmental objectives, and optimizing the driver working conditions/working hours, and minimizing the risks of accidents as the social objectives. With these objective functions, we aim to determine which routes, and which alternative links should be used in addition to the speed choices on each link. We discuss the results of the developed vehicle routing models and compare their results depending on the system parameters.Keywords: vehicle routing, alternative links between nodes, mixed integer linear programming, triple bottom line accounting
Procedia PDF Downloads 4074004 Conflicts Identification Approach among Stakeholders in Goal-Oriented Requirements Analysis
Authors: Muhammad Suhaib
Abstract:
Requirements Analysis are the most important part of software Engineering for both system application development, and project requirements. Conflicts often arise during the requirements gathering and analysis phase. This research aims to identify conflicts during the requirements gathering phase in software development life cycle, Research, Development, and Technology converted the world into a global village. During requirements elicitation/gathering phase it’s very difficult to understand the main objective of stakeholders, after completion of requirements elicitation task final results are used for Software Requirements Specification (SRS), SRS is the highly important outcome of the requirements analysis phase. this is the foundation between the developers and stakeholders or customers, proposed methodology will be helpful to identify those conflicts in a very easy manner during the initial phase of the project.Keywords: goal oriented requirements analysis, conflicts identification model, requirements analysis, requirements engineering
Procedia PDF Downloads 1344003 Mixed Integer Programming-Based One-Class Classification Method for Process Monitoring
Authors: Younghoon Kim, Seoung Bum Kim
Abstract:
One-class classification plays an important role in detecting outlier and abnormality from normal observations. In the previous research, several attempts were made to extend the scope of application of the one-class classification techniques to statistical process control problems. For most previous approaches, such as support vector data description (SVDD) control chart, the design of the control limits is commonly based on the assumption that the proportion of abnormal observations is approximately equal to an expected Type I error rate in Phase I process. Because of the limitation of the one-class classification techniques based on convex optimization, we cannot make the proportion of abnormal observations exactly equal to expected Type I error rate: controlling Type I error rate requires to optimize constraints with integer decision variables, but convex optimization cannot satisfy the requirement. This limitation would be undesirable in theoretical and practical perspective to construct effective control charts. In this work, to address the limitation of previous approaches, we propose the one-class classification algorithm based on the mixed integer programming technique, which can solve problems formulated with continuous and integer decision variables. The proposed method minimizes the radius of a spherically shaped boundary subject to the number of normal data to be equal to a constant value specified by users. By modifying this constant value, users can exactly control the proportion of normal data described by the spherically shaped boundary. Thus, the proportion of abnormal observations can be made theoretically equal to an expected Type I error rate in Phase I process. Moreover, analogous to SVDD, the boundary can be made to describe complex structures by using some kernel functions. New multivariate control chart applying the effectiveness of the algorithm is proposed. This chart uses a monitoring statistic to characterize the degree of being an abnormal point as obtained through the proposed one-class classification. The control limit of the proposed chart is established by the radius of the boundary. The usefulness of the proposed method was demonstrated through experiments with simulated and real process data from a thin film transistor-liquid crystal display.Keywords: control chart, mixed integer programming, one-class classification, support vector data description
Procedia PDF Downloads 1744002 Urban Planning and Sustainable Cities: Issues and Viewpoints
Abstract:
This article provides an overview of academic research on urban future planning, with a focus on sustainable cities. The goal of the article is to provide a global update on the issues and viewpoints that are now surrounding urban planning, sustainability, and development. Based on scholarly and scientific research, the review presents potential avenues of investigation and development for ensuring a sustainable urban future. Recent scholarly research in the context of sustainable cities has focused on the conceptualization and knowledge generation involved in building sustainable cities. The goal of the study is to describe the present state of research on concepts and terminologies related to sustainable cities, planning, and techniques for developing and evaluating urban sustainability, even though its breadth may not be all-inclusive. The objective is to offer local governments, urban and development practitioners and other stakeholders some perspective and guidance in striving towards urban sustainability in the future.Keywords: urban sustainability, sustainable urban development, sustainability assessment, sustainable development, sustainable cities
Procedia PDF Downloads 434001 Parallel Fuzzy Rough Support Vector Machine for Data Classification in Cloud Environment
Authors: Arindam Chaudhuri
Abstract:
Classification of data has been actively used for most effective and efficient means of conveying knowledge and information to users. The prima face has always been upon techniques for extracting useful knowledge from data such that returns are maximized. With emergence of huge datasets the existing classification techniques often fail to produce desirable results. The challenge lies in analyzing and understanding characteristics of massive data sets by retrieving useful geometric and statistical patterns. We propose a supervised parallel fuzzy rough support vector machine (PFRSVM) for data classification in cloud environment. The classification is performed by PFRSVM using hyperbolic tangent kernel. The fuzzy rough set model takes care of sensitiveness of noisy samples and handles impreciseness in training samples bringing robustness to results. The membership function is function of center and radius of each class in feature space and is represented with kernel. It plays an important role towards sampling the decision surface. The success of PFRSVM is governed by choosing appropriate parameter values. The training samples are either linear or nonlinear separable. The different input points make unique contributions to decision surface. The algorithm is parallelized with a view to reduce training times. The system is built on support vector machine library using Hadoop implementation of MapReduce. The algorithm is tested on large data sets to check its feasibility and convergence. The performance of classifier is also assessed in terms of number of support vectors. The challenges encountered towards implementing big data classification in machine learning frameworks are also discussed. The experiments are done on the cloud environment available at University of Technology and Management, India. The results are illustrated for Gaussian RBF and Bayesian kernels. The effect of variability in prediction and generalization of PFRSVM is examined with respect to values of parameter C. It effectively resolves outliers’ effects, imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. The average classification accuracy for PFRSVM is better than other classifiers for both Gaussian RBF and Bayesian kernels. The experimental results on both synthetic and real data sets clearly demonstrate the superiority of the proposed technique.Keywords: FRSVM, Hadoop, MapReduce, PFRSVM
Procedia PDF Downloads 4904000 Clinical Applications of Amide Proton Transfer Magnetic Resonance Imaging: Detection of Brain Tumor Proliferative Activity
Authors: Fumihiro Imai, Shinichi Watanabe, Shingo Maeda, Haruna Imai, Hiroki Niimi
Abstract:
It is important to know the growth rate of brain tumors before surgery because it influences treatment planning, including not only surgical resection strategy but also adjuvant therapy after surgery. Amide proton transfer (APT) imaging is an emerging molecular magnetic resonance imaging (MRI) technique based on chemical exchange saturation transfer without the administration of a contrast medium. The underlying assumption in APT imaging of tumors is that there is a close relationship between the proliferative activity of the tumor and mobile protein synthesis. We aimed to evaluate the diagnostic performance of APT imaging of pre-and post-treatment brain tumors. Ten patients with brain tumor underwent conventional and APT-weighted sequences on a 3.0 Tesla MRI before clinical intervention. The maximum and the minimum APT-weighted signals (APTWmax and APTWmin) in each solid tumor region were obtained and compared before and after a clinical intervention. All surgical specimens were examined for histopathological diagnosis. Eight of ten patients underwent adjuvant therapy after surgery. Histopathological diagnosis was glioma in 7 patients (WHO grade 2 in 2 patients, WHO grade 3 in 3 patients, and WHO grade 4 in 2 patients), meningioma WHO grade 1 in 2 patients, and primary lymphoma of the brain in 1 patient. High-grade gliomas showed significantly higher APTW signals than that low-grade gliomas. APTWmax in one huge parasagittal meningioma infiltrating into the skull bone was higher than that in glioma WHO grade 4. On the other hand, APTWmax in another convexity meningioma was the same as that in glioma WHO grade 3. Diagnosis of primary lymphoma of the brain was possible with APT imaging before pathological confirmation. APTW signals in residual tumors decreased dramatically within one year after adjuvant therapy in all patients. APT imaging demonstrated excellent diagnostic performance for the planning of surgery and adjuvant therapy of brain tumors.Keywords: amides, magnetic resonance imaging, brain tumors, cell proliferation
Procedia PDF Downloads 863999 Geospatial Analysis for Predicting Sinkhole Susceptibility in Greene County, Missouri
Authors: Shishay Kidanu, Abdullah Alhaj
Abstract:
Sinkholes in the karst terrain of Greene County, Missouri, pose significant geohazards, imposing challenges on construction and infrastructure development, with potential threats to lives and property. To address these issues, understanding the influencing factors and modeling sinkhole susceptibility is crucial for effective mitigation through strategic changes in land use planning and practices. This study utilizes geographic information system (GIS) software to collect and process diverse data, including topographic, geologic, hydrogeologic, and anthropogenic information. Nine key sinkhole influencing factors, ranging from slope characteristics to proximity to geological structures, were carefully analyzed. The Frequency Ratio method establishes relationships between attribute classes of these factors and sinkhole events, deriving class weights to indicate their relative importance. Weighted integration of these factors is accomplished using the Analytic Hierarchy Process (AHP) and the Weighted Linear Combination (WLC) method in a GIS environment, resulting in a comprehensive sinkhole susceptibility index (SSI) model for the study area. Employing Jenk's natural break classifier method, the SSI values are categorized into five distinct sinkhole susceptibility zones: very low, low, moderate, high, and very high. Validation of the model, conducted through the Area Under Curve (AUC) and Sinkhole Density Index (SDI) methods, demonstrates a robust correlation with sinkhole inventory data. The prediction rate curve yields an AUC value of 74%, indicating a 74% validation accuracy. The SDI result further supports the success of the sinkhole susceptibility model. This model offers reliable predictions for the future distribution of sinkholes, providing valuable insights for planners and engineers in the formulation of development plans and land-use strategies. Its application extends to enhancing preparedness and minimizing the impact of sinkhole-related geohazards on both infrastructure and the community.Keywords: sinkhole, GIS, analytical hierarchy process, frequency ratio, susceptibility, Missouri
Procedia PDF Downloads 743998 Perceptions on Community Media for Effective Acculturation in Nigerian Indigenous Languages
Authors: Chima Onwukwe
Abstract:
This study examined perceptions on the effectiveness, attendant challenges and remedies of community media for effective acculturation in Nigerian languages. The qualitative survey design was adopted with Focus Group Discussions (FGD) and Key Informant Interviews (KIIs) of 50 purposively chosen informants. It was perceived that community media could serve as veritable platform for effective acculturation in Nigerian languages since they would engender the setting of acculturation in Nigerian languages as national objective or goal. It was further held that the strengths of community media for acculturation were in being goal-defined, ensuring local content and diversification. The study identified that as palatable as the proposal for community media for effective acculturation in Nigerian languages is; it would be fraught with some set-backs or challenges that were very much surmountable. Perceptions pointed towards transient nature of community media and funding as challenges, as well as multi-based funding as one remedy. Immediate establishment of community media for the purpose of acculturation in Nigerian languages was recommended.Keywords: perception, community media, acculturation, indigenous language
Procedia PDF Downloads 2693997 Implementation of Quality Function Development to Incorporate Customer’s Value in the Conceptual Design Stage of a Construction Projects
Authors: Ayedh Alqahtani
Abstract:
Many construction firms in Saudi Arabia dedicated to building projects agree that the most important factor in the real estate market is the value that they can give to their customer. These firms understand the value of their client in different ways. Value can be defined as the size of the building project in relationship to the cost or the design quality of the materials utilized in finish work or any other features of building rooms such as the bathroom. Value can also be understood as something suitable for the money the client is investing for the new property. A quality tool is required to support companies to achieve a solution for the building project and to understand and manage the customer’s needs. Quality Function Development (QFD) method will be able to play this role since the main difference between QFD and other conventional quality management tools is QFD a valuable and very flexible tool for design and taking into the account the VOC. Currently, organizations and agencies are seeking suitable models able to deal better with uncertainty, and that is flexible and easy to use. The primary aim of this research project is to incorporate customer’s requirements in the conceptual design of construction projects. Towards this goal, QFD is selected due to its capability to integrate the design requirements to meet the customer’s needs. To develop QFD, this research focused upon the contribution of the different (significantly weighted) input factors that represent the main variables influencing QFD and subsequent analysis of the techniques used to measure them. First of all, this research will review the literature to determine the current practice of QFD in construction projects. Then, the researcher will review the literature to define the current customers of residential projects and gather information on customers’ requirements for the design of the residential building. After that, qualitative survey research will be conducted to rank customer’s needs and provide the views of stakeholder practitioners about how these needs can affect their satisfy. Moreover, a qualitative focus group with the members of the design team will be conducted to determine the improvements level and technical details for the design of residential buildings. Finally, the QFD will be developed to establish the degree of significance of the design’s solution.Keywords: quality function development, construction projects, Saudi Arabia, quality tools
Procedia PDF Downloads 1243996 IOT Based Automated Production and Control System for Clean Water Filtration Through Solar Energy Operated by Submersible Water Pump
Authors: Musse Mohamud Ahmed, Tina Linda Achilles, Mohammad Kamrul Hasan
Abstract:
Deterioration of the mother nature is evident these day with clear danger of human catastrophe emanating from greenhouses (GHG) with increasing CO2 emissions to the environment. PV technology can help to reduce the dependency on fossil fuel, decreasing air pollution and slowing down the rate of global warming. The objective of this paper is to propose, develop and design the production of clean water supply to rural communities using an appropriate technology such as Internet of Things (IOT) that does not create any CO2 emissions. Additionally, maximization of solar energy power output and reciprocally minimizing the natural characteristics of solar sources intermittences during less presence of the sun itself is another goal to achieve in this work. The paper presents the development of critical automated control system for solar energy power output optimization using several new techniques. water pumping system is developed to supply clean water with the application of IOT-renewable energy. This system is effective to provide clean water supply to remote and off-grid areas using Photovoltaics (PV) technology that collects energy generated from the sunlight. The focus of this work is to design and develop a submersible solar water pumping system that applies an IOT implementation. Thus, this system has been executed and programmed using Arduino Software (IDE), proteus, Maltab and C++ programming language. The mechanism of this system is that it pumps water from water reservoir that is powered up by solar energy and clean water production was also incorporated using filtration system through the submersible solar water pumping system. The filtering system is an additional application platform which is intended to provide a clean water supply to any households in Sarawak State, Malaysia.Keywords: IOT, automated production and control system, water filtration, automated submersible water pump, solar energy
Procedia PDF Downloads 883995 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules
Authors: Mohsen Maraoui
Abstract:
In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing
Procedia PDF Downloads 1413994 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions
Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib
Abstract:
Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption
Procedia PDF Downloads 4113993 Nature of Forest Fragmentation Owing to Human Population along Elevation Gradient in Different Countries in Hindu Kush Himalaya Mountains
Authors: Pulakesh Das, Mukunda Dev Behera, Manchiraju Sri Ramachandra Murthy
Abstract:
Large numbers of people living in and around the Hindu Kush Himalaya (HKH) region, depends on this diverse mountainous region for ecosystem services. Following the global trend, this region also experiencing rapid population growth, and demand for timber and agriculture land. The eight countries sharing the HKH region have different forest resources utilization and conservation policies that exert varying forces in the forest ecosystem. This created a variable spatial as well altitudinal gradient in rate of deforestation and corresponding forest patch fragmentation. The quantitative relationship between fragmentation and demography has not been established before for HKH vis-à-vis along elevation gradient. This current study was carried out to attribute the overall and different nature in landscape fragmentations along the altitudinal gradient with the demography of each sharing countries. We have used the tree canopy cover data derived from Landsat data to analyze the deforestation and afforestation rate, and corresponding landscape fragmentation observed during 2000 – 2010. Area-weighted mean radius of gyration (AMN radius of gyration) was computed owing to its advantage as spatial indicator of fragmentation over non-spatial fragmentation indices. Using the subtraction method, the change in fragmentation was computed during 2000 – 2010. Using the tree canopy cover data as a surrogate of forest cover, highest forest loss was observed in Myanmar followed by China, India, Bangladesh, Nepal, Pakistan, Bhutan, and Afghanistan. However, the sequence of fragmentation was different after the maximum fragmentation observed in Myanmar followed by India, China, Bangladesh, and Bhutan; whereas increase in fragmentation was seen following the sequence of as Nepal, Pakistan, and Afghanistan. Using SRTM-derived DEM, we observed higher rate of fragmentation up to 2400m that corroborated with high human population for the year 2000 and 2010. To derive the nature of fragmentation along the altitudinal gradients, the Statistica software was used, where the user defined function was utilized for regression applying the Gauss-Newton estimation method with 50 iterations. We observed overall logarithmic decrease in fragmentation change (area-weighted mean radius of gyration), forest cover loss and population growth during 2000-2010 along the elevation gradient with very high R2 values (i.e., 0.889, 0.895, 0.944 respectively). The observed negative logarithmic function with the major contribution in the initial elevation gradients suggest to gap filling afforestation in the lower altitudes to enhance the forest patch connectivity. Our finding on the pattern of forest fragmentation and human population across the elevation gradient in HKH region will have policy level implication for different nations and would help in characterizing hotspots of change. Availability of free satellite derived data products on forest cover and DEM, grid-data on demography, and utility of geospatial tools helped in quick evaluation of the forest fragmentation vis-a-vis human impact pattern along the elevation gradient in HKH.Keywords: area-weighted mean radius of gyration, fragmentation, human impact, tree canopy cover
Procedia PDF Downloads 215