Search results for: uncertain concept
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1394

Search results for: uncertain concept

1034 A Model for Estimation of Efforts in Development of Software Systems

Authors: Parvinder S. Sandhu, Manisha Prashar, Pourush Bassi, Atul Bisht

Abstract:

Software effort estimation is the process of predicting the most realistic use of effort required to develop or maintain software based on incomplete, uncertain and/or noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets. There are various models like Halstead, Walston-Felix, Bailey-Basili, Doty and GA Based models which have already used to estimate the software effort for projects. In this study Statistical Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are experimented to estimate the software effort for projects. The performances of the developed models were tested on NASA software project datasets and results are compared with the Halstead, Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based models mentioned in the literature. The result shows that the NF Model has the lowest MMRE and RMSE values. The NF Model shows the best results as compared with the Fuzzy-GA based hybrid Inference System and other existing Models that are being used for the Effort Prediction with lowest MMRE and RMSE values.

Keywords: Neuro-Fuzzy Model, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model, GA Based Model, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3193
1033 Human Digital Twin for Personal Conversation Automation Using Supervised Machine Learning Approaches

Authors: Aya Salama

Abstract:

Digital Twin has emerged as a compelling research area, capturing the attention of scholars over the past decade. It finds applications across diverse fields, including smart manufacturing and healthcare, offering significant time and cost savings. Notably, it often intersects with other cutting-edge technologies such as Data Mining, Artificial Intelligence, and Machine Learning. However, the concept of a Human Digital Twin (HDT) is still in its infancy and requires further demonstration of its practicality. HDT takes the notion of Digital Twin a step further by extending it to living entities, notably humans, who are vastly different from inanimate physical objects. The primary objective of this research was to create an HDT capable of automating real-time human responses by simulating human behavior. To achieve this, the study delved into various areas, including clustering, supervised classification, topic extraction, and sentiment analysis. The paper successfully demonstrated the feasibility of HDT for generating personalized responses in social messaging applications. Notably, the proposed approach achieved an overall accuracy of 63%, a highly promising result that could pave the way for further exploration of the HDT concept. The methodology employed Random Forest for clustering the question database and matching new questions, while K-nearest neighbor was utilized for sentiment analysis.

Keywords: Human Digital twin, sentiment analysis, topic extraction, supervised machine learning, unsupervised machine learning, classification and clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137
1032 An Evaluation of Digital Elevation Models to Short-Term Monitoring of a High Energy Barrier Island, Northeast Brazil

Authors: Venerando E. Amaro, Francisco Gabriel F. de Lima, Marcelo S.T. Santos

Abstract:

The morphological short-term evolution of Ponta do Tubarão Island (PTI) was investigated through high accurate surveys based on post-processed kinematic (PPK) relative positioning on Global Navigation Satellite Systems (GNSS). PTI is part of a barrier island system on a high energy northeast Brazilian coastal environment and also an area of high environmental sensitivity. Surveys were carried out quarterly over a two years period from May 2010 to May 2012. This paper assesses statically the performance of digital elevation models (DEM) derived from different interpolation methods to represent morphologic features and to quantify volumetric changes and TIN models shown the best results to that purposes. The MDE allowed quantifying surfaces and volumes in detail as well as identifying the most vulnerable segments of the PTI to erosion and/or accumulation of sediments and relate the alterations to climate conditions. The coastal setting and geometry of PTI protects a significant mangrove ecosystem and some oil and gas facilities installed in the vicinities from damaging effects of strong oceanwaves and currents. Thus, the maintenance of PTI is extremely required but the prediction of its longevity is uncertain because results indicate an irregularity of sedimentary balance and a substantial decline in sediment supply to this coastal area.

Keywords: DEM, GNSS, short-term monitoring, Brazil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2602
1031 A Sensorless Robust Tracking Control of an Implantable Rotary Blood Pump for Heart Failure Patients

Authors: Mohsen A. Bakouri, Andrey V. Savkin, Abdul-Hakeem H. Alomari, Robert F. Salamonsen, Einly Lim, Nigel H. Lovell

Abstract:

Physiological control of a left ventricle assist device (LVAD) is generally a complicated task due to diverse operating environments and patient variability. In this work, a tracking control algorithm based on sliding mode and feed forward control for a class of discrete-time single input single output (SISO) nonlinear uncertain systems is presented. The controller was developed to track the reference trajectory to a set operating point without inducing suction in the ventricle. The controller regulates the estimated mean pulsatile flow Qp and mean pulsatility index of pump rotational speed PIω that was generated from a model of the assist device. We recall the principle of the sliding mode control theory then we combine the feed-forward control design with the sliding mode control technique to follow the reference trajectory. The uncertainty is replaced by its upper and lower boundary. The controller was tested in a computer simulation covering two scenarios (preload and ventricular contractility). The simulation results prove the effectiveness and the robustness of the proposed controller

Keywords: robust control system, discrete-sliding mode, left ventricularle assist devicse, pulsatility index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1842
1030 Comparative Performance of Artificial Bee Colony Based Algorithms for Wind-Thermal Unit Commitment

Authors: P. K. Singhal, R. Naresh, V. Sharma

Abstract:

This paper presents the three optimization models, namely New Binary Artificial Bee Colony (NBABC) algorithm, NBABC with Local Search (NBABC-LS), and NBABC with Genetic Crossover (NBABC-GC) for solving the Wind-Thermal Unit Commitment (WTUC) problem. The uncertain nature of the wind power is incorporated using the Weibull probability density function, which is used to calculate the overestimation and underestimation costs associated with the wind power fluctuation. The NBABC algorithm utilizes a mechanism based on the dissimilarity measure between binary strings for generating the binary solutions in WTUC problem. In NBABC algorithm, an intelligent scout bee phase is proposed that replaces the abandoned solution with the global best solution. The local search operator exploits the neighboring region of the current solutions, whereas the integration of genetic crossover with the NBABC algorithm increases the diversity in the search space and thus avoids the problem of local trappings encountered with the NBABC algorithm. These models are then used to decide the units on/off status, whereas the lambda iteration method is used to dispatch the hourly load demand among the committed units. The effectiveness of the proposed models is validated on an IEEE 10-unit thermal system combined with a wind farm over the planning period of 24 hours.

Keywords: Artificial bee colony algorithm, economic dispatch, unit commitment, wind power.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1039
1029 Comparative Performance of Artificial Bee Colony Based Algorithms for Wind-Thermal Unit Commitment

Authors: P. K. Singhal, R. Naresh, V. Sharma

Abstract:

This paper presents the three optimization models, namely New Binary Artificial Bee Colony (NBABC) algorithm, NBABC with Local Search (NBABC-LS), and NBABC with Genetic Crossover (NBABC-GC) for solving the Wind-Thermal Unit Commitment (WTUC) problem. The uncertain nature of the wind power is incorporated using the Weibull probability density function, which is used to calculate the overestimation and underestimation costs associated with the wind power fluctuation. The NBABC algorithm utilizes a mechanism based on the dissimilarity measure between binary strings for generating the binary solutions in WTUC problem. In NBABC algorithm, an intelligent scout bee phase is proposed that replaces the abandoned solution with the global best solution. The local search operator exploits the neighboring region of the current solutions, whereas the integration of genetic crossover with the NBABC algorithm increases the diversity in the search space and thus avoids the problem of local trappings encountered with the NBABC algorithm. These models are then used to decide the units on/off status, whereas the lambda iteration method is used to dispatch the hourly load demand among the committed units. The effectiveness of the proposed models is validated on an IEEE 10-unit thermal system combined with a wind farm over the planning period of 24 hours.

Keywords: Artificial bee colony algorithm, economic dispatch, unit commitment, wind power.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1147
1028 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs

Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao

Abstract:

A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.

Keywords: Tight reservoirs, cyclic O2 injection, asphaltene, solubility, reservoir simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755
1027 Integrated Evaluation of Green Design and Green Manufacturing Processes Using a Mathematical Model

Authors: Yuan-Jye Tseng, Shin-Han Lin

Abstract:

In this research, a mathematical model for integrated evaluation of green design and green manufacturing processes is presented. To design a product, there can be alternative options to design the detailed components to fulfill the same product requirement. In the design alternative cases, the components of the product can be designed with different materials and detailed specifications. If several design alternative cases are proposed, the different materials and specifications can affect the manufacturing processes. In this paper, a new concept for integrating green design and green manufacturing processes is presented. A green design can be determined based the manufacturing processes of the designed product by evaluating the green criteria including energy usage and environmental impact, in addition to the traditional criteria of manufacturing cost. With this concept, a mathematical model is developed to find the green design and the associated green manufacturing processes. In the mathematical model, the cost items include material cost, manufacturing cost, and green related cost. The green related cost items include energy cost and environmental cost. The objective is to find the decisions of green design and green manufacturing processes to achieve the minimized total cost. In practical applications, the decision-making can be made to select a good green design case and its green manufacturing processes. In this presentation, an example product is illustrated. It shows that the model is practical and useful for integrated evaluation of green design and green manufacturing processes.

Keywords: Supply chain management, green supply chain, green design, green manufacturing, mathematical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1829
1026 Diversity for Safety and Security of Autonomous Vehicles against Accidental and Deliberate Faults

Authors: Anil Ranjitbhai Patel, Clement John Shaji, Peter Liggesmeyer

Abstract:

Safety and security of Autonomous Vehicles (AVs) is a growing concern, first, due to the increased number of safety-critical functions taken over by automotive embedded systems; second, due to the increased exposure of the software-intensive systems to potential attackers; third, due to dynamic interaction in an uncertain and unknown environment at runtime which results in changed functional and non-functional properties of the system. Frequently occurring environmental uncertainties, random component failures, and compromise security of the AVs might result in hazardous events, sometimes even in an accident, if left undetected. Beyond these technical issues, we argue that the safety and security of AVs against accidental and deliberate faults are poorly understood and rarely implemented. One possible way to overcome this is through a well-known diversity approach. As an effective approach to increase safety and security, diversity has been widely used in the aviation, railway, and aerospace industries. Thus, paper proposes fault-tolerance by diversity model taking into consideration the mitigation of accidental and deliberate faults by application of structure and variant redundancy. The model can be used to design the AVs with various types of diversity in hardware and software-based multi-version system. The paper evaluates the presented approach by employing an example from adaptive cruise control, followed by discussing the case study with initial findings.

Keywords: Autonomous vehicles, diversity, fault-tolerance, adaptive cruise control, safety, security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 408
1025 Building a Transformative Continuing Professional Development Experience for Educators through a Principle-Based, Technological-Driven Knowledge Building Approach: A Case Study of a Professional Learning Team in Secondary Education

Authors: Melvin Chan, Chew Lee Teo

Abstract:

There has been a growing emphasis in elevating the teachers’ proficiency and competencies through continuing professional development (CPD) opportunities. In this era of a Volatile, Uncertain, Complex, Ambiguous (VUCA) world, teachers are expected to be collaborative designers, critical thinkers and creative builders. However, many of the CPD structures are still revolving in the model of transmission, which stands in contradiction to the cultivation of future-ready teachers for the innovative world of emerging technologies. This article puts forward the framing of CPD through a Principle-Based, Technological-Driven Knowledge Building Approach grounded in the essence of andragogy and progressive learning theories where growth is best exemplified through an authentic immersion in a social/community experience-based setting. Putting this Knowledge Building Professional Development Model (KBPDM) in operation via a Professional Learning Team (PLT) situated in a Secondary School in Singapore, research findings reveal that the intervention has led to a fundamental change in the learning paradigm of the teachers, henceforth equipping and empowering them successfully in their pedagogical design and practices for a 21st century classroom experience. This article concludes with the possibility in leveraging the Learning Analytics to deepen the CPD experiences for educators.

Keywords: Continual professional development, knowledge building, learning paradigm, andragogy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 938
1024 Probabilistic Approach of Dealing with Uncertainties in Distributed Constraint Optimization Problems and Situation Awareness for Multi-agent Systems

Authors: Sagir M. Yusuf, Chris Baber

Abstract:

In this paper, we describe how Bayesian inferential reasoning will contributes in obtaining a well-satisfied prediction for Distributed Constraint Optimization Problems (DCOPs) with uncertainties. We also demonstrate how DCOPs could be merged to multi-agent knowledge understand and prediction (i.e. Situation Awareness). The DCOPs functions were merged with Bayesian Belief Network (BBN) in the form of situation, awareness, and utility nodes. We describe how the uncertainties can be represented to the BBN and make an effective prediction using the expectation-maximization algorithm or conjugate gradient descent algorithm. The idea of variable prediction using Bayesian inference may reduce the number of variables in agents’ sampling domain and also allow missing variables estimations. Experiment results proved that the BBN perform compelling predictions with samples containing uncertainties than the perfect samples. That is, Bayesian inference can help in handling uncertainties and dynamism of DCOPs, which is the current issue in the DCOPs community. We show how Bayesian inference could be formalized with Distributed Situation Awareness (DSA) using uncertain and missing agents’ data. The whole framework was tested on multi-UAV mission for forest fire searching. Future work focuses on augmenting existing architecture to deal with dynamic DCOPs algorithms and multi-agent information merging.

Keywords: DCOP, multi-agent reasoning, Bayesian reasoning, swarm intelligence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 951
1023 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy

Authors: Zviad Ghadua, Biswa Bhattacharya

Abstract:

The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.

Keywords: Flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696
1022 An Integrated Design Evaluation and Assembly Sequence Planning Model using a Particle Swarm Optimization Approach

Authors: Feng-Yi Huang, Yuan-Jye Tseng

Abstract:

In the traditional concept of product life cycle management, the activities of design, manufacturing, and assembly are performed in a sequential way. The drawback is that the considerations in design may contradict the considerations in manufacturing and assembly. The different designs of components can lead to different assembly sequences. Therefore, in some cases, a good design may result in a high cost in the downstream assembly activities. In this research, an integrated design evaluation and assembly sequence planning model is presented. Given a product requirement, there may be several design alternative cases to design the components for the same product. If a different design case is selected, the assembly sequence for constructing the product can be different. In this paper, first, the designed components are represented by using graph based models. The graph based models are transformed to assembly precedence constraints and assembly costs. A particle swarm optimization (PSO) approach is presented by encoding a particle using a position matrix defined by the design cases and the assembly sequences. The PSO algorithm simultaneously performs design evaluation and assembly sequence planning with an objective of minimizing the total assembly costs. As a result, the design cases and the assembly sequences can both be optimized. The main contribution lies in the new concept of integrated design evaluation and assembly sequence planning model and the new PSO solution method. The test results show that the presented method is feasible and efficient for solving the integrated design evaluation and assembly planning problem. In this paper, an example product is tested and illustrated.

Keywords: assembly sequence planning, design evaluation, design for assembly, particle swarm optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
1021 A Concept of Rational Water Management at Local Utilities – The Use of RO for Water Supply and Wastewater Treatment/Reuse

Authors: N. Matveev, A. Pervov

Abstract:

Local utilities often face problems of local industrial wastes, storm water disposal due to existing strict regulations. For many local industries, the problem of wastewater treatment and discharge into surface reservoirs can’t be solved through the use of conventional biological treatment techniques. Current discharge standards require very strict removal of a number of impurities such as ammonia, nitrates, phosphate, etc. To reach this level of removal, expensive reagents and sorbents are used. The modern concept of rational water resources management requires the development of new efficient techniques that provide wastewater treatment and reuse. As RO membranes simultaneously reject all dissolved impurities such as BOD, TDS, ammonia, phosphates etc., they become very attractive for the direct treatment of wastewater without biological stage. To treat wastewater, specially designed membrane "open channel" modules are used that do not possess "dead areas" that cause fouling or require pretreatment. A solution to RO concentrate disposal problem is presented that consists of reducing of initial wastewater volume by 100 times. Concentrate is withdrawn from membrane unit as sludge moisture. The efficient use of membrane RO techniques is connected with a salt balance in water system. Thus, to provide high ecological efficiency of developed techniques, all components of water supply and wastewater discharge systems should be accounted for.

Keywords: Reverse osmosis, stormwater treatment, openchannel module, wastewater reuse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
1020 Managing Iterations in Product Design and Development

Authors: K. Aravindhan, Trishit Bandyopadhyay, Mahesh Mehendale, Supriya Kumar De

Abstract:

The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.

Keywords: Decision Points, Iteration, Product Design, Rework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2161
1019 Adapting Cities Name with ICT and Countries Interested in the Smart City

Authors: Qasim Hamakhurshid Hamamurad, Normal Mat Jusoh, Uznir Ujang

Abstract:

The concept of the city with an infrastructure of Information and Communication Technology (ICT) embraces several definitions depending on the meanings of the word "smart" which include: intelligent city, smart city, knowledge city, ubiquitous city, sustainable city, and digital city. Many definitions of the city exist, but this study explores which one has been universally acknowledged. From the literature analysis, it emerges that the term smart city is the most used in the articles to show the smartness of a city. This paper shares exploration of the research from the seven main website digital databases and journals focusing on the smart city from January 2015 to February 2020 to: (a) Time research, to examine the causes of the smart city phenomenon and other concept literature in the last five years; (b) Review of words, to see how and where the smart city specification and relation of different definitions are implemented; (c) Geographical research to consider where smart cities' greatest concentrations are in the world and determine if Malaysians are interacting with the smart city; and (d) How many papers are published in all of Malaysia from 2015 to 2020 about smart cities. Three steps are followed to accomplish the aim of this study: (1) The analysis which covered a systematic literature review search strategy to gather a representative sub-set of papers on the smart city and other definitions utilizing GoogleScholar, Elsevier, Scopus, ScienceDirect, IEEEXplore, WebofScience, and Springer between January 2015-February 2020; (2) The formation of a bibliometric map based on the bibliometric evaluation using the mapping technique VOSviewer to visualize differences; (3) VOSviewer application program to build initial clusters. The bibliometric analytical findings targeted the word harmony.

Keywords: Bibliometric research, smart city, ICT, VOSviewer, urban modernization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 994
1018 Landfill Failure Mobility Analysis: A Probabilistic Approach

Authors: Ali Jahanfar, Brajesh Dubey, Bahram Gharabaghi, Saber Bayat Movahed

Abstract:

Ever increasing population growth of major urban centers and environmental challenges in siting new landfills have resulted in a growing trend in design of mega-landfills some with extraordinary heights and dangerously steep slopes. Landfill failure mobility risk analysis is one of the most uncertain types of dynamic rheology models due to very large inherent variabilities in the heterogeneous solid waste material shear strength properties. The waste flow of three historic dumpsite and two landfill failures were back-analyzed using run-out modeling with DAN-W model. The travel distances of the waste flow during landfill failures were calculated approach by taking into account variability in material shear strength properties. The probability distribution function for shear strength properties of the waste material were grouped into four major classed based on waste material compaction (landfills versus dumpsites) and composition (high versus low quantity) of high shear strength waste materials such as wood, metal, plastic, paper and cardboard in the waste. This paper presents a probabilistic method for estimation of the spatial extent of waste avalanches, after a potential landfill failure, to create maps of vulnerability scores to inform property owners and residents of the level of the risk.

Keywords: Landfill failure, waste flow, Voellmy rheology, friction coefficient, waste compaction and type.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2238
1017 The Urban Development Boundary as a Planning Tool for Sustainable Urban Form: The South African Situation

Authors: E. J. Cilliers

Abstract:

It is the living conditions in the cities that determine the future of our livelihood. “To change life, we must first change space"- Henri Lefebvre. Sustainable development is a utopian aspiration for South African cities (especially the case study of the Gauteng City Region), which are currently characterized by unplanned growth and increasing urban sprawl. While the reasons for poor environmental quality and living conditions are undoubtedly diverse and complex, having political, economical and social dimensions, it is argued that the prevailing approach to layout planning in South Africa is part of the problem. This article seeks a solution to the problem of sustainability, from a spatial planning perspective. The spatial planning tool, the urban development boundary, is introduced as the concept that will ensure empty talk being translated into a sustainable vision. The urban development boundary is a spatial planning tool that can be used and implemented to direct urban growth towards a more sustainable form. The urban development boundary aims to ensure planned urban areas, in contrast to the current unplanned areas characterized by urban sprawl and insufficient infrastructure. However, the success of the urban development boundary concept is subject to effective implementation measures, as well as adequate and efficient management. The concept of sustainable development can function as a driving force underlying societal change and transformation, but the interface between spatial planning and environmental management needs to be established (as this is the core aspects underlying sustainable development), and authorities needs to understand and implement this interface consecutively. This interface can, however, realize in terms of the objectives of the planning tool – the urban development boundary. The case study, the Gauteng City Region, is depicted as a site of economic growth and innovation, but there is a lack of good urban and regional governance, impacting on the design (layout) and function of urban areas and land use, as current authorities make uninformed decisions in terms of development applications, leading to unsustainable urban forms and unsustainable nodes. Place and space concepts are thus critical matters applicable to planning of the Gauteng City Region. The urban development boundary are thus explored as a planning tool to guide decision-making, and create a sustainable urban form, leading to better environmental and living conditions, and continuous sustainability.

Keywords: Urban planning, sustainable urban form, urbandevelopment boundary, planning tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2538
1016 Promoting Local Products through One Village One Product and Customer Satisfaction

Authors: Wardoyo, Humairoh

Abstract:

In global competition nowadays, the world economy heavily depends upon high technology and capital intensive industries that are mainly owned by well-established economic and developed countries, such as United States of America, United Kingdom, Japan, and South Korea. Indonesia as a developing country is building its economic activities towards industrial country as well, although a slightly different approach was implemented. For example, similar to the concept of one village one product (OVOP) implemented in Japan, Indonesia also adopted this concept by promoting local traditional products to improve incomes of village people and to enhance local economic activities. Analysis on how OVOP program increase local people’s income and influence customer satisfaction were the objective of this paper. Behavioral intention to purchase and re-purchase, customer satisfaction and promotion are key factors for local products to play significant roles in improving local income and economy of the region. The concepts of OVOP and key factors that influence economic activities of local people and the region will be described and explained in the paper. Results of research, in a case study based on 300 respondents, customers of a local restaurant at Tangerang City, Banten Province of Indonesia, indicated that local product, service quality and behavioral intention individually have significant influence to customer satisfaction; whereas simultaneous tests to the variables indicated positive and significant influence to the behavioral intention through customer satisfaction as the intervening variable.

Keywords: Behavioral intention, customer satisfaction, local products, one village one product.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2277
1015 Concrete Mix Design Using Neural Network

Authors: Rama Shanker, Anil Kumar Sachan

Abstract:

Basic ingredients of concrete are cement, fine aggregate, coarse aggregate and water. To produce a concrete of certain specific properties, optimum proportion of these ingredients are mixed. The important factors which govern the mix design are grade of concrete, type of cement and size, shape and grading of aggregates. Concrete mix design method is based on experimentally evolved empirical relationship between the factors in the choice of mix design. Basic draw backs of this method are that it does not produce desired strength, calculations are cumbersome and a number of tables are to be referred for arriving at trial mix proportion moreover, the variation in attainment of desired strength is uncertain below the target strength and may even fail. To solve this problem, a lot of cubes of standard grades were prepared and attained 28 days strength determined for different combination of cement, fine aggregate, coarse aggregate and water. An artificial neural network (ANN) was prepared using these data. The input of ANN were grade of concrete, type of cement, size, shape and grading of aggregates and output were proportions of various ingredients. With the help of these inputs and outputs, ANN was trained using feed forward back proportion model. Finally trained ANN was validated, it was seen that it gave the result with/ error of maximum 4 to 5%. Hence, specific type of concrete can be prepared from given material properties and proportions of these materials can be quickly evaluated using the proposed ANN.

Keywords: Aggregate Proportions, Artificial Neural Network, Concrete Grade, Concrete Mix Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2594
1014 Development of High Strength Self Curing Concrete Using Super Absorbing Polymer

Authors: K. Bala Subramanian, A. Siva, S. Swaminathan, Arul. M. G. Ajin

Abstract:

Concrete is an essential building material which is widely used in construction industry all over the world due to its compressible strength. Curing of concrete plays a vital role in durability and other performance necessities. Improper curing can affect the concrete performance and durability easily. When areas like scarcity of water, structures is not accessible by humans external curing cannot be performed, so we opt for internal curing. Internal curing (or) self curing plays a major role in developing the concrete pore structure and microstructure. The concept of internal curing is to enhance the hydration process to maintain the temperature uniformly. The evaporation of water in the concrete is reduced by self curing agent (Super Absorbing Polymer – SAP) there by increasing the water retention capacity of the concrete. The research work was carried out to reduce water, which is prime material used for concrete in the construction industry. Concrete curing plays a major role in developing hydration process. Concept of self curing will reduce the evaporation of water from concrete. Self curing will increase water retention capacity as compared to the conventional concrete. Proper self curing (or) internal curing increases the strength, durability and performance of concrete. Super absorbing Polymer (SAP) used as internal curing agent. In this study 0.2% to 0.4% of SAP was varied in different grade of high strength concrete. In the experiment replacement of cement by silica fumes with 5%, 10% and 15% are studied. It is found that replacement of silica fumes by 10 % gives more strength and durability when compared to others.

Keywords: Compressive Strength, High strength Concrete Rapid chloride permeability, Super Absorbing Polymer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3182
1013 On λ− Summable of Orlicz Space of Gai Sequences of Fuzzy Numbers

Authors: N.Subramanian, S.Krishnamoorthy, S. Balasubramanian

Abstract:

In this paper the concept of strongly (λM)p - Ces'aro summability of a sequence of fuzzy numbers and strongly λM- statistically convergent sequences of fuzzy numbers is introduced.

Keywords: Fuzzy numbers, statistical convergence, Orlicz space, gai sequence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
1012 On the Parameter Optimization of Fuzzy Inference Systems

Authors: Erika Martinez Ramirez, Rene V. Mayorga

Abstract:

Nowadays, more engineering systems are using some kind of Artificial Intelligence (AI) for the development of their processes. Some well-known AI techniques include artificial neural nets, fuzzy inference systems, and neuro-fuzzy inference systems among others. Furthermore, many decision-making applications base their intelligent processes on Fuzzy Logic; due to the Fuzzy Inference Systems (FIS) capability to deal with problems that are based on user knowledge and experience. Also, knowing that users have a wide variety of distinctiveness, and generally, provide uncertain data, this information can be used and properly processed by a FIS. To properly consider uncertainty and inexact system input values, FIS normally use Membership Functions (MF) that represent a degree of user satisfaction on certain conditions and/or constraints. In order to define the parameters of the MFs, the knowledge from experts in the field is very important. This knowledge defines the MF shape to process the user inputs and through fuzzy reasoning and inference mechanisms, the FIS can provide an “appropriate" output. However an important issue immediately arises: How can it be assured that the obtained output is the optimum solution? How can it be guaranteed that each MF has an optimum shape? A viable solution to these questions is through the MFs parameter optimization. In this Paper a novel parameter optimization process is presented. The process for FIS parameter optimization consists of the five simple steps that can be easily realized off-line. Here the proposed process of FIS parameter optimization it is demonstrated by its implementation on an Intelligent Interface section dealing with the on-line customization / personalization of internet portals applied to E-commerce.

Keywords: Artificial Intelligence, Fuzzy Logic, Fuzzy InferenceSystems, Nonlinear Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1952
1011 A Multigranular Linguistic Additive Ratio Assessment Model in Group Decision Making

Authors: Wiem Daoud Ben Amor, Luis Martínez López, Jr., Hela Moalla Frikha

Abstract:

Most of the multi-criteria group decision making (MCGDM) problems dealing with qualitative criteria require consideration of the large background of expert information. It is common that experts have different degrees of knowledge for giving their alternative assessments according to criteria. So, it seems logical that they use different evaluation scales to express their judgment, i.e., multi granular linguistic scales. In this context, we propose the extension of the classical additive ratio assessment (ARAS) method to the case of a hierarchical linguistics term for managing multi granular linguistic scales in uncertain context where uncertainty is modeled by means in linguistic information. The proposed approach is called the extended hierarchical linguistics-ARAS method (ELH-ARAS). Within the ELH-ARAS approach, the decision maker (DMs) can diagnose the results (the ranking of the alternatives) in a decomposed style i.e., not only at one level of the hierarchy but also at the intermediate ones. Also, the developed approach allows a feedback transformation i.e., the collective final results of all experts are able to be transformed at any level of the extended linguistic hierarchy that each expert has previously used. Therefore, the ELH-ARAS technique makes it easier for decision-makers to understand the results. Finally, an MCGDM case study is given to illustrate the proposed approach.

Keywords: Additive ratio assessment, extended hierarchical linguistic, multi-criteria group decision making problems, multi granular linguistic contexts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 320
1010 An Approach to Correlate the Statistical-Based Lorenz Method, as a Way of Measuring Heterogeneity, with Kozeny-Carman Equation

Authors: H. Khanfari, M. Johari Fard

Abstract:

Dealing with carbonate reservoirs can be mind-boggling for the reservoir engineers due to various digenetic processes that cause a variety of properties through the reservoir. A good estimation of the reservoir heterogeneity which is defined as the quality of variation in rock properties with location in a reservoir or formation, can better help modeling the reservoir and thus can offer better understanding of the behavior of that reservoir. Most of reservoirs are heterogeneous formations whose mineralogy, organic content, natural fractures, and other properties vary from place to place. Over years, reservoir engineers have tried to establish methods to describe the heterogeneity, because heterogeneity is important in modeling the reservoir flow and in well testing. Geological methods are used to describe the variations in the rock properties because of the similarities of environments in which different beds have deposited in. To illustrate the heterogeneity of a reservoir vertically, two methods are generally used in petroleum work: Dykstra-Parsons permeability variations (V) and Lorenz coefficient (L) that are reviewed briefly in this paper. The concept of Lorenz is based on statistics and has been used in petroleum from that point of view. In this paper, we correlated the statistical-based Lorenz method to a petroleum concept, i.e. Kozeny-Carman equation and derived the straight line plot of Lorenz graph for a homogeneous system. Finally, we applied the two methods on a heterogeneous field in South Iran and discussed each, separately, with numbers and figures. As expected, these methods show great departure from homogeneity. Therefore, for future investment, the reservoir needs to be treated carefully.

Keywords: Carbonate reservoirs, heterogeneity, homogeneous system, Dykstra-Parsons permeability variations (V), Lorenz coefficient (L).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
1009 Urban Environment Quality Improvement Planning Case Study: Moft Abad Neighborhood, Tehran, Iran

Authors: Elham Lashkari, Mehrshad Khalaj

Abstract:

Rapid enlargement and physical development of cities have facilitated the emergence of a number of city life crises and decrease of environment quality. Subsequently, the need for noticing the concept of quality and its improvement in urban environments, besides quantitative issues, is obviously recognized. In the domain of urban ideas the importance of taking these issues into consideration is obvious not only in accordance to sustainable development concepts and improvement of public environment quality, but also in the enhancement of social and behavioral models. The major concern of present article is to study the nature of urban environment quality in urban development plans, which is important not only in the concept and the aim of projects but also in their execution procedure. As a result, this paper is going to utilize planning capacities caused by environmental virtues in the planning procedure of Moft Abad neighborhood. Thus, at the first step, applying the Analytical Hierarchy Process (AHP), it has assessed quantitative environmental issues. The present conditions of Moft Abad state that “the neighborhood is generally suffering from the lack of qualitative parameters, and the previously formed planning procedures could not take the sustainable and developmental paths which are aimed at environment quality virtues." The diminution of economical and environmental virtues has resulted in the diminution of residential and social virtues. Therefore, in order to enhance the environment quality in Moft Abad, the present paper has tried to supply the subject plans in order to make a safe, healthy, and lively neighborhood.

Keywords: Urban Environment Quality, Neighborhood Plan, Urban Development Plan, Analytical Hierarchy Process (AHP)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052
1008 Analysis of Cost Estimation and Payment Systems for Consultant Contracts in the US, Japan, China and the UK

Authors: Shih-Hsu Wang, Yuan-Yuan Cheng, Ming-Tsung Lee, Wei-Chih Wang

Abstract:

Determining reasonable fees is the main objective of designing the cost estimation and payment systems for consultant contracts. However, project clients utilize different cost estimation and payment systems because of their varying views on the reasonableness of consultant fees. This study reviews the cost estimation and payment systems of consultant contracts for five countries, including the US (Washington State Department of Transportation), Japan (Ministry of Land, Infrastructure, Transport and Tourism), China (Engineering Design Charging Standard) and UK (Her Majesty's Treasure). Specifically, this work investigates the budgeting process, contractor selection method, contractual price negotiation process, cost review, and cost-control concept of the systems used in these countries. The main finding indicates that that project client-s view on whether the fee is high will affect the way he controls it. In the US, the fee is commonly considered to be high. As a result, stringent auditing system (low flexibility given to the consultant) is then applied. In the UK, the fee is viewed to be low by comparing it to the total life-cycle project cost. Thus, a system that has high flexibility in budgeting and cost reviewing is given to the consultant. In terms of the flexibility allowed for the consultant, the systems applied in Japan and China fall between those of the US and UK. Both the US and UK systems are helpful in determining a reasonable fee. However, in the US system, rigid auditing standards must be established and additional cost-audit manpower is required. In the UK system, sufficient historical cost data should be needed to evaluate the reasonableness of the consultant-s proposed fee

Keywords: Consultant Services, Cost Estimation and Payment System, Payment Flexibility, Cost-control Concept

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
1007 Social Media Idea Ontology: A Concept for Semantic Search of Product Ideas in Customer Knowledge through User-Centered Metrics and Natural Language Processing

Authors: Martin H¨ausl, Maximilian Auch, Johannes Forster, Peter Mandl, Alexander Schill

Abstract:

In order to survive on the market, companies must constantly develop improved and new products. These products are designed to serve the needs of their customers in the best possible way. The creation of new products is also called innovation and is primarily driven by a company’s internal research and development department. However, a new approach has been taking place for some years now, involving external knowledge in the innovation process. This approach is called open innovation and identifies customer knowledge as the most important source in the innovation process. This paper presents a concept of using social media posts as an external source to support the open innovation approach in its initial phase, the Ideation phase. For this purpose, the social media posts are semantically structured with the help of an ontology and the authors are evaluated using graph-theoretical metrics such as density. For the structuring and evaluation of relevant social media posts, we also use the findings of Natural Language Processing, e. g. Named Entity Recognition, specific dictionaries, Triple Tagger and Part-of-Speech-Tagger. The selection and evaluation of the tools used are discussed in this paper. Using our ontology and metrics to structure social media posts enables users to semantically search these posts for new product ideas and thus gain an improved insight into the external sources such as customer needs.

Keywords: Idea ontology, innovation management, open innovation, semantic search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 755
1006 Implementation of the Quality Management System and Development of Organizational Learning: Case of Three Small and Medium-Sized Enterprises in Morocco

Authors: Abdelghani Boudiaf

Abstract:

The profusion of studies relating to the concept of organizational learning shows the importance that has been given to this concept in the management sciences. A few years ago, companies leaned towards ISO 9001 certification; this requires the implementation of the quality management system (QMS). In order for this objective to be achieved, companies must have a set of skills, which pushes them to develop learning through continuous training. The results of empirical research have shown that implementation of the QMS in the company promotes the development of learning. It should also be noted that several types of learning are developed in this sense. Given the nature of skills development is normative in the context of the quality demarche, companies are obliged to qualify and improve the skills of their human resources. Continuous training is the keystone to develop the necessary learning. To carry out continuous training, companies need to be able to identify their real needs by developing training plans based on well-defined engineering. The training process goes obviously through several stages. Initially, training has a general aspect, that is to say, it focuses on topics and actions of a general nature. Subsequently, this is done in a more targeted and more precise way to accompany the evolution of the QMS and also to make the changes decided each time (change of working method, change of practices, change of objectives, change of mentality, etc.). To answer our problematic we opted for the method of qualitative research. It should be noted that the case study method crosses several data collection techniques to explain and understand a phenomenon. Three cases of companies were studied as part of this research work using different data collection techniques related to this method.

Keywords: Changing mentalities, continuous training, organizational learning, quality management system, skills development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 676
1005 Knowledge Transfer among Cross-Functional Teams as a Continual Improvement Process

Authors: Sergio Mauricio Pérez López, Luis Rodrigo Valencia Pérez, Juan Manuel Peña Aguilar, Adelina Morita Alexander

Abstract:

The culture of continuous improvement in organizations is very important as it represents a source of competitive advantage. This article discusses the transfer of knowledge between companies which formed cross-functional teams and used a dynamic model for knowledge creation as a framework. In addition, the article discusses the structure of cognitive assets in companies and the concept of "stickiness" (which is defined as an obstacle to the transfer of knowledge). The purpose of this analysis is to show that an improvement in the attitude of individual members of an organization creates opportunities, and that an exchange of information and knowledge leads to generating continuous improvements in the company as a whole. This article also discusses the importance of creating the proper conditions for sharing tacit knowledge. By narrowing gaps between people, mutual trust can be created and thus contribute to an increase in sharing. The concept of adapting knowledge to new environments will be highlighted, as it is essential for companies to translate and modify information so that such information can fit the context of receiving organizations. Adaptation will ensure that the transfer process is carried out smoothly by preventing "stickiness". When developing the transfer process on cross-functional teams (as opposed to working groups), the team acquires the flexibility and responsiveness necessary to meet objectives. These types of cross-functional teams also generate synergy due to the array of different work backgrounds of their individuals. When synergy is established, a culture of continuous improvement is created.

Keywords: Knowledge transfer, continuous improvement, teamwork, cognitive assets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654