Search results for: probabilistic decision making
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2213

Search results for: probabilistic decision making

1013 Nonlinear Finite Element Modeling of Deep Beam Resting on Linear and Nonlinear Random Soil

Authors: M. Seguini, D. Nedjar

Abstract:

An accuracy nonlinear analysis of a deep beam resting on elastic perfectly plastic soil is carried out in this study. In fact, a nonlinear finite element modeling for large deflection and moderate rotation of Euler-Bernoulli beam resting on linear and nonlinear random soil is investigated. The geometric nonlinear analysis of the beam is based on the theory of von Kàrmàn, where the Newton-Raphson incremental iteration method is implemented in a Matlab code to solve the nonlinear equation of the soil-beam interaction system. However, two analyses (deterministic and probabilistic) are proposed to verify the accuracy and the efficiency of the proposed model where the theory of the local average based on the Monte Carlo approach is used to analyze the effect of the spatial variability of the soil properties on the nonlinear beam response. The effect of six main parameters are investigated: the external load, the length of a beam, the coefficient of subgrade reaction of the soil, the Young’s modulus of the beam, the coefficient of variation and the correlation length of the soil’s coefficient of subgrade reaction. A comparison between the beam resting on linear and nonlinear soil models is presented for different beam’s length and external load. Numerical results have been obtained for the combination of the geometric nonlinearity of beam and material nonlinearity of random soil. This comparison highlighted the need of including the material nonlinearity and spatial variability of the soil in the geometric nonlinear analysis, when the beam undergoes large deflections.

Keywords: Finite element method, geometric nonlinearity, material nonlinearity, soil-structure interaction, spatial variability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1905
1012 A Novel Approach towards Segmentation of Breast Tumors from Screening Mammograms for Efficient Decision Support System

Authors: M.Suganthi, M.Madheswaran

Abstract:

This paper presents a novel approach to finding a priori interesting regions in mammograms. In order to delineate those regions of interest (ROI-s) in mammograms, which appear to be prominent, a topographic representation called the iso-level contour map consisting of iso-level contours at multiple intensity levels and region segmentation based-thresholding have been proposed. The simulation results indicate that the computed boundary gives the detection rate of 99.5% accuracy.

Keywords: Breast Cancer, Mammogram, and Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1462
1011 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision making has not been far-fetched. Proper classification of these textual information in a given context has also been very difficult. As a result, a systematic review was conducted from previous literature on sentiment classification and AI-based techniques. The study was done in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that could correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy using the knowledge gain from the evaluation of different artificial intelligence techniques reviewed. The study evaluated over 250 articles from digital sources like ACM digital library, Google Scholar, and IEEE Xplore; and whittled down the number of research to 52 articles. Findings revealed that deep learning approaches such as Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Bidirectional Encoder Representations from Transformer (BERT), and Long Short-Term Memory (LSTM) outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also required to develop a robust sentiment classifier. Results also revealed that data can be obtained from places like Twitter, movie reviews, Kaggle, Stanford Sentiment Treebank (SST), and SemEval Task4 based on the required domain. The hybrid deep learning techniques like CNN+LSTM, CNN+ Gated Recurrent Unit (GRU), CNN+BERT outperformed single deep learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of development simplicity and AI-based library functionalities. Finally, the study recommended the findings obtained for building robust sentiment classifier in the future.

Keywords: Artificial Intelligence, Natural Language Processing, Sentiment Analysis, Social Network, Text.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 540
1010 LOWL: Logic and OWL, an Extension

Authors: M. Mohsenzadeh, F. Shams, M. Teshnehlab

Abstract:

Current research on semantic web aims at making intelligent web pages meaningful for machines. In this way, ontology plays a primary role. We believe that logic can help ontology languages (such as OWL) to be more fluent and efficient. In this paper we try to combine logic with OWL to reduce some disadvantages of this language. Therefore we extend OWL by logic and also show how logic can satisfy our future expectations of an ontology language.

Keywords: Logical Programming, OWL, Language Extension.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1534
1009 Nuclear Fuel Safety Threshold Determined by Logistic Regression Plus Uncertainty

Authors: D. S. Gomes, A. T. Silva

Abstract:

Analysis of the uncertainty quantification related to nuclear safety margins applied to the nuclear reactor is an important concept to prevent future radioactive accidents. The nuclear fuel performance code may involve the tolerance level determined by traditional deterministic models producing acceptable results at burn cycles under 62 GWd/MTU. The behavior of nuclear fuel can simulate applying a series of material properties under irradiation and physics models to calculate the safety limits. In this study, theoretical predictions of nuclear fuel failure under transient conditions investigate extended radiation cycles at 75 GWd/MTU, considering the behavior of fuel rods in light-water reactors under reactivity accident conditions. The fuel pellet can melt due to the quick increase of reactivity during a transient. Large power excursions in the reactor are the subject of interest bringing to a treatment that is known as the Fuchs-Hansen model. The point kinetic neutron equations show similar characteristics of non-linear differential equations. In this investigation, the multivariate logistic regression is employed to a probabilistic forecast of fuel failure. A comparison of computational simulation and experimental results was acceptable. The experiments carried out use the pre-irradiated fuels rods subjected to a rapid energy pulse which exhibits the same behavior during a nuclear accident. The propagation of uncertainty utilizes the Wilk's formulation. The variables chosen as essential to failure prediction were the fuel burnup, the applied peak power, the pulse width, the oxidation layer thickness, and the cladding type.

Keywords: Logistic regression, reactivity-initiated accident, safety margins, uncertainty propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 999
1008 A Novel Impulse Detector for Filtering of Highly Corrupted Images

Authors: Umesh Ghanekar

Abstract:

As the performance of the filtering system depends upon the accuracy of the noise detection scheme, in this paper, we present a new scheme for impulse noise detection based on two levels of decision. In this scheme in the first stage we coarsely identify the corrupted pixels and in the second stage we finally decide whether the pixel under consideration is really corrupt or not. The efficacy of the proposed filter has been confirmed by extensive simulations.

Keywords: Impulse detection, noise removal, image filtering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1390
1007 Prioritization Assessment of Housing Development Risk Factors: A Fuzzy Hierarchical Process-Based Approach

Authors: Yusuf Garba Baba

Abstract:

The construction industry and housing subsector are fraught with risks that have the potential of negatively impacting on the achievement of project objectives. The success or otherwise of most construction projects depends to large extent on how well these risks have been managed. The recent paradigm shift by the subsector to use of formal risk management approach in contrast to hitherto developed rules of thumb means that risks must not only be identified but also properly assessed and responded to in a systematic manner. The study focused on identifying risks associated with housing development projects and prioritisation assessment of the identified risks in order to provide basis for informed decision. The study used a three-step identification framework: review of literature for similar projects, expert consultation and questionnaire based survey to identify potential risk factors. Delphi survey method was employed in carrying out the relative prioritization assessment of the risks factors using computer-based Analytical Hierarchical Process (AHP) software. The results show that 19 out of the 50 risks significantly impact on housing development projects. The study concludes that although significant numbers of risk factors have been identified as having relevance and impacting to housing construction projects, economic risk group and, in particular, ‘changes in demand for houses’ is prioritised by most developers as posing a threat to the achievement of their housing development objectives. Unless these risks are carefully managed, their effects will continue to impede success in these projects. The study recommends the adoption and use of the combination of multi-technique identification framework and AHP prioritization assessment methodology as a suitable model for the assessment of risks in housing development projects.

Keywords: Risk identification, risk assessment, analytical hierarchical process, multi-criteria decision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 710
1006 Quality Assessment of Hollow Sandcrete Blocks in Minna, Nigeria

Authors: M. Abdullahi, S. Sadiku, Bashar S. Mohammed, J. I. Aguwa

Abstract:

The properties of hollow sandcrete blocks produced in Minna, Nigeria are presented. Sandcrete block is made of cement, water and sand binded together in certain mix proportions. For the purpose of this work, fifty (50) commercial sandcrete block industries were visited in Minna, Nigeria to obtain block samples and aggregates used for the manufacture, and to take inventory of the mix composition and the production process. Sieve analysis tests were conduction on the soil sample from various block industries to ascertain their quality to be used for block making. The mix ratios were also investigated. Five (5) nine inches (9’’ or 225mm) blocks were obtained from each block industry and tested for dimensional compliance and compressive strength. The results of the soil test shows that the grading fall within the limit for natural aggregate and can easily are used to obtain workable mix. Physical examinations of the block sizes show slight deviation from the standard requirement in NIS 87:2000. Compressive strength of hollow sandcrete blocks in range of 0.12 N/mm2 to 0.54 N/mm2 was obtained which is below the recommendable value of 3.45 N/mm2 for load bearing hollow sandcrete blocks. This indicates that these blocks are below the standard for load-bearing sandcrete blocks and cannot be used as load bearing walling units. The mix composition also indicated low cement content resulting in low compressive strength. Most of the commercial block industries visited does not take curing very serious. Water were only sprinkled ones or twice before the blocks were stacked and made readily available for sale. It is recommended that a mix ratio of 1:4 to 1:6 should be used for the production of sandcrete blocks and proper curing practice should be adhered. Blocks should also be cured for 14 days before making them available for consumers.

Keywords: Compressive strength, dimensions, mix proportions, sandcrete blocks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
1005 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite

Authors: F. Lazzeri, I. Reiter

Abstract:

Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.

Keywords: Time-series, features engineering methods for forecasting, energy demand forecasting, Azure machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1265
1004 A Construction Management Tool: Determining a Project Schedule Typical Behaviors Using Cluster Analysis

Authors: Natalia Rudeli, Elisabeth Viles, Adrian Santilli

Abstract:

Delays in the construction industry are a global phenomenon. Many construction projects experience extensive delays exceeding the initially estimated completion time. The main purpose of this study is to identify construction projects typical behaviors in order to develop a prognosis and management tool. Being able to know a construction projects schedule tendency will enable evidence-based decision-making to allow resolutions to be made before delays occur. This study presents an innovative approach that uses Cluster Analysis Method to support predictions during Earned Value Analyses. A clustering analysis was used to predict future scheduling, Earned Value Management (EVM), and Earned Schedule (ES) principal Indexes behaviors in construction projects. The analysis was made using a database with 90 different construction projects. It was validated with additional data extracted from literature and with another 15 contrasting projects. For all projects, planned and executed schedules were collected and the EVM and ES principal indexes were calculated. A complete linkage classification method was used. In this way, the cluster analysis made considers that the distance (or similarity) between two clusters must be measured by its most disparate elements, i.e. that the distance is given by the maximum span among its components. Finally, through the use of EVM and ES Indexes and Tukey and Fisher Pairwise Comparisons, the statistical dissimilarity was verified and four clusters were obtained. It can be said that construction projects show an average delay of 35% of its planned completion time. Furthermore, four typical behaviors were found and for each of the obtained clusters, the interim milestones and the necessary rhythms of construction were identified. In general, detected typical behaviors are: (1) Projects that perform a 5% of work advance in the first two tenths and maintain a constant rhythm until completion (greater than 10% for each remaining tenth), being able to finish on the initially estimated time. (2) Projects that start with an adequate construction rate but suffer minor delays culminating with a total delay of almost 27% of the planned time. (3) Projects which start with a performance below the planned rate and end up with an average delay of 64%, and (4) projects that begin with a poor performance, suffer great delays and end up with an average delay of a 120% of the planned completion time. The obtained clusters compose a tool to identify the behavior of new construction projects by comparing their current work performance to the validated database, thus allowing the correction of initial estimations towards more accurate completion schedules.

Keywords: Cluster analysis, construction management, earned value, schedule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1165
1003 Automated Ranking of Hints

Authors: Sylvia Encheva

Abstract:

The importance of hints in an intelligent tutoring system is well understood. The problems however related to their delivering are quite a few. In this paper we propose delivering of hints to be based on considering their usefulness. By this we mean that a hint is regarded as useful to a student if the student has succeeded to solve a problem after the hint was suggested to her/him. Methods from the theory of partial orderings are further applied facilitating an automated process of offering individualized advises on how to proceed in order to solve a particular problem.

Keywords: Decision support services, uncertainty management, partial orderings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470
1002 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength

Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos

Abstract:

Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.

Keywords: Statistical slope stability analysis, Skew distributions, Probability of failure, Functions of random variables.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519
1001 Efficient Iris Recognition Method for Human Identification

Authors: A. Basit, M. Y. Javed, M. A. Anjum

Abstract:

In this paper, an efficient method for personal identification based on the pattern of human iris is proposed. It is composed of image acquisition, image preprocessing to make a flat iris then it is converted into eigeniris and decision is carried out using only reduction of iris in one dimension. By comparing the eigenirises it is determined whether two irises are similar. The results show that proposed method is quite effective.

Keywords: Biometrics, Canny Operator, Eigeniris, Iris Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521
1000 Genetic Algorithms with Oracle for the Traveling Salesman Problem

Authors: Robin Gremlich, Andreas Hamfelt, Héctor de Pereda, Vladislav Valkovsky

Abstract:

By introducing the concept of Oracle we propose an approach for improving the performance of genetic algorithms for large-scale asymmetric Traveling Salesman Problems. The results have shown that the proposed approach allows overcoming some traditional problems for creating efficient genetic algorithms.

Keywords: Genetic algorithms, Traveling Salesman Problem, optimal decision distribution, oracle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1702
999 Urban Waste Water Governance in South Africa: A Case Study of Stellenbosch

Authors: R. Malisa, E. Schwella, K. I. Theletsane

Abstract:

Due to climate change, population growth and rapid urbanization, the demand for water in South Africa is inevitably surpassing supply. To address similar challenges globally, there has been a paradigm shift from conventional urban waste water management “government” to a “governance” paradigm. From the governance paradigm, Integrated Urban Water Management (IUWM) principle emerged. This principle emphasizes efficient urban waste water treatment and production of high-quality recyclable effluent. In so doing mimicking natural water systems, in their processes of recycling water efficiently, and averting depletion of natural water resources.  The objective of this study was to investigate drivers of shifting the current urban waste water management approach from a “government” paradigm towards “governance”. The study was conducted through Interactive Management soft systems research methodology which follows a qualitative research design. A case study methodology was employed, guided by realism research philosophy. Qualitative data gathered were analyzed through interpretative structural modelling using Concept Star for Professionals Decision-Making tools (CSPDM) version 3.64.  The constructed model deduced that the main drivers in shifting the Stellenbosch municipal urban waste water management towards IUWM “governance” principles are mainly social elements characterized by overambitious expectations of the public on municipal water service delivery, mis-interpretation of the constitution on access to adequate clean water and sanitation as a human right and perceptions on recycling water by different communities. Inadequate public participation also emerged as a strong driver. However, disruptive events such as draught may play a positive role in raising an awareness on the value of water, resulting in a shift on the perceptions on recycled water. Once the social elements are addressed, the alignment of governance and administration elements towards IUWM are achievable. Hence, the point of departure for the desired paradigm shift is the change of water service authorities and serviced communities’ perceptions and behaviors towards shifting urban waste water management approaches from “government” to “governance” paradigm.

Keywords: Integrated urban water management, urban water system, waste water governance, waste water treatment works.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1058
998 An Investigation into the Use of an Atomistic, Hermeneutic, Holistic Approach in Education Relating to the Architectural Design Process

Authors: N. Pritchard

Abstract:

Within architectural education, students arrive fore-armed with; their life-experience; knowledge gained from subject-based learning; their brains and more specifically their imaginations. The learning-by-doing that they embark on in studio-based/project-based learning calls for supervision that allows the student to proactively undertake research and experimentation with design solution possibilities. The degree to which this supervision includes direction is subject to debate and differing opinion. It can be argued that if the student is to learn-by-doing, then design decision making within the design process needs to be instigated and owned by the student so that they have the ability to personally reflect on and evaluate those decisions. Within this premise lies the problem that the student's endeavours can become unstructured and unfocused as they work their way into a new and complex activity. A resultant weakness can be that the design activity is compartmented and not holistic or comprehensive, and therefore, the student's reflections are consequently impoverished in terms of providing a positive, informative feedback loop. The construct proffered in this paper is that a supportive 'armature' or 'Heuristic-Framework' can be developed that facilitates a holistic approach and reflective learning. The normal explorations of architectural design comprise: Analysing the site and context, reviewing building precedents, assimilating the briefing information. However, the student can still be compromised by 'not knowing what they need to know'. The long-serving triad 'Firmness, Commodity and Delight' provides a broad-brush framework of considerations to explore and integrate into good design. If this were further atomised in subdivision formed from the disparate aspects of architectural design that need to be considered within the design process, then the student could sieve through the facts more methodically and reflectively in terms of considering their interrelationship conflict and alliances. The words facts and sieve hold the acronym of the aspects that form the Heuristic-Framework: Function, Aesthetics, Context, Tectonics, Spatial, Servicing, Infrastructure, Environmental, Value and Ecological issues. The Heuristic could be used as a Hermeneutic Model with each aspect of design being focused on and considered in abstraction and then considered in its relation to other aspect and the design proposal as a whole. Importantly, the heuristic could be used as a method for gathering information and enhancing the design brief. The more poetic, mysterious, intuitive, unconscious processes should still be able to occur for the student. The Heuristic-Framework should not be seen as comprehensive prescriptive formulaic or inhibiting to the wide exploration of possibilities and solutions within the architectural design process.

Keywords: Atomistic, hermeneutic, holistic, approach architectural design studio education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1353
997 Seismic Fragility Assessment of Strongback Steel Braced Frames Subjected to Near-Field Earthquakes

Authors: Mohammadreza Salek Faramarzi, Touraj Taghikhany

Abstract:

In this paper, seismic fragility assessment of a recently developed hybrid structural system, known as the strongback system (SBS) is investigated. In this system, to mitigate the occurrence of the soft-story mechanism and improve the distribution of story drifts over the height of the structure, an elastic vertical truss is formed. The strengthened members of the braced span are designed to remain substantially elastic during levels of excitation where soft-story mechanisms are likely to occur and impose a nearly uniform story drift distribution. Due to the distinctive characteristics of near-field ground motions, it seems to be necessary to study the effect of these records on seismic performance of the SBS. To this end, a set of 56 near-field ground motion records suggested by FEMA P695 methodology is used. For fragility assessment, nonlinear dynamic analyses are carried out in OpenSEES based on the recommended procedure in HAZUS technical manual. Four damage states including slight, moderate, extensive, and complete damage (collapse) are considered. To evaluate each damage state, inter-story drift ratio and floor acceleration are implemented as engineering demand parameters. Further, to extend the evaluation of the collapse state of the system, a different collapse criterion suggested in FEMA P695 is applied. It is concluded that SBS can significantly increase the collapse capacity and consequently decrease the collapse risk of the structure during its life time. Comparing the observing mean annual frequency (MAF) of exceedance of each damage state against the allowable values presented in performance-based design methods, it is found that using the elastic vertical truss, improves the structural response effectively.

Keywords: Strongback System, Near-fault, Seismic fragility, Uncertainty, IDA, Probabilistic performance assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 544
996 A Logic Approach to Database Dynamic Updating

Authors: Daniel Stamate

Abstract:

We introduce a logic-based framework for database updating under constraints. In our framework, the constraints are represented as an instantiated extended logic program. When performing an update, database consistency may be violated. We provide an approach of maintaining database consistency, and study the conditions under which the maintenance process is deterministic. We show that the complexity of the computations and decision problems presented in our framework is in each case polynomial time.

Keywords: Databases, knowledge bases, constraints, updates, minimal change, consistency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1344
995 Constructing of Classifier for Face Recognition on the Basis of the Conjugation Indexes

Authors: Vladimir A. Fursov, Nikita E. Kozin

Abstract:

In this work the opportunity of construction of the qualifiers for face-recognition systems based on conjugation criteria is investigated. The linkage between the bipartite conjugation, the conjugation with a subspace and the conjugation with the null-space is shown. The unified solving rule is investigated. It makes the decision on the rating of face to a class considering the linkage between conjugation values. The described recognition method can be successfully applied to the distributed systems of video control and video observation.

Keywords: Conjugation, Eigenfaces, Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1444
994 Genetic Algorithm for In-Theatre Military Logistics Search-and-Delivery Path Planning

Authors: Jean Berger, Mohamed Barkaoui

Abstract:

Discrete search path planning in time-constrained uncertain environment relying upon imperfect sensors is known to be hard, and current problem-solving techniques proposed so far to compute near real-time efficient path plans are mainly bounded to provide a few move solutions. A new information-theoretic –based open-loop decision model explicitly incorporating false alarm sensor readings, to solve a single agent military logistics search-and-delivery path planning problem with anticipated feedback is presented. The decision model consists in minimizing expected entropy considering anticipated possible observation outcomes over a given time horizon. The model captures uncertainty associated with observation events for all possible scenarios. Entropy represents a measure of uncertainty about the searched target location. Feedback information resulting from possible sensor observations outcomes along the projected path plan is exploited to update anticipated unit target occupancy beliefs. For the first time, a compact belief update formulation is generalized to explicitly include false positive observation events that may occur during plan execution. A novel genetic algorithm is then proposed to efficiently solve search path planning, providing near-optimal solutions for practical realistic problem instances. Given the run-time performance of the algorithm, natural extension to a closed-loop environment to progressively integrate real visit outcomes on a rolling time horizon can be easily envisioned. Computational results show the value of the approach in comparison to alternate heuristics.

Keywords: Search path planning, false alarm, search-and-delivery, entropy, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1942
993 The Urban Development Boundary as a Planning Tool for Sustainable Urban Form: The South African Situation

Authors: E. J. Cilliers

Abstract:

It is the living conditions in the cities that determine the future of our livelihood. “To change life, we must first change space"- Henri Lefebvre. Sustainable development is a utopian aspiration for South African cities (especially the case study of the Gauteng City Region), which are currently characterized by unplanned growth and increasing urban sprawl. While the reasons for poor environmental quality and living conditions are undoubtedly diverse and complex, having political, economical and social dimensions, it is argued that the prevailing approach to layout planning in South Africa is part of the problem. This article seeks a solution to the problem of sustainability, from a spatial planning perspective. The spatial planning tool, the urban development boundary, is introduced as the concept that will ensure empty talk being translated into a sustainable vision. The urban development boundary is a spatial planning tool that can be used and implemented to direct urban growth towards a more sustainable form. The urban development boundary aims to ensure planned urban areas, in contrast to the current unplanned areas characterized by urban sprawl and insufficient infrastructure. However, the success of the urban development boundary concept is subject to effective implementation measures, as well as adequate and efficient management. The concept of sustainable development can function as a driving force underlying societal change and transformation, but the interface between spatial planning and environmental management needs to be established (as this is the core aspects underlying sustainable development), and authorities needs to understand and implement this interface consecutively. This interface can, however, realize in terms of the objectives of the planning tool – the urban development boundary. The case study, the Gauteng City Region, is depicted as a site of economic growth and innovation, but there is a lack of good urban and regional governance, impacting on the design (layout) and function of urban areas and land use, as current authorities make uninformed decisions in terms of development applications, leading to unsustainable urban forms and unsustainable nodes. Place and space concepts are thus critical matters applicable to planning of the Gauteng City Region. The urban development boundary are thus explored as a planning tool to guide decision-making, and create a sustainable urban form, leading to better environmental and living conditions, and continuous sustainability.

Keywords: Urban planning, sustainable urban form, urbandevelopment boundary, planning tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2549
992 Organizational Decision Based on Business Intelligence

Authors: Pejman Hosseinioun, Rose Shayeghi, Ghasem Ghorbani Rostam

Abstract:

Nowadays, obtaining traditional statistics and reports is not adequate for the needs of organizational managers. The managers need to analyze and to transform the raw data into knowledge in the world filled with information. Therefore in this regard various processes have been developed. In the meantime the artificial intelligence-based processes are used and the new topics such as business intelligence and knowledge discovery have emerged. In the current paper it is sought to study the business intelligence and its applications in the organizations.

Keywords: Business intelligence, business intelligence infrastructures, business processes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011
991 Designing a Pre-Assessment Tool to Support the Achievement of Green Building Certifications

Authors: Jisun Mo, Paola Boarin

Abstract:

The impact of common buildings on climate and environment has prompted people to get involved in the green building standards aimed at implementing rating tools or certifications. Thus, green building rating systems were introduced to the construction industry, and the demand for certified green buildings has increased gradually and succeeded considerably in enhancing people’s environmental awareness. However, the existing certification process has been unsatisfactory in attracting stakeholders and/or professionals who are actively engaged in adopting a rating system. It is because they have faced recurring barriers regarding limited information in understanding the rating process, time-consuming procedures and higher costs, which have a direct influence on pursuing green building rating systems. To promote the achievement of green building certifications within the building industry more successfully, this paper aims at designing a Pre-Assessment Tool (PAT) framework that can help stakeholders and/or professionals engaged in the construction industry to clarify their basic knowledge, timeframe and extra costs needed to activate a green building certification. First, taking the first steps towards the rating tool seems to be complicated because of upfront commitment to understanding the overall rating procedure is required. This conceptual PAT framework can increase basic knowledge of the rating tool and the certification process, mainly in terms of all resources or information of each credit requirements. Second, the assessment process of rating tools is generally known as a “lengthy and time-consuming system”, contributing to unenthusiastic reactions concerning green building projects. The proposed framework can predict the timeframe needed to identify how long it will take for a green project to process each credit requirement and the documentation required from the beginning of the certification process to final approval. Finally, most people often have the initial perception that pursuing green building certification costs more than constructing a non-green building, which makes it more difficult to execute rating tools. To overcome this issue, this PAT will help users to estimate the extra expenses such as certification fees and third-party contributions based on the track of the amount of time it takes to implement the rating tool throughout all the related stages. Also, it can prevent unexpected or hidden costs occurring in the process of assessment. Therefore, this proposed PAT framework can be recommended as an effective method to support the decision-making of inexperienced users and play an important role in promoting green building certification.

Keywords: Barriers, certification process, green building rating systems, pre-assessment tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 802
990 An Intelligent Combined Method Based on Power Spectral Density, Decision Trees and Fuzzy Logic for Hydraulic Pumps Fault Diagnosis

Authors: Kaveh Mollazade, Hojat Ahmadi, Mahmoud Omid, Reza Alimardani

Abstract:

Recently, the issue of machine condition monitoring and fault diagnosis as a part of maintenance system became global due to the potential advantages to be gained from reduced maintenance costs, improved productivity and increased machine availability. The aim of this work is to investigate the effectiveness of a new fault diagnosis method based on power spectral density (PSD) of vibration signals in combination with decision trees and fuzzy inference system (FIS). To this end, a series of studies was conducted on an external gear hydraulic pump. After a test under normal condition, a number of different machine defect conditions were introduced for three working levels of pump speed (1000, 1500, and 2000 rpm), corresponding to (i) Journal-bearing with inner face wear (BIFW), (ii) Gear with tooth face wear (GTFW), and (iii) Journal-bearing with inner face wear plus Gear with tooth face wear (B&GW). The features of PSD values of vibration signal were extracted using descriptive statistical parameters. J48 algorithm is used as a feature selection procedure to select pertinent features from data set. The output of J48 algorithm was employed to produce the crisp if-then rule and membership function sets. The structure of FIS classifier was then defined based on the crisp sets. In order to evaluate the proposed PSD-J48-FIS model, the data sets obtained from vibration signals of the pump were used. Results showed that the total classification accuracy for 1000, 1500, and 2000 rpm conditions were 96.42%, 100%, and 96.42% respectively. The results indicate that the combined PSD-J48-FIS model has the potential for fault diagnosis of hydraulic pumps.

Keywords: Power Spectral Density, Machine ConditionMonitoring, Hydraulic Pump, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2669
989 Grade and Maximum Tumor Dimension as Determinants of Lymphadenectomy in Patients with Endometrioid Endometrial Cancer (EEC)

Authors: Ali A. Bazzi, Ameer Hamza, Riley O’Hara, Kimberly Kado, Karen H. Hagglund, Lamia Fathallah, Robert T. Morris

Abstract:

Introduction: Endometrial Cancer is a common gynecologic malignancy primarily treated with complete surgical staging, which may include complete pelvic and para-aortic lymphadenectomy. The role of lymphadenectomy is controversial, especially the intraoperative indications for the procedure. Three factors are important in decision to proceed with lymphadenectomy: Myometrial invasion, maximum tumor dimension, and histology. Many institutions incorporate these criteria in varying degrees in the decision to proceed with lymphadenectomy. This investigation assesses the use of intraoperatively measured MTD with and without pre-operative histologic grade. Methods: This study compared retrospectively EEC patients with intraoperatively measured MTD ≤2 cm to those with MTD >2 cm from January 1, 2002 to August 31, 2017. This assessment compared those with MTD ≤ 2cm with endometrial biopsy (EB) grade 1-2 to patients with MTD > 2cm with EB grade 3. Lymph node metastasis (LNM), recurrence, and survival were compared in these groups. Results: This study reviewed 222 patient cases. In tumors > 2 cm, LNM occurred in 20% cases while in tumors ≤ 2 cm, LNM was found in 6% cases (p=0.04). Recurrence and mean survival based on last follow up visit in these two groups were not statistically different (p=0.78 and 0.36 respectively). Data demonstrated a trend that when combined with preoperative EB International Federation of Gynecology and Obstetrics (FIGO) grade, a higher proportion of patients with EB FIGO Grade 3 and MTD > 2 cm had LNM compared to those with EB FIGO Grade 1-2 and MTD ≤ 2 cm (43% vs, 11%, p=0.06). LNM was found in 15% of cases in which lymphadenectomy was performed based on current practices, whereas if the criteria of EB FIGO 3 and MTD > 2 cm were used the incidence of LNM would have been 44% cases. However, using this criterion, two patients would not have had their nodal metastases detected. Compared to the current practice, the sensitivity and specificity of the proposed criteria would be 60% and 81%, respectively. The PPV and NPV would be 43% and 90%, respectively. Conclusion: The results indicate that MTD combined with EB FIGO grade can detect LNM in a higher proportion of cases when compared to current practice. MTD combined with EB FIGO grade may eliminate the need of frozen section sampling in a substantial number of cases.

Keywords: Endometrial cancer, FIGO grade, lymphadenectomy, tumor size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 823
988 Identifying a Drug Addict Person Using Artificial Neural Networks

Authors: Mustafa Al Sukar, Azzam Sleit, Abdullatif Abu-Dalhoum, Bassam Al-Kasasbeh

Abstract:

Use and abuse of drugs by teens is very common and can have dangerous consequences. The drugs contribute to physical and sexual aggression such as assault or rape. Some teenagers regularly use drugs to compensate for depression, anxiety or a lack of positive social skills. Teen resort to smoking should not be minimized because it can be "gateway drugs" for other drugs (marijuana, cocaine, hallucinogens, inhalants, and heroin). The combination of teenagers' curiosity, risk taking behavior, and social pressure make it very difficult to say no. This leads most teenagers to the questions: "Will it hurt to try once?" Nowadays, technological advances are changing our lives very rapidly and adding a lot of technologies that help us to track the risk of drug abuse such as smart phones, Wireless Sensor Networks (WSNs), Internet of Things (IoT), etc. This technique may help us to early discovery of drug abuse in order to prevent an aggravation of the influence of drugs on the abuser. In this paper, we have developed a Decision Support System (DSS) for detecting the drug abuse using Artificial Neural Network (ANN); we used a Multilayer Perceptron (MLP) feed-forward neural network in developing the system. The input layer includes 50 variables while the output layer contains one neuron which indicates whether the person is a drug addict. An iterative process is used to determine the number of hidden layers and the number of neurons in each one. We used multiple experiment models that have been completed with Log-Sigmoid transfer function. Particularly, 10-fold cross validation schemes are used to access the generalization of the proposed system. The experiment results have obtained 98.42% classification accuracy for correct diagnosis in our system. The data had been taken from 184 cases in Jordan according to a set of questions compiled from Specialists, and data have been obtained through the families of drug abusers.

Keywords: Artificial Neural Network, Decision Support System, drug abuse, drug addiction, Multilayer Perceptron.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
987 An Enterprise Intelligent System Development and Solution Framework

Authors: Rajendra M. Sonar

Abstract:

The recent trend has been using hybrid approach rather than using a single intelligent technique to solve the problems. In this paper, we describe and discuss a framework to develop enterprise solutions that are backed by intelligent techniques. The framework not only uses intelligent techniques themselves but it is a complete environment that includes various interfaces and components to develop the intelligent solutions. The framework is completely Web-based and uses XML extensively. It can work like shared plat-form to be accessed by multiple developers, users and decision makers.

Keywords: Intelligent System Development Framework, WebbasedIntelligent Systems, Retail Banking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1997
986 Accelerating GLA with an M-Tree

Authors: Olli Luoma, Johannes Tuikkala, Olli Nevalainen

Abstract:

In this paper, we propose a novel improvement for the generalized Lloyd Algorithm (GLA). Our algorithm makes use of an M-tree index built on the codebook which makes it possible to reduce the number of distance computations when the nearest code words are searched. Our method does not impose the use of any specific distance function, but works with any metric distance, making it more general than many other fast GLA variants. Finally, we present the positive results of our performance experiments.

Keywords: Clustering, GLA, M-Tree, Vector Quantization .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
985 Multi-Objective Optimization of Run-of-River Small-Hydropower Plants Considering Both Investment Cost and Annual Energy Generation

Authors: Amèdédjihundé H. J. Hounnou, Frédéric Dubas, François-Xavier Fifatin, Didier Chamagne, Antoine Vianou

Abstract:

This paper presents the techno-economic evaluation of run-of-river small-hydropower plants. In this regard, a multi-objective optimization procedure is proposed for the optimal sizing of the hydropower plants, and NSGAII is employed as the optimization algorithm. Annual generated energy and investment cost are considered as the objective functions, and number of generator units (n) and nominal turbine flow rate (QT) constitute the decision variables. Site of Yeripao in Benin is considered as the case study. We have categorized the river of this site using its environmental characteristics: gross head, and first quartile, median, third quartile and mean of flow. Effects of each decision variable on the objective functions are analysed. The results gave Pareto Front which represents the trade-offs between annual energy generation and the investment cost of hydropower plants, as well as the recommended optimal solutions. We noted that with the increase of the annual energy generation, the investment cost rises. Thus, maximizing energy generation is contradictory with minimizing the investment cost. Moreover, we have noted that the solutions of Pareto Front are grouped according to the number of generator units (n). The results also illustrate that the costs per kWh are grouped according to the n and rise with the increase of the nominal turbine flow rate. The lowest investment costs per kWh are obtained for n equal to one and are between 0.065 and 0.180 €/kWh. Following the values of n (equal to 1, 2, 3 or 4), the investment cost and investment cost per kWh increase almost linearly with increasing the nominal turbine flowrate while annual generated. Energy increases logarithmically with increasing of the nominal turbine flowrate. This study made for the Yeripao river can be applied to other rivers with their own characteristics.

Keywords: Hydropower plant, investment cost, multi-objective optimization, number of generator units.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1018
984 Multi-Objective Cellular Manufacturing System under Machines with Different Life-Cycle using Genetic Algorithm

Authors: N. Javadian, J. Rezaeian, Y. Maali

Abstract:

In this paper a multi-objective nonlinear programming model of cellular manufacturing system is presented which minimize the intercell movements and maximize the sum of reliability of cells. We present a genetic approach for finding efficient solutions to the problem of cell formation for products having multiple routings. These methods find the non-dominated solutions and according to decision makers prefer, the best solution will be chosen.

Keywords: Cellular Manufacturing, Genetic Algorithm, Multiobjective, Life-Cycle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1926