Search results for: real estate valuation model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20701

Search results for: real estate valuation model

19741 TerraEnhance: High-Resolution Digital Elevation Model Generation using GANs

Authors: Siddharth Sarma, Ayush Majumdar, Nidhi Sabu, Mufaddal Jiruwaala, Shilpa Paygude

Abstract:

Digital Elevation Models (DEMs) are digital representations of the Earth’s topography, which include information about the elevation, slope, aspect, and other terrain attributes. DEMs play a crucial role in various applications, including terrain analysis, urban planning, and environmental modeling. In this paper, TerraEnhance is proposed, a distinct approach for high-resolution DEM generation using Generative Adversarial Networks (GANs) combined with Real-ESRGANs. By learning from a dataset of low-resolution DEMs, the GANs are trained to upscale the data by 10 times, resulting in significantly enhanced DEMs with improved resolution and finer details. The integration of Real-ESRGANs further enhances visual quality, leading to more accurate representations of the terrain. A post-processing layer is introduced, employing high-pass filtering to refine the generated DEMs, preserving important details while reducing noise and artifacts. The results demonstrate that TerraEnhance outperforms existing methods, producing high-fidelity DEMs with intricate terrain features and exceptional accuracy. These advancements make TerraEnhance suitable for various applications, such as terrain analysis and precise environmental modeling.

Keywords: DEM, ESRGAN, image upscaling, super resolution, computer vision

Procedia PDF Downloads 11
19740 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.

Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate

Procedia PDF Downloads 154
19739 QoS-CBMG: A Model for e-Commerce Customer Behavior

Authors: Hoda Ghavamipoor, S. Alireza Hashemi Golpayegani

Abstract:

An approach to model the customer interaction with e-commerce websites is presented. Considering the service quality level as a predictive feature, we offer an improved method based on the Customer Behavior Model Graph (CBMG), a state-transition graph model. To derive the Quality of Service sensitive-CBMG (QoS-CBMG) model, process-mining techniques is applied to pre-processed website server logs which are categorized as ‘buy’ or ‘visit’. Experimental results on an e-commerce website data confirmed that the proposed method outperforms CBMG based method.

Keywords: customer behavior model, electronic commerce, quality of service, customer behavior model graph, process mining

Procedia PDF Downloads 416
19738 Model Based Simulation Approach to a 14-Dof Car Model Using Matlab/Simulink

Authors: Ishit Sheth, Chandrasekhar Jinendran, Chinmaya Ranjan Sahu

Abstract:

A fourteen degree of freedom (DOF) ride and handling control mathematical model is developed for a car using generalized boltzmann hamel equation which will create a basis for design of ride and handling controller. Mathematical model developed yield equations of motion for non-holonomic constrained systems in quasi-coordinates. The governing differential equation developed integrates ride and handling control of car. Model-based systems engineering approach is implemented for simulation using matlab/simulink, vehicle’s response in different DOF is examined and later validated using commercial software (ADAMS). This manuscript involves detailed derivation of full car vehicle model which provides response in longitudinal, lateral and yaw motion to demonstrate the advantages of the developed model over the existing dynamic model. The dynamic behaviour of the developed ride and handling model is simulated for different road conditions.

Keywords: Full Vehicle Model, MBSE, Non Holonomic Constraints, Boltzmann Hamel Equation

Procedia PDF Downloads 232
19737 Comprehensive Risk Assessment Model in Agile Construction Environment

Authors: Jolanta Tamošaitienė

Abstract:

The article focuses on a developed comprehensive model to be used in an agile environment for the risk assessment and selection based on multi-attribute methods. The model is based on a multi-attribute evaluation of risk in construction, and the determination of their optimality criterion values are calculated using complex Multiple Criteria Decision-Making methods. The model may be further applied to risk assessment in an agile construction environment. The attributes of risk in a construction project are selected by applying the risk assessment condition to the construction sector, and the construction process efficiency in the construction industry accounts for the agile environment. The paper presents the comprehensive risk assessment model in an agile construction environment. It provides a background and a description of the proposed model and the developed analysis of the comprehensive risk assessment model in an agile construction environment with the criteria.

Keywords: assessment, environment, agile, model, risk

Procedia PDF Downloads 256
19736 Study of a Lean Premixed Combustor: A Thermo Acoustic Analysis

Authors: Minoo Ghasemzadeh, Rouzbeh Riazi, Shidvash Vakilipour, Alireza Ramezani

Abstract:

In this study, thermo acoustic oscillations of a lean premixed combustor has been investigated, and a mono-dimensional code was developed in this regard. The linearized equations of motion are solved for perturbations with time dependence〖 e〗^iwt. Two flame models were considered in this paper and the effect of mean flow and boundary conditions were also investigated. After manipulation of flame heat release equation together with the equations of flow perturbation within the main components of the combustor model (i.e., plenum/ premixed duct/ and combustion chamber) and by considering proper boundary conditions between the components of model, a system of eight homogeneous equations can be obtained. This simplification, for the main components of the combustor model, is convenient since low frequency acoustic waves are not affected by bends. Moreover, some elements in the combustor are smaller than the wavelength of propagated acoustic perturbations. A convection time is also assumed to characterize the required time for the acoustic velocity fluctuations to travel from the point of injection to the location of flame front in the combustion chamber. The influence of an extended flame model on the acoustic frequencies of combustor was also investigated, assuming the effect of flame speed as a function of equivalence ratio perturbation, on the rate of flame heat release. The abovementioned system of equations has a related eigenvalue equation which has complex roots. The sign of imaginary part of these roots determines whether the disturbances grow or decay and the real part of these roots would give the frequency of the modes. The results show a reasonable agreement between the predicted values of dominant frequencies in the present model and those calculated in previous related studies.

Keywords: combustion instability, dominant frequencies, flame speed, premixed combustor

Procedia PDF Downloads 379
19735 A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem

Authors: C. E. Nugraheni, L. Abednego

Abstract:

This paper is concerned with minimization of mean tardiness and flow time in a real single machine production scheduling problem. Two variants of genetic algorithm as meta-heuristic are combined with hyper-heuristic approach are proposed to solve this problem. These methods are used to solve instances generated with real world data from a company. Encouraging results are reported.

Keywords: hyper-heuristics, evolutionary algorithms, production scheduling, meta-heuristic

Procedia PDF Downloads 381
19734 Optimization of Hybrid off Grid Energy Station

Authors: Yehya Abdellatif, Iyad M. Muslih, Azzah Alkhalailah, Abdallah Muslih

Abstract:

Hybrid Optimization Model for Electric Renewable (HOMER) software was utilized to find the optimum design of a hybrid off-Grid system, by choosing the optimal solution depending on the cost analysis of energy based on different capacity shortage percentages. A complete study for the site conditions and load profile was done to optimize the design and implementation of a hybrid off-grid power station. In addition, the solution takes into consecration the ambient temperature effect on the efficiency of the power generation and the economical aspects of selection depending on real market price. From the analysis of the HOMER model results, the optimum hybrid power station was suggested, based on wind speed, and solar conditions. The optimization function objective is to minimize the Net Price Cost (NPC) and the Cost of Energy (COE) with zero and 10 percentage of capacity shortage.

Keywords: energy modeling, HOMER, off-grid system, optimization

Procedia PDF Downloads 565
19733 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 409
19732 Numerical Verification of a Backfill-Rectangular Tank-Fluid System

Authors: Ramazan Livaoğlu, Tufan Çakır

Abstract:

The performance of rectangular tanks during earthquakes has been observed to depend significantly on the existence of water in the container and the presence of the backfill acting on tank wall. Therefore, in design of rectangular tanks, the topics of fluid-structure-backfill interactions and determination of modal characteristics of the interaction system have traditionally been one of the great theoretical and practical controversy. Although finite element method has been and will continue to be used to a significant extent in treating the response of the system, experimental verification of numerical models remains prerequisite for their adoption and reliable application in practice. Thus, in this study, the numerical and experimental investigations were performed on the backfill-exterior wall-fluid interaction system. Firstly, three dimensional finite element model (3D-FEM) was developed to acquire modal frequencies and mode shapes of the system by means of ANSYS. Secondly, a series of in-situ tests were fulfilled to define modal characteristics of same system to determine the applicability of the FEM to a real physical situation under field conditions. Finally, comparing the theoretical predictions from the model to results from experimental measurement, a close agreement was found between theory and experiment. Thus, it can be easily stated that experimental verification provides strong support for the use of proposed model in further investigations.

Keywords: fluid-structure interaction, modal analysis, rectangular tank, soil structure interaction

Procedia PDF Downloads 394
19731 Formal Verification of Cache System Using a Novel Cache Memory Model

Authors: Guowei Hou, Lixin Yu, Wei Zhuang, Hui Qin, Xue Yang

Abstract:

Formal verification is proposed to ensure the correctness of the design and make functional verification more efficient. As cache plays a vital role in the design of System on Chip (SoC), and cache with Memory Management Unit (MMU) and cache memory unit makes the state space too large for simulation to verify, then a formal verification is presented for such system design. In the paper, a formal model checking verification flow is suggested and a new cache memory model which is called “exhaustive search model” is proposed. Instead of using large size ram to denote the whole cache memory, exhaustive search model employs just two cache blocks. For cache system contains data cache (Dcache) and instruction cache (Icache), Dcache memory model and Icache memory model are established separately using the same mechanism. At last, the novel model is employed to the verification of a cache which is module of a custom-built SoC system that has been applied in practical, and the result shows that the cache system is verified correctly using the exhaustive search model, and it makes the verification much more manageable and flexible.

Keywords: cache system, formal verification, novel model, system on chip (SoC)

Procedia PDF Downloads 497
19730 Development of Simple-To-Apply Biogas Kinetic Models for the Co-Digestion of Food Waste and Maize Husk

Authors: Owamah Hilary, O. C. Izinyon

Abstract:

Many existing biogas kinetic models are difficult to apply to substrates they were not developed for, as they are substrate specific. Biodegradability kinetic (BIK) model and maximum biogas production potential and stability assessment (MBPPSA) model were therefore developed in this study for the anaerobic co-digestion of food waste and maize husk. Biodegradability constant (k) was estimated as 0.11d-1 using the BIK model. The results of maximum biogas production potential (A) obtained using the MBPPSA model corresponded well with the results obtained using the popular but complex modified Gompertz model for digesters B-1, B-2, B-3, B-4, and B-5. The (If) value of MBPPSA model also showed that digesters B-3, B-4, and B-5 were stable, while B-1 and B-2 were unstable. Similar stability observation was also obtained using the modified Gompertz model. The MBPPSA model can therefore be used as alternative model for anaerobic digestion feasibility studies and plant design.

Keywords: biogas, inoculum, model development, stability assessment

Procedia PDF Downloads 429
19729 Modeling Factors Affecting Fertility Transition in Africa: Case of Kenya

Authors: Dennis Okora Amima Ondieki

Abstract:

Fertility transition has been identified to be affected by numerous factors. This research aimed to investigate the most real factors affecting fertility transition in Kenya. These factors were firstly extracted from the literature convened into demographic features, social, and economic features, social-cultural features, reproductive features and modernization features. All these factors had 23 factors identified for this study. The data for this study was from the Kenya Demographic and Health Surveys (KDHS) conducted in 1999-2003 and 2003-2008/9. The data was continuous, and it involved the mean birth order for the ten periods. Principal component analysis (PCA) was utilized using 23 factors. Principal component analysis conveyed religion, region, education and marital status as the real factors. PC scores were calculated for every point. The identified principal components were utilized as forecasters in the multiple regression model, with the fertility level as the response variable. The four components were found to be affecting fertility transition differently. It was found that fertility is affected positively by factors of region and marital and negatively by factors of religion and education. These four factors can be considered in the planning policy in Kenya and Africa at large.

Keywords: fertility transition, principal component analysis, Kenya demographic health survey, birth order

Procedia PDF Downloads 103
19728 Climate Safe House: A Community Housing Project Tackling Catastrophic Sea Level Rise in Coastal Communities

Authors: Chris Fersterer, Col Fay, Tobias Danielmeier, Kat Achterberg, Scott Willis

Abstract:

New Zealand, an island nation, has an extensive coastline peppered with small communities of iconic buildings known as Bachs. Post WWII, these modest buildings were constructed by their owners as retreats and generally were small, low cost, often using recycled material and often they fell below current acceptable building standards. In the latter part of the 20th century, real estate prices in many of these communities remained low and these areas became permanent residences for people attracted to this affordable lifestyle choice. The Blueskin Resilient Communities Trust (BRCT) is an organisation that recognises the vulnerability of communities in low lying settlements as now being prone to increased flood threat brought about by climate change and sea level rise. Some of the inhabitants of Blueskin Bay, Otago, NZ have already found their properties to be un-insurable because of increased frequency of flood events and property values have slumped accordingly. Territorial authorities also acknowledge this increased risk and have created additional compliance measures for new buildings that are less than 2 m above tidal peaks. Community resilience becomes an additional concern where inhabitants are attracted to a lifestyle associated with a specific location and its people when this lifestyle is unable to be met in a suburban or city context. Traditional models of social housing fail to provide the sense of community connectedness and identity enjoyed by the current residents of Blueskin Bay. BRCT have partnered with the Otago Polytechnic Design School to design a new form of community housing that can react to this environmental change. It is a longitudinal project incorporating participatory approaches as a means of getting people ‘on board’, to understand complex systems and co-develop solutions. In the first period, they are seeking industry support and funding to develop a transportable and fully self-contained housing model that exploits current technologies. BRCT also hope that the building will become an educational tool to highlight climate change issues facing us today. This paper uses the Climate Safe House (CSH) as a case study for education in architectural sustainability through experiential learning offered as part of the Otago Polytechnics Bachelor of Design. Students engage with the project with research methodologies, including site surveys, resident interviews, data sourced from government agencies and physical modelling. The process involves collaboration across design disciplines including product and interior design but also includes connections with industry, both within the education institution and stakeholder industries introduced through BRCT. This project offers a rich learning environment where students become engaged through project based learning within a community of practice, including architecture, construction, energy and other related fields. The design outcomes are expressed in a series of public exhibitions and forums where community input is sought in a truly participatory process.

Keywords: community resilience, problem based learning, project based learning, case study

Procedia PDF Downloads 290
19727 Optimization of Proton Exchange Membrane Fuel Cell Parameters Based on Modified Particle Swarm Algorithms

Authors: M. Dezvarei, S. Morovati

Abstract:

In recent years, increasing usage of electrical energy provides a widespread field for investigating new methods to produce clean electricity with high reliability and cost management. Fuel cells are new clean generations to make electricity and thermal energy together with high performance and no environmental pollution. According to the expansion of fuel cell usage in different industrial networks, the identification and optimization of its parameters is really significant. This paper presents optimization of a proton exchange membrane fuel cell (PEMFC) parameters based on modified particle swarm optimization with real valued mutation (RVM) and clonal algorithms. Mathematical equations of this type of fuel cell are presented as the main model structure in the optimization process. Optimized parameters based on clonal and RVM algorithms are compared with the desired values in the presence and absence of measurement noise. This paper shows that these methods can improve the performance of traditional optimization methods. Simulation results are employed to analyze and compare the performance of these methodologies in order to optimize the proton exchange membrane fuel cell parameters.

Keywords: clonal algorithm, proton exchange membrane fuel cell (PEMFC), particle swarm optimization (PSO), real-valued mutation (RVM)

Procedia PDF Downloads 353
19726 Online Monitoring and Control of Continuous Mechanosynthesis by UV-Vis Spectrophotometry

Authors: Darren A. Whitaker, Dan Palmer, Jens Wesholowski, James Flaherty, John Mack, Ahmad B. Albadarin, Gavin Walker

Abstract:

Traditional mechanosynthesis has been performed by either ball milling or manual grinding. However, neither of these techniques allow the easy application of process control. The temperature may change unpredictably due to friction in the process. Hence the amount of energy transferred to the reactants is intrinsically non-uniform. Recently, it has been shown that the use of Twin-Screw extrusion (TSE) can overcome these limitations. Additionally, TSE enables a platform for continuous synthesis or manufacturing as it is an open-ended process, with feedstocks at one end and product at the other. Several materials including metal-organic frameworks (MOFs), co-crystals and small organic molecules have been produced mechanochemically using TSE. The described advantages of TSE are offset by drawbacks such as increased process complexity (a large number of process parameters) and variation in feedstock flow impacting on product quality. To handle the above-mentioned drawbacks, this study utilizes UV-Vis spectrophotometry (InSpectroX, ColVisTec) as an online tool to gain real-time information about the quality of the product. Additionally, this is combined with real-time process information in an Advanced Process Control system (PharmaMV, Perceptive Engineering) allowing full supervision and control of the TSE process. Further, by characterizing the dynamic behavior of the TSE, a model predictive controller (MPC) can be employed to ensure the process remains under control when perturbed by external disturbances. Two reactions were studied; a Knoevenagel condensation reaction of barbituric acid and vanillin and, the direct amidation of hydroquinone by ammonium acetate to form N-Acetyl-para-aminophenol (APAP) commonly known as paracetamol. Both reactions could be carried out continuously using TSE, nuclear magnetic resonance (NMR) spectroscopy was used to confirm the percentage conversion of starting materials to product. This information was used to construct partial least squares (PLS) calibration models within the PharmaMV development system, which relates the percent conversion to product to the acquired UV-Vis spectrum. Once this was complete, the model was deployed within the PharmaMV Real-Time System to carry out automated optimization experiments to maximize the percentage conversion based on a set of process parameters in a design of experiments (DoE) style methodology. With the optimum set of process parameters established, a series of PRBS process response tests (i.e. Pseudo-Random Binary Sequences) around the optimum were conducted. The resultant dataset was used to build a statistical model and associated MPC. The controller maximizes product quality whilst ensuring the process remains at the optimum even as disturbances such as raw material variability are introduced into the system. To summarize, a combination of online spectral monitoring and advanced process control was used to develop a robust system for optimization and control of two TSE based mechanosynthetic processes.

Keywords: continuous synthesis, pharmaceutical, spectroscopy, advanced process control

Procedia PDF Downloads 179
19725 A Tutorial on Model Predictive Control for Spacecraft Maneuvering Problem with Theory, Experimentation and Applications

Authors: O. B. Iskender, K. V. Ling, V. Dubanchet, L. Simonini

Abstract:

This paper discusses the recent advances and future prospects of spacecraft position and attitude control using Model Predictive Control (MPC). First, the challenges of the space missions are summarized, in particular, taking into account the errors, uncertainties, and constraints imposed by the mission, spacecraft and, onboard processing capabilities. The summary of space mission errors and uncertainties provided in categories; initial condition errors, unmodeled disturbances, sensor, and actuator errors. These previous constraints are classified into two categories: physical and geometric constraints. Last, real-time implementation capability is discussed regarding the required computation time and the impact of sensor and actuator errors based on the Hardware-In-The-Loop (HIL) experiments. The rationales behind the scenarios’ are also presented in the scope of space applications as formation flying, attitude control, rendezvous and docking, rover steering, and precision landing. The objectives of these missions are explained, and the generic constrained MPC problem formulations are summarized. Three key design elements used in MPC design: the prediction model, the constraints formulation and the objective cost function are discussed. The prediction models can be linear time invariant or time varying depending on the geometry of the orbit, whether it is circular or elliptic. The constraints can be given as linear inequalities for input or output constraints, which can be written in the same form. Moreover, the recent convexification techniques for the non-convex geometrical constraints (i.e., plume impingement, Field-of-View (FOV)) are presented in detail. Next, different objectives are provided in a mathematical framework and explained accordingly. Thirdly, because MPC implementation relies on finding in real-time the solution to constrained optimization problems, computational aspects are also examined. In particular, high-speed implementation capabilities and HIL challenges are presented towards representative space avionics. This covers an analysis of future space processors as well as the requirements of sensors and actuators on the HIL experiments outputs. The HIL tests are investigated for kinematic and dynamic tests where robotic arms and floating robots are used respectively. Eventually, the proposed algorithms and experimental setups are introduced and compared with the authors' previous work and future plans. The paper concludes with a conjecture that MPC paradigm is a promising framework at the crossroads of space applications while could be further advanced based on the challenges mentioned throughout the paper and the unaddressed gap.

Keywords: convex optimization, model predictive control, rendezvous and docking, spacecraft autonomy

Procedia PDF Downloads 111
19724 Reflection on Using Bar Model Method in Learning and Teaching Primary Mathematics: A Hong Kong Case Study

Authors: Chui Ka Shing

Abstract:

This case study research attempts to examine the use of the Bar Model Method approach in learning and teaching mathematics in a primary school in Hong Kong. The objectives of the study are to find out to what extent (a) the Bar Model Method approach enhances the construction of students’ mathematics concepts, and (b) the school-based mathematics curriculum development with adopting the Bar Model Method approach. This case study illuminates the effectiveness of using the Bar Model Method to solve mathematics problems from Primary 1 to Primary 6. Some effective pedagogies and assessments were developed to strengthen the use of the Bar Model Method across year levels. Suggestions including school-based curriculum development for using Bar Model Method and further study were discussed.

Keywords: bar model method, curriculum development, mathematics education, problem solving

Procedia PDF Downloads 221
19723 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate

Authors: Angela Maria Fasnacht

Abstract:

Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.

Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive

Procedia PDF Downloads 124
19722 Mapping Actors in Sao Paulo's Urban Development Policies: Interests at Stake in the Challenge to Sustainability

Authors: A. G. Back

Abstract:

In the context of global climate change, extreme weather events are increasingly intense and frequent, challenging the adaptability of urban space. In this sense, urban planning is a relevant instrument for addressing, in a systemic manner, various sectoral policies capable of linking the urban agenda to the reduction of social and environmental risks. The Master Plan of the Municipality of Sao Paulo, 2014, presents innovations capable of promoting the transition to sustainability in the urban space. Among such innovations, the following stand out: i) promotion of density in the axes of mass transport involving mixture of commercial, residential, services, and leisure uses (principles related to the compact city); ii) vulnerabilities reduction based on housing policies, including regular sources of funds for social housing and land reservation in urbanized areas; iii) reserve of green areas in the city to create parks and environmental regulations for new buildings focused on reducing the effects of heat island and improving urban drainage. However, long-term implementation involves distributive conflicts and may change in different political, economic, and social contexts over time. Thus, the central objective of this paper is to identify which factors limit or support the implementation of these policies. That is, to map the challenges and interests of converging and/or divergent urban actors in the sustainable urban development agenda and what resources they mobilize to support or limit these actions in the city of Sao Paulo. Recent proposals to amend the urban zoning law undermine the implementation of the Master Plan guidelines. In this context, three interest groups with different views of the city come into dispute: the real estate market, upper middle class neighborhood associations ('not in my backyard' movements), and social housing rights movements. This paper surveys the different interests and visions of these groups taking into account their convergences, or not, with the principles of sustainable urban development. This approach seeks to fill a gap in the international literature on the causes that underpin or hinder the continued implementation of policies aimed at the transition to urban sustainability in the medium and long term.

Keywords: adaptation, ecosystem-based adaptation, interest groups, urban planning, urban transition to sustainability

Procedia PDF Downloads 122
19721 Augmented Reality and Its Impact on Education

Authors: Aliakbar Alijarahi, Ali Khaleghi, Azadehe Afrasiyabi

Abstract:

One of the emerging technologies in the field of education that can be effectively profitable, called augmented reality, where the combination of real world and virtual images in real time produces new concepts that can facilitate learning. The paper, providing an introduction to the general concept of augmented reality, aims at surveying its capabitities in different areas, with an emphasis on Education, It seems quite necessary to have comparative study on virtual/e-learning and augmented reality and conclude their differences in education methods. As an review article, the paper is composed, instead of producing new concepts, to sum-up and analayze accomplished works related to the subject.

Keywords: augmented reality, education, virtual learning, e-learning

Procedia PDF Downloads 341
19720 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 535
19719 Alternating Current Photovoltaic Module Model

Authors: Irtaza M. Syed, Kaamran Raahemifar

Abstract:

This paper presents modeling of a Alternating Current (AC) Photovoltaic (PV) module using Matlab/Simulink. The proposed AC-PV module model is simple, realistic, and application oriented. The model is derived on module level as compared to cell level directly from the information provided by the manufacturer data sheet. DC-PV module, MPPT control, BC, VSI and LC filter, all were treated as a single unit. The model accounts for changes in variations of both irradiance and temperature. The AC-PV module proposed model is simulated and the results are compared with the datasheet projected numbers to validate model’s accuracy and effectiveness. Implementation and results demonstrate simplicity and accuracy, as well as reliability of the model.

Keywords: PV modeling, AC PV Module, datasheet, VI curves irradiance, temperature, MPPT, Matlab/Simulink

Procedia PDF Downloads 575
19718 A Method for Reduction of Association Rules in Data Mining

Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa

Abstract:

The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.

Keywords: data mining, association rules, rules reduction, artificial intelligence

Procedia PDF Downloads 161
19717 Model of Optimal Centroids Approach for Multivariate Data Classification

Authors: Pham Van Nha, Le Cam Binh

Abstract:

Particle swarm optimization (PSO) is a population-based stochastic optimization algorithm. PSO was inspired by the natural behavior of birds and fish in migration and foraging for food. PSO is considered as a multidisciplinary optimization model that can be applied in various optimization problems. PSO’s ideas are simple and easy to understand but PSO is only applied in simple model problems. We think that in order to expand the applicability of PSO in complex problems, PSO should be described more explicitly in the form of a mathematical model. In this paper, we represent PSO in a mathematical model and apply in the multivariate data classification. First, PSOs general mathematical model (MPSO) is analyzed as a universal optimization model. Then, Model of Optimal Centroids (MOC) is proposed for the multivariate data classification. Experiments were conducted on some benchmark data sets to prove the effectiveness of MOC compared with several proposed schemes.

Keywords: analysis of optimization, artificial intelligence based optimization, optimization for learning and data analysis, global optimization

Procedia PDF Downloads 209
19716 Using Machine Learning as an Alternative for Predicting Exchange Rates

Authors: Pedro Paulo Galindo Francisco, Eli Dhadad Junior

Abstract:

This study addresses the Meese-Rogoff Puzzle by introducing the latest machine learning techniques as alternatives for predicting the exchange rates. Using RMSE as a comparison metric, Meese and Rogoff discovered that economic models are unable to outperform the random walk model as short-term exchange rate predictors. Decades after this study, no statistical prediction technique has proven effective in overcoming this obstacle; although there were positive results, they did not apply to all currencies and defined periods. Recent advancements in artificial intelligence technologies have paved the way for a new approach to exchange rate prediction. Leveraging this technology, we applied five machine learning techniques to attempt to overcome the Meese-Rogoff puzzle. We considered daily data for the real, yen, British pound, euro, and Chinese yuan against the US dollar over a time horizon from 2010 to 2023. Our results showed that none of the presented techniques were able to produce an RMSE lower than the Random Walk model. However, the performance of some models, particularly LSTM and N-BEATS were able to outperform the ARIMA model. The results also suggest that machine learning models have untapped potential and could represent an effective long-term possibility for overcoming the Meese-Rogoff puzzle.

Keywords: exchage rate, prediction, machine learning, deep learning

Procedia PDF Downloads 32
19715 The Review of Permanent Downhole Monitoring System

Authors: Jing Hu, Dong Yang

Abstract:

With the increasingly difficult development and operating environment of exploration, there are many new challenges and difficulties in developing and exploiting oil and gas resources. These include the ability to dynamically monitor wells and provide data and assurance for the completion and production of high-cost and complex wells. A key technology in providing these assurances and maximizing oilfield profitability is real-time permanent reservoir monitoring. The emergence of optical fiber sensing systems has gradually begun to replace traditional electronic systems. Traditional temperature sensors can only achieve single-point temperature monitoring, but fiber optic sensing systems based on the Bragg grating principle have a high level of reliability, accuracy, stability, and resolution, enabling cost-effective monitoring, which can be done in real-time, anytime, and without well intervention. Continuous data acquisition is performed along the entire wellbore. The integrated package with the downhole pressure gauge, packer, and surface system can also realize real-time dynamic monitoring of the pressure in some sections of the downhole, avoiding oil well intervention and eliminating the production delay and operational risks of conventional surveys. Real-time information obtained through permanent optical fibers can also provide critical reservoir monitoring data for production and recovery optimization.

Keywords: PDHM, optical fiber, coiled tubing, photoelectric composite cable, digital-oilfield

Procedia PDF Downloads 79
19714 Lean Impact Analysis Assessment Models: Development of a Lean Measurement Structural Model

Authors: Catherine Maware, Olufemi Adetunji

Abstract:

The paper is aimed at developing a model to measure the impact of Lean manufacturing deployment on organizational performance. The model will help industry practitioners to assess the impact of implementing Lean constructs on organizational performance. It will also harmonize the measurement models of Lean performance with the house of Lean that seems to have become the industry standard. The sheer number of measurement models for impact assessment of Lean implementation makes it difficult for new adopters to select an appropriate assessment model or deployment methodology. A literature review is conducted to classify the Lean performance model. Pareto analysis is used to select the Lean constructs for the development of the model. The model is further formalized through the use of Structural Equation Modeling (SEM) in defining the underlying latent structure of a Lean system. An impact assessment measurement model developed can be used to measure Lean performance and can be adopted by different industries.

Keywords: impact measurement model, lean bundles, lean manufacturing, organizational performance

Procedia PDF Downloads 485
19713 Towards Automatic Calibration of In-Line Machine Processes

Authors: David F. Nettleton, Elodie Bugnicourt, Christian Wasiak, Alejandro Rosales

Abstract:

In this presentation, preliminary results are given for the modeling and calibration of two different industrial winding MIMO (Multiple Input Multiple Output) processes using machine learning techniques. In contrast to previous approaches which have typically used ‘black-box’ linear statistical methods together with a definition of the mechanical behavior of the process, we use non-linear machine learning algorithms together with a ‘white-box’ rule induction technique to create a supervised model of the fitting error between the expected and real force measures. The final objective is to build a precise model of the winding process in order to control de-tension of the material being wound in the first case, and the friction of the material passing through the die, in the second case. Case 1, Tension Control of a Winding Process. A plastic web is unwound from a first reel, goes over a traction reel and is rewound on a third reel. The objectives are: (i) to train a model to predict the web tension and (ii) calibration to find the input values which result in a given tension. Case 2, Friction Force Control of a Micro-Pullwinding Process. A core+resin passes through a first die, then two winding units wind an outer layer around the core, and a final pass through a second die. The objectives are: (i) to train a model to predict the friction on die2; (ii) calibration to find the input values which result in a given friction on die2. Different machine learning approaches are tested to build models, Kernel Ridge Regression, Support Vector Regression (with a Radial Basis Function Kernel) and MPART (Rule Induction with continuous value as output). As a previous step, the MPART rule induction algorithm was used to build an explicative model of the error (the difference between expected and real friction on die2). The modeling of the error behavior using explicative rules is used to help improve the overall process model. Once the models are built, the inputs are calibrated by generating Gaussian random numbers for each input (taking into account its mean and standard deviation) and comparing the output to a target (desired) output until a closest fit is found. The results of empirical testing show that a high precision is obtained for the trained models and for the calibration process. The learning step is the slowest part of the process (max. 5 minutes for this data), but this can be done offline just once. The calibration step is much faster and in under one minute obtained a precision error of less than 1x10-3 for both outputs. To summarize, in the present work two processes have been modeled and calibrated. A fast processing time and high precision has been achieved, which can be further improved by using heuristics to guide the Gaussian calibration. Error behavior has been modeled to help improve the overall process understanding. This has relevance for the quick optimal set up of many different industrial processes which use a pull-winding type process to manufacture fibre reinforced plastic parts. Acknowledgements to the Openmind project which is funded by Horizon 2020 European Union funding for Research & Innovation, Grant Agreement number 680820

Keywords: data model, machine learning, industrial winding, calibration

Procedia PDF Downloads 241
19712 Cracks Detection and Measurement Using VLP-16 LiDAR and Intel Depth Camera D435 in Real-Time

Authors: Xinwen Zhu, Xingguang Li, Sun Yi

Abstract:

Crack is one of the most common damages in buildings, bridges, roads and so on, which may pose safety hazards. However, cracks frequently happen in structures of various materials. Traditional methods of manual detection and measurement, which are known as subjective, time-consuming, and labor-intensive, are gradually unable to meet the needs of modern development. In addition, crack detection and measurement need be safe considering space limitations and danger. Intelligent crack detection has become necessary research. In this paper, an efficient method for crack detection and quantification using a 3D sensor, LiDAR, and depth camera is proposed. This method works even in a dark environment, which is usual in real-world applications. The LiDAR rapidly spins to scan the surrounding environment and discover cracks through lasers thousands of times per second, providing a rich, 3D point cloud in real-time. The LiDAR provides quite accurate depth information. The precision of the distance of each point can be determined within around  ±3 cm accuracy, and not only it is good for getting a precise distance, but it also allows us to see far of over 100m going with the top range models. But the accuracy is still large for some high precision structures of material. To make the depth of crack is much more accurate, the depth camera is in need. The cracks are scanned by the depth camera at the same time. Finally, all data from LiDAR and Depth cameras are analyzed, and the size of the cracks can be quantified successfully. The comparison shows that the minimum and mean absolute percentage error between measured and calculated width are about 2.22% and 6.27%, respectively. The experiments and results are presented in this paper.

Keywords: LiDAR, depth camera, real-time, detection and measurement

Procedia PDF Downloads 227