Search results for: context based planning model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 42432

Search results for: context based planning model

40242 Model Predictive Control Using Thermal Inputs for Crystal Growth Dynamics

Authors: Takashi Shimizu, Tomoaki Hashimoto

Abstract:

Recently, crystal growth technologies have made progress by the requirement for the high quality of crystal materials. To control the crystal growth dynamics actively by external forces is useuful for reducing composition non-uniformity. In this study, a control method based on model predictive control using thermal inputs is proposed for crystal growth dynamics of semiconductor materials. The control system of crystal growth dynamics considered here is governed by the continuity, momentum, energy, and mass transport equations. To establish the control method for such thermal fluid systems, we adopt model predictive control known as a kind of optimal feedback control in which the control performance over a finite future is optimized with a performance index that has a moving initial time and terminal time. The objective of this study is to establish a model predictive control method for crystal growth dynamics of semiconductor materials.

Keywords: model predictive control, optimal control, process control, crystal growth

Procedia PDF Downloads 361
40241 Management Information System to Help Managers for Providing Decision Making in an Organization

Authors: Ajayi Oluwasola Felix

Abstract:

Management information system (MIS) provides information for the managerial activities in an organization. The main purpose of this research is, MIS provides accurate and timely information necessary to facilitate the decision-making process and enable the organizations planning control and operational functions to be carried out effectively. Management information system (MIS) is basically concerned with processing data into information and is then communicated to the various departments in an organization for appropriate decision-making. MIS is a subset of the overall planning and control activities covering the application of humans technologies, and procedures of the organization. The information system is the mechanism to ensure that information is available to the managers in the form they want it and when they need it.

Keywords: Management Information Systems (MIS), information technology, decision-making, MIS in Organizations

Procedia PDF Downloads 561
40240 Reliability Prediction of Tires Using Linear Mixed-Effects Model

Authors: Myung Hwan Na, Ho- Chun Song, EunHee Hong

Abstract:

We widely use normal linear mixed-effects model to analysis data in repeated measurement. In case of detecting heteroscedasticity and the non-normality of the population distribution at the same time, normal linear mixed-effects model can give improper result of analysis. To achieve more robust estimation, we use heavy tailed linear mixed-effects model which gives more exact and reliable analysis conclusion than standard normal linear mixed-effects model.

Keywords: reliability, tires, field data, linear mixed-effects model

Procedia PDF Downloads 565
40239 Development of a Model Based on Wavelets and Matrices for the Treatment of Weakly Singular Partial Integro-Differential Equations

Authors: Somveer Singh, Vineet Kumar Singh

Abstract:

We present a new model based on viscoelasticity for the Non-Newtonian fluids.We use a matrix formulated algorithm to approximate solutions of a class of partial integro-differential equations with the given initial and boundary conditions. Some numerical results are presented to simplify application of operational matrix formulation and reduce the computational cost. Convergence analysis, error estimation and numerical stability of the method are also investigated. Finally, some test examples are given to demonstrate accuracy and efficiency of the proposed method.

Keywords: Legendre Wavelets, operational matrices, partial integro-differential equation, viscoelasticity

Procedia PDF Downloads 340
40238 Evaluating Radiative Feedback Mechanisms in Coastal West Africa Using Regional Climate Models

Authors: Akinnubi Rufus Temidayo

Abstract:

Coastal West Africa is highly sensitive to climate variability, driven by complex ocean-atmosphere interactions that shape temperature, precipitation, and extreme weather. Radiative feedback mechanisms—such as water vapor feedback, cloud-radiation interactions, and surface albedo—play a critical role in modulating these patterns. Yet, limited research addresses these feedbacks in climate models specific to West Africa’s coastal zones, creating challenges for accurate climate projections and adaptive planning. This study aims to evaluate the influence of radiative feedbacks on the coastal climate of West Africa by quantifying the effects of water vapor, cloud cover, and sea surface temperature (SST) on the region’s radiative balance. The study uses a regional climate model (RCM) to simulate feedbacks over a 20-year period (2005-2025) with high-resolution data from CORDEX and satellite observations. Key mechanisms investigated include (1) Water Vapor Feedback—the amplifying effect of humidity on warming, (2) Cloud-Radiation Interactions—the impact of cloud cover on radiation balance, especially during the West African Monsoon, and (3) Surface Albedo and Land-Use Changes—effects of urbanization and vegetation on the radiation budget. Preliminary results indicate that radiative feedbacks strongly influence seasonal climate variability in coastal West Africa. Water vapor feedback amplifies dry-season warming, cloud-radiation interactions moderate surface temperatures during monsoon seasons, and SST variations in the Atlantic affect the frequency and intensity of extreme rainfall events. The findings suggest that incorporating these feedbacks into climate planning can strengthen resilience to climate impacts in West African coastal communities. Further research should refine regional models to capture anthropogenic influences like greenhouse gas emissions, guiding sustainable urban and resource planning to mitigate climate risks.

Keywords: west africa, radiative, climate, resilence, anthropogenic

Procedia PDF Downloads 18
40237 Planning Fore Stress II: Study on Resiliency of New Architectural Patterns in Urban Scale

Authors: Amir Shouri, Fereshteh Tabe

Abstract:

Master planning and urban infrastructure’s thoughtful and sequential design strategies will play the major role in reducing the damages of natural disasters, war and or social/population related conflicts for cities. Defensive strategies have been revised during the history of mankind after having damages from natural depressions, war experiences and terrorist attacks on cities. Lessons learnt from Earthquakes, from 2 world war casualties in 20th century and terrorist activities of all times. Particularly, after Hurricane Sandy of New York in 2012 and September 11th attack on New York’s World Trade Centre (WTC) in 21st century, there have been series of serious collaborations between law making authorities, urban planners and architects and defence related organizations to firstly, getting prepared and/or prevent such activities and secondly, reduce the human loss and economic damages to minimum. This study will work on developing a model of planning for New York City, where its citizens will get minimum impacts in threat-full time with minimum economic damages to the city after the stress is passed. The main discussion in this proposal will focus on pre-hazard, hazard-time and post-hazard transformative policies and strategies that will reduce the “Life casualties” and will ease “Economic Recovery” in post-hazard conditions. This proposal is going to scrutinize that one of the key solutions in this path might be focusing on all overlaying possibilities on architectural platforms of three fundamental infrastructures, the transportation, the power related sources and defensive abilities on a dynamic-transformative framework that will provide maximum safety, high level of flexibility and fastest action-reaction opportunities in stressful periods of time. “Planning Fore Stress” is going to be done in an analytical, qualitative and quantitative work frame, where it will study cases from all over the world. Technology, Organic Design, Materiality, Urban forms, city politics and sustainability will be discussed in deferent cases in international scale. From the modern strategies of Copenhagen for living friendly with nature to traditional approaches of Indonesian old urban planning patterns, the “Iron Dome” of Israel to “Tunnels” in Gaza, from “Ultra-high-performance quartz-infused concrete” of Iran to peaceful and nature-friendly strategies of Switzerland, from “Urban Geopolitics” in cities, war and terrorism to “Design of Sustainable Cities” in the world, will all be studied with references and detailed look to analysis of each case in order to propose the most resourceful, practical and realistic solutions to questions on “New City Divisions”, “New City Planning and social activities” and “New Strategic Architecture for Safe Cities”. This study is a developed version of a proposal that was announced as winner at MoMA in 2013 in call for ideas for Rockaway after Sandy Hurricane took place.

Keywords: urban scale, city safety, natural disaster, war and terrorism, city divisions, architecture for safe cities

Procedia PDF Downloads 488
40236 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Orlin Davchev

Abstract:

The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.

Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction

Procedia PDF Downloads 70
40235 An Ontology Model for Systems Engineering Derived from ISO/IEC/IEEE 15288: 2015: Systems and Software Engineering - System Life Cycle Processes

Authors: Lan Yang, Kathryn Cormican, Ming Yu

Abstract:

ISO/IEC/IEEE 15288: 2015, Systems and Software Engineering - System Life Cycle Processes is an international standard that provides generic top-level process descriptions to support systems engineering (SE). However, the processes defined in the standard needs improvement to lift integrity and consistency. The goal of this research is to explore the way by building an ontology model for the SE standard to manage the knowledge of SE. The ontology model gives a whole picture of the SE knowledge domain by building connections between SE concepts. Moreover, it creates a hierarchical classification of the concepts to fulfil different requirements of displaying and analysing SE knowledge.

Keywords: knowledge management, model-based systems engineering, ontology modelling, systems engineering ontology

Procedia PDF Downloads 430
40234 SQL Generator Based on MVC Pattern

Authors: Chanchai Supaartagorn

Abstract:

Structured Query Language (SQL) is the standard de facto language to access and manipulate data in a relational database. Although SQL is a language that is simple and powerful, most novice users will have trouble with SQL syntax. Thus, we are presenting SQL generator tool which is capable of translating actions and displaying SQL commands and data sets simultaneously. The tool was developed based on Model-View-Controller (MVC) pattern. The MVC pattern is a widely used software design pattern that enforces the separation between the input, processing, and output of an application. Developers take full advantage of it to reduce the complexity in architectural design and to increase flexibility and reuse of code. In addition, we use White-Box testing for the code verification in the Model module.

Keywords: MVC, relational database, SQL, White-Box testing

Procedia PDF Downloads 423
40233 Low Impact Development Strategies Applied in the Water System Planning in the Coastal Eco-Green Campus

Authors: Ying Li, Zaisheng Hong, Weihong Wang

Abstract:

With the rapid enlargement of the size of Chinese universities, newly built campuses are springing up everywhere in recent years. It is urged to build eco-green campus because the role of higher education institutions in the transition to a more sustainable society has been highlighted for almost three decades. On condition that a new campus is usually built on an undeveloped site, where the basic infrastructure is not completed, finding proper strategies in planning and design of the campus becomes a primary concern. Low Impact Development (LID) options have been proposed as an alternative approach to make better use of rainwater in planning and design of an undeveloped site. On the basis of analyzing the natural circumstance, geographic condition, and other relative information, four main LID approaches are coordinated in this study of Hebei Union University, which are ‘Storage’, ‘Retaining’, ‘Infiltration’ and ‘Purification’. ‘Storage’ refers to a big central lake in the campus for rainwater harvesting. ‘Retaining’ means rainwater gardens scattered in the campus, also being known as bioretention areas which mimic the naturally created pools of water, to decrease surface flow runoff. ‘Infiltration’ is designed of grassed swales, which also play a part of floodway channel. ‘Purification’ is known as either natural or artificial wetland to reduce pollutants such as nitrogen and phosphorous in the waterbody. With above mentioned measures dealing with the synthetic use of rainwater in the acid & alkali area in the coastal district, an eco-green campus construction and an ecological sustainability will be realized, which will give us more enlightenment and reference.

Keywords: newly built campus, low impact development, planning design, rainwater reuse

Procedia PDF Downloads 251
40232 Subpixel Corner Detection for Monocular Camera Linear Model Research

Authors: Guorong Sui, Xingwei Jia, Fei Tong, Xiumin Gao

Abstract:

Camera calibration is a fundamental issue of high precision noncontact measurement. And it is necessary to analyze and study the reliability and application range of its linear model which is often used in the camera calibration. According to the imaging features of monocular cameras, a camera model which is based on the image pixel coordinates and three dimensional space coordinates is built. Using our own customized template, the image pixel coordinate is obtained by the subpixel corner detection method. Without considering the aberration of the optical system, the feature extraction and linearity analysis of the line segment in the template are performed. Moreover, the experiment is repeated 11 times by constantly varying the measuring distance. At last, the linearity of the camera is achieved by fitting 11 groups of data. The camera model measurement results show that the relative error does not exceed 1%, and the repeated measurement error is not more than 0.1 mm magnitude. Meanwhile, it is found that the model has some measurement differences in the different region and object distance. The experiment results show this linear model is simple and practical, and have good linearity within a certain object distance. These experiment results provide a powerful basis for establishment of the linear model of camera. These works will have potential value to the actual engineering measurement.

Keywords: camera linear model, geometric imaging relationship, image pixel coordinates, three dimensional space coordinates, sub-pixel corner detection

Procedia PDF Downloads 280
40231 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome

Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler

Abstract:

Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.

Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model

Procedia PDF Downloads 155
40230 Back Stepping Sliding Mode Control of Blood Glucose for Type I Diabetes

Authors: N. Tadrisi Parsa, A. R. Vali, R. Ghasemi

Abstract:

Diabetes is a growing health problem in worldwide. Especially, the patients with Type 1 diabetes need strict glycemic control because they have deficiency of insulin production. This paper attempts to control blood glucose based on body mathematical body model. The Bergman minimal mathematical model is used to develop the nonlinear controller. A novel back-stepping based sliding mode control (B-SMC) strategy is proposed as a solution that guarantees practical tracking of a desired glucose concentration. In order to show the performance of the proposed design, it is compared with conventional linear and fuzzy controllers which have been done in previous researches. The numerical simulation result shows the advantages of sliding mode back stepping controller design to linear and fuzzy controllers.

Keywords: bergman model, nonlinear control, back stepping, sliding mode control

Procedia PDF Downloads 384
40229 Statistical Data Analysis of Migration Impact on the Spread of HIV Epidemic Model Using Markov Monte Carlo Method

Authors: Ofosuhene O. Apenteng, Noor Azina Ismail

Abstract:

Over the last several years, concern has developed over how to minimize the spread of HIV/AIDS epidemic in many countries. AIDS epidemic has tremendously stimulated the development of mathematical models of infectious diseases. The transmission dynamics of HIV infection that eventually developed AIDS has taken a pivotal role of much on building mathematical models. From the initial HIV and AIDS models introduced in the 80s, various improvements have been taken into account as how to model HIV/AIDS frameworks. In this paper, we present the impact of migration on the spread of HIV/AIDS. Epidemic model is considered by a system of nonlinear differential equations to supplement the statistical method approach. The model is calibrated using HIV incidence data from Malaysia between 1986 and 2011. Bayesian inference based on Markov Chain Monte Carlo is used to validate the model by fitting it to the data and to estimate the unknown parameters for the model. The results suggest that the migrants stay for a long time contributes to the spread of HIV. The model also indicates that susceptible individual becomes infected and moved to HIV compartment at a rate that is more significant than the removal rate from HIV compartment to AIDS compartment. The disease-free steady state is unstable since the basic reproduction number is 1.627309. This is a big concern and not a good indicator from the public heath point of view since the aim is to stabilize the epidemic at the disease equilibrium.

Keywords: epidemic model, HIV, MCMC, parameter estimation

Procedia PDF Downloads 603
40228 Reinforcement Learning for Self Driving Racing Car Games

Authors: Adam Beaunoyer, Cory Beaunoyer, Mohammed Elmorsy, Hanan Saleh

Abstract:

This research aims to create a reinforcement learning agent capable of racing in challenging simulated environments with a low collision count. We present a reinforcement learning agent that can navigate challenging tracks using both a Deep Q-Network (DQN) and a Soft Actor-Critic (SAC) method. A challenging track includes curves, jumps, and varying road widths throughout. Using open-source code on Github, the environment used in this research is based on the 1995 racing game WipeOut. The proposed reinforcement learning agent can navigate challenging tracks rapidly while maintaining low racing completion time and collision count. The results show that the SAC model outperforms the DQN model by a large margin. We also propose an alternative multiple-car model that can navigate the track without colliding with other vehicles on the track. The SAC model is the basis for the multiple-car model, where it can complete the laps quicker than the single-car model but has a higher collision rate with the track wall.

Keywords: reinforcement learning, soft actor-critic, deep q-network, self-driving cars, artificial intelligence, gaming

Procedia PDF Downloads 53
40227 A New Categorization of Image Quality Metrics Based on a Model of Human Quality Perception

Authors: Maria Grazia Albanesi, Riccardo Amadeo

Abstract:

This study presents a new model of the human image quality assessment process: the aim is to highlight the foundations of the image quality metrics proposed in literature, by identifying the cognitive/physiological or mathematical principles of their development and the relation with the actual human quality assessment process. The model allows to create a novel categorization of objective and subjective image quality metrics. Our work includes an overview of the most used or effective objective metrics in literature, and, for each of them, we underline its main characteristics, with reference to the rationale of the proposed model and categorization. From the results of this operation, we underline a problem that affects all the presented metrics: the fact that many aspects of human biases are not taken in account at all. We then propose a possible methodology to address this issue.

Keywords: eye-tracking, image quality assessment metric, MOS, quality of user experience, visual perception

Procedia PDF Downloads 416
40226 Detection Method of Federated Learning Backdoor Based on Weighted K-Medoids

Authors: Xun Li, Haojie Wang

Abstract:

Federated learning is a kind of distributed training and centralized training mode, which is of great value in the protection of user privacy. In order to solve the problem that the model is vulnerable to backdoor attacks in federated learning, a backdoor attack detection method based on a weighted k-medoids algorithm is proposed. First of all, this paper collates the update parameters of the client to construct a vector group, then uses the principal components analysis (PCA) algorithm to extract the corresponding feature information from the vector group, and finally uses the improved k-medoids clustering algorithm to identify the normal and backdoor update parameters. In this paper, the backdoor is implanted in the federation learning model through the model replacement attack method in the simulation experiment, and the update parameters from the attacker are effectively detected and removed by the defense method proposed in this paper.

Keywords: federated learning, backdoor attack, PCA, k-medoids, backdoor defense

Procedia PDF Downloads 117
40225 Loading and Unloading Scheduling Problem in a Multiple-Multiple Logistics Network: Modelling and Solving

Authors: Yasin Tadayonrad

Abstract:

Most of the supply chain networks have many nodes starting from the suppliers’ side up to the customers’ side that each node sends/receives the raw materials/products from/to the other nodes. One of the major concerns in this kind of supply chain network is finding the best schedule for loading /unloading the shipments through the whole network by which all the constraints in the source and destination nodes are met and all the shipments are delivered on time. One of the main constraints in this problem is loading/unloading capacity in each source/ destination node at each time slot (e.g., per week/day/hour). Because of the different characteristics of different products/groups of products, the capacity of each node might differ based on each group of products. In most supply chain networks (especially in the Fast-moving consumer goods industry), there are different planners/planning teams working separately in different nodes to determine the loading/unloading timeslots in source/destination nodes to send/receive the shipments. In this paper, a mathematical problem has been proposed to find the best timeslots for loading/unloading the shipments minimizing the overall delays subject to respecting the capacity of loading/unloading of each node, the required delivery date of each shipment (considering the lead-times), and working-days of each node. This model was implemented on python and solved using Python-MIP on a sample data set. Finally, the idea of a heuristic algorithm has been proposed as a way of improving the solution method that helps to implement the model on larger data sets in real business cases, including more nodes and shipments.

Keywords: supply chain management, transportation, multiple-multiple network, timeslots management, mathematical modeling, mixed integer programming

Procedia PDF Downloads 98
40224 Constitutive Model for Analysis of Long-Term Municipal Solid Waste Landfill Settlement

Authors: Irena Basaric Ikodinovic, Dragoslav Rakic, Mirjana Vukicevic, Sanja Jockovic, Jovana Jankovic Pantic

Abstract:

Large long-term settlement occurs at the municipal solid waste landfills over an extended period of time which may lead to breakage of the geomembrane, damage of the cover systems, other protective systems or facilities constructed on top of a landfill. Also, municipal solid waste is an extremely heterogeneous material and its properties vary over location and time within a landfill. These material characteristics require the formulation of a new constitutive model to predict the long-term settlement of municipal solid waste. The paper presents a new constitutive model which is formulated to describe the mechanical behavior of municipal solid waste. Model is based on Modified Cam Clay model and the critical state soil mechanics framework incorporating time-dependent components: mechanical creep and biodegradation of municipal solid waste. The formulated constitutive model is optimized and defined with eight input parameters: five Modified Cam Clay parameters, one parameter for mechanical creep and two parameters for biodegradation of municipal solid waste. Thereafter, the constitutive model is implemented in the software suite for finite element analysis (ABAQUS) and numerical analysis of the experimental landfill settlement is performed. The proposed model predicts the total settlement which is in good agreement with field measured settlement at the experimental landfill.

Keywords: constitutive model, finite element analysis, municipal solid waste, settlement

Procedia PDF Downloads 237
40223 Assessment of Soil Salinity through Remote Sensing Technique in the Coastal Region of Bangladesh

Authors: B. Hossen, Y. Helmut

Abstract:

Soil salinity is a major problem for the coastal region of Bangladesh, which has been increasing for the last four decades. Determination of soil salinity is essential for proper land use planning for agricultural crop production. The aim of the research is to estimate and monitor the soil salinity in the study area. Remote sensing can be an effective tool for detecting soil salinity in data-scarce conditions. In the research, Landsat 8 is used, which required atmospheric and radiometric correction, and nine soil salinity indices are applied to develop a soil salinity map. Ground soil salinity data, i.e., EC value, is collected as a printed map which is then scanned and digitized to develop a point shapefile. Linear regression is made between satellite-based generated map and ground soil salinity data, i.e., EC value. The results show that maximum R² value is found for salinity index SI 7 = G*R/B representing 0.022. This minimal R² value refers that there is a negligible relationship between ground EC value and salinity index generated value. Hence, these indices are not appropriate to assess soil salinity though many studies used those soil salinity indices successfully. Therefore, further research is necessary to formulate a model for determining the soil salinity in the coastal of Bangladesh.

Keywords: soil salinity, EC, Landsat 8, salinity indices, linear regression, remote sensing

Procedia PDF Downloads 352
40222 Formulating a Flexible-Spread Fuzzy Regression Model Based on Dissemblance Index

Authors: Shih-Pin Chen, Shih-Syuan You

Abstract:

This study proposes a regression model with flexible spreads for fuzzy input-output data to cope with the situation that the existing measures cannot reflect the actual estimation error. The main idea is that a dissemblance index (DI) is carefully identified and defined for precisely measuring the actual estimation error. Moreover, the graded mean integration (GMI) representation is adopted for determining more representative numeric regression coefficients. Notably, to comprehensively compare the performance of the proposed model with other ones, three different criteria are adopted. The results from commonly used test numerical examples and an application to Taiwan's business monitoring indicator illustrate that the proposed dissemblance index method not only produces valid fuzzy regression models for fuzzy input-output data, but also has satisfactory and stable performance in terms of the total estimation error based on these three criteria.

Keywords: dissemblance index, forecasting, fuzzy sets, linear regression

Procedia PDF Downloads 366
40221 Performance Comparison of Microcontroller-Based Optimum Controller for Fruit Drying System

Authors: Umar Salisu

Abstract:

This research presents the development of a hot air tomatoes drying system. To provide a more efficient and continuous temperature control, microcontroller-based optimal controller was developed. The system is based on a power control principle to achieve smooth power variations depending on a feedback temperature signal of the process. An LM35 temperature sensor and LM399 differential comparator were used to measure the temperature. The mathematical model of the system was developed and the optimal controller was designed and simulated and compared with the PID controller transient response. A controlled environment suitable for fruit drying is developed within a closed chamber and is a three step process. First, the infrared light is used internally to preheated the fruit to speedily remove the water content inside the fruit for fast drying. Second, hot air of a specified temperature is blown inside the chamber to maintain the humidity below a specified level and exhaust the humid air of the chamber. Third, the microcontroller disconnects the power to the chamber after the moisture content of the fruits is removed to minimal. Experiments were conducted with 1kg of fresh tomatoes at three different temperatures (40, 50 and 60 °C) at constant relative humidity of 30%RH. The results obtained indicate that the system is significantly reducing the drying time without affecting the quality of the fruits. In the context of temperature control, the results obtained showed that the response of the optimal controller has zero overshoot whereas the PID controller response overshoots to about 30% of the set-point. Another performance metric used is the rising time; the optimal controller rose without any delay while the PID controller delayed for more than 50s. It can be argued that the optimal controller performance is preferable than that of the PID controller since it does not overshoot and it starts in good time.

Keywords: drying, microcontroller, optimum controller, PID controller

Procedia PDF Downloads 304
40220 Study and Simulation of a Dynamic System Using Digital Twin

Authors: J.P. Henriques, E. R. Neto, G. Almeida, G. Ribeiro, J.V. Coutinho, A.B. Lugli

Abstract:

Industry 4.0, or the Fourth Industrial Revolution, is transforming the relationship between people and machines. In this scenario, some technologies such as Cloud Computing, Internet of Things, Augmented Reality, Artificial Intelligence, Additive Manufacturing, among others, are making industries and devices increasingly intelligent. One of the most powerful technologies of this new revolution is the Digital Twin, which allows the virtualization of a real system or process. In this context, the present paper addresses the linear and nonlinear dynamic study of a didactic level plant using Digital Twin. In the first part of the work, the level plant is identified at a fixed point of operation, BY using the existing method of least squares means. The linearized model is embedded in a Digital Twin using Automation Studio® from Famous Technologies. Finally, in order to validate the usage of the Digital Twin in the linearized study of the plant, the dynamic response of the real system is compared to the Digital Twin. Furthermore, in order to develop the nonlinear model on a Digital Twin, the didactic level plant is identified by using the method proposed by Hammerstein. Different steps are applied to the plant, and from the Hammerstein algorithm, the nonlinear model is obtained for all operating ranges of the plant. As for the linear approach, the nonlinear model is embedded in the Digital Twin, and the dynamic response is compared to the real system in different points of operation. Finally, yet importantly, from the practical results obtained, one can conclude that the usage of Digital Twin to study the dynamic systems is extremely useful in the industrial environment, taking into account that it is possible to develop and tune controllers BY using the virtual model of the real systems.

Keywords: industry 4.0, digital twin, system identification, linear and nonlinear models

Procedia PDF Downloads 153
40219 A Model of Knowledge Management Culture Change

Authors: Reza Davoodi, Hamid Abbasi, Heidar Norouzi, Gholamabbas Alipourian

Abstract:

A dynamic model shaping a process of knowledge management (KM) culture change is suggested. It is aimed at providing effective KM of employees for obtaining desired results in an organization. The essential requirements for obtaining KM culture change are determined. The proposed model realizes these requirements. Dynamics of the model are expressed by a change of its parameters. It is adjusted to the dynamic process of KM culture change. Building the model includes elaboration and integration of interconnected components. The “Result” is a central component of the model. This component determines a desired organizational goal and possible directions of its attainment. The “Confront” component engenders constructive confrontation in an organization. For this reason, the employees are prompted toward KM culture change with the purpose of attaining the desired result. The “Assess” component realizes complex assessments of employee proposals by management and peers. The proposals are directed towards attaining the desired result in an organization. The “Reward” component sets the order of assigning rewards to employees based on the assessments of their proposals.

Keywords: knowledge management, organizational culture change, employee, result

Procedia PDF Downloads 412
40218 Deep Learning Framework for Predicting Bus Travel Times with Multiple Bus Routes: A Single-Step Multi-Station Forecasting Approach

Authors: Muhammad Ahnaf Zahin, Yaw Adu-Gyamfi

Abstract:

Bus transit is a crucial component of transportation networks, especially in urban areas. Any intelligent transportation system must have accurate real-time information on bus travel times since it minimizes waiting times for passengers at different stations along a route, improves service reliability, and significantly optimizes travel patterns. Bus agencies must enhance the quality of their information service to serve their passengers better and draw in more travelers since people waiting at bus stops are frequently anxious about when the bus will arrive at their starting point and when it will reach their destination. For solving this issue, different models have been developed for predicting bus travel times recently, but most of them are focused on smaller road networks due to their relatively subpar performance in high-density urban areas on a vast network. This paper develops a deep learning-based architecture using a single-step multi-station forecasting approach to predict average bus travel times for numerous routes, stops, and trips on a large-scale network using heterogeneous bus transit data collected from the GTFS database. Over one week, data was gathered from multiple bus routes in Saint Louis, Missouri. In this study, Gated Recurrent Unit (GRU) neural network was followed to predict the mean vehicle travel times for different hours of the day for multiple stations along multiple routes. Historical time steps and prediction horizon were set up to 5 and 1, respectively, which means that five hours of historical average travel time data were used to predict average travel time for the following hour. The spatial and temporal information and the historical average travel times were captured from the dataset for model input parameters. As adjacency matrices for the spatial input parameters, the station distances and sequence numbers were used, and the time of day (hour) was considered for the temporal inputs. Other inputs, including volatility information such as standard deviation and variance of journey durations, were also included in the model to make it more robust. The model's performance was evaluated based on a metric called mean absolute percentage error (MAPE). The observed prediction errors for various routes, trips, and stations remained consistent throughout the day. The results showed that the developed model could predict travel times more accurately during peak traffic hours, having a MAPE of around 14%, and performed less accurately during the latter part of the day. In the context of a complicated transportation network in high-density urban areas, the model showed its applicability for real-time travel time prediction of public transportation and ensured the high quality of the predictions generated by the model.

Keywords: gated recurrent unit, mean absolute percentage error, single-step forecasting, travel time prediction.

Procedia PDF Downloads 77
40217 Geospatial Technologies in Support of Civic Engagement and Cultural Heritage: Lessons Learned from Three Participatory Planning Workshops for Involving Local Communities in the Development of Sustainable Tourism Practices in Latiano, Brindisi

Authors: Mark Opmeer

Abstract:

The fruitful relationship between cultural heritage and digital technology is evident. Due to the development of user-friendly software, an increasing amount of heritage scholars use ict for their research activities. As a result, the implementation of information technology for heritage planning has become a research objective in itself. During the last decades, we have witnessed a growing debate and literature about the importance of computer technologies for the field of cultural heritage and ecotourism. Indeed, implementing digital technology in support of these domains can be very fruitful for one’s research practice. However, due to the rapid development of new software scholars may find it challenging to use these innovations in an appropriate way. As such, this contribution seeks to explore the interplay between geospatial technologies (geo-ict), civic engagement and cultural heritage and tourism. In this article, we discuss our findings on the use of geo-ict in support of civic participation, cultural heritage and sustainable tourism development in the southern Italian district of Brindisi. In the city of Latiano, three workshops were organized that involved local members of the community to distinguish and discuss interesting points of interests (POI’s) which represent the cultural significance and identity of the area. During the first workshop, a so called mappa della comunità was created on a touch table with collaborative mapping software, that allowed the participators to highlight potential destinations for tourist purposes. Furthermore, two heritage-based itineraries along a selection of identified POI’s was created to make the region attractive for recreants and tourists. These heritage-based itineraries reflect the communities’ ideas about the cultural identity of the region. Both trails were subsequently implemented in a dedicated mobile application (app) and was evaluated using a mixed-method approach with the members of the community during the second workshop. In the final workshop, the findings of the collaboration, the heritage trails and the app was evaluated with all participants. Based on our conclusions, we argue that geospatial technologies have a significant potential for involving local communities in heritage planning and tourism development. The participants of the workshops found it increasingly engaging to share their ideas and knowledge using the digital map of the touch table. Secondly, the use of a mobile application as instrument to test the heritage-based itineraries in the field was broadly considered as fun and beneficial for enhancing community awareness and participation in local heritage. The app furthermore stimulated the communities’ awareness of the added value of geospatial technologies for sustainable tourism development in the area. We conclude this article with a number of recommendations in order to provide a best practice for organizing heritage workshops with similar objectives.

Keywords: civic engagement, geospatial technologies, tourism development, cultural heritage

Procedia PDF Downloads 292
40216 Context and Culture in EFL Learners' and Native Speakers' Discourses

Authors: Emad A. S. Abu-Ayyash

Abstract:

Cohesive devices, the linguistic tools that are usually employed to hold the different parts of the text together, have been the focus of a significant number of discourse analysis studies. These linguistic tools have grabbed the attention of researchers since the inception of the first and most comprehensive model of cohesion in 1976. However, it was noticed that some cohesive devices (e.g., endophoric reference, conjunctions, ellipsis, substitution, and lexical ties) – being thought of as more popular than others (e.g., exophoric reference) – were over-researched. The present paper explores the usage of two cohesive devices that have been evidently almost absent from discourse analysis studies. These cohesive devices are exophoric and homophoric references, the linguistic items that can be interpreted in terms of the physical and cultural contexts of discourse. The significance of the current paper, therefore, stems from the fact that it attempts to fill a gap in the research conducted so far on cohesive devices. This study provides an explanation of the concepts of the cohesive devices that have been employed in a plethora of research on cohesion and elucidates the relevant context-related concepts. The paper also identifies the gap in cohesive devices research. Exophora and homophora, the least visited cohesive devices in previous studies, were qualitatively and quantitatively explored in six opinion articles, four produced by eight postgraduate English as a Foreign Language (EFL) students in a university in the United Arab Emirates and two by professional NS writers in the Independent and the Guardian. The six pieces were about the United Kingdom Independent Party (UKIP) leader’s call to ban the burqa in the UK and were analysed vis-a-vis the employment and function of homophora and exophora. The study found that both EFL students and native speakers employed exophora and homophora considerably in their writing to serve a variety of functions, including building assumptions, supporting main ideas, and involving the readers among others.

Keywords: cohesive devices, context, culture, exophoric reference, homophoric reference

Procedia PDF Downloads 128
40215 Hybrid Adaptive Modeling to Enhance Robustness of Real-Time Optimization

Authors: Hussain Syed Asad, Richard Kwok Kit Yuen, Gongsheng Huang

Abstract:

Real-time optimization has been considered an effective approach for improving energy efficient operation of heating, ventilation, and air-conditioning (HVAC) systems. In model-based real-time optimization, model mismatches cannot be avoided. When model mismatches are significant, the performance of the real-time optimization will be impaired and hence the expected energy saving will be reduced. In this paper, the model mismatches for chiller plant on real-time optimization are considered. In the real-time optimization of the chiller plant, simplified semi-physical or grey box model of chiller is always used, which should be identified using available operation data. To overcome the model mismatches associated with the chiller model, hybrid Genetic Algorithms (HGAs) method is used for online real-time training of the chiller model. HGAs combines Genetic Algorithms (GAs) method (for global search) and traditional optimization method (i.e. faster and more efficient for local search) to avoid conventional hit and trial process of GAs. The identification of model parameters is synthesized as an optimization problem; and the objective function is the Least Square Error between the output from the model and the actual output from the chiller plant. A case study is used to illustrate the implementation of the proposed method. It has been shown that the proposed approach is able to provide reliability in decision making, enhance the robustness of the real-time optimization strategy and improve on energy performance.

Keywords: energy performance, hybrid adaptive modeling, hybrid genetic algorithms, real-time optimization, heating, ventilation, and air-conditioning

Procedia PDF Downloads 420
40214 Links between Landscape Management and Environmental Risk Assessment: Considerations from the Italian Context

Authors: Mara Balestrieri, Clara Pusceddu

Abstract:

Issues relating to the destructive phenomena that can damage people and goods have returned to the centre of debate in Italy with the increase in catastrophic episodes in recent years in a country which is highly vulnerable to hydrological risk. Environmental factors and geological and geomorphological territorial characteristics play an important role in determining the level of vulnerability and the natural tendency to risk. However, a territory has also been subjected to the requirements of and transformations of society, and this brings other relevant factors. The reasons for the increase in destructive phenomena are often to be found in the territorial development models adopted. Stewardship of the landscape and management of risk are related issues. This study aims to summarize the most relevant elements about this connection and at the same time to clarify the role of environmental risk assessment as a tool to aid in the sustainable management of landscape. How planners relate to this problem and which aspects should be monitored in order to prepare responsible and useful interventions?

Keywords: assessment, landscape, risk, planning

Procedia PDF Downloads 465
40213 A Simulation Model and Parametric Study of Triple-Effect Desalination Plant

Authors: Maha BenHamad, Ali Snoussi, Ammar Ben Brahim

Abstract:

A steady-state analysis of triple-effect thermal vapor compressor desalination unit was performed. A mathematical model based on mass, salinity and energy balances is developed. The purpose of this paper is to develop a connection between process simulator and process optimizer in order to study the influence of several operating variables on the performance and the produced water cost of the unit. A MATLAB program is used to solve the model equations, and Aspen HYSYS is used to model the plant. The model validity is examined against a commercial plant and showed a good agreement between industrial data and simulations results. Results show that the pressures of the last effect and the compressed vapor have an important influence on the produced cost, and the increase of the difference temperature in the condenser decreases the specific heat area about 22%.

Keywords: steady-state, triple effect, thermal vapor compressor, Matlab, Aspen Hysys

Procedia PDF Downloads 177