Search results for: agent based model
38643 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved
Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben
Abstract:
Model transformation, as a pivotal aspect of Model-driven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: cross-domain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.Keywords: automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons
Procedia PDF Downloads 40038642 Effect on Haemolymph Cellular Parameters of Periplaneta Americana Following Challenge with Agrobacterium Tumefaciens: A Possible Microbial Control Agent
Authors: Fouzia Qamar, Shahida Hasnain
Abstract:
The present study is primarily concerned with the alteration in haemocyte profile of adult male Periplaneta americana with emphasis on the effect of bacterial inoculations on the haemogram i.e., total haemocyte count (THC) and differential haemocyte count (DHC) of different haemocyte types of the target insect. Haemolymph cellular profile showed considerable alterations under the effect of nine strains of Agrobacterium tumefaciens after 8, 16 and 24 hrs of treatment thereby signifying the potential role of Agrobacterium tumefaciens as a possible biocontrol agent against the house hold pests.Keywords: Agrobacterium tumefaciens, Periplaneta americana, Haemolymph, cellular parametes
Procedia PDF Downloads 39138641 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach
Authors: Jean Berger, Nassirou Lo, Martin Noel
Abstract:
Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization
Procedia PDF Downloads 37638640 CFD Simulation of a Large Scale Unconfined Hydrogen Deflagration
Authors: I. C. Tolias, A. G. Venetsanos, N. Markatos
Abstract:
In the present work, CFD simulations of a large scale open deflagration experiment are performed. Stoichiometric hydrogen-air mixture occupies a 20 m hemisphere. Two combustion models are compared and are evaluated against the experiment. The Eddy Dissipation Model and a Multi-physics combustion model which is based on Yakhot’s equation for the turbulent flame speed. The values of models’ critical parameters are investigated. The effect of the turbulence model is also examined. k-ε model and LES approach were tested.Keywords: CFD, deflagration, hydrogen, combustion model
Procedia PDF Downloads 50638639 An Attribute Based Access Control Model with POL Module for Dynamically Granting and Revoking Authorizations
Authors: Gang Liu, Huimin Song, Can Wang, Runnan Zhang, Lu Fang
Abstract:
Currently, resource sharing and system security are critical issues. This paper proposes a POL module composed of PRIV ILEGE attribute (PA), obligation and log which improves attribute based access control (ABAC) model in dynamically granting authorizations and revoking authorizations. The following describes the new model termed PABAC in terms of the POL module structure, attribute definitions, policy formulation and authorization architecture, which demonstrate the advantages of it. The POL module addresses the problems which are not predicted before and not described by access control policy. It can be one of the subject attributes or resource attributes according to the practical application, which enhances the flexibility of the model compared with ABAC. A scenario that illustrates how this model is applied to the real world is provided.Keywords: access control, attribute based access control, granting authorizations, privilege, revoking authorizations, system security
Procedia PDF Downloads 36138638 The DC Behavioural Electrothermal Model of Silicon Carbide Power MOSFETs under SPICE
Authors: Lakrim Abderrazak, Tahri Driss
Abstract:
This paper presents a new behavioural electrothermal model of power Silicon Carbide (SiC) MOSFET under SPICE. This model is based on the MOS model level 1 of SPICE, in which phenomena such as Drain Leakage Current IDSS, On-State Resistance RDSon, gate Threshold voltage VGSth, the transconductance (gfs), I-V Characteristics Body diode, temperature-dependent and self-heating are included and represented using behavioural blocks ABM (Analog Behavioural Models) of Spice library. This ultimately makes this model flexible and easily can be integrated into the various Spice -based simulation softwares. The internal junction temperature of the component is calculated on the basis of the thermal model through the electric power dissipated inside and its thermal impedance in the form of the localized Foster canonical network. The model parameters are extracted from manufacturers' data (curves data sheets) using polynomial interpolation with the method of simulated annealing (S A) and weighted least squares (WLS). This model takes into account the various important phenomena within transistor. The effectiveness of the presented model has been verified by Spice simulation results and as well as by data measurement for SiC MOS transistor C2M0025120D CREE (1200V, 90A).Keywords: SiC power MOSFET, DC electro-thermal model, ABM Spice library, SPICE modelling, behavioural model, C2M0025120D CREE.
Procedia PDF Downloads 58338637 Development of an Image-Based Biomechanical Model for Assessment of Hip Fracture Risk
Authors: Masoud Nasiri Sarvi, Yunhua Luo
Abstract:
Low-trauma hip fracture, usually caused by fall from standing height, has become a main source of morbidity and mortality for the elderly. Factors affecting hip fracture include sex, race, age, body weight, height, body mass distribution, etc., and thus, hip fracture risk in fall differs widely from subject to subject. It is therefore necessary to develop a subject-specific biomechanical model to predict hip fracture risk. The objective of this study is to develop a two-level, image-based, subject-specific biomechanical model consisting of a whole-body dynamics model and a proximal-femur finite element (FE) model for more accurately assessing the risk of hip fracture in lateral falls. Required information for constructing the model is extracted from a whole-body and a hip DXA (Dual Energy X-ray Absorptiometry) image of the subject. The proposed model considers all parameters subject-specifically, which will provide a fast, accurate, and non-expensive method for predicting hip fracture risk.Keywords: bone mineral density, hip fracture risk, impact force, sideways falls
Procedia PDF Downloads 53938636 Numerical Modelling of Wind Dispersal Seeds of Bromeliad Tillandsia recurvata L. (L.) Attached to Electric Power Lines
Authors: Bruna P. De Souza, Ricardo C. De Almeida
Abstract:
In some cities in the State of Parana – Brazil and in other countries atmospheric bromeliads (Tillandsia spp - Bromeliaceae) are considered weeds in trees, electric power lines, satellite dishes and other artificial supports. In this study, a numerical model was developed to simulate the seed dispersal of the Tillandsia recurvata species by wind with the objective of evaluating seeds displacement in the city of Ponta Grossa – PR, Brazil, since it is considered that the region is already infested. The model simulates the dispersal of each individual seed integrating parameters from the atmospheric boundary layer (ABL) and the local wind, simulated by the Weather Research Forecasting (WRF) mesoscale atmospheric model for the 2012 to 2015 period. The dispersal model also incorporates the approximate number of bromeliads and source height data collected from most infested electric power lines. The seeds terminal velocity, which is an important input data but was not available in the literature, was measured by an experiment with fifty-one seeds of Tillandsia recurvata. Wind is the main dispersal agent acting on plumed seeds whereas atmospheric turbulence is a determinant factor to transport the seeds to distances beyond 200 meters as well as to introduce random variability in the seed dispersal process. Such variability was added to the model through the application of an Inverse Fast Fourier Transform to wind velocity components energy spectra based on boundary-layer meteorology theory and estimated from micrometeorological parameters produced by the WRF model. Seasonal and annual wind means were obtained from the surface wind data simulated by WRF for Ponta Grossa. The mean wind direction is assumed to be the most probable direction of bromeliad seed trajectory. Moreover, the atmospheric turbulence effect and dispersal distances were analyzed in order to identify likely regions of infestation around Ponta Grossa urban area. It is important to mention that this model could be applied to any species and local as long as seed’s biological data and meteorological data for the region of interest are available.Keywords: atmospheric turbulence, bromeliad, numerical model, seed dispersal, terminal velocity, wind
Procedia PDF Downloads 14238635 Evaluation of the Integration of a Direct Reduction Process into an Existing Steel Mill
Authors: Nils Mueller, Gregor Herz, Erik Reichelt, Matthias Jahn
Abstract:
In the context of climate change, the reduction of greenhouse gas emissions in all economic sectors is considered to be an important factor in order to meet the demands of a sustainable energy system. The steel industry as one of the large industrial CO₂ emitters is currently highly dependent on fossil resources. In order to reduce coke consumption and thereby CO₂ emissions while still being able to further utilize existing blast furnaces, the possibility of including a direct reduction process (DRP) into a fully integrated steel mill was investigated. Therefore, a blast furnace model, derived from literature data and implemented in Aspen Plus, was used to analyze the impact of DRI in the blast furnace process. Furthermore, a state-of-the-art DRP was modeled to investigate the possibility of substituting the reducing agent natural gas with hydrogen. A sensitivity analysis was carried out in order to find the boundary percentage of hydrogen as a reducing agent without penalty to the DRI quality. Lastly, the two modeled process steps were combined to form a route of producing pig iron. By varying boundary conditions of the DRP while recording the CO₂ emissions of the two process steps, the overall potential for the reduction of CO₂ emissions was estimated. Within the simulated range, a maximum reduction of CO₂ emissions of 23.5% relative to typical emissions of a blast furnace could be determined.Keywords: blast furnace, CO₂ mitigation, DRI, hydrogen
Procedia PDF Downloads 28738634 Strength & Density of an Autoclaved Aerated Concrete Using Various Air Entraining Agent
Authors: Shashank Gupta, Shiva Garg
Abstract:
The purpose of the present paper is to study the changes in the strength characteristics of autoclaved aerated concrete (AAC) and also the density when different expansion agents are used. The expansion agent so used releases air in the concrete thereby making it lighter by reducing its density. It also increases the workability of the concrete. The various air entraining agents used for this study are hydrogen peroxide, oleic acid, and olive oil. The addition of these agents causes the concrete to rise like cake but it reduces the strength of concrete due to the formation of air voids. The amount of agents chosen for concrete production are 0.5%, 1%, 1.5% by weight of cement.Keywords: AAC, olive oil, hydrogen peroxide, oleic acid, steam curing
Procedia PDF Downloads 37138633 Effects of Research-Based Blended Learning Model Using Adaptive Scaffolding to Enhance Graduate Students' Research Competency and Analytical Thinking Skills
Authors: Panita Wannapiroon, Prachyanun Nilsook
Abstract:
This paper is a report on the findings of a Research and Development (R&D) aiming to develop the model of Research-Based Blended Learning Model Using Adaptive Scaffolding (RBBL-AS) to enhance graduate students’ research competency and analytical thinking skills, to study the result of using such model. The sample consisted of 10 experts in the fields during the model developing stage, while there were 23 graduate students of KMUTNB for the RBBL-AS model try out stage. The research procedures included 4 phases: 1) literature review, 2) model development, 3) model experiment, and 4) model revision and confirmation. The research results were divided into 3 parts according to the procedures as described in the following session. First, the data gathering from the literature review were reported as a draft model; followed by the research finding from the experts’ interviews indicated that the model should be included 8 components to enhance graduate students’ research competency and analytical thinking skills. The 8 components were 1) cloud learning environment, 2) Ubiquitous Cloud Learning Management System (UCLMS), 3) learning courseware, 4) learning resources, 5) adaptive Scaffolding, 6) communication and collaboration tolls, 7) learning assessment, and 8) research-based blended learning activity. Second, the research finding from the experimental stage found that there were statistically significant difference of the research competency and analytical thinking skills posttest scores over the pretest scores at the .05 level. The Graduate students agreed that learning with the RBBL-AS model was at a high level of satisfaction. Third, according to the finding from the experimental stage and the comments from the experts, the developed model was revised and proposed in the report for further implication and references.Keywords: research based learning, blended learning, adaptive scaffolding, research competency, analytical thinking skills
Procedia PDF Downloads 42438632 Effects of Coupling Agent on the Properties of Henequen Microfiber (NF) Filled High Density Polyethylene (HDPE) Composites
Authors: Pravin Gaikwad, Prakash Mahanwar
Abstract:
The main objective of incorporating natural fibers such as Henequen microfibers (NF) into the High-Density Polyethylene (HDPE) polymer matrix is to reduce the cost and to enhance the mechanical as well as other properties. The Henequen microfibers were chopped manually to 5-7mm in length and added into the polymer matrix at the optimized concentration of 8 wt %. In order to facilitate the link between Henequen microfibers (NF) and HDPE matrix, coupling agent such as Glycidoxy (Epoxy) Functional Methoxy Silane (GPTS) at various concentrations from 0.1%, 0.3%, 0.5%, 0.7%, 0.9%, and 1% by weight to the total fibers were added. The tensile strength of the composite increased marginally while % elongation at break of the composites decreased with increase in silane loading by wt %. Tensile modulus and stiffness observed increased at 0.9 wt % GPTS loading. Flexural as well as impact strength of the composite decreased with increase in GPTS loading by weight %. Dielectric strength of the composite also found increased marginally upto 0.5wt % silane loading and thereafter remained constant.Keywords: Henequen microfibers (NF), polymer composites, HDPE, coupling agent, GPTS
Procedia PDF Downloads 44238631 New Analytical Current-Voltage Model for GaN-based Resonant Tunneling Diodes
Authors: Zhuang Guo
Abstract:
In the field of GaN-based resonant tunneling diodes (RTDs) simulations, the traditional Tsu-Esaki formalism failed to predict the values of peak currents and peak voltages in the simulated current-voltage(J-V) characteristics. The main reason is that due to the strong internal polarization fields, two-dimensional electron gas(2DEG) accumulates at emitters, resulting in 2D-2D resonant tunneling currents, which become the dominant parts of the total J-V characteristics. By comparison, based on the 3D-2D resonant tunneling mechanism, the traditional Tsu-Esaki formalism cannot predict the J-V characteristics correctly. To overcome this shortcoming, we develop a new analytical model for the 2D-2D resonant tunneling currents generated in GaN-based RTDs. Compared with Tsu-Esaki formalism, the new model has made the following modifications: Firstly, considering the Heisenberg uncertainty, the new model corrects the expression of the density of states around the 2DEG eigenenergy levels at emitters so that it could predict the half width at half-maximum(HWHM) of resonant tunneling currents; Secondly, taking into account the effect of bias on wave vectors on the collectors, the new model modifies the expression of the transmission coefficients which could help to get the values of peak currents closer to the experiment data compared with Tsu-Esaki formalism. The new analytical model successfully predicts the J-V characteristics of GaN-based RTDs, and it also reveals more detailed mechanisms of resonant tunneling happened in GaN-based RTDs, which helps to design and fabricate high-performance GaN RTDs.Keywords: GaN-based resonant tunneling diodes, tsu-esaki formalism, 2D-2D resonant tunneling, heisenberg uncertainty
Procedia PDF Downloads 7938630 Functional Instruction Set Simulator of a Neural Network IP with Native Brain Float-16 Generator
Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula
Abstract:
A functional model to mimic the functional correctness of a neural network compute accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of GCC compilers to the BF-16 datatype, which we addressed with a native BF-16 generator integrated into our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex neural network accelerator design by proposing a functional model-based scoreboard or software model using SystemC. The proposed functional model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT, bringing up micro-steps of execution.Keywords: ISA, neural network, Brain Float-16, DUT
Procedia PDF Downloads 9938629 Design and Implementation of LabVIEW Based Relay Autotuning Controller for Level Setup
Authors: Manoj M. Sarode, Sharad P. Jadhav, Mukesh D. Patil, Pushparaj S. Suryawanshi
Abstract:
Even though the PID controller is widely used in industrial process, tuning of PID parameters are not easy. It is a time consuming and requires expert people. Another drawback of PID controller is that process dynamics might change over time. This can happen due to variation of the process load, normal wear and tear etc. To compensate for process behavior change over time, expert users are required to recalibrate the PID gains. Implementation of model based controllers usually needs a process model. Identification of process model is time consuming job and no guaranty of model accuracy. If the identified model is not accurate, performance of the controller may degrade. Model based controllers are quite expensive and the whole procedure for the implementation is sometimes tedious. To eliminate such issues Autotuning PID controller becomes vital element. Software based Relay Feedback Autotuning Controller proves to be efficient, upgradable and maintenance free controller. In Relay Feedback Autotune controller PID parameters can be achieved with a very short span of time. This paper presents the real time implementation of LabVIEW based Relay Feedback Autotuning PID controller. It is successfully developed and implemented to control level of a laboratory setup. Its performance is analyzed for different setpoints and found satisfactorily.Keywords: autotuning, PID, liquid level control, recalibrate, labview, controller
Procedia PDF Downloads 39738628 The Value of Audit in Managing Supplier’s Process Improvement
Authors: Mohammad E. Nikoofal, Mehmet Gumus
Abstract:
Besides the many benefits of outsourcing, firms are still concerned about the lack of critical information regarding both the risk levels and actions of their suppliers that are just a few links away. In this paper, we study the effectiveness of audit for the manufacturer in managing her supplier’s process improvement effort when the supplier is privately informed about his disruption risk and actions. By comparing the agency costs associated with the optimal menu of contracts with and without audit, we completely characterize the value of audit for all the cases from the perspectives of both manufacturer, and supplier as well as total supply chain. First, the analysis of value of audit from the manufacturer’s perspective shows that she can strictly benefit from auditing her supplier’s actions. To the best of our knowledge, this result has not been documented before in the principal-agent literature under a standard setting where the agent is assumed to be risk-neutral and not protected by limited liability constraints. Second, we find that not only the manufacturer but also the supplier can strictly benefit from audit. Third, the audit enables the manufacturer to customize her contract offerings based on the reliability of the supplier. Finally, by analyzing the impact of problem parameters on the value of audit, we identify the conditions under which an audit would be beneficial for individual supply chain parties as well as total supply chain.Keywords: supply disruption, adverse selection, moral hazard incentives, audit
Procedia PDF Downloads 46638627 An Emergentist Defense of Incompatibility between Morally Significant Freedom and Causal Determinism
Authors: Lubos Rojka
Abstract:
The common perception of morally responsible behavior is that it presupposes freedom of choice, and that free decisions and actions are not determined by natural events, but by a person. In other words, the moral agent has the ability and the possibility of doing otherwise when making morally responsible decisions, and natural causal determinism cannot fully account for morally significant freedom. The incompatibility between a person’s morally significant freedom and causal determinism appears to be a natural position. Nevertheless, some of the most influential philosophical theories on moral responsibility are compatibilist or semi-compatibilist, and they exclude the requirement of alternative possibilities, which contradicts the claims of classical incompatibilism. The compatibilists often employ Frankfurt-style thought experiments to prove their theory. The goal of this paper is to examine the role of imaginary Frankfurt-style examples in compatibilist accounts. More specifically, the compatibilist accounts defended by John Martin Fischer and Michael McKenna will be inserted into the broader understanding of a person elaborated by Harry Frankfurt, Robert Kane and Walter Glannon. Deeper analysis reveals that the exclusion of alternative possibilities based on Frankfurt-style examples is problematic and misleading. A more comprehensive account of moral responsibility and morally significant (source) freedom requires higher order complex theories of human will and consciousness, in which rational and self-creative abilities and a real possibility to choose otherwise, at least on some occasions during a lifetime, are necessary. Theoretical moral reasons and their logical relations seem to require a sort of higher-order agent-causal incompatibilism. The ability of theoretical or abstract moral reasoning requires complex (strongly emergent) mental and conscious properties, among which an effective free will, together with first and second-order desires. Such a hierarchical theoretical model unifies reasons-responsiveness, mesh theory and emergentism. It is incompatible with physical causal determinism, because such determinism only allows non-systematic processes that may be hard to predict, but not complex (strongly) emergent systems. An agent’s effective will and conscious reflectivity is the starting point of a morally responsible action, which explains why a decision is 'up to the subject'. A free decision does not always have a complete causal history. This kind of an emergentist source hyper-incompatibilism seems to be the best direction of the search for an adequate explanation of moral responsibility in the traditional (merit-based) sense. Physical causal determinism as a universal theory would exclude morally significant freedom and responsibility in the traditional sense because it would exclude the emergence of and supervenience by the essential complex properties of human consciousness.Keywords: consciousness, free will, determinism, emergence, moral responsibility
Procedia PDF Downloads 16838626 Conduction Model Compatible for Multi-Physical Domain Dynamic Investigations: Bond Graph Approach
Abstract:
In the current paper, a domain independent conduction model compatible for multi-physical system dynamic investigations is suggested. By means of a port-based approach, a classical nonlinear conduction model containing physical states is first represented. A compatible discrete configuration of the thermal domain in line with the elastic domain is then generated through the enhancement of the configuration of the conventional thermal element. The presented simulation results of a sample structure indicate that the suggested conductive model can cover a wide range of dynamic behavior of the thermal domain.Keywords: multi-physical domain, conduction model, port based modeling, dynamic interaction, physical modeling
Procedia PDF Downloads 27838625 Enhancement of Indexing Model for Heterogeneous Multimedia Documents: User Profile Based Approach
Authors: Aicha Aggoune, Abdelkrim Bouramoul, Mohamed Khiereddine Kholladi
Abstract:
Recent research shows that user profile as important element can improve heterogeneous information retrieval with its content. In this context, we present our indexing model for heterogeneous multimedia documents. This model is based on the combination of user profile to the indexing process. The general idea of our proposal is to operate the common concepts between the representation of a document and the definition of a user through his profile. These two elements will be added as additional indexing entities to enrich the heterogeneous corpus documents indexes. We have developed IRONTO domain ontology allowing annotation of documents. We will present also the developed tool validating the proposed model.Keywords: indexing model, user profile, multimedia document, heterogeneous of sources, ontology
Procedia PDF Downloads 35538624 Multivariate Rainfall Disaggregation Using MuDRain Model: Malaysia Experience
Authors: Ibrahim Suliman Hanaish
Abstract:
Disaggregation daily rainfall using stochastic models formulated based on multivariate approach (MuDRain) is discussed in this paper. Seven rain gauge stations are considered in this study for different distances from the referred station starting from 4 km to 160 km in Peninsular Malaysia. The hourly rainfall data used are covered the period from 1973 to 2008 and July and November months are considered as an example of dry and wet periods. The cross-correlation among the rain gauges is considered for the available hourly rainfall information at the neighboring stations or not. This paper discussed the applicability of the MuDRain model for disaggregation daily rainfall to hourly rainfall for both sources of cross-correlation. The goodness of fit of the model was based on the reproduction of fitting statistics like the means, variances, coefficients of skewness, lag zero cross-correlation of coefficients and the lag one auto correlation of coefficients. It is found the correlation coefficients based on extracted correlations that was based on daily are slightly higher than correlations based on available hourly rainfall especially for neighboring stations not more than 28 km. The results showed also the MuDRain model did not reproduce statistics very well. In addition, a bad reproduction of the actual hyetographs comparing to the synthetic hourly rainfall data. Mean while, it is showed a good fit between the distribution function of the historical and synthetic hourly rainfall. These discrepancies are unavoidable because of the lowest cross correlation of hourly rainfall. The overall performance indicated that the MuDRain model would not be appropriate choice for disaggregation daily rainfall.Keywords: rainfall disaggregation, multivariate disaggregation rainfall model, correlation, stochastic model
Procedia PDF Downloads 52338623 Analysis on the Need of Engineering Drawing and Feasibility Study on 3D Model Based Engineering Implementation
Authors: Parthasarathy J., Ramshankar C. S.
Abstract:
Engineering drawings these days play an important role in every part of an industry. By and large, Engineering drawings are influential over every phase of the product development process. Traditionally, drawings are used for communication in industry because they are the clearest way to represent the product manufacturing information. Until recently, manufacturing activities were driven by engineering data captured in 2D paper documents or digital representations of those documents. The need of engineering drawing is inevitable. Still Engineering drawings are disadvantageous in re-entry of data throughout manufacturing life cycle. This document based approach is prone to errors and requires costly re-entry of data at every stage in the manufacturing life cycle. So there is a requirement to eliminate Engineering drawings throughout product development process and to implement 3D Model Based Engineering (3D MBE or 3D MBD). Adopting MBD appears to be the next logical step to continue reducing time-to-market and improve product quality. Ideally, by fully applying the MBD concept, the product definition will no longer rely on engineering drawings throughout the product lifecycle. This project addresses the need of Engineering drawing and its influence in various parts of an industry and the need to implement the 3D Model Based Engineering with its advantages and the technical barriers that must be overcome in order to implement 3D Model Based Engineering. This project also addresses the requirements of neutral formats and its realisation in order to implement the digital product definition principles in a light format. In order to prove the concepts of 3D Model Based Engineering, the screw jack body part is also demonstrated. At ZF Windpower Coimbatore Limited, 3D Model Based Definition is implemented to Torque Arm (Machining and Casting), Steel tube, Pinion shaft, Cover, Energy tube.Keywords: engineering drawing, model based engineering MBE, MBD, CAD
Procedia PDF Downloads 43738622 Combustion Analysis of Suspended Sodium Droplet
Authors: T. Watanabe
Abstract:
Combustion analysis of suspended sodium droplet is performed by solving numerically the Navier-Stokes equations and the energy conservation equations. The combustion model consists of the pre-ignition and post-ignition models. The reaction rate for the pre-ignition model is based on the chemical kinetics, while that for the post-ignition model is based on the mass transfer rate of oxygen. The calculated droplet temperature is shown to be in good agreement with the existing experimental data. The temperature field in and around the droplet is obtained as well as the droplet shape variation, and the present numerical model is confirmed to be effective for the combustion analysis.Keywords: analysis, combustion, droplet, sodium
Procedia PDF Downloads 21338621 The Leaching Kinetics of Zinc from Industrial Zinc Slag Waste
Authors: Hilary Rutto
Abstract:
The investigation was aimed at determining the extent at which the zinc will be extracted from secondary sources generated from galvanising process using dilute sulphuric acid under controlled laboratory conditions of temperature, solid-liquid ratio, and agitation rate. The leaching experiment was conducted for a period of 2 hours and to total zinc extracted calculated in relation to the amount of zinc dissolved at a unit time in comparison to the initial zinc content of the zinc ash. Sulphuric acid was found to be an effective leaching agent with an overall extraction of 91.1% when concentration is at 2M, and solid/liquid ratio kept at 1g/200mL leaching solution and temperature set at 65ᵒC while slurry agitation is at 450rpm. The leaching mechanism of zinc ash with sulphuric acid was conformed well to the shrinking core model.Keywords: leaching, kinetics, shrinking core model, zinc slag
Procedia PDF Downloads 16038620 AgriFood Model in Ankara Regional Innovation Strategy
Authors: Coskun Serefoglu
Abstract:
The study aims to analyse how a traditional sector such as agri-food could be mobilized through regional innovation strategies. A principal component analysis as well as qualitative information, such as in-depth interviews, focus group and surveys, were employed to find the priority sectors. An agri-food model was developed which includes both a linear model and interactive model. The model consists of two main components, one of which is technological integration and the other one is agricultural extension which is based on Land-grant university approach of U.S. which is not a common practice in Turkey.Keywords: regional innovation strategy, interactive model, agri-food sector, local development, planning, regional development
Procedia PDF Downloads 15438619 A Nonlinear Parabolic Partial Differential Equation Model for Image Enhancement
Authors: Tudor Barbu
Abstract:
We present a robust nonlinear parabolic partial differential equation (PDE)-based denoising scheme in this article. Our approach is based on a second-order anisotropic diffusion model that is described first. Then, a consistent and explicit numerical approximation algorithm is constructed for this continuous model by using the finite-difference method. Finally, our restoration experiments and method comparison, which prove the effectiveness of this proposed technique, are discussed in this paper.Keywords: anisotropic diffusion, finite differences, image denoising and restoration, nonlinear PDE model, anisotropic diffusion, numerical approximation schemes
Procedia PDF Downloads 31538618 Research on Straightening Process Model Based on Iteration and Self-Learning
Authors: Hong Lu, Xiong Xiao
Abstract:
Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.Keywords: straightness, straightening stroke, deflection, shaft parts
Procedia PDF Downloads 33038617 The Proposal for a Framework to Face Opacity and Discrimination ‘Sins’ Caused by Consumer Creditworthiness Machines in the EU
Authors: Diogo José Morgado Rebelo, Francisco António Carneiro Pacheco de Andrade, Paulo Jorge Freitas de Oliveira Novais
Abstract:
Not everything in AI-power consumer credit scoring turns out to be a wonder. When using AI in Creditworthiness Assessment (CWA), opacity and unfairness ‘sins’ must be considered to the task be deemed Responsible. AI software is not always 100% accurate, which can lead to misclassification. Discrimination of some groups can be exponentiated. A hetero personalized identity can be imposed on the individual(s) affected. Also, autonomous CWA sometimes lacks transparency when using black box models. However, for this intended purpose, human analysts ‘on-the-loop’ might not be the best remedy consumers are looking for in credit. This study seeks to explore the legality of implementing a Multi-Agent System (MAS) framework in consumer CWA to ensure compliance with the regulation outlined in Article 14(4) of the Proposal for an Artificial Intelligence Act (AIA), dated 21 April 2021 (as per the last corrigendum by the European Parliament on 19 April 2024), Especially with the adoption of Art. 18(8)(9) of the EU Directive 2023/2225, of 18 October, which will go into effect on 20 November 2026, there should be more emphasis on the need for hybrid oversight in AI-driven scoring to ensure fairness and transparency. In fact, the range of EU regulations on AI-based consumer credit will soon impact the AI lending industry locally and globally, as shown by the broad territorial scope of AIA’s Art. 2. Consequently, engineering the law of consumer’s CWA is imperative. Generally, the proposed MAS framework consists of several layers arranged in a specific sequence, as follows: firstly, the Data Layer gathers legitimate predictor sets from traditional sources; then, the Decision Support System Layer, whose Neural Network model is trained using k-fold Cross Validation, provides recommendations based on the feeder data; the eXplainability (XAI) multi-structure comprises Three-Step-Agents; and, lastly, the Oversight Layer has a 'Bottom Stop' for analysts to intervene in a timely manner. From the analysis, one can assure a vital component of this software is the XAY layer. It appears as a transparent curtain covering the AI’s decision-making process, enabling comprehension, reflection, and further feasible oversight. Local Interpretable Model-agnostic Explanations (LIME) might act as a pillar by offering counterfactual insights. SHapley Additive exPlanation (SHAP), another agent in the XAI layer, could address potential discrimination issues, identifying the contribution of each feature to the prediction. Alternatively, for thin or no file consumers, the Suggestion Agent can promote financial inclusion. It uses lawful alternative sources such as the share of wallet, among others, to search for more advantageous solutions to incomplete evaluation appraisals based on genetic programming. Overall, this research aspires to bring the concept of Machine-Centered Anthropocentrism to the table of EU policymaking. It acknowledges that, when put into service, credit analysts no longer exert full control over the data-driven entities programmers have given ‘birth’ to. With similar explanatory agents under supervision, AI itself can become self-accountable, prioritizing human concerns and values. AI decisions should not be vilified inherently. The issue lies in how they are integrated into decision-making and whether they align with non-discrimination principles and transparency rules.Keywords: creditworthiness assessment, hybrid oversight, machine-centered anthropocentrism, EU policymaking
Procedia PDF Downloads 3738616 Identification of Wiener Model Using Iterative Schemes
Authors: Vikram Saini, Lillie Dewan
Abstract:
This paper presents the iterative schemes based on Least square, Hierarchical Least Square and Stochastic Approximation Gradient method for the Identification of Wiener model with parametric structure. A gradient method is presented for the parameter estimation of wiener model with noise conditions based on the stochastic approximation. Simulation results are presented for the Wiener model structure with different static non-linear elements in the presence of colored noise to show the comparative analysis of the iterative methods. The stochastic gradient method shows improvement in the estimation performance and provides fast convergence of the parameters estimates.Keywords: hard non-linearity, least square, parameter estimation, stochastic approximation gradient, Wiener model
Procedia PDF Downloads 40838615 Investigating the Challenges Faced by English Language Teachers in Implementing Outcome Based Education the Outcome Based Education model in Engineering Universities of Sindh
Authors: Habibullah Pathan
Abstract:
The present study aims to explore problems faced by English Language Teachers (ELT) while implementing the Outcome Based Education (OBE) model in engineering universities of Sindh. OBE is an emerging model initiative of the International Engineering Alliance. Traditional educational systems are teacher-centered or curriculum-centered, in which learners are not able to achieve desired outcomes, but the OBE model enables learners to know the outcomes before the start of the program. OBE is a circular process that begins from the needs and demands of society to stakeholders who ask the experts to produce the alumnus who can fulfill the needs and ends up getting new enrollment in the respective programs who can work according to the demands. In all engineering institutions, engineering courses besides English language courses are taught on the OBE model. English language teachers were interviewed to learn the in-depth of the problems faced by them. The study found that teachers were facing problems including pedagogical, OBE training, assessment, evaluation and administrative support. This study will be a guide for public and private English language teachers to cope with these challenges while teaching the English language on the OBE model. OBE is an emerging model by which the institutions can produce such a product that can meet the demands.Keywords: problems of ELT teachers, outcome based education (OBE), implementing, assessment
Procedia PDF Downloads 10138614 Critical Review of Web Content Mining Extraction Mechanisms
Authors: Rabia Bashir, Sajjad Akbar
Abstract:
There is an inevitable demand of web mining due to rapid increase of huge information on the Internet, but the striking variety of web structures has made required content retrieval a difficult task. To counter this issue, Web Content Mining (WCM) emerges as a potential candidate which extracts and integrates suitable resources of data to users. In past few years, research has been done on several extraction techniques for WCM i.e. agent-based, template-based, assumption-based, statistic-based, wrapper-based and machine learning. However, it is still unclear that either these approaches are efficiently tackling the significant challenges of WCM or not. To answer this question, this paper identifies these challenges such as language independency, structure flexibility, performance, automation, dynamicity, redundancy handling, intelligence, relevant content retrieval, and privacy. Further, mapping of these challenges is done with existing extraction mechanisms which helps to adopt the most suitable WCM approach, given some conditions and characteristics at hand.Keywords: content mining challenges, web content mining, web content extraction approaches, web information retrieval
Procedia PDF Downloads 551