Search results for: theoretical model
16177 Degradation of Diclofenac in Water Using FeO-Based Catalytic Ozonation in a Modified Flotation Cell
Authors: Miguel A. Figueroa, José A. Lara-Ramos, Miguel A. Mueses
Abstract:
Pharmaceutical residues are a section of emerging contaminants of anthropogenic origin that are present in a myriad of waters with which human beings interact daily and are starting to affect the ecosystem directly. Conventional waste-water treatment systems are not capable of degrading these pharmaceutical effluents because their designs cannot handle the intermediate products and biological effects occurring during its treatment. That is why it is necessary to hybridize conventional waste-water systems with non-conventional processes. In the specific case of an ozonation process, its efficiency highly depends on a perfect dispersion of ozone, long times of interaction of the gas-liquid phases and the size of the ozone bubbles formed through-out the reaction system. In order to increase the efficiency of these parameters, the use of a modified flotation cell has been proposed recently as a reactive system, which is used at an industrial level to facilitate the suspension of particles and spreading gas bubbles through the reactor volume at a high rate. The objective of the present work is the development of a mathematical model that can closely predict the kinetic rates of reactions taking place in the flotation cell at an experimental scale by means of identifying proper reaction mechanisms that take into account the modified chemical and hydrodynamic factors in the FeO-catalyzed Ozonation of Diclofenac aqueous solutions in a flotation cell. The methodology is comprised of three steps: an experimental phase where a modified flotation cell reactor is used to analyze the effects of ozone concentration and loading catalyst over the degradation of Diclofenac aqueous solutions. The performance is evaluated through an index of utilized ozone, which relates the amount of ozone supplied to the system per milligram of degraded pollutant. Next, a theoretical phase where the reaction mechanisms taking place during the experiments must be identified and proposed that details the multiple direct and indirect reactions the system goes through. Finally, a kinetic model is obtained that can mathematically represent the reaction mechanisms with adjustable parameters that can be fitted to the experimental results and give the model a proper physical meaning. The expected results are a robust reaction rate law that can simulate the improved results of Diclofenac mineralization on water using the modified flotation cell reactor. By means of this methodology, the following results were obtained: A robust reaction pathways mechanism showcasing the intermediates, free-radicals and products of the reaction, Optimal values of reaction rate constants that simulated Hatta numbers lower than 3 for the system modeled, degradation percentages of 100%, TOC (Total organic carbon) removal percentage of 69.9 only requiring an optimal value of FeO catalyst of 0.3 g/L. These results showed that a flotation cell could be used as a reactor in ozonation, catalytic ozonation and photocatalytic ozonation processes, since it produces high reaction rate constants and reduces mass transfer limitations (Ha > 3) by producing microbubbles and maintaining a good catalyst distribution.Keywords: advanced oxidation technologies, iron oxide, emergent contaminants, AOTS intensification
Procedia PDF Downloads 11616176 Creation and Management of Knowledge for Organization Sustainability and Learning
Authors: Deepa Kapoor, Rajshree Singh
Abstract:
This paper appreciates the emergence and growing importance as a new production factor makes the development of technologies, methodologies and strategies for measurement, creation, and diffusion into one of the main priorities of the organizations in the knowledge society. There are many models for creation and management of knowledge and diverse and varied perspectives for study, analysis, and understanding. In this article, we will conduct a theoretical approach to the type of models for the creation and management of knowledge; we will discuss some of them and see some of the difficulties and the key factors that determine the success of the processes for the creation and management of knowledge.Keywords: knowledge creation, knowledge management, organizational development, organization learning
Procedia PDF Downloads 35016175 Computation of Stress Intensity Factor Using Extended Finite Element Method
Authors: Mahmoudi Noureddine, Bouregba Rachid
Abstract:
In this paper the stress intensity factors of a slant-cracked plate of AISI 304 stainless steel, have been calculated using extended finite element method and finite element method (FEM) in ABAQUS software, the results were compared with theoretical values.Keywords: stress intensity factors, extended finite element method, stainless steel, abaqus
Procedia PDF Downloads 62116174 The Reliability Analysis of Concrete Chimneys Due to Random Vortex Shedding
Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta
Abstract:
Chimneys are generally tall and slender structures with circular cross-sections, due to which they are highly prone to wind forces. Wind exerts pressure on the wall of the chimneys, which produces unwanted forces. Vortex-induced oscillation is one of such excitations which can lead to the failure of the chimneys. Therefore, vortex-induced oscillation of chimneys is of great concern to researchers and practitioners since many failures of chimneys due to vortex shedding have occurred in the past. As a consequence, extensive research has taken place on the subject over decades. Many laboratory experiments have been performed to verify the theoretical models proposed to predict vortex-induced forces, including aero-elastic effects. Comparatively, very few proto-type measurement data have been recorded to verify the proposed theoretical models. Because of this reason, the theoretical models developed with the help of experimental laboratory data are utilized for analyzing the chimneys for vortex-induced forces. This calls for reliability analysis of the predictions of the responses of the chimneys produced due to vortex shedding phenomena. Although several works of literature exist on the vortex-induced oscillation of chimneys, including code provisions, the reliability analysis of chimneys against failure caused due to vortex shedding is scanty. In the present study, the reliability analysis of chimneys against vortex shedding failure is presented, assuming the uncertainty in vortex shedding phenomena to be significantly more than other uncertainties, and hence, the latter is ignored. The vortex shedding is modeled as a stationary random process and is represented by a power spectral density function (PSDF). It is assumed that the vortex shedding forces are perfectly correlated and act over the top one-third height of the chimney. The PSDF of the tip displacement of the chimney is obtained by performing a frequency domain spectral analysis using a matrix approach. For this purpose, both chimney and random wind forces are discretized over a number of points along with the height of the chimney. The method of analysis duly accounts for the aero-elastic effects. The double barrier threshold crossing level, as proposed by Vanmarcke, is used for determining the probability of crossing different threshold levels of the tip displacement of the chimney. Assuming the annual distribution of the mean wind velocity to be a Gumbel type-I distribution, the fragility curve denoting the variation of the annual probability of threshold crossing against different threshold levels of the tip displacement of the chimney is determined. The reliability estimate is derived from the fragility curve. A 210m tall concrete chimney with a base diameter of 35m, top diameter as 21m, and thickness as 0.3m has been taken as an illustrative example. The terrain condition is assumed to be that corresponding to the city center. The expression for the PSDF of the vortex shedding force is taken to be used by Vickery and Basu. The results of the study show that the threshold crossing reliability of the tip displacement of the chimney is significantly influenced by the assumed structural damping and the Gumbel distribution parameters. Further, the aero-elastic effect influences the reliability estimate to a great extent for small structural damping.Keywords: chimney, fragility curve, reliability analysis, vortex-induced vibration
Procedia PDF Downloads 16916173 Lab Bench for Synthetic Aperture Radar Imaging System
Authors: Karthiyayini Nagarajan, P. V. Ramakrishna
Abstract:
Radar Imaging techniques provides extensive applications in the field of remote sensing, majorly Synthetic Aperture Radar (SAR) that provide high resolution target images. This paper work puts forward the effective and realizable signal generation and processing for SAR images. The major units in the system include camera, signal generation unit, signal processing unit and display screen. The real radio channel is replaced by its mathematical model based on optical image to calculate a reflected signal model in real time. Signal generation realizes the algorithm and forms the radar reflection model. Signal processing unit provides range and azimuth resolution through matched filtering and spectrum analysis procedure to form radar image on the display screen. The restored image has the same quality as that of the optical image. This SAR imaging system has been designed and implemented using MATLAB and Quartus II tools on Stratix III device as a System (Lab Bench) that works in real time to study/investigate on radar imaging rudiments and signal processing scheme for educational and research purposes.Keywords: synthetic aperture radar, radio reflection model, lab bench, imaging engineering
Procedia PDF Downloads 50216172 Gaussian Mixture Model Based Identification of Arterial Wall Movement for Computation of Distension Waveform
Authors: Ravindra B. Patil, P. Krishnamoorthy, Shriram Sethuraman
Abstract:
This work proposes a novel Gaussian Mixture Model (GMM) based approach for accurate tracking of the arterial wall and subsequent computation of the distension waveform using Radio Frequency (RF) ultrasound signal. The approach was evaluated on ultrasound RF data acquired using a prototype ultrasound system from an artery mimicking flow phantom. The effectiveness of the proposed algorithm is demonstrated by comparing with existing wall tracking algorithms. The experimental results show that the proposed method provides 20% reduction in the error margin compared to the existing approaches in tracking the arterial wall movement. This approach coupled with ultrasound system can be used to estimate the arterial compliance parameters required for screening of cardiovascular related disorders.Keywords: distension waveform, Gaussian Mixture Model, RF ultrasound, arterial wall movement
Procedia PDF Downloads 50916171 Impact of Workers’ Remittances on Poverty in Pakistan: A Time Series Analysis by Ardl
Authors: Syed Aziz Rasool, Ayesha Zaman
Abstract:
Poverty is one of the most important problems for any developing nation. Workers’ remittances and investment plays a crucial role in development of any country by reducing the poverty level in Pakistan. This research studies the relationship between workers’ remittances and poverty alleviation. It also focused the significant effect on poverty reduction. This study uses time series data for the period of 1972-2013. Autoregressive Distributed Lag (ARDL)Model and Error Correction (ECM)Model has been used in order to find out the long run and short run relationship between the worker’s remittances and poverty level respectively. Thus, inflow of remittances showed the significant and negative impact on poverty level. Moreover, coefficient of error correction model explains the adjustment towards convergence and it has highly significant and negative value. According to this research, Policy makers should strongly focus on positive and effective policies to attract more remittances. JELCODE: JEL: J61 Procedia PDF Downloads 29116170 Service Quality Improvement in Ghana's Healthcare Supply Chain
Authors: Ammatu Alhassan
Abstract:
Quality healthcare delivery is a crucial indicator in assessing the overall developmental status of a country. There are many limitations in the Ghanaian healthcare supply chain due to the lack of studies about the correlation between quality health service and the healthcare supply chain. Patients who visit various healthcare providers face unpleasant experiences such as delays in the availability of their medications. In this study, an assessment of the quality of services provided to Ghanaian outpatients who visit public healthcare providers was investigated to establish its effect on the healthcare supply chain using a conceptual model. The Donabedian’s structure, process, and outcome theory for service quality evaluation were used to analyse 20 Ghanaian hospitals. The data obtained was tested using the structural equation model (SEM). The findings from this research will help us to improve the overall quality of the Ghanaian healthcare supply chain. The model which will be developed will help us to understand better the linkage between quality healthcare and the healthcare supply chain as well as serving as a reference tool for future healthcare research in Ghana.Keywords: Ghana, healthcare, outpatients, supply chain
Procedia PDF Downloads 19316169 A Regression Model for Predicting Sugar Crystal Size in a Fed-Batch Vacuum Evaporative Crystallizer
Authors: Sunday B. Alabi, Edikan P. Felix, Aniediong M. Umo
Abstract:
Crystal size distribution is of great importance in the sugar factories. It determines the market value of granulated sugar and also influences the cost of production of sugar crystals. Typically, sugar is produced using fed-batch vacuum evaporative crystallizer. The crystallization quality is examined by crystal size distribution at the end of the process which is quantified by two parameters: the average crystal size of the distribution in the mean aperture (MA) and the width of the distribution of the coefficient of variation (CV). Lack of real-time measurement of the sugar crystal size hinders its feedback control and eventual optimisation of the crystallization process. An attractive alternative is to use a soft sensor (model-based method) for online estimation of the sugar crystal size. Unfortunately, the available models for sugar crystallization process are not suitable as they do not contain variables that can be measured easily online. The main contribution of this paper is the development of a regression model for estimating the sugar crystal size as a function of input variables which are easy to measure online. This has the potential to provide real-time estimates of crystal size for its effective feedback control. Using 7 input variables namely: initial crystal size (Lo), temperature (T), vacuum pressure (P), feed flowrate (Ff), steam flowrate (Fs), initial super-saturation (S0) and crystallization time (t), preliminary studies were carried out using Minitab 14 statistical software. Based on the existing sugar crystallizer models, and the typical ranges of these 7 input variables, 128 datasets were obtained from a 2-level factorial experimental design. These datasets were used to obtain a simple but online-implementable 6-input crystal size model. It seems the initial crystal size (Lₒ) does not play a significant role. The goodness of the resulting regression model was evaluated. The coefficient of determination, R² was obtained as 0.994, and the maximum absolute relative error (MARE) was obtained as 4.6%. The high R² (~1.0) and the reasonably low MARE values are an indication that the model is able to predict sugar crystal size accurately as a function of the 6 easy-to-measure online variables. Thus, the model can be used as a soft sensor to provide real-time estimates of sugar crystal size during sugar crystallization process in a fed-batch vacuum evaporative crystallizer.Keywords: crystal size, regression model, soft sensor, sugar, vacuum evaporative crystallizer
Procedia PDF Downloads 21216168 Analyzing the Effects of Supply and Demand Shocks in the Spanish Economy
Authors: José M Martín-Moreno, Rafaela Pérez, Jesús Ruiz
Abstract:
In this paper we use a small open economy Dynamic Stochastic General Equilibrium Model (DSGE) for the Spanish economy to search for a deeper characterization of the determinants of Spain’s macroeconomic fluctuations throughout the period 1970-2008. In order to do this, we distinguish between tradable and non-tradable goods to take into account the fact that the presence of non-tradable goods in this economy is one of the largest in the world. We estimate a DSGE model with supply and demand shocks (sectorial productivity, public spending, international real interest rate and preferences) using Kalman Filter techniques. We find the following results. First of all, our variance decomposition analysis suggests that 1) the preference shock basically accounts for private consumption volatility, 2) the idiosyncratic productivity shock accounts for non-tradable output volatility, and 3) the sectorial productivity shock along with the international interest rate both greatly account for tradable output. Secondly, the model closely replicates the time path observed in the data for the Spanish economy and finally, the model captures the main cyclical qualitative features of this economy reasonably well.Keywords: business cycle, DSGE models, Kalman filter estimation, small open economy
Procedia PDF Downloads 42216167 The Acquisition of Case in Biological Domain Based on Text Mining
Authors: Shen Jian, Hu Jie, Qi Jin, Liu Wei Jie, Chen Ji Yi, Peng Ying Hong
Abstract:
In order to settle the problem of acquiring case in biological related to design problems, a biometrics instance acquisition method based on text mining is presented. Through the construction of corpus text vector space and knowledge mining, the feature selection, similarity measure and case retrieval method of text in the field of biology are studied. First, we establish a vector space model of the corpus in the biological field and complete the preprocessing steps. Then, the corpus is retrieved by using the vector space model combined with the functional keywords to obtain the biological domain examples related to the design problems. Finally, we verify the validity of this method by taking the example of text.Keywords: text mining, vector space model, feature selection, biologically inspired design
Procedia PDF Downloads 26616166 Thermodynamics of Aqueous Solutions of Organic Molecule and Electrolyte: Use Cloud Point to Obtain Better Estimates of Thermodynamic Parameters
Authors: Jyoti Sahu, Vinay A. Juvekar
Abstract:
Electrolytes are often used to bring about salting-in and salting-out of organic molecules and polymers (e.g. polyethylene glycols/proteins) from the aqueous solutions. For quantification of these phenomena, a thermodynamic model which can accurately predict activity coefficient of electrolyte as a function of temperature is needed. The thermodynamics models available in the literature contain a large number of empirical parameters. These parameters are estimated using lower/upper critical solution temperature of the solution in the electrolyte/organic molecule at different temperatures. Since the number of parameters is large, inaccuracy can bethe creep in during their estimation, which can affect the reliability of prediction beyond the range in which these parameters are estimated. Cloud point of solution is related to its free energy through temperature and composition derivative. Hence, the Cloud point measurement can be used for accurate estimation of the temperature and composition dependence of parameters in the model for free energy. Hence, if we use a two pronged procedure in which we first use cloud point of solution to estimate some of the parameters of the thermodynamic model and determine the rest using osmotic coefficient data, we gain on two counts. First, since the parameters, estimated in each of the two steps, are fewer, we achieve higher accuracy of estimation. The second and more important gain is that the resulting model parameters are more sensitive to temperature. This is crucial when we wish to use the model outside temperatures window within which the parameter estimation is sought. The focus of the present work is to prove this proposition. We have used electrolyte (NaCl/Na2CO3)-water-organic molecule (Iso-propanol/ethanol) as the model system. The model of Robinson-Stokes-Glukauf is modified by incorporating the temperature dependent Flory-Huggins interaction parameters. The Helmholtz free energy expression contains, in addition to electrostatic and translational entropic contributions, three Flory-Huggins pairwise interaction contributions viz., and (w-water, p-polymer, s-salt). These parameters depend both on temperature and concentrations. The concentration dependence is expressed in the form of a quadratic expression involving the volume fractions of the interacting species. The temperature dependence is expressed in the form .To obtain the temperature-dependent interaction parameters for organic molecule-water and electrolyte-water systems, Critical solution temperature of electrolyte -water-organic molecules is measured using cloud point measuring apparatus The temperature and composition dependent interaction parameters for electrolyte-water-organic molecule are estimated through measurement of cloud point of solution. The model is used to estimate critical solution temperature (CST) of electrolyte water-organic molecules solution. We have experimentally determined the critical solution temperature of different compositions of electrolyte-water-organic molecule solution and compared the results with the estimates based on our model. The two sets of values show good agreement. On the other hand when only osmotic coefficients are used for estimation of the free energy model, CST predicted using the resulting model show poor agreement with the experiments. Thus, the importance of the CST data in the estimation of parameters of the thermodynamic model is confirmed through this work.Keywords: concentrated electrolytes, Debye-Hückel theory, interaction parameters, Robinson-Stokes-Glueckauf model, Flory-Huggins model, critical solution temperature
Procedia PDF Downloads 39616165 Assessing Denitrification-Disintegration Model’s Efficacy in Simulating Greenhouse Gas Emissions, Crop Growth, Yield, and Soil Biochemical Processes in Moroccan Context
Authors: Mohamed Boullouz, Mohamed Louay Metougui
Abstract:
Accurate modeling of greenhouse gas (GHG) emissions, crop growth, soil productivity, and biochemical processes is crucial considering escalating global concerns about climate change and the urgent need to improve agricultural sustainability. The application of the denitrification-disintegration (DNDC) model in the context of Morocco's unique agro-climate is thoroughly investigated in this study. Our main research hypothesis is that the DNDC model offers an effective and powerful tool for precisely simulating a wide range of significant parameters, including greenhouse gas emissions, crop growth, yield potential, and complex soil biogeochemical processes, all consistent with the intricate features of environmental Moroccan agriculture. In order to verify these hypotheses, a vast amount of field data covering Morocco's various agricultural regions and encompassing a range of soil types, climatic factors, and crop varieties had to be gathered. These experimental data sets will serve as the foundation for careful model calibration and subsequent validation, ensuring the accuracy of simulation results. In conclusion, the prospective research findings add to the global conversation on climate-resilient agricultural practices while encouraging the promotion of sustainable agricultural models in Morocco. A policy architect's and an agricultural actor's ability to make informed decisions that not only advance food security but also environmental stability may be strengthened by the impending recognition of the DNDC model as a potent simulation tool tailored to Moroccan conditions.Keywords: greenhouse gas emissions, DNDC model, sustainable agriculture, Moroccan cropping systems
Procedia PDF Downloads 6916164 Design and Implementation of a Lab Bench for Synthetic Aperture Radar Imaging System
Authors: Karthiyayini Nagarajan, P. V. RamaKrishna
Abstract:
Radar Imaging techniques provides extensive applications in the field of remote sensing, majorly Synthetic Aperture Radar(SAR) that provide high resolution target images. This paper work puts forward the effective and realizable signal generation and processing for SAR images. The major units in the system include camera, signal generation unit, signal processing unit and display screen. The real radio channel is replaced by its mathematical model based on optical image to calculate a reflected signal model in real time. Signal generation realizes the algorithm and forms the radar reflection model. Signal processing unit provides range and azimuth resolution through matched filtering and spectrum analysis procedure to form radar image on the display screen. The restored image has the same quality as that of the optical image. This SAR imaging system has been designed and implemented using MATLAB and Quartus II tools on Stratix III device as a System(lab bench) that works in real time to study/investigate on radar imaging rudiments and signal processing scheme for educational and research purposes.Keywords: synthetic aperture radar, radio reflection model, lab bench
Procedia PDF Downloads 47216163 Failure Analysis and Verification Using an Integrated Method for Automotive Electric/Electronic Systems
Authors: Lei Chen, Jian Jiao, Tingdi Zhao
Abstract:
Failures of automotive electric/electronic systems, which are universally considered to be safety-critical and software-intensive, may cause catastrophic accidents. Analysis and verification of failures in these kinds of systems is a big challenge with increasing system complexity. Model-checking is often employed to allow formal verification by ensuring that the system model conforms to specified safety properties. The system-level effects of failures are established, and the effects on system behavior are observed through the formal verification. A hazard analysis technique, called Systems-Theoretic Process Analysis, is capable of identifying design flaws which may cause potential failure hazardous, including software and system design errors and unsafe interactions among multiple system components. This paper provides a concept on how to use model-checking integrated with Systems-Theoretic Process Analysis to perform failure analysis and verification of automotive electric/electronic systems. As a result, safety requirements are optimized, and failure propagation paths are found. Finally, an automotive electric/electronic system case study is used to verify the effectiveness and practicability of the method.Keywords: failure analysis and verification, model checking, system-theoretic process analysis, automotive electric/electronic system
Procedia PDF Downloads 12616162 Debriefing Practices and Models: An Integrative Review
Authors: Judson P. LaGrone
Abstract:
Simulation-based education in curricula was once a luxurious component of nursing programs but now serves as a vital element of an individual’s learning experience. A debriefing occurs after the simulation scenario or clinical experience is completed to allow the instructor(s) or trained professional(s) to act as a debriefer to guide a reflection with a purpose of acknowledging, assessing, and synthesizing the thought process, decision-making process, and actions/behaviors performed during the scenario or clinical experience. Debriefing is a vital component of the simulation process and educational experience to allow the learner(s) to progressively build upon past experiences and current scenarios within a safe and welcoming environment with a guided dialog to enhance future practice. The aim of this integrative review was to assess current practices of debriefing models in simulation-based education for health care professionals and students. The following databases were utilized for the search: CINAHL Plus, Cochrane Database of Systemic Reviews, EBSCO (ERIC), PsycINFO (Ovid), and Google Scholar. The advanced search option was useful to narrow down the search of articles (full text, Boolean operators, English language, peer-reviewed, published in the past five years). Key terms included debrief, debriefing, debriefing model, debriefing intervention, psychological debriefing, simulation, simulation-based education, simulation pedagogy, health care professional, nursing student, and learning process. Included studies focus on debriefing after clinical scenarios of nursing students, medical students, and interprofessional teams conducted between 2015 and 2020. Common themes were identified after the analysis of articles matching the search criteria. Several debriefing models are addressed in the literature with similarities of effectiveness for participants in clinical simulation-based pedagogy. Themes identified included (a) importance of debriefing in simulation-based pedagogy, (b) environment for which debriefing takes place is an important consideration, (c) individuals who should conduct the debrief, (d) length of debrief, and (e) methodology of the debrief. Debriefing models supported by theoretical frameworks and facilitated by trained staff are vital for a successful debriefing experience. Models differed from self-debriefing, facilitator-led debriefing, video-assisted debriefing, rapid cycle deliberate practice, and reflective debriefing. A reoccurring finding was centered around the emphasis of continued research for systematic tool development and analysis of the validity and effectiveness of current debriefing practices. There is a lack of consistency of debriefing models among nursing curriculum with an increasing rate of ill-prepared faculty to facilitate the debriefing phase of the simulation.Keywords: debriefing model, debriefing intervention, health care professional, simulation-based education
Procedia PDF Downloads 14616161 Measuring Business Strategy and Information Systems Alignment
Authors: Amit Saraswat, Ruchi Tewari
Abstract:
Purpose: The research paper aims at understanding the alignment of business and IT in the Indian context and the business value attached to such an alignment. Methodology: The study is conducted in two stages. Stage one: Bibliographic research was conducted to evolve the parameters for defining alignment. Stage two: Evolving a model for strategic alignment to conduct an empirical study. The model is defined in terms of four fundamental domains of strategic management choice – business strategy, information strategy, organizational structure, and information technology structure. A survey through a questionnaire was conducted across organizations from 4 different industries and Structure Equation Modelling (SEM) technique is used for validating the model. Findings: In the Indian scenario all the subscales of alignment could not be validated. It could be validated that organizational strategy impacts information strategy and information technology structure. Research Limitations: The study is limited to the Indian context. Business IT alignment may be culture dependent so further research is required to validate the model in other cultures. Originality/Value: In the western world several models of alignment of business strategy and information systems is available but they do not measure the extent of alignment which the current study in the Indian context. Findings of the study can be used by managers in strategizing and understanding their business and information systems needs holistically and cohesively leading to efficient use of resources and output.Keywords: business strategy, information technology (IT), business IT alignment, SEM
Procedia PDF Downloads 39216160 Academic Staff Identity and Emotional Labour: Exploring Pride, Motivation, and Relationships in Universities
Authors: Keith Schofield, Garry R. Prentice
Abstract:
The perceptions of the work an academic does, and the environment in which they do it, contributes to the professional identity of that academic. In turn, this has implications for the level of involvement they have in their job, their satisfaction, and their work product. This research explores academic identities in British and Irish institutions and considers the complex interplay between identity, practice, and participation. Theoretical assumptions made in this paper assert that meaningful work has positive effects on work pride, organisational commitment, organisational citizenship, and motivation; when employees participate enthusiastically they are likely to be more engaged, more successful, and more satisfied. Further examination is given to the context in which this participation happens; the nature of institutional process, management, and relationships with colleagues, team members, and students is considered. The present study follows a mixed-methods approach to explore work satisfaction constructs in a number of academic contexts in the UK and Ireland. The quantitative component of this research (Convenience Sample: 155 academics, and support/ administrative staff; 36.1% male, 63.9% female; 60.8% academic staff, 39.2% support/ administration staff; across a number of universities in the UK and Ireland) was based on an established emotional labour model and was tested across gender groups, job roles, and years of service. This was complimented by qualitative semi-structured interviews (Purposive Sample: 10 academics, and 5 support/ administrative staff across the same universities in the UK and Ireland) to examine various themes including values within academia, work conditions, professional development, and transmission of knowledge to students. Experiences from both academic and support perspectives were sought in order to gain a holistic view of academia and to provide an opportunity to explore the dynamic of the academic/administrator relationship within the broader institutional context. The quantitative emotional labour model, tested via a path analysis, provided a robust description of the relationships within the data. The significant relationships found within the quantitative emotional labour model included a link between non-expression of true feelings resulting in emotional labourious work and lower levels of intrinsic motivation and higher levels of extrinsic motivation. Higher levels of intrinsic motivation also linked positively to work pride. These findings were further explored in the qualitative elements of the research where themes emerged including the disconnection between faculty management and staff, personal fulfilment and the friction between the identities of teacher, researcher/ practitioner and administrator. The implications of the research findings from this study are combined and discussed in relation to possible identity-related and emotional labour management-related interventions. Further, suggestions are made to institutions concerning the application of these findings including the development of academic practices, with specific reference to the duality of identity required to service the combined teacher/ researcher role. Broader considerations of the paper include how individuals and institutions may engage with the changing nature of students-as-consumers as well as a recommendation to centralise personal fulfillment through the development of professional academic identities.Keywords: academic work, emotional labour, identity friction, mixed methods
Procedia PDF Downloads 28216159 A Mathematical Model for Reliability Redundancy Optimization Problem of K-Out-Of-N: G System
Authors: Gak-Gyu Kim, Won Il Jung
Abstract:
According to a remarkable development of science and technology, function and role of the system of engineering fields has recently been diversified. The system has become increasingly more complex and precise, and thus, system designers intended to maximize reliability concentrate more effort at the design stage. This study deals with the reliability redundancy optimization problem (RROP) for k-out-of-n: G system configuration with cold standby and warm standby components. This paper further intends to present the optimal mathematical model through which the following three elements of (i) multiple components choices, (ii) redundant components quantity and (iii) the choice of redundancy strategies may be combined in order to maximize the reliability of the system. Therefore, we focus on the following three issues. First, we consider RROP that there exists warm standby state as well as cold standby state of the component. Second, as eliminating an approximation approach of the previous RROP studies, we construct a precise model for system reliability. Third, given transition time when the state of components changes, we present not simply a workable solution but the advanced method. For the wide applicability of RROPs, moreover, we use absorbing continuous time Markov chain and matrix analytic methods in the suggested mathematical model.Keywords: RROP, matrix analytic methods, k-out-of-n: G system, MTTF, absorbing continuous time Markov Chain
Procedia PDF Downloads 25616158 A Damage-Plasticity Concrete Model for Damage Modeling of Reinforced Concrete Structures
Authors: Thanh N. Do
Abstract:
This paper addresses the modeling of two critical behaviors of concrete material in reinforced concrete components: (1) the increase in strength and ductility due to confining stresses from surrounding transverse steel reinforcements, and (2) the progressive deterioration in strength and stiffness due to high strain and/or cyclic loading. To improve the state-of-the-art, the author presents a new 3D constitutive model of concrete material based on plasticity and continuum damage mechanics theory to simulate both the confinement effect and the strength deterioration in reinforced concrete components. The model defines a yield function of the stress invariants and a compressive damage threshold based on the level of confining stresses to automatically capture the increase in strength and ductility when subjected to high compressive stresses. The model introduces two damage variables to describe the strength and stiffness deterioration under tensile and compressive stress states. The damage formulation characterizes well the degrading behavior of concrete material, including the nonsymmetric strength softening in tension and compression, as well as the progressive strength and stiffness degradation under primary and follower load cycles. The proposed damage model is implemented in a general purpose finite element analysis program allowing an extensive set of numerical simulations to assess its ability to capture the confinement effect and the degradation of the load-carrying capacity and stiffness of structural elements. It is validated against a collection of experimental data of the hysteretic behavior of reinforced concrete columns and shear walls under different load histories. These correlation studies demonstrate the ability of the model to describe vastly different hysteretic behaviors with a relatively consistent set of parameters. The model shows excellent consistency in response determination with very good accuracy. Its numerical robustness and computational efficiency are also very good and will be further assessed with large-scale simulations of structural systems.Keywords: concrete, damage-plasticity, shear wall, confinement
Procedia PDF Downloads 17316157 On Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Secondary Distant Metastases Growth in Patients with Lymph Nodes Metastases
Authors: Ella Tyuryumina, Alexey Neznanov
Abstract:
This paper is devoted to mathematical modelling of the progression and stages of breast cancer. We propose Consolidated mathematical growth model of primary tumor and secondary distant metastases growth in patients with lymph nodes metastases (CoM-III) as a new research tool. We are interested in: 1) modelling the whole natural history of primary tumor and secondary distant metastases growth in patients with lymph nodes metastases; 2) developing adequate and precise CoM-III which reflects relations between primary tumor and secondary distant metastases; 3) analyzing the CoM-III scope of application; 4) implementing the model as a software tool. Firstly, the CoM-III includes exponential tumor growth model as a system of determinate nonlinear and linear equations. Secondly, mathematical model corresponds to TNM classification. It allows to calculate different growth periods of primary tumor and secondary distant metastases growth in patients with lymph nodes metastases: 1) ‘non-visible period’ for primary tumor; 2) ‘non-visible period’ for secondary distant metastases growth in patients with lymph nodes metastases; 3) ‘visible period’ for secondary distant metastases growth in patients with lymph nodes metastases. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. Thus, the CoM-III model and predictive software: a) detect different growth periods of primary tumor and secondary distant metastases growth in patients with lymph nodes metastases; b) make forecast of the period of the distant metastases appearance in patients with lymph nodes metastases; c) have higher average prediction accuracy than the other tools; d) can improve forecasts on survival of breast cancer and facilitate optimization of diagnostic tests. The following are calculated by CoM-III: the number of doublings for ‘non-visible’ and ‘visible’ growth period of secondary distant metastases; tumor volume doubling time (days) for ‘non-visible’ and ‘visible’ growth period of secondary distant metastases. The CoM-III enables, for the first time, to predict the whole natural history of primary tumor and secondary distant metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on primary tumor sizes. Summarizing: a) CoM-III describes correctly primary tumor and secondary distant metastases growth of IA, IIA, IIB, IIIB (T1-4N1-3M0) stages in patients with lymph nodes metastases (N1-3); b) facilitates the understanding of the appearance period and inception of secondary distant metastases.Keywords: breast cancer, exponential growth model, mathematical model, primary tumor, secondary metastases, survival
Procedia PDF Downloads 30816156 Teaching the Temperature Dependence of Electrical Resistance of Materials through Arduino Investigation
Authors: Vinit Srivastava, Abhay Singh Thakur, Shivam Dubey, Rahul Vaish, Bharat Singh Rajpurohit
Abstract:
This study examines the problem of students' poor comprehension of the thermal dependence of resistance by investigating this idea using an evidence-based inquiry approach. It suggests a practical exercise to improve secondary school students' comprehension of how materials' resistance to temperature changes. The suggested exercise uses an Arduino and Peltier device to test the resistance of aluminum and graphite at various temperatures. The study attempts to close the knowledge gap between the theoretical and practical facets of the subject, which students frequently find difficult to grasp. With the help of a variety of resistors made of various materials and pencils of varying grades, the Arduino experiment investigates the resistance of a metallic conductor (aluminum) and a semiconductor (graphite) at various temperatures. The purpose of the research is to clarify for students the relationship between temperature and resistance and to emphasize the importance of resistor material choice and measurement methods in obtaining precise and stable resistance values over dynamic temperature variations. The findings show that while the resistance of graphite decreases with temperature, the resistance of metallic conductors rises with temperature. The results also show that as softer lead pencils or pencils of a lower quality are used, the resistance values of the resistors drop. In addition, resistors showed greater stability at lower temperatures when their temperature coefficients of resistance (TCR) were smaller. Overall, the results of this article show that the suggested experiment is a useful and practical method for teaching students about resistance's relationship to temperature. It emphasizes how crucial it is to take into account the resistor material selection and the resistance measurement technique when designing and picking out resistors for various uses. The results of the study are anticipated to guide the creation of more efficient teaching methods to close the gap between science education's theoretical and practical components.Keywords: electrical resistance, temperature dependence, science education, inquiry-based activity, resistor stability
Procedia PDF Downloads 7816155 Computer Software for Calculating Electron Mobility of Semiconductors Compounds; Case Study for N-Gan
Authors: Emad A. Ahmed
Abstract:
Computer software to calculate electron mobility with respect to different scattering mechanism has been developed. This software is adopted completely Graphical User Interface (GUI) technique and its interface has been designed by Microsoft Visual Basic 6.0. As a case study the electron mobility of n-GaN was performed using this software. The behaviour of the mobility for n-GaN due to elastic scattering processes and its relation to temperature and doping concentration were discussed. The results agree with other available theoretical and experimental data.Keywords: electron mobility, relaxation time, GaN, scattering, computer software, computation physics
Procedia PDF Downloads 67816154 Estimating Water Balance at Beterou Watershed, Benin Using Soil and Water Assessment Tool (SWAT) Model
Authors: Ella Sèdé Maforikan
Abstract:
Sustained water management requires quantitative information and the knowledge of spatiotemporal dynamics of hydrological system within the basin. This can be achieved through the research. Several studies have investigated both surface water and groundwater in Beterou catchment. However, there are few published papers on the application of the SWAT modeling in Beterou catchment. The objective of this study was to evaluate the performance of SWAT to simulate the water balance within the watershed. The inputs data consist of digital elevation model, land use maps, soil map, climatic data and discharge records. The model was calibrated and validated using the Sequential Uncertainty Fitting (SUFI2) approach. The calibrated started from 1989 to 2006 with four years warming up period (1985-1988); and validation was from 2007 to 2020. The goodness of the model was assessed using five indices, i.e., Nash–Sutcliffe efficiency (NSE), the ratio of the root means square error to the standard deviation of measured data (RSR), percent bias (PBIAS), the coefficient of determination (R²), and Kling Gupta efficiency (KGE). Results showed that SWAT model successfully simulated river flow in Beterou catchment with NSE = 0.79, R2 = 0.80 and KGE= 0.83 for the calibration process against validation process that provides NSE = 0.78, R2 = 0.78 and KGE= 0.85 using site-based streamflow data. The relative error (PBIAS) ranges from -12.2% to 3.1%. The parameters runoff curve number (CN2), Moist Bulk Density (SOL_BD), Base Flow Alpha Factor (ALPHA_BF), and the available water capacity of the soil layer (SOL_AWC) were the most sensitive parameter. The study provides further research with uncertainty analysis and recommendations for model improvement and provision of an efficient means to improve rainfall and discharges measurement data.Keywords: watershed, water balance, SWAT modeling, Beterou
Procedia PDF Downloads 6016153 Rating Agreement: Machine Learning for Environmental, Social, and Governance Disclosure
Authors: Nico Rosamilia
Abstract:
The study evaluates the importance of non-financial disclosure practices for regulators, investors, businesses, and markets. It aims to create a sector-specific set of indicators for environmental, social, and governance (ESG) performances alternative to the ratings of the agencies. The existing literature extensively studies the implementation of ESG rating systems. Conversely, this study has a twofold outcome. Firstly, it should generalize incentive systems and governance policies for ESG and sustainable principles. Therefore, it should contribute to the EU Sustainable Finance Disclosure Regulation. Secondly, it concerns the market and the investors by highlighting successful sustainable investing. Indeed, the study contemplates the effect of ESG adoption practices on corporate value. The research explores the asset pricing angle in order to shed light on the fragmented argument on the finance of ESG. Investors may be misguided about the positive or negative effects of ESG on performances. The paper proposes a different method to evaluate ESG performances. By comparing the results of a traditional econometric approach (Lasso) with a machine learning algorithm (Random Forest), the study establishes a set of indicators for ESG performance. Therefore, the research also empirically contributes to the theoretical strands of literature regarding model selection and variable importance in a finance framework. The algorithms will spit out sector-specific indicators. This set of indicators defines an alternative to the compounded scores of ESG rating agencies and avoids the possible offsetting effect of scores. With this approach, the paper defines a sector-specific set of indicators to standardize ESG disclosure. Additionally, it tries to shed light on the absence of a clear understanding of the direction of the ESG effect on corporate value (the problem of endogeneity).Keywords: ESG ratings, non-financial information, value of firms, sustainable finance
Procedia PDF Downloads 8816152 Through 7S Model to Promote the Service Innovation Management
Authors: Cheng Fang Hsu
Abstract:
Call center is the core of building customer relationship management system. Under the strong competitive stress, it becomes a new profiting challenge for a successful enterprise. Call center is a department not only to provide customer service but also to bring business profit. This is the qualitative case study in Taiwan bank service industry which goes on deeper exploration, and analysis by business interviews and industrial analysis. This study starts from the establishment, development, and management after the reforming of the case call center. Through SWOT analysis, and industrial analysis, this study adopted 7S model to explain how the call center reforms from service oriented to profit oriented and from cost management to profit management. The results indicated how service innovation management promotes call center to be operated as a market profit competition center. The recommendations are indicated to support the call center on marketing profit by service innovation management.Keywords: call center, 7S model, service innovation management, bioinformatics
Procedia PDF Downloads 49316151 SAP-Reduce: Staleness-Aware P-Reduce with Weight Generator
Authors: Lizhi Ma, Chengcheng Hu, Fuxian Wong
Abstract:
Partial reduce (P-Reduce) has set a state-of-the-art performance on distributed machine learning in the heterogeneous environment over the All-Reduce architecture. The dynamic P-Reduce based on the exponential moving average (EMA) approach predicts all the intermediate model parameters, which raises unreliability. It is noticed that the approximation trick leads the wrong way to obtaining model parameters in all the nodes. In this paper, SAP-Reduce is proposed, which is a variant of the All-Reduce distributed training model with staleness-aware dynamic P-Reduce. SAP-Reduce directly utilizes the EMA-like algorithm to generate the normalized weights. To demonstrate the effectiveness of the algorithm, the experiments are set based on a number of deep learning models, comparing the single-step training acceleration ratio and convergence time. It is found that SAP-Reduce simplifying dynamic P-Reduce outperforms the intermediate approximation one. The empirical results show SAP-Reduce is 1.3× −2.1× faster than existing baselines.Keywords: collective communication, decentralized distributed training, machine learning, P-Reduce
Procedia PDF Downloads 3616150 Membrane Distillation Process Modeling: Dynamical Approach
Authors: Fadi Eleiwi, Taous Meriem Laleg-Kirati
Abstract:
This paper presents a complete dynamic modeling of a membrane distillation process. The model contains two consistent dynamic models. A 2D advection-diffusion equation for modeling the whole process and a modified heat equation for modeling the membrane itself. The complete model describes the temperature diffusion phenomenon across the feed, membrane, permeate containers and boundary layers of the membrane. It gives an online and complete temperature profile for each point in the domain. It explains heat conduction and convection mechanisms that take place inside the process in terms of mathematical parameters, and justify process behavior during transient and steady state phases. The process is monitored for any sudden change in the performance at any instance of time. In addition, it assists maintaining production rates as desired, and gives recommendations during membrane fabrication stages. System performance and parameters can be optimized and controlled using this complete dynamic model. Evolution of membrane boundary temperature with time, vapor mass transfer along the process, and temperature difference between membrane boundary layers are depicted and included. Simulations were performed over the complete model with real membrane specifications. The plots show consistency between 2D advection-diffusion model and the expected behavior of the systems as well as literature. Evolution of heat inside the membrane starting from transient response till reaching steady state response for fixed and varying times is illustrated.Keywords: membrane distillation, dynamical modeling, advection-diffusion equation, thermal equilibrium, heat equation
Procedia PDF Downloads 27616149 Performance Evaluation of Al Jame’s Roundabout Using SIDRA
Authors: D. Muley, H. S. Al-Mandhari
Abstract:
This paper evaluates the performance of a multi-lane four-legged modern roundabout operating in Muscat using SIDRA model. The performance measures include Degree of Saturation (DOS), average delay, and queue lengths. The geometric and traffic data were used for model preparation. Gap acceptance parameters, critical gap, and follow-up headway were used for calibration of SIDRA model. The results from the analysis showed that currently the roundabout is experiencing delays up to 610 seconds with DOS 1.67 during peak hour. Further, sensitivity analysis for general and roundabout parameters was performed, amongst lane width, cruise speed, inscribed diameter, entry radius, and entry angle showed that inscribed diameter is the most crucial factor affecting delay and DOS. Upgradation of the roundabout to the fully signalized junction was found as the suitable solution which will serve for future years with LOS C for design year having DOS of 0.9 with average control delay of 51.9 seconds per vehicle.Keywords: performance analysis, roundabout, sensitivity analysis, SIDRA
Procedia PDF Downloads 38816148 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.Keywords: security, internet of things, cloud computing, stackelberg game, machine learning, naive q-learning
Procedia PDF Downloads 358