Search results for: cognitive models
6725 Impulsivity, Emotional Regulation, Problematic Mukbang Watching and Eating Disorders in University Students
Authors: Aqsa Butt, Nida Zafar
Abstract:
The study assesses the relationship between impulsivity, emotional regulation, problematic mukbang watching, and eating disorders in university students. It was hypothesized there is likely to be a relationship between impulsivity, emotional regulation, problematic mukbang watching, and eating disorders in university students; impulsivity and emotional regulation would predict problematic mukbang watching in university students; problematic mukbang watching would predict eating disorders in university students. A correlational research design was used. A sample of 200 students was taken from different public and private universities in Lahore. Emotional regulation questionnaire (Gross & John, 2003), Abbreviated Barrat Impulsiveness Scale (Christopher et al., 2014), Problematic Mukbang Watching Scale (Kircaburun et al., 2020), and Eating Disorder Diagnostic Scale (Stice et al., 2004) were used for assessment. Results showed a significant positive relationship between impulsivity and expressive suppression with problematic mukbang watching. However, there is a significant negative relationship between cognitive reappraisal and problematic mukbang watching. Problematic mukbang is significantly positively related to bulimia nervosa and binge eating. Furthermore, impulsivity and expressive suppression are significant positive predictors of problematic mukbang watching, and cognitive reappraisal is a significant negative predictor of problematic mukbang watching. Additionally, problematic mukbang watching significantly positively predicts bulimia nervosa and binge eating. The research has important implications for university students to understand that excessive watching of such videos can lead to eating disorders such as bulimia nervosa and binge eating. This research provides an understanding of the effects of Mukbang watching, and it also adds to the existing body of knowledge on eating disorders.Keywords: impulsivity, emotional regulation, problematic Mukbang watching eating disorders, students
Procedia PDF Downloads 656724 Patient Care Needs Assessment: An Evidence-Based Process to Inform Quality Care and Decision Making
Authors: Wynne De Jong, Robert Miller, Ross Riggs
Abstract:
Beyond the number of nurses providing care for patients, having nurses with the right skills, experience and education is essential to ensure the best possible outcomes for patients. Research studies continue to link nurse staffing and skill mix with nurse-sensitive patient outcomes; numerous studies clearly show that superior patient outcomes are associated with higher levels of regulated staff. Due to the limited number of tools and processes available to assist nurse leaders with staffing models of care, nurse leaders are constantly faced with the ongoing challenge to ensure their staffing models of care best suit their patient population. In 2009, several hospitals in Ontario, Canada participated in a research study to develop and evaluate an RN/RPN utilization toolkit. The purpose of this study was to develop and evaluate a toolkit for Registered Nurses/Registered Practical Nurses Staff mix decision-making based on the College of Nurses of Ontario, Canada practice standards for the utilization of RNs and RPNs. This paper will highlight how an organization has further developed the Patient Care Needs Assessment (PCNA) questionnaire, a major component of the toolkit. Moreover, it will demonstrate how it has utilized the information from PCNA to clearly identify patient and family care needs, thus providing evidence-based results to assist leaders with matching the best staffing skill mix to their patients.Keywords: nurse staffing models of care, skill mix, nursing health human resources, patient safety
Procedia PDF Downloads 3176723 A Context Aware Mobile Learning System with a Cognitive Recommendation Engine
Authors: Jalal Maqbool, Gyu Myoung Lee
Abstract:
Using smart devices for context aware mobile learning is becoming increasingly popular. This has led to mobile learning technology becoming an indispensable part of today’s learning environment and platforms. However, some fundamental issues remain - namely, mobile learning still lacks the ability to truly understand human reaction and user behaviour. This is due to the fact that current mobile learning systems are passive and not aware of learners’ changing contextual situations. They rely on static information about mobile learners. In addition, current mobile learning platforms lack the capability to incorporate dynamic contextual situations into learners’ preferences. Thus, this thesis aims to address these issues highlighted by designing a context aware framework which is able to sense learner’s contextual situations, handle data dynamically, and which can use contextual information to suggest bespoke learning content according to a learner’s preferences. This is to be underpinned by a robust recommendation system, which has the capability to perform these functions, thus providing learners with a truly context-aware mobile learning experience, delivering learning contents using smart devices and adapting to learning preferences as and when it is required. In addition, part of designing an algorithm for the recommendation engine has to be based on learner and application needs, personal characteristics and circumstances, as well as being able to comprehend human cognitive processes which would enable the technology to interact effectively and deliver mobile learning content which is relevant, according to the learner’s contextual situations. The concept of this proposed project is to provide a new method of smart learning, based on a capable recommendation engine for providing an intuitive mobile learning model based on learner actions.Keywords: aware, context, learning, mobile
Procedia PDF Downloads 2466722 Evaluation of Commercials by Psychological Changes in Consumers’ Physiological Characteristics
Authors: Motoki Seguchi, Fumiko Harada, Hiromitsu Shimakawa
Abstract:
There have been many local companies in countryside that carefully produce and sell products, which include crafts and foods produced with traditional methods. These companies are likely to use commercials to advertise their products. However, it is difficult for companies to judge whether the commercials they create are having an impact on consumers. Therefore, to create effective commercials, this study researches what kind of gimmicks in commercials affect what kind of consumers. This study proposes a method for extracting psychological change points from the physiological characteristics of consumers while they are watching commercials and estimating the gimmicks in the commercial that affect consumer engagement. In this method, change point detection is applied to pupil size for estimating gimmicks that affect consumers’ emotional engagement, and to EDA for estimating gimmicks that affect cognitive engagement. A questionnaire is also used to estimate the commercials that influence behavioral engagement. As a result of estimating the gimmicks that influence consumer engagement using this method, it was found that there are some common features among the gimmicks. To influence cognitive engagement, it was found that it was useful to include flashback scenes, messages to be appealed to, the company’s name, and the company’s logos as gimmicks. It was also found that flashback scenes and story climaxes were useful in influencing emotional engagement. Furthermore, it was found that the use of storytelling commercials may or may not be useful, depending on which consumers are desired to take which behaviors. It also estimated the gimmicks that influence consumers for each target and found that the useful gimmicks are slightly different for students and working adults. By using this method, it can understand which gimmicks in the commercial affect which engagement of the consumers. Therefore, the results of this study can be used as a reference for the gimmicks that should be included in commercials when companies create their commercials in the future.Keywords: change point detection, estimating engagement, physiological characteristics, psychological changes, watching commercials
Procedia PDF Downloads 1876721 Revisionism in Literature: Deconstructing Patriarchal Ideals in Margaret Atwood's The Penelopiad
Authors: Essam Abdelhamid Hegazy
Abstract:
This paper aims to read Margaret Atwood's The Penelopiad (2005) via a revisionist and deconstructive approach. This novel is a postmodernist exploration of the grand-narrative myth The Odyssey (800 BC) by Homer, who portrayed the heroic warrior and the faithful wife as the epitome of perfect male and female models _examples whom all must follow and mimic. In Atwood's narrative, the same two hero models are the two great tricksters who are willing to perform any sort of obnoxious act for achieving their goals. This research tries to examine how Atwood tried to synthesize the change in character’s narratives leading to the humanization of the perfect hero and the ideal wife. The researcher has used a multidisciplinary approach where the feminist, revisionist and deconstructive theories were implemented to identify and find out the new interpretations of the myths that center the experiences and perspectives of women. Research findings are that revisionist approach was applied through giving an opportunity to the victimized and the voiceless to speak out and retaliate against their prosecutions.Keywords: margret atwood, patriarchal, penelopiad, revisionism
Procedia PDF Downloads 836720 The Principle of Methodological Rationality and Security of Organisations
Authors: Jan Franciszek Jacko
Abstract:
This investigation presents the principle of methodological rationality of decision making and discusses the impact of an organisation's members' methodologically rational or irrational decisions on its security. This study formulates and partially justifies some research hypotheses regarding the impact. The thinking experiment is used according to Max Weber's ideal types method. Two idealised situations("models") are compared: Model A, whereall decision-makers follow methodologically rational decision-making procedures. Model B, in which these agents follow methodologically irrational decision-making practices. Analysing and comparing the two models will allow the formulation of some research hypotheses regarding the impact of methodologically rational and irrational attitudes of members of an organisation on its security. In addition to the method, phenomenological analyses of rationality and irrationality are applied.Keywords: methodological rationality, rational decisions, security of organisations, philosophy of economics
Procedia PDF Downloads 1406719 Identification and Prioritisation of Students Requiring Literacy Intervention and Subsequent Communication with Key Stakeholders
Authors: Emilie Zimet
Abstract:
During networking and NCCD moderation meetings, best practices for identifying students who require Literacy Intervention are often discussed. Once these students are identified, consideration is given to the most effective process for prioritising those who have the greatest need for Literacy Support and the allocation of resources, tracking of intervention effectiveness and communicating with teachers/external providers/parents. Through a workshop, the group will investigate best practices to identify students who require literacy support and strategies to communicate and track their progress. In groups, participants will examine what they do in their settings and then compare with other models, including the researcher’s model, to decide the most effective path to identification and communication. Participants will complete a worksheet at the beginning of the session to deeply consider their current approaches. The participants will be asked to critically analyse their own identification processes for Literacy Intervention, ensuring students are not overlooked if they fall into the borderline category. A cut-off for students to access intervention will be considered so as not to place strain on already stretched resources along with the most effective allocation of resources. Furthermore, communicating learning needs and differentiation strategies to staff is paramount to the success of an intervention, and participants will look at the frequency of communication to share such strategies and updates. At the end of the session, the group will look at creating or evolving models that allow for best practices for the identification and communication of Literacy Interventions. The proposed outcome for this research is to develop a model of identification of students requiring Literacy Intervention that incorporates the allocation of resources and communication to key stakeholders. This will be done by pooling information and discussing a variety of models used in the participant's school settings.Keywords: identification, student selection, communication, special education, school policy, planning for intervention
Procedia PDF Downloads 476718 Special Case of Trip Distribution Model and Its Use for Estimation of Detailed Transport Demand in the Czech Republic
Authors: Jiri Dufek
Abstract:
The national model of the Czech Republic has been modified in a detailed way to get detailed travel demand in the municipality level (cities, villages over 300 inhabitants). As a technique for this detailed modelling, three-dimensional procedure for calibrating gravity models, was used. Besides of zone production and attraction, which is usual in gravity models, the next additional parameter for trip distribution was introduced. Usually it is called by “third dimension”. In the model, this parameter is a demand between regions. The distribution procedure involved calculation of appropriate skim matrices and its multiplication by three coefficients obtained by iterative balancing of production, attraction and third dimension. This type of trip distribution was processed in R-project and the results were used in the Czech Republic transport model, created in PTV Vision. This process generated more precise results in local level od the model (towns, villages)Keywords: trip distribution, three dimension, transport model, municipalities
Procedia PDF Downloads 1316717 Data-driven Decision-Making in Digital Entrepreneurship
Authors: Abeba Nigussie Turi, Xiangming Samuel Li
Abstract:
Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.Keywords: startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship
Procedia PDF Downloads 3296716 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: ’Reddit’
Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell
Abstract:
Native language identification is one of the growing subfields in natural language processing (NLP). The task of native language identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features, when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL), and then the trained models are evaluated on different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and logistic regression. Results show that content-based features are more accurate and robust than content independent ones when tested within the corpus and across corpus.Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML
Procedia PDF Downloads 1396715 Temperature Control Improvement of Membrane Reactor
Authors: Pornsiri Kaewpradit, Chalisa Pourneaw
Abstract:
Temperature control improvement of a membrane reactor with exothermic and reversible esterification reaction is studied in this work. It is well known that a batch membrane reactor requires different control strategies from a continuous one due to the fact that it is operated dynamically. Due to the effect of the operating temperature, the suitable control scheme has to be designed based reliable predictive model to achieve a desired objective. In the study, the optimization framework has been preliminary formulated in order to determine an optimal temperature trajectory for maximizing a desired product. In model predictive control scheme, a set of predictive models have been initially developed corresponding to the possible operating points of the system. The multiple predictive control moves have been further calculated on-line using the developed models corresponding to current operating point. It is obviously seen in the simulation results that the temperature control has been improved compared to the performance obtained by the conventional predictive controller. Further robustness tests have also been investigated in this study.Keywords: model predictive control, batch reactor, temperature control, membrane reactor
Procedia PDF Downloads 4696714 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 1366713 Risk and Emotion: Measuring the Effect of Emotion and Other Visceral Factors on Decision Making under Risk
Authors: Michael Mihalicz, Aziz Guergachi
Abstract:
Background: The science of modelling choice preferences has evolved over centuries into an interdisciplinary field contributing to several branches of Microeconomics and Mathematical Psychology. Early theories in Decision Science rested on the logic of rationality, but as it and related fields matured, descriptive theories emerged capable of explaining systematic violations of rationality through cognitive mechanisms underlying the thought processes that guide human behaviour. Cognitive limitations are not, however, solely responsible for systematic deviations from rationality and many are now exploring the effect of visceral factors as the more dominant drivers. The current study builds on the existing literature by exploring sleep deprivation, thermal comfort, stress, hunger, fear, anger and sadness as moderators to three distinct elements that define individual risk preference under Cumulative Prospect Theory. Methodology: This study is designed to compare the risk preference of participants experiencing an elevated affective or visceral state to those in a neutral state using nonparametric elicitation methods across three domains. Two experiments will be conducted simultaneously using different methodologies. The first will determine visceral states and risk preferences randomly over a two-week period by prompting participants to complete an online survey remotely. In each round of questions, participants will be asked to self-assess their current state using Visual Analogue Scales before answering a series of lottery-style elicitation questions. The second experiment will be conducted in a laboratory setting using psychological primes to induce a desired state. In this experiment, emotional states will be recorded using emotion analytics and used a basis for comparison between the two methods. Significance: The expected results include a series of measurable and systematic effects on the subjective interpretations of gamble attributes and evidence supporting the proposition that a portion of the variability in human choice preferences unaccounted for by cognitive limitations can be explained by interacting visceral states. Significant results will promote awareness about the subconscious effect that emotions and other drive states have on the way people process and interpret information, and can guide more effective decision making by informing decision-makers of the sources and consequences of irrational behaviour.Keywords: decision making, emotions, prospect theory, visceral factors
Procedia PDF Downloads 1496712 Times Series Analysis of Depositing in Industrial Design in Brazil between 1996 and 2013
Authors: Jonas Pedro Fabris, Alberth Almeida Amorim Souza, Maria Emilia Camargo, Suzana Leitão Russo
Abstract:
With the law Nº. 9279, of May 14, 1996, the Brazilian government regulates rights and obligations relating to industrial property considering the economic development of the country as granting patents, trademark registration, registration of industrial designs and other forms of protection copyright. In this study, we show the application of the methodology of Box and Jenkins in the series of deposits of industrial design at the National Institute of Industrial Property for the period from May 1996 to April 2013. First, a graphical analysis of the data was done by observing the behavior of the data and the autocorrelation function. The best model found, based on the analysis of charts and statistical tests suggested by Box and Jenkins methodology, it was possible to determine the model number for the deposit of industrial design, SARIMA (2,1,0)(2,0,0), with an equal to 9.88% MAPE.Keywords: ARIMA models, autocorrelation, Box and Jenkins Models, industrial design, MAPE, time series
Procedia PDF Downloads 5466711 Identification of Nonlinear Systems Using Radial Basis Function Neural Network
Authors: C. Pislaru, A. Shebani
Abstract:
This paper uses the radial basis function neural network (RBFNN) for system identification of nonlinear systems. Five nonlinear systems are used to examine the activity of RBFNN in system modeling of nonlinear systems; the five nonlinear systems are dual tank system, single tank system, DC motor system, and two academic models. The feed forward method is considered in this work for modelling the non-linear dynamic models, where the K-Means clustering algorithm used in this paper to select the centers of radial basis function network, because it is reliable, offers fast convergence and can handle large data sets. The least mean square method is used to adjust the weights to the output layer, and Euclidean distance method used to measure the width of the Gaussian function.Keywords: system identification, nonlinear systems, neural networks, radial basis function, K-means clustering algorithm
Procedia PDF Downloads 4716710 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection
Procedia PDF Downloads 2946709 Analyzing the Impact of Spatio-Temporal Climate Variations on the Rice Crop Calendar in Pakistan
Authors: Muhammad Imran, Iqra Basit, Mobushir Riaz Khan, Sajid Rasheed Ahmad
Abstract:
The present study investigates the space-time impact of climate change on the rice crop calendar in tropical Gujranwala, Pakistan. The climate change impact was quantified through the climatic variables, whereas the existing calendar of the rice crop was compared with the phonological stages of the crop, depicted through the time series of the Normalized Difference Vegetation Index (NDVI) derived from Landsat data for the decade 2005-2015. Local maxima were applied on the time series of NDVI to compute the rice phonological stages. Panel models with fixed and cross-section fixed effects were used to establish the relation between the climatic parameters and the time-series of NDVI across villages and across rice growing periods. Results show that the climatic parameters have significant impact on the rice crop calendar. Moreover, the fixed effect model is a significant improvement over cross-sectional fixed effect models (R-squared equal to 0.673 vs. 0.0338). We conclude that high inter-annual variability of climatic variables cause high variability of NDVI, and thus, a shift in the rice crop calendar. Moreover, inter-annual (temporal) variability of the rice crop calendar is high compared to the inter-village (spatial) variability. We suggest the local rice farmers to adapt this change in the rice crop calendar.Keywords: Landsat NDVI, panel models, temperature, rainfall
Procedia PDF Downloads 2056708 A Numerical Hybrid Finite Element Model for Lattice Structures Using 3D/Beam Elements
Authors: Ahmadali Tahmasebimoradi, Chetra Mang, Xavier Lorang
Abstract:
Thanks to the additive manufacturing process, lattice structures are replacing the traditional structures in aeronautical and automobile industries. In order to evaluate the mechanical response of the lattice structures, one has to resort to numerical techniques. Ansys is a globally well-known and trusted commercial software that allows us to model the lattice structures and analyze their mechanical responses using either solid or beam elements. In this software, a script may be used to systematically generate the lattice structures for any size. On the one hand, solid elements allow us to correctly model the contact between the substrates (the supports of the lattice structure) and the lattice structure, the local plasticity, and the junctions of the microbeams. However, their computational cost increases rapidly with the size of the lattice structure. On the other hand, although beam elements reduce the computational cost drastically, it doesn’t correctly model the contact between the lattice structures and the substrates nor the junctions of the microbeams. Also, the notion of local plasticity is not valid anymore. Moreover, the deformed shape of the lattice structure doesn’t correspond to the deformed shape of the lattice structure using 3D solid elements. In this work, motivated by the pros and cons of the 3D and beam models, a numerically hybrid model is presented for the lattice structures to reduce the computational cost of the simulations while avoiding the aforementioned drawbacks of the beam elements. This approach consists of the utilization of solid elements for the junctions and beam elements for the microbeams connecting the corresponding junctions to each other. When the global response of the structure is linear, the results from the hybrid models are in good agreement with the ones from the 3D models for body-centered cubic with z-struts (BCCZ) and body-centered cubic without z-struts (BCC) lattice structures. However, the hybrid models have difficulty to converge when the effect of large deformation and local plasticity are considerable in the BCCZ structures. Furthermore, the effect of the junction’s size of the hybrid models on the results is investigated. For BCCZ lattice structures, the results are not affected by the junction’s size. This is also valid for BCC lattice structures as long as the ratio of the junction’s size to the diameter of the microbeams is greater than 2. The hybrid model can take into account the geometric defects. As a demonstration, the point clouds of two lattice structures are parametrized in a platform called LATANA (LATtice ANAlysis) developed by IRT-SystemX. In this process, for each microbeam of the lattice structures, an ellipse is fitted to capture the effect of shape variation and roughness. Each ellipse is represented by three parameters; semi-major axis, semi-minor axis, and angle of rotation. Having the parameters of the ellipses, the lattice structures are constructed in Spaceclaim (ANSYS) using the geometrical hybrid approach. The results show a negligible discrepancy between the hybrid and 3D models, while the computational cost of the hybrid model is lower than the computational cost of the 3D model.Keywords: additive manufacturing, Ansys, geometric defects, hybrid finite element model, lattice structure
Procedia PDF Downloads 1146707 Multi-Model Super Ensemble Based Advanced Approaches for Monsoon Rainfall Prediction
Authors: Swati Bhomia, C. M. Kishtawal, Neeru Jaiswal
Abstract:
Traditionally, monsoon forecasts have encountered many difficulties that stem from numerous issues such as lack of adequate upper air observations, mesoscale nature of convection, proper resolution, radiative interactions, planetary boundary layer physics, mesoscale air-sea fluxes, representation of orography, etc. Uncertainties in any of these areas lead to large systematic errors. Global circulation models (GCMs), which are developed independently at different institutes, each of which carries somewhat different representation of the above processes, can be combined to reduce the collective local biases in space, time, and for different variables from different models. This is the basic concept behind the multi-model superensemble and comprises of a training and a forecast phase. The training phase learns from the recent past performances of models and is used to determine statistical weights from a least square minimization via a simple multiple regression. These weights are then used in the forecast phase. The superensemble forecasts carry the highest skill compared to simple ensemble mean, bias corrected ensemble mean and the best model out of the participating member models. This approach is a powerful post-processing method for the estimation of weather forecast parameters reducing the direct model output errors. Although it can be applied successfully to the continuous parameters like temperature, humidity, wind speed, mean sea level pressure etc., in this paper, this approach is applied to rainfall, a parameter quite difficult to handle with standard post-processing methods, due to its high temporal and spatial variability. The present study aims at the development of advanced superensemble schemes comprising of 1-5 day daily precipitation forecasts from five state-of-the-art global circulation models (GCMs), i.e., European Centre for Medium Range Weather Forecasts (Europe), National Center for Environmental Prediction (USA), China Meteorological Administration (China), Canadian Meteorological Centre (Canada) and U.K. Meteorological Office (U.K.) obtained from THORPEX Interactive Grand Global Ensemble (TIGGE), which is one of the most complete data set available. The novel approaches include the dynamical model selection approach in which the selection of the superior models from the participating member models at each grid and for each forecast step in the training period is carried out. Multi-model superensemble based on the training using similar conditions is also discussed in the present study, which is based on the assumption that training with the similar type of conditions may provide the better forecasts in spite of the sequential training which is being used in the conventional multi-model ensemble (MME) approaches. Further, a variety of methods that incorporate a 'neighborhood' around each grid point which is available in literature to allow for spatial error or uncertainty, have also been experimented with the above mentioned approaches. The comparison of these schemes with respect to the observations verifies that the newly developed approaches provide more unified and skillful prediction of the summer monsoon (viz. June to September) rainfall compared to the conventional multi-model approach and the member models.Keywords: multi-model superensemble, dynamical model selection, similarity criteria, neighborhood technique, rainfall prediction
Procedia PDF Downloads 1396706 The Impact of Window Opening Occupant Behavior Models on Building Energy Performance
Authors: Habtamu Tkubet Ebuy
Abstract:
Purpose Conventional dynamic energy simulation tools go beyond the static dimension of simplified methods by providing better and more accurate prediction of building performance. However, their ability to forecast actual performance is undermined by a low representation of human interactions. The purpose of this study is to examine the potential benefits of incorporating information on occupant diversity into occupant behavior models used to simulate building performance. The co-simulation of the stochastic behavior of the occupants substantially increases the accuracy of the simulation. Design/methodology/approach In this article, probabilistic models of the "opening and closing" behavior of the window of inhabitants have been developed in a separate multi-agent platform, SimOcc, and implemented in the building simulation, TRNSYS, in such a way that the behavior of the window with the interconnectivity can be reflected in the simulation analysis of the building. Findings The results of the study prove that the application of complex behaviors is important to research in predicting actual building performance. The results aid in the identification of the gap between reality and existing simulation methods. We hope this study and its results will serve as a guide for researchers interested in investigating occupant behavior in the future. Research limitations/implications Further case studies involving multi-user behavior for complex commercial buildings need to more understand the impact of the occupant behavior on building performance. Originality/value This study is considered as a good opportunity to achieve the national strategy by showing a suitable tool to help stakeholders in the design phase of new or retrofitted buildings to improve the performance of office buildings.Keywords: occupant behavior, co-simulation, energy consumption, thermal comfort
Procedia PDF Downloads 1056705 A Risk-Based Modeling Approach for Successful Adoption of CAATTs in Audits: An Exploratory Study Applied to Israeli Accountancy Firms
Authors: Alon Cohen, Jeffrey Kantor, Shalom Levy
Abstract:
Technology adoption models are extensively used in the literature to explore drivers and inhibitors affecting the adoption of Computer Assisted Audit Techniques and Tools (CAATTs). Further studies from recent years suggested additional factors that may affect technology adoption by CPA firms. However, the adoption of CAATTs by financial auditors differs from the adoption of technologies in other industries. This is a result of the unique characteristics of the auditing process, which are expressed in the audit risk elements and the risk-based auditing approach, as encoded in the auditing standards. Since these audit risk factors are not part of the existing models that are used to explain technology adoption, these models do not fully correspond to the specific needs and requirements of the auditing domain. The overarching objective of this qualitative research is to fill the gap in the literature, which exists as a result of using generic technology adoption models. Followed by a pretest and based on semi-structured in-depth interviews with 16 Israeli CPA firms of different sizes, this study aims to reveal determinants related to audit risk factors that influence the adoption of CAATTs in audits and proposes a new modeling approach for the successful adoption of CAATTs. The findings emphasize several important aspects: (1) while large CPA firms developed their own inner guidelines to assess the audit risk components, other CPA firms do not follow a formal and validated methodology to evaluate these risks; (2) large firms incorporate a variety of CAATTs, including self-developed advanced tools. On the other hand, small and mid-sized CPA firms incorporate standard CAATTs and still need to catch up to better understand what CAATTs can offer and how they can contribute to the quality of the audit; (3) the top management of mid-sized and small CPA firms should be more proactive and updated about CAATTs capabilities and contributions to audits; and (4) All CPA firms consider professionalism as a major challenge that must be constantly managed to ensure an optimal CAATTs operation. The study extends the existing knowledge of CAATTs adoption by looking at it from a risk-based auditing approach. It suggests a new model for CAATTs adoption by incorporating influencing audit risk factors that auditors should examine when considering CAATTs adoption. Since the model can be used in various audited scenarios and supports strategic, risk-based decisions, it maximizes the great potential of CAATTs on the quality of the audits. The results and insights can be useful to CPA firms, internal auditors, CAATTs developers and regulators. Moreover, it may motivate audit standard-setters to issue updated guidelines regarding CAATTs adoption in audits.Keywords: audit risk, CAATTs, financial auditing, information technology, technology adoption models
Procedia PDF Downloads 706704 Numerical Modelling of Immiscible Fluids Flow in Oil Reservoir Rocks during Enhanced Oil Recovery Processes
Authors: Zahreddine Hafsi, Manoranjan Mishra , Sami Elaoud
Abstract:
Ensuring the maximum recovery rate of oil from reservoir rocks is a challenging task that requires preliminary numerical analysis of different techniques used to enhance the recovery process. After conventional oil recovery processes and in order to retrieve oil left behind after the primary recovery phase, water flooding in one of several techniques used for enhanced oil recovery (EOR). In this research work, EOR via water flooding is numerically modeled, and hydrodynamic instabilities resulted from immiscible oil-water flow in reservoir rocks are investigated. An oil reservoir is a porous medium consisted of many fractures of tiny dimensions. For modeling purposes, the oil reservoir is considered as a collection of capillary tubes which provides useful insights into how fluids behave in the reservoir pore spaces. Equations governing oil-water flow in oil reservoir rocks are developed and numerically solved following a finite element scheme. Numerical results are obtained using Comsol Multiphysics software. The two phase Darcy module of COMSOL Multiphysics allows modelling the imbibition process by the injection of water (as wetting phase) into an oil reservoir. Van Genuchten, Brooks Corey and Levrett models were considered as retention models and obtained flow configurations are compared, and the governing parameters are discussed. For the considered retention models it was found that onset of instabilities viz. fingering phenomenon is highly dependent on the capillary pressure as well as the boundary conditions, i.e., the inlet pressure and the injection velocity.Keywords: capillary pressure, EOR process, immiscible flow, numerical modelling
Procedia PDF Downloads 1326703 The Types of Collaboration Models Driven by Public Art Establishment–Case Study of Taichung City
Authors: Cheng-Lung Yu, Ying-His Liao
Abstract:
Some evidence show that public art accelerates local economic growth. Even local governments award the collaboration of public-private partnership to sustain the creation of public art for urban economic development. Through the public-private partnership of public art establishment it is obvious that public construction projects have been led by the governmental policy yet the private developers have played crucial roles to drive the innovative business models such as tourism investment, real estate value up and community participation. This study shows that the types of collaboration have been driven by Taichung city governmental policy from the regulation of public art establishment in the past three years. Through some cases empirical analyzes the authors discover the trends concerning the public art development to support local economic growth in Taiwan.Keywords: public art, public art establishment regulation, construction management, urban governance
Procedia PDF Downloads 346702 WEMax: Virtual Manned Assembly Line Generation
Authors: Won Kyung Ham, Kang Hoon Cho, Sang C. Park
Abstract:
Presented in this paper is a framework of a software ‘WEMax’. The WEMax is invented for analysis and simulation for manned assembly lines to sustain and improve performance of manufacturing systems. In a manufacturing system, performance, such as productivity, is a key of competitiveness for output products. However, the manned assembly lines are difficult to forecast performance, because human labors are not expectable factors by computer simulation models or mathematical models. Existing approaches to performance forecasting of the manned assembly lines are limited to matters of the human itself, such as ergonomic and workload design, and non-human-factor-relevant simulation. Consequently, an approach for the forecasting and improvement of manned assembly line performance is needed to research. As a solution of the current problem, this study proposes a framework that is for generation and simulation of virtual manned assembly lines, and the framework has been implemented as a software.Keywords: performance forecasting, simulation, virtual manned assembly line, WEMax
Procedia PDF Downloads 3286701 Parametric Models of Facade Designs of High-Rise Residential Buildings
Authors: Yuchen Sharon Sung, Yingjui Tseng
Abstract:
High-rise residential buildings have become the most mainstream housing pattern in the world’s metropolises under the current trend of urbanization. The facades of high-rise buildings are essential elements of the urban landscape. The skins of these facades are important media between the interior and exterior of high- rise buildings. It not only connects between users and environments, but also plays an important functional and aesthetic role. This research involves a study of skins of high-rise residential buildings using the methodology of shape grammar to find out the rules which determine the combinations of the facade patterns and analyze the patterns’ parameters using software Grasshopper. We chose a number of facades of high-rise residential buildings as source to discover the underlying rules and concepts of the generation of facade skins. This research also provides the rules that influence the composition of facade skins. The items of the facade skins, such as windows, balconies, walls, sun visors and metal grilles are treated as elements in the system of facade skins. The compositions of these elements will be categorized and described by logical rules; and the types of high-rise building facade skins will be modelled by Grasshopper. Then a variety of analyzed patterns can also be applied on other facade skins through this parametric mechanism. Using these patterns established in the models, researchers can analyze each single item to do more detail tests and architects can apply each of these items to construct their facades for other buildings through various combinations and permutations. The goal of these models is to develop a mechanism to generate prototypes in order to facilitate generation of various facade skins.Keywords: facade skin, grasshopper, high-rise residential building, shape grammar
Procedia PDF Downloads 5106700 Finite Element Modeling of Heat and Moisture Transfer in Porous Material
Authors: V. D. Thi, M. Li, M. Khelifa, M. El Ganaoui, Y. Rogaume
Abstract:
This paper presents a two-dimensional model to study the heat and moisture transfer through porous building materials. Dynamic and static coupled models of heat and moisture transfer in porous material under low temperature are presented and the coupled models together with variable initial and boundary conditions have been considered in an analytical way and using the finite element method. The resulting coupled model is converted to two nonlinear partial differential equations, which is then numerically solved by an implicit iterative scheme. The numerical results of temperature and moisture potential changes are compared with the experimental measurements available in the literature. Predicted results demonstrate validation of the theoretical model and effectiveness of the developed numerical algorithms. It is expected to provide useful information for the porous building material design based on heat and moisture transfer model.Keywords: finite element method, heat transfer, moisture transfer, porous materials, wood
Procedia PDF Downloads 4006699 Factors Impacting Geostatistical Modeling Accuracy and Modeling Strategy of Fluvial Facies Models
Authors: Benbiao Song, Yan Gao, Zhuo Liu
Abstract:
Geostatistical modeling is the key technic for reservoir characterization, the quality of geological models will influence the prediction of reservoir performance greatly, but few studies have been done to quantify the factors impacting geostatistical reservoir modeling accuracy. In this study, 16 fluvial prototype models have been established to represent different geological complexity, 6 cases range from 16 to 361 wells were defined to reproduce all those 16 prototype models by different methodologies including SIS, object-based and MPFS algorithms accompany with different constraint parameters. Modeling accuracy ratio was defined to quantify the influence of each factor, and ten realizations were averaged to represent each accuracy ratio under the same modeling condition and parameters association. Totally 5760 simulations were done to quantify the relative contribution of each factor to the simulation accuracy, and the results can be used as strategy guide for facies modeling in the similar condition. It is founded that data density, geological trend and geological complexity have great impact on modeling accuracy. Modeling accuracy may up to 90% when channel sand width reaches up to 1.5 times of well space under whatever condition by SIS and MPFS methods. When well density is low, the contribution of geological trend may increase the modeling accuracy from 40% to 70%, while the use of proper variogram may have very limited contribution for SIS method. It can be implied that when well data are dense enough to cover simple geobodies, few efforts were needed to construct an acceptable model, when geobodies are complex with insufficient data group, it is better to construct a set of robust geological trend than rely on a reliable variogram function. For object-based method, the modeling accuracy does not increase obviously as SIS method by the increase of data density, but kept rational appearance when data density is low. MPFS methods have the similar trend with SIS method, but the use of proper geological trend accompany with rational variogram may have better modeling accuracy than MPFS method. It implies that the geological modeling strategy for a real reservoir case needs to be optimized by evaluation of dataset, geological complexity, geological constraint information and the modeling objective.Keywords: fluvial facies, geostatistics, geological trend, modeling strategy, modeling accuracy, variogram
Procedia PDF Downloads 2646698 Construction of Ovarian Cancer-on-Chip Model by 3D Bioprinting and Microfluidic Techniques
Authors: Zakaria Baka, Halima Alem
Abstract:
Cancer is a major worldwide health problem that has caused around ten million deaths in 2020. In addition, efforts to develop new anti-cancer drugs still face a high failure rate. This is partly due to the lack of preclinical models that recapitulate in-vivo drug responses. Indeed conventional cell culture approach (known as 2D cell culture) is far from reproducing the complex, dynamic and three-dimensional environment of tumors. To set up more in-vivo-like cancer models, 3D bioprinting seems to be a promising technology due to its ability to achieve 3D scaffolds containing different cell types with controlled distribution and precise architecture. Moreover, the introduction of microfluidic technology makes it possible to simulate in-vivo dynamic conditions through the so-called “cancer-on-chip” platforms. Whereas several cancer types have been modeled through the cancer-on-chip approach, such as lung cancer and breast cancer, only a few works describing ovarian cancer models have been described. The aim of this work is to combine 3D bioprinting and microfluidic technics with setting up a 3D dynamic model of ovarian cancer. In the first phase, alginate-gelatin hydrogel containing SKOV3 cells was used to achieve tumor-like structures through an extrusion-based bioprinter. The desired form of the tumor-like mass was first designed on 3D CAD software. The hydrogel composition was then optimized for ensuring good and reproducible printability. Cell viability in the bioprinted structures was assessed using Live/Dead assay and WST1 assay. In the second phase, these bioprinted structures will be included in a microfluidic device that allows simultaneous testing of different drug concentrations. This microfluidic dispositive was first designed through computational fluid dynamics (CFD) simulations for fixing its precise dimensions. It was then be manufactured through a molding method based on a 3D printed template. To confirm the results of CFD simulations, doxorubicin (DOX) solutions were perfused through the dispositive and DOX concentration in each culture chamber was determined. Once completely characterized, this model will be used to assess the efficacy of anti-cancer nanoparticles developed in the Jean Lamour institute.Keywords: 3D bioprinting, ovarian cancer, cancer-on-chip models, microfluidic techniques
Procedia PDF Downloads 1966697 The Case for Strategic Participation: How Facilitated Engagement Can Be Shown to Reduce Resistance and Improve Outcomes Through the Use of Strategic Models
Authors: Tony Mann
Abstract:
This paper sets out the case for involving and engaging employees/workers/stakeholders/staff in any significant change that is being considered by the senior executives of the organization. It establishes the rationale, the approach, the methodology of engagement and the benefits of a participative approach. It challenges the new norm of imposing change for fear of resistance and instead suggests that involving people has better outcomes and a longer-lasting impact. Various strategic models are introduced and illustrated to explain how the process can be most effective. The paper highlights one model in particular (the Process Iceberg® Organizational Change model) that has proven to be instrumental in developing effective change. Its use is demonstrated in its various forms and explains why so much change fails to address the key elements and how we can be more productive in managing change. ‘Participation’ in change is too often seen as negative, expensive and unwieldy. The paper aims to show that another model: UIA=O+E, can offset the difficulties and, in fact, produce much more positive and effective change.Keywords: facilitation, stakeholders, buy-in, digital workshops
Procedia PDF Downloads 1126696 Incidence of and Risk Factors for Post-Operative Cognitive Dysfunction (POCD) in Neurosurgical Patients: A Prospective Cohort Study
Authors: Suparna Bharadwaj, Sriganesh Kamath, Gopalakrishna K. N., Subhas Konar
Abstract:
Introduction: Post-operative cognitive dysfunction (POCD) is a spectrum of clinical syndrome presenting as emergence delirium (ED) and/or post-operative delirium (POD). ED is a transient state (minutes to hours) of marked agitation after the discontinuation of general anesthesia, which does not respond to consoling measures. On the other hand, POD without identifiable etiology is not temporally related to emergence from anesthesia. These patients often emerge smoothly and may be lucid in the post-anesthesia care unit (PACU), but may develop fluctuating mental status, most commonly between postoperative days one and three. General anesthesia (GA) has been identified as a risk factor for POCD. Cranial surgeries involve brain handling in addition to exposure to GA. We hypothesize that the incidence of postoperative delirium after cranial surgery is twice that of spinal surgery. The primary objective of this study was to evaluate the incidence of emergence delirium and postoperative delirium in patients undergoing cranial and spinal neurosurgeries. The secondary objective was to identify the perioperative risk factors of ED and POD. Methods: This was a prospective cohort observation study conducted from March 2020 to September 2023 conducted at a tertiary neurocentre. After obtaining institutional ethics committee approval, adult patients undergoing cranial or spinal surgery with a Glasgow coma scale of 15 were included in the study. Patients undergoing cranial surgery are considered exposed to risk factors, while patients undergoing spinal surgery are considered unexposed. All study subjects received standard general anesthesia. About twenty perioperative parameters were identified as risk factors for POCD. ED was assessed using the Riker sedation agitation scale, and POD was assessed using the confusion assessment method. A sample size of 2000 patients was planned with 1000 each cranial and spinal cases. However, around 700 spinal patients could be recruited for this study. Results: In this study, about two thousand patients were screened for inclusion. However, 1185 cranial cases and 742 spinal cases were considered for final analysis. Both the groups were similar in terms of demographics. Incidence of ED was 25.8% after cranial surgery vs 10.24% after spinal surgery (relative risk 2.5). The incidence of POD after cranial surgery is 20.25% vs 2.15% after cranial surgery (relative risk 9.3). All the proposed risk factors were assessed using binomial logistic regression. Conclusion: Cranial cases expose patients to a nine times higher risk for the development of postoperative delirium. The presence of ED predisposes to POD representing a spectrum.Keywords: post operative cognitive dysfunction, Neurosurgical patients cohort study, cohort study, emergence delirium
Procedia PDF Downloads 12