Search results for: computational cognitive model
18394 Cultural Influence on Social Cognition in Social and Educational Psychology
Authors: Mbah Fidelix Njong, Sabi Emile Forkwa
Abstract:
Social cognition is an aspect of social psychology that focuses on how people process, store and apply information about others and social situations. It lay emphasis on how cognitive processes play in our social interactions. In this article, we try to show how culture can influence our ways of thinking about others, how we feel and interact with the world around us. Social cognitive processes involve perceiving people and how we learn about the people around us. It concerns the mental processes of remembering, thinking and attending to other people with different cultural backgrounds and how we attend to certain information about the world. Especially in an educational setting, students’ learning processes are most often than not influenced by their cultural background. We can also talk of social schemas. That’s people’s mental representation of social patterns and norms. This involves information about the societal role and the expectations of individuals within a group. These cognitive processes can also be influence by culture. There are important cultural differences in social cognition. In any social situation, two individuals may have different interpretations. Each person brings in a unique background of experiences, knowledge, social influence, feelings and cultural variations. Cultural differences can also affect how people interpret social situations. The same social behavior in one cultural setting might have completely different meaning and interpretation if observed or applied in another culture. However, as people interpret behaviors and bring out meaning from the interpretations, they act based on their beliefs about situations they are confronted with. This helps to reinforce and reproduce the cultural norms that influence their social cognition.Keywords: social cognition, social schema, cultural influence, psychology
Procedia PDF Downloads 9218393 Stock Market Prediction by Regression Model with Social Moods
Authors: Masahiro Ohmura, Koh Kakusho, Takeshi Okadome
Abstract:
This paper presents a regression model with autocorrelated errors in which the inputs are social moods obtained by analyzing the adjectives in Twitter posts using a document topic model. The regression model predicts Dow Jones Industrial Average (DJIA) more precisely than autoregressive moving-average models.Keywords: stock market prediction, social moods, regression model, DJIA
Procedia PDF Downloads 54918392 Reducing the Computational Cost of a Two-way Coupling CFD-FEA Model via a Multi-scale Approach for Fire Determination
Authors: Daniel Martin Fellows, Sean P. Walton, Jennifer Thompson, Oubay Hassan, Kevin Tinkham, Ella Quigley
Abstract:
Structural integrity for cladding products is a key performance parameter, especially concerning fire performance. Cladding products such as PIR-based sandwich panels are tested rigorously, in line with industrial standards. Physical fire tests are necessary to ensure the customer's safety but can give little information about critical behaviours that can help develop new materials. Numerical modelling is a tool that can help investigate a fire's behaviour further by replicating the fire test. However, fire is an interdisciplinary problem as it is a chemical reaction that behaves fluidly and impacts structural integrity. An analysis using Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) is needed to capture all aspects of a fire performance test. One method is a two-way coupling analysis that imports the updated changes in thermal data, due to the fire's behaviour, to the FEA solver in a series of iterations. In light of our recent work with Tata Steel U.K using a two-way coupling methodology to determine the fire performance, it has been shown that a program called FDS-2-Abaqus can make predictions of a BS 476 -22 furnace test with a degree of accuracy. The test demonstrated the fire performance of Tata Steel U.K Trisomet product, a Polyisocyanurate (PIR) based sandwich panel used for cladding. Previous works demonstrated the limitations of the current version of the program, the main limitation being the computational cost of modelling three Trisomet panels, totalling an area of 9 . The computational cost increases substantially, with the intention to scale up to an LPS 1181-1 test, which includes a total panel surface area of 200 .The FDS-2-Abaqus program is developed further within this paper to overcome this obstacle and better accommodate Tata Steel U.K PIR sandwich panels. The new developments aim to reduce the computational cost and error margin compared to experimental data. One avenue explored is a multi-scale approach in the form of Reduced Order Modeling (ROM). The approach allows the user to include refined details of the sandwich panels, such as the overlapping joints, without a computationally costly mesh size.Comparative studies will be made between the new implementations and the previous study completed using the original FDS-2-ABAQUS program. Validation of the study will come from physical experiments in line with governing body standards such as BS 476 -22 and LPS 1181-1. The physical experimental data includes the panels' gas and surface temperatures and mechanical deformation. Conclusions are drawn, noting the new implementations' impact factors and discussing the reasonability for scaling up further to a whole warehouse.Keywords: fire testing, numerical coupling, sandwich panels, thermo fluids
Procedia PDF Downloads 7918391 A Computational Framework for Decoding Hierarchical Interlocking Structures with SL Blocks
Authors: Yuxi Liu, Boris Belousov, Mehrzad Esmaeili Charkhab, Oliver Tessmann
Abstract:
This paper presents a computational solution for designing reconfigurable interlocking structures that are fully assembled with SL Blocks. Formed by S-shaped and L-shaped tetracubes, SL Block is a specific type of interlocking puzzle. Analogous to molecular self-assembly, the aggregation of SL blocks will build a reversible hierarchical and discrete system where a single module can be numerously replicated to compose semi-interlocking components that further align, wrap, and braid around each other to form complex high-order aggregations. These aggregations can be disassembled and reassembled, responding dynamically to design inputs and changes with a unique capacity for reconfiguration. To use these aggregations as architectural structures, we developed computational tools that automate the configuration of SL blocks based on architectural design objectives. There are three critical phases in our work. First, we revisit the hierarchy of the SL block system and devise a top-down-type design strategy. From this, we propose two key questions: 1) How to translate 3D polyominoes into SL block assembly? 2) How to decompose the desired voxelized shapes into a set of 3D polyominoes with interlocking joints? These two questions can be considered the Hamiltonian path problem and the 3D polyomino tiling problem. Then, we derive our solution to each of them based on two methods. The first method is to construct the optimal closed path from an undirected graph built from the voxelized shape and translate the node sequence of the resulting path into the assembly sequence of SL blocks. The second approach describes interlocking relationships of 3D polyominoes as a joint connection graph. Lastly, we formulate the desired shapes and leverage our methods to achieve their reconfiguration within different levels. We show that our computational strategy will facilitate the efficient design of hierarchical interlocking structures with a self-replicating geometric module.Keywords: computational design, SL-blocks, 3D polyomino puzzle, combinatorial problem
Procedia PDF Downloads 12918390 Imaginal and in Vivo Exposure Blended with Emdr: Becoming Unstuck, an Integrated Inpatient Treatment for Post-Traumatic Stress Disorder
Authors: Merrylord Harb-Azar
Abstract:
Traditionally, PTSD treatment has involved trauma-focused cognitive behaviour therapy (TF CBT) to consolidate traumatic memories. A piloted integrated treatment of TF CBT and eye movement desensitisation reprocessing therapy (EMDR) of eight phases will fasten the rate memory is being consolidated and enhance cognitive functioning in patients with PTSD. Patients spend a considerable amount of time in treatment managing their traumas experienced firsthand, or from aversive details ranging from war, assaults, accidents, abuse, hostage related, riots, or natural disasters. The time spent in treatment or as inpatient affects overall quality of life, relationships, cognitive functioning, and overall sense of identity. EMDR is being offered twice a week in conjunction with the standard prolonged exposure as an inpatient in a private hospital. Prolonged exposure for up to 5 hours per day elicits the affect response required for EMDR sessions in the afternoon to unlock unprocessed memories and facilitate consolidation in the amygdala and hippocampus. Results are indicating faster consolidation of memories, reduction in symptoms in a shorter period of time, reduction in admission time, which is enhancing the quality of life and relationships, and improved cognition. The impact of events scale (IES) results demonstrate a significant reduction in symptoms, trauma symptoms inventory (TSI), and posttraumatic stressor disorder check list (PCL) that demonstrates large effect sizes to date. An integrated treatment approach for PTSD achieves a faster resolution of memories, improves cognition, and reduces the amount of time spent in therapy.Keywords: EMDR enhances cognitive functioning, faster consolidation of trauma memory, integrated treatment of TF CBT and EMDR, reduction in inpatient admission time
Procedia PDF Downloads 14518389 Structural Equation Modeling Semiparametric Truncated Spline Using Simulation Data
Authors: Adji Achmad Rinaldo Fernandes
Abstract:
SEM analysis is a complex multivariate analysis because it involves a number of exogenous and endogenous variables that are interconnected to form a model. The measurement model is divided into two, namely, the reflective model (reflecting) and the formative model (forming). Before carrying out further tests on SEM, there are assumptions that must be met, namely the linearity assumption, to determine the form of the relationship. There are three modeling approaches to path analysis, including parametric, nonparametric and semiparametric approaches. The aim of this research is to develop semiparametric SEM and obtain the best model. The data used in the research is secondary data as the basis for the process of obtaining simulation data. Simulation data was generated with various sample sizes of 100, 300, and 500. In the semiparametric SEM analysis, the form of the relationship studied was determined, namely linear and quadratic and determined one and two knot points with various levels of error variance (EV=0.5; 1; 5). There are three levels of closeness of relationship for the analysis process in the measurement model consisting of low (0.1-0.3), medium (0.4-0.6) and high (0.7-0.9) levels of closeness. The best model lies in the form of the relationship X1Y1 linear, and. In the measurement model, a characteristic of the reflective model is obtained, namely that the higher the closeness of the relationship, the better the model obtained. The originality of this research is the development of semiparametric SEM, which has not been widely studied by researchers.Keywords: semiparametric SEM, measurement model, structural model, reflective model, formative model
Procedia PDF Downloads 4118388 A Lightweight Pretrained Encrypted Traffic Classification Method with Squeeze-and-Excitation Block and Sharpness-Aware Optimization
Authors: Zhiyan Meng, Dan Liu, Jintao Meng
Abstract:
Dependable encrypted traffic classification is crucial for improving cybersecurity and handling the growing amount of data. Large language models have shown that learning from large datasets can be effective, making pre-trained methods for encrypted traffic classification popular. However, attention-based pre-trained methods face two main issues: their large neural parameters are not suitable for low-computation environments like mobile devices and real-time applications, and they often overfit by getting stuck in local minima. To address these issues, we developed a lightweight transformer model, which reduces the computational parameters through lightweight vocabulary construction and Squeeze-and-Excitation Block. We use sharpness-aware optimization to avoid local minima during pre-training and capture temporal features with relative positional embeddings. Our approach keeps the model's classification accuracy high for downstream tasks. We conducted experiments on four datasets -USTC-TFC2016, VPN 2016, Tor 2016, and CICIOT 2022. Even with fewer than 18 million parameters, our method achieves classification results similar to methods with ten times as many parameters.Keywords: sharpness-aware optimization, encrypted traffic classification, squeeze-and-excitation block, pretrained model
Procedia PDF Downloads 3018387 Newly Designed Ecological Task to Assess Cognitive Map Reading Ability: Behavioral Neuro-Anatomic Correlates of Mental Navigation
Authors: Igor Faulmann, Arnaud Saj, Roland Maurer
Abstract:
Spatial cognition consists in a plethora of high level cognitive abilities: among them, the ability to learn and to navigate in large scale environments is probably one of the most complex skills. Navigation is thought to rely on the ability to read a cognitive map, defined as an allocentric representation of ones environment. Those representations are of course intimately related to the two geometrical primitives of the environment: distance and direction. Also, many recent studies point to a predominant hippocampal and para-hippocampal role in spatial cognition, as well as in the more specific cluster of navigational skills. In a previous study in humans, we used a newly validated test assessing cognitive map processing by evaluating the ability to judge relative distances and directions: the CMRT (Cognitive Map Recall Test). This study identified in topographically disorientated patients (1) behavioral differences between the evaluation of distances and of directions, and (2) distinct causality patterns assessed via VLSM (i.e., distinct cerebral lesions cause distinct response patterns depending on the modality (distance vs direction questions). Thus, we hypothesized that: (1) if the CMRT really taps into the same resources as real navigation, there would be hippocampal, parahippocampal, and parietal activation, and (2) there exists underlying neuroanatomical and functional differences between the processing of this two modalities. Aiming toward a better understanding of the neuroanatomical correlates of the CMRT in humans, and more generally toward a better understanding of how the brain processes the cognitive map, we adapted the CMRT as an fMRI procedure. 23 healthy subjects (11 women, 12 men), all living in Geneva for at least 2 years, underwent the CMRT in fMRI. Results show, for distance and direction taken together, than the most active brain regions are the parietal, frontal and cerebellar parts. Additionally, and as expected, patterns of brain activation differ when comparing the two modalities. Furthermore, distance processing seems to rely more on parietal regions (compared to other brain regions in the same modality and also to direction). It is interesting to notice that no significant activity was observed in the hippocampal or parahippocampal areas. Direction processing seems to tap more into frontal and cerebellar brain regions (compared to other brain regions in the same modality and also to distance). Significant hippocampal and parahippocampal activity has been shown only in this modality. This results demonstrated a complex interaction of structures which are compatible with response patterns observed in other navigational tasks, thus showing that the CMRT taps at least partially into the same brain resources as real navigation. Additionally, differences between the processing of distances and directions leads to the conclusion that the human brain processes each modality distinctly. Further research should focus on the dynamics of this processing, allowing a clearer understanding between the two sub-processes.Keywords: cognitive map, navigation, fMRI, spatial cognition
Procedia PDF Downloads 29418386 Predicting the Exposure Level of Airborne Contaminants in Occupational Settings via the Well-Mixed Room Model
Authors: Alireza Fallahfard, Ludwig Vinches, Stephane Halle
Abstract:
In the workplace, the exposure level of airborne contaminants should be evaluated due to health and safety issues. It can be done by numerical models or experimental measurements, but the numerical approach can be useful when it is challenging to perform experiments. One of the simplest models is the well-mixed room (WMR) model, which has shown its usefulness to predict inhalation exposure in many situations. However, since the WMR is limited to gases and vapors, it cannot be used to predict exposure to aerosols. The main objective is to modify the WMR model to expand its application to exposure scenarios involving aerosols. To reach this objective, the standard WMR model has been modified to consider the deposition of particles by gravitational settling and Brownian and turbulent deposition. Three deposition models were implemented in the model. The time-dependent concentrations of airborne particles predicted by the model were compared to experimental results conducted in a 0.512 m3 chamber. Polystyrene particles of 1, 2, and 3 µm in aerodynamic diameter were generated with a nebulizer under two air changes per hour (ACH). The well-mixed condition and chamber ACH were determined by the tracer gas decay method. The mean friction velocity on the chamber surfaces as one of the input variables for the deposition models was determined by computational fluid dynamics (CFD) simulation. For the experimental procedure, the particles were generated until reaching the steady-state condition (emission period). Then generation stopped, and concentration measurements continued until reaching the background concentration (decay period). The results of the tracer gas decay tests revealed that the ACHs of the chamber were: 1.4 and 3.0, and the well-mixed condition was achieved. The CFD results showed the average mean friction velocity and their standard deviations for the lowest and highest ACH were (8.87 ± 0.36) ×10-2 m/s and (8.88 ± 0.38) ×10-2 m/s, respectively. The numerical results indicated the difference between the predicted deposition rates by the three deposition models was less than 2%. The experimental and numerical aerosol concentrations were compared in the emission period and decay period. In both periods, the prediction accuracy of the modified model improved in comparison with the classic WMR model. However, there is still a difference between the actual value and the predicted value. In the emission period, the modified WMR results closely follow the experimental data. However, the model significantly overestimates the experimental results during the decay period. This finding is mainly due to an underestimation of the deposition rate in the model and uncertainty related to measurement devices and particle size distribution. Comparing the experimental and numerical deposition rates revealed that the actual particle deposition rate is significant, but the deposition mechanisms considered in the model were ten times lower than the experimental value. Thus, particle deposition was significant and will affect the airborne concentration in occupational settings, and it should be considered in the airborne exposure prediction model. The role of other removal mechanisms should be investigated.Keywords: aerosol, CFD, exposure assessment, occupational settings, well-mixed room model, zonal model
Procedia PDF Downloads 10318385 Remembering Route in an Unfamiliar Homogenous Environment
Authors: Ahmed Sameer, Braj Bhushan
Abstract:
The objective of our study was to compare two techniques (no landmark vs imaginary landmark) of remembering route while traversing in an unfamiliar homogenous environment. We used two videos each having nine identical turns with no landmarks. In the first video participant was required to remember the sequence of turns. In the second video participant was required to imagine a landmark at each turn and associate the turn with it. In both the task the participant was asked to recall the sequence of turns as it appeared in the video. Results showed that performance in the first condition i.e. without use of landmarks was better than imaginary landmark condition. The difference, however, became significant when the participant were tested again about 30 minutes later though performance was still better in no-landmark condition. The finding is surprising given the past research in memory and is explained in terms of cognitive factors such as mental workload.Keywords: wayfinding, landmarks, unfamiliar environment, cognitive psychology
Procedia PDF Downloads 47618384 Documenting the 15th Century Prints with RTI
Authors: Peter Fornaro, Lothar Schmitt
Abstract:
The Digital Humanities Lab and the Institute of Art History at the University of Basel are collaborating in the SNSF research project ‘Digital Materiality’. Its goal is to develop and enhance existing methods for the digital reproduction of cultural heritage objects in order to support art historical research. One part of the project focuses on the visualization of a small eye-catching group of early prints that are noteworthy for their subtle reliefs and glossy surfaces. Additionally, this group of objects – known as ‘paste prints’ – is characterized by its fragile state of preservation. Because of the brittle substances that were used for their production, most paste prints are heavily damaged and thus very hard to examine. These specific material properties make a photographic reproduction extremely difficult. To obtain better results we are working with Reflectance Transformation Imaging (RTI), a computational photographic method that is already used in archaeological and cultural heritage research. This technique allows documenting how three-dimensional surfaces respond to changing lighting situations. Our first results show that RTI can capture the material properties of paste prints and their current state of preservation more accurately than conventional photographs, although there are limitations with glossy surfaces because the mathematical models that are included in RTI are kept simple in order to keep the software robust and easy to use. To improve the method, we are currently developing tools for a more detailed analysis and simulation of the reflectance behavior. An enhanced analytical model for the representation and visualization of gloss will increase the significance of digital representations of cultural heritage objects. For collaborative efforts, we are working on a web-based viewer application for RTI images based on WebGL in order to make acquired data accessible to a broader international research community. At the ICDH Conference, we would like to present unpublished results of our work and discuss the implications of our concept for art history, computational photography and heritage science.Keywords: art history, computational photography, paste prints, reflectance transformation imaging
Procedia PDF Downloads 27618383 Fuzzy Data, Random Drift, and a Theoretical Model for the Sequential Emergence of Religious Capacity in Genus Homo
Authors: Margaret Boone Rappaport, Christopher J. Corbally
Abstract:
The ancient ape ancestral population from which living great ape and human species evolved had demographic features affecting their evolution. The population was large, had great genetic variability, and natural selection was effective at honing adaptations. The emerging populations of chimpanzees and humans were affected more by founder effects and genetic drift because they were smaller. Natural selection did not disappear, but it was not as strong. Consequences of the 'population crash' and the human effective population size are introduced briefly. The history of the ancient apes is written in the genomes of living humans and great apes. The expansion of the brain began before the human line emerged. Coalescence times for some genes are very old – up to several million years, long before Homo sapiens. The mismatch between gene trees and species trees highlights the anthropoid speciation processes, and gives the human genome history a fuzzy, probabilistic quality. However, it suggests traits that might form a foundation for capacities emerging later. A theoretical model is presented in which the genomes of early ape populations provide the substructure for the emergence of religious capacity later on the human line. The model does not search for religion, but its foundations. It suggests a course by which an evolutionary line that began with prosimians eventually produced a human species with biologically based religious capacity. The model of the sequential emergence of religious capacity relies on cognitive science, neuroscience, paleoneurology, primate field studies, cognitive archaeology, genomics, and population genetics. And, it emphasizes five trait types: (1) Documented, positive selection of sensory capabilities on the human line may have favored survival, but also eventually enriched human religious experience. (2) The bonobo model suggests a possible down-regulation of aggression and increase in tolerance while feeding, as well as paedomorphism – but, in a human species that remains cognitively sharp (unlike the bonobo). The two species emerged from the same ancient ape population, so it is logical to search for shared traits. (3) An up-regulation of emotional sensitivity and compassion seems to have occurred on the human line. This finds support in modern genetic studies. (4) The authors’ published model of morality's emergence in Homo erectus encompasses a cognitively based, decision-making capacity that was hypothetically overtaken, in part, by religious capacity. Together, they produced a strong, variable, biocultural capability to support human sociability. (5) The full flowering of human religious capacity came with the parietal expansion and smaller face (klinorhynchy) found only in Homo sapiens. Details from paleoneurology suggest the stage was set for human theologies. Larger parietal lobes allowed humans to imagine inner spaces, processes, and beings, and, with the frontal lobe, led to the first theologies composed of structured and integrated theories of the relationships between humans and the supernatural. The model leads to the evolution of a small population of African hominins that was ready to emerge with religious capacity when the species Homo sapiens evolved two hundred thousand years ago. By 50-60,000 years ago, when human ancestors left Africa, they were fully enabled.Keywords: genetic drift, genomics, parietal expansion, religious capacity
Procedia PDF Downloads 34118382 A Context Aware Mobile Learning System with a Cognitive Recommendation Engine
Authors: Jalal Maqbool, Gyu Myoung Lee
Abstract:
Using smart devices for context aware mobile learning is becoming increasingly popular. This has led to mobile learning technology becoming an indispensable part of today’s learning environment and platforms. However, some fundamental issues remain - namely, mobile learning still lacks the ability to truly understand human reaction and user behaviour. This is due to the fact that current mobile learning systems are passive and not aware of learners’ changing contextual situations. They rely on static information about mobile learners. In addition, current mobile learning platforms lack the capability to incorporate dynamic contextual situations into learners’ preferences. Thus, this thesis aims to address these issues highlighted by designing a context aware framework which is able to sense learner’s contextual situations, handle data dynamically, and which can use contextual information to suggest bespoke learning content according to a learner’s preferences. This is to be underpinned by a robust recommendation system, which has the capability to perform these functions, thus providing learners with a truly context-aware mobile learning experience, delivering learning contents using smart devices and adapting to learning preferences as and when it is required. In addition, part of designing an algorithm for the recommendation engine has to be based on learner and application needs, personal characteristics and circumstances, as well as being able to comprehend human cognitive processes which would enable the technology to interact effectively and deliver mobile learning content which is relevant, according to the learner’s contextual situations. The concept of this proposed project is to provide a new method of smart learning, based on a capable recommendation engine for providing an intuitive mobile learning model based on learner actions.Keywords: aware, context, learning, mobile
Procedia PDF Downloads 24518381 A Lagrangian Hamiltonian Computational Method for Hyper-Elastic Structural Dynamics
Authors: Hosein Falahaty, Hitoshi Gotoh, Abbas Khayyer
Abstract:
Performance of a Hamiltonian based particle method in simulation of nonlinear structural dynamics is subjected to investigation in terms of stability and accuracy. The governing equation of motion is derived based on Hamilton's principle of least action, while the deformation gradient is obtained according to Weighted Least Square method. The hyper-elasticity models of Saint Venant-Kirchhoff and a compressible version similar to Mooney- Rivlin are engaged for the calculation of second Piola-Kirchhoff stress tensor, respectively. Stability along with accuracy of numerical model is verified by reproducing critical stress fields in static and dynamic responses. As the results, although performance of Hamiltonian based model is evaluated as being acceptable in dealing with intense extensional stress fields, however kinds of instabilities reveal in the case of violent collision which can be most likely attributed to zero energy singular modes.Keywords: Hamilton's principle of least action, particle-based method, hyper-elasticity, analysis of stability
Procedia PDF Downloads 34118380 On the Evaluation of Different Turbulence Models through the Displacement of Oil-Water Flow in Porous Media
Authors: Sidique Gawusu, Xiaobing Zhang
Abstract:
Turbulence models play a significant role in all computational fluid dynamics based modelling approaches. There is, however, no general turbulence model suitable for all flow scenarios. Therefore, a successful numerical modelling approach is only achievable if a more appropriate closure model is used. This paper evaluates different turbulence models in numerical modelling of oil-water flow within the Eulerian-Eulerian approach. A comparison among the obtained numerical results and published benchmark data showed reasonable agreement. The domain was meshed using structured mesh, and grid test was performed to ascertain grid independence. The evaluation of the models was made through analysis of velocity and pressure profiles across the domain. The models were tested for their suitability to accurately obtain a scalable and precise numerical experience. As a result, it is found that all the models except Standard-ω provide comparable results. The study also revealed new insights on flow in porous media, specifically oil reservoirs.Keywords: turbulence modelling, simulation, multi-phase flows, water-flooding, heavy oil
Procedia PDF Downloads 27918379 Metabolic Predictive Model for PMV Control Based on Deep Learning
Authors: Eunji Choi, Borang Park, Youngjae Choi, Jinwoo Moon
Abstract:
In this study, a predictive model for estimating the metabolism (MET) of human body was developed for the optimal control of indoor thermal environment. Human body images for indoor activities and human body joint coordinated values were collected as data sets, which are used in predictive model. A deep learning algorithm was used in an initial model, and its number of hidden layers and hidden neurons were optimized. Lastly, the model prediction performance was analyzed after the model being trained through collected data. In conclusion, the possibility of MET prediction was confirmed, and the direction of the future study was proposed as developing various data and the predictive model.Keywords: deep learning, indoor quality, metabolism, predictive model
Procedia PDF Downloads 25718378 Model Averaging in a Multiplicative Heteroscedastic Model
Authors: Alan Wan
Abstract:
In recent years, the body of literature on frequentist model averaging in statistics has grown significantly. Most of this work focuses on models with different mean structures but leaves out the variance consideration. In this paper, we consider a regression model with multiplicative heteroscedasticity and develop a model averaging method that combines maximum likelihood estimators of unknown parameters in both the mean and variance functions of the model. Our weight choice criterion is based on a minimisation of a plug-in estimator of the model average estimator's squared prediction risk. We prove that the new estimator possesses an asymptotic optimality property. Our investigation of finite-sample performance by simulations demonstrates that the new estimator frequently exhibits very favourable properties compared to some existing heteroscedasticity-robust model average estimators. The model averaging method hedges against the selection of very bad models and serves as a remedy to variance function misspecification, which often discourages practitioners from modeling heteroscedasticity altogether. The proposed model average estimator is applied to the analysis of two real data sets.Keywords: heteroscedasticity-robust, model averaging, multiplicative heteroscedasticity, plug-in, squared prediction risk
Procedia PDF Downloads 38518377 Reliability Prediction of Tires Using Linear Mixed-Effects Model
Authors: Myung Hwan Na, Ho- Chun Song, EunHee Hong
Abstract:
We widely use normal linear mixed-effects model to analysis data in repeated measurement. In case of detecting heteroscedasticity and the non-normality of the population distribution at the same time, normal linear mixed-effects model can give improper result of analysis. To achieve more robust estimation, we use heavy tailed linear mixed-effects model which gives more exact and reliable analysis conclusion than standard normal linear mixed-effects model.Keywords: reliability, tires, field data, linear mixed-effects model
Procedia PDF Downloads 56418376 Cognitive Benefits of Being Bilingual: The Effect of Language Learning on the Working Memory in Emerging Miao-Mandarin Juveniles in Rural Regions of China
Authors: Peien Ma
Abstract:
Bilingual effect/advantage theorized the positive effect of being bilingual on general cognitive abilities, but it was unknown which factors tend to modulate these bilingualism effects on working memory capacity. This study imposed empirical field research on a group of low-SES emerging bilinguals, Miao people, in the hill tribes of rural China to investigate whether bilingualism affected their verbal working memory performance. 20 Miao-Chinese bilinguals (13 girls and 7 boys with a mean age of 11.45, SD=1.67) and 20 Chinese monolingual peers (13 girls and 7 boys with a mean age of 11.6, SD=0.68) were recruited. These bilingual and monolingual juveniles, matched on age, sex, socioeconomic status, and educational status, completed a language background questionnaire and a standard forward and backward digit span test adapted from Wechsler Adult Intelligence Scale-Revised (WAIS-R). The results showed that bilinguals earned a significantly higher overall mean score of the task, suggesting the superiority of working memory ability over the monolinguals. And bilingual cognitive benefits were independent of proficiency levels in learners’ two languages. The results suggested that bilingualism enhances working memory in sequential bilinguals from low SES backgrounds and shed light on our understanding of the bilingual advantage from a psychological and social perspective.Keywords: bilingual effects, heritage language, Miao/Hmong language Mandarin, working memory
Procedia PDF Downloads 15718375 A Method for Reduction of Association Rules in Data Mining
Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa
Abstract:
The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.Keywords: data mining, association rules, rules reduction, artificial intelligence
Procedia PDF Downloads 16118374 Characterization of Group Dynamics for Fostering Mathematical Modeling Competencies
Authors: Ayse Ozturk
Abstract:
The study extends the prior research on modeling competencies by positioning students’ cognitive and language resources as the fundamentals for pursuing their own inquiry and expression lines through mathematical modeling. This strategy aims to answer the question that guides this study, “How do students’ group approaches to modeling tasks affect their modeling competencies over a unit of instruction?” Six bilingual tenth-grade students worked on open-ended modeling problems along with the content focused on quantities over six weeks. Each group was found to have a unique cognitive approach for solving these problems. Three different problem-solving strategies affected how the groups’ modeling competencies changed. The results provide evidence that the discussion around groups’ solutions, coupled with their reflections, advances group interpreting and validating competencies in the mathematical modeling processKeywords: cognition, collective learning, mathematical modeling competencies, problem-solving
Procedia PDF Downloads 15918373 A Numerical Study on the Effects of N2 Dilution on the Flame Structure and Temperature Distribution of Swirl Diffusion Flames
Authors: Yasaman Tohidi, Shidvash Vakilipour, Saeed Ebadi Tavallaee, Shahin Vakilipoor Takaloo, Hossein Amiri
Abstract:
The numerical modeling is performed to study the effects of N2 addition to the fuel stream on the flame structure and temperature distribution of methane-air swirl diffusion flames with different swirl intensities. The Open source Field Operation and Manipulation (OpenFOAM) has been utilized as the computational tool. Flamelet approach along with modified k-ε model is employed to model the flame characteristics. The results indicate that the presence of N2 in the fuel stream leads to the flame temperature reduction. By increasing of swirl intensity, the flame structure changes significantly. The flame has a conical shape in low swirl intensity; however, it has an hour glass-shape with a shorter length in high swirl intensity. The effects of N2 dilution decrease the flame length in all swirl intensities; however, the rate of reduction is more noticeable in low swirl intensity.Keywords: swirl diffusion flame, N2 dilution, OpenFOAM, swirl intensity
Procedia PDF Downloads 16918372 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model
Authors: Donatella Giuliani
Abstract:
In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation
Procedia PDF Downloads 21718371 Health Risk Assessment of Exposing to Benzene in Office Building around a Chemical Industry Based on Numerical Simulation
Authors: Majid Bayatian, Mohammadreza Ashouri
Abstract:
Releasing hazardous chemicals is one of the major problems for office buildings in the chemical industry and, therefore, environmental risks are inherent to these environments. The adverse health effects of the airborne concentration of benzene have been a matter of significant concern, especially in oil refineries. The chronic and acute adverse health effects caused by benzene exposure have attracted wide attention. Acute exposure to benzene through inhalation could cause headaches, dizziness, drowsiness, and irritation of the skin. Chronic exposures have reported causing aplastic anemia and leukemia at the occupational settings. Association between chronic occupational exposure to benzene and the development of aplastic anemia and leukemia were documented by several epidemiological studies. Numerous research works have investigated benzene emissions and determined benzene concentration at different locations of the refinery plant and stated considerable health risks. The high cost of industrial control measures requires justification through lifetime health risk assessment of exposed workers and the public. In the present study, a Computational Fluid Dynamics (CFD) model has been proposed to assess the exposure risk of office building around a refinery due to its release of benzene. For simulation, GAMBIT, FLUENT, and CFD Post software were used as pre-processor, processor, and post-processor, and the model was validated based on comparison with experimental results of benzene concentration and wind speed. Model validation results showed that the model is highly validated, and this model can be used for health risk assessment. The simulation and risk assessment results showed that benzene could be dispersion to an office building nearby, and the exposure risk has been unacceptable. According to the results of this study, a validated CFD model, could be very useful for decision-makers for control measures and possibly support them for emergency planning of probable accidents. Also, this model can be used to assess exposure to various types of accidents as well as other pollutants such as toluene, xylene, and ethylbenzene in different atmospheric conditions.Keywords: health risk assessment, office building, Benzene, numerical simulation, CFD
Procedia PDF Downloads 13018370 Role of Collaborative Cultural Model to Step on Cleaner Energy: A Case of Kathmandu City Core
Authors: Bindu Shrestha, Sudarshan R. Tiwari, Sushil B. Bajracharya
Abstract:
Urban household cooking fuel choice is highly influenced by human behavior and energy culture parameters such as cognitive norms, material culture and practices. Although these parameters have a leading role in Kathmandu for cleaner households, they are not incorporated in the city’s energy policy. This paper aims to identify trade-offs to transform resident behavior in cooking pattern towards cleaner technology from the questionnaire survey, observation, mapping, interview, and quantitative analysis. The analysis recommends implementing a Collaborative Cultural Model (CCM) for changing impact on the neighborhood from the policy level. The results showed that each household produces 439.56 kg of carbon emission each year and 20 percent used unclean technology due to low-income level. Residents who used liquefied petroleum gas (LPG) as their cooking fuel suffered from an energy crisis every year that has created fuel hoarding, which ultimately creates more energy demand and carbon exposure. In conclusion, the carbon emission can be reduced by improving the residents’ energy consumption culture. It recommended the city to use holistic action of changing habits as soft power of collaboration in two-way participation approach within residents, private sectors, and government to change their energy culture and behavior in policy level.Keywords: energy consumption pattern, collaborative cultural model, energy culture, fuel stacking
Procedia PDF Downloads 13418369 Hybridization of Manually Extracted and Convolutional Features for Classification of Chest X-Ray of COVID-19
Authors: M. Bilal Ishfaq, Adnan N. Qureshi
Abstract:
COVID-19 is the most infectious disease these days, it was first reported in Wuhan, the capital city of Hubei in China then it spread rapidly throughout the whole world. Later on 11 March 2020, the World Health Organisation (WHO) declared it a pandemic. Since COVID-19 is highly contagious, it has affected approximately 219M people worldwide and caused 4.55M deaths. It has brought the importance of accurate diagnosis of respiratory diseases such as pneumonia and COVID-19 to the forefront. In this paper, we propose a hybrid approach for the automated detection of COVID-19 using medical imaging. We have presented the hybridization of manually extracted and convolutional features. Our approach combines Haralick texture features and convolutional features extracted from chest X-rays and CT scans. We also employ a minimum redundancy maximum relevance (MRMR) feature selection algorithm to reduce computational complexity and enhance classification performance. The proposed model is evaluated on four publicly available datasets, including Chest X-ray Pneumonia, COVID-19 Pneumonia, COVID-19 CTMaster, and VinBig data. The results demonstrate high accuracy and effectiveness, with 0.9925 on the Chest X-ray pneumonia dataset, 0.9895 on the COVID-19, Pneumonia and Normal Chest X-ray dataset, 0.9806 on the Covid CTMaster dataset, and 0.9398 on the VinBig dataset. We further evaluate the effectiveness of the proposed model using ROC curves, where the AUC for the best-performing model reaches 0.96. Our proposed model provides a promising tool for the early detection and accurate diagnosis of COVID-19, which can assist healthcare professionals in making informed treatment decisions and improving patient outcomes. The results of the proposed model are quite plausible and the system can be deployed in a clinical or research setting to assist in the diagnosis of COVID-19.Keywords: COVID-19, feature engineering, artificial neural networks, radiology images
Procedia PDF Downloads 7518368 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing
Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah
Abstract:
The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing
Procedia PDF Downloads 42918367 Integrating HOTS Activities with Geogebra in Pre-Service Teachers' Preparation
Authors: Wajeeh Daher, Nimer Baya'a
Abstract:
High Order Thinking Skills (HOTS) are suggested today as essential for the cognitive development of students and as preparing them for real life skills. Teachers are encouraged to use HOTS activities in the classroom to help their students develop higher order skills and deep thinking. So it is essential to prepare pre-service teachers to write and use HOTS activities for their students. This paper describes a model for integrating HOTS activities with GeoGebra in pre-service teachers’ preparation. This model describes four aspects of HOTS activities and working with them: Activity components, preparation procedure, strategies and processes used in writing a HOTS activity and types of the HOTS activities. In addition, the paper describes the pre-service teachers' difficulties in preparing and working with HOTS activities, as well as their perceptions regarding the use of these activities and GeoGebra in the mathematics classroom. The paper also describes the contribution of a HOTS activity to pupils' learning of mathematics, where this HOTS activity was prepared and taught by one pre-service teacher.Keywords: high order thinking skills, HOTS activities, pre-service teachers, professional development
Procedia PDF Downloads 34718366 Risk of Disrupted Eating Attitudes in Disabled Athletes
Authors: Zehra Buyuktuncer, Aylin H. Büyükkaragöz, Tuğçe N. Balcı, Nevin Ergun
Abstract:
Background: Undergoing rigid dietary habits for enhancing athletic performance could lead to eating disorders. High prevalence of eating disorders among female athletes has been already reported. However, the risk of disordered eating among disabled athletes is not known. A better knowledge of the different eating behaviors and their prevalence in disabled athletes would be helpful to understand interactions between eating and health. This study aimed to examine the cognitive restraint, uncontrolled eating and emotional eating behaviors in a disabled athlete population. Method: A total of 70 disabled Turkish national athletes (33 female, 37 male) from 5 sport branches (soccer, weight lifting, shooting, table tennis and basketball) were involved in the study. The cognitive restraint, uncontrolled eating and emotional eating behaviors were assessed using the revised version of Three Factor Eating Questionnaire-R18 (TFEQ-R18). The questionnaires were conducted by dietitian during the preparation camps of athletes. Body weight, height and waist circumference (WC) were measured; and body composition was analyzed by bioelectrical impedance analysis method. Results: The TFEQ scales showed a cognitive dietary restraint score of 13.9±4.2, uncontrolled eating score of 17.7±5.8 and emotional eating score of 4.9±2.5. The mean score of total TFEQ-R18 was 36.5±8.62. Neither total TFEQ-R18 score nor subscale scores differed significantly by gender or sport branches (p>0.05, for each). The scores were also similar in BMI groups (n=63; p>0.05). Total TFEQ, uncontrolled eating and emotional eating scores were significantly higher among the athletes with congenital disabilities compared to the scores of the athletes with acquired disabilities (p<0.05, for each). Moreover, the cognitive dietary restraint score was significantly high in athletes who would like to lose weight (p=0.009). Conclusion: Disabled athletes might have a risk of disordered eating. The different eating behaviors among disabled athletes should be assessed using validated tools to develop personalized nutritional strategies for those athletes.Keywords: disabled athletes, eating behaviour, three-factor eating questionnaire-r18, body composition
Procedia PDF Downloads 33518365 Towards a Measurement-Based E-Government Portals Maturity Model
Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri
Abstract:
The e-government emerging concept transforms the way in which the citizens are dealing with their governments. Thus, the citizens can execute the intended services online anytime and anywhere. This results in great benefits for both the governments (reduces the number of officers) and the citizens (more flexibility and time saving). Therefore, building a maturity model to assess the e-government portals becomes desired to help in the improvement process of such portals. This paper aims at proposing an e-government maturity model based on the measurement of the best practices’ presence. The main benefit of such maturity model is to provide a way to rank an e-government portal based on the used best practices, and also giving a set of recommendations to go to the higher stage in the maturity model.Keywords: best practices, e-government portal, maturity model, quality model
Procedia PDF Downloads 338