Search results for: game outcome prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4667

Search results for: game outcome prediction

3707 Artificial Neural Network Based Approach in Prediction of Potential Water Pollution Across Different Land-Use Patterns

Authors: M.Rüştü Karaman, İsmail İşeri, Kadir Saltalı, A.Reşit Brohi, Ayhan Horuz, Mümin Dizman

Abstract:

Considerable relations has recently been given to the environmental hazardous caused by agricultural chemicals such as excess fertilizers. In this study, a neural network approach was investigated in the prediction of potential nitrate pollution across different land-use patterns by using a feedforward multilayered computer model of artificial neural network (ANN) with proper training. Periodical concentrations of some anions, especially nitrate (NO3-), and cations were also detected in drainage waters collected from the drain pipes placed in irrigated tomato field, unirrigated wheat field, fallow and pasture lands. The soil samples were collected from the irrigated tomato field and unirrigated wheat field on a grid system with 20 m x 20 m intervals. Site specific nitrate concentrations in the soil samples were measured for ANN based simulation of nitrate leaching potential from the land profiles. In the application of ANN model, a multi layered feedforward was evaluated, and data sets regarding with training, validation and testing containing the measured soil nitrate values were estimated based on spatial variability. As a result of the testing values, while the optimal structures of 2-15-1 was obtained (R2= 0.96, P < 0.01) for unirrigated field, the optimal structures of 2-10-1 was obtained (R2= 0.96, P < 0.01) for irrigated field. The results showed that the ANN model could be successfully used in prediction of the potential leaching levels of nitrate, based on different land use patterns. However, for the most suitable results, the model should be calibrated by training according to different NN structures depending on site specific soil parameters and varied agricultural managements.

Keywords: artificial intelligence, ANN, drainage water, nitrate pollution

Procedia PDF Downloads 295
3706 The UN Mediation in the Armed Conflict of Nepal and El Salvador: A Cross-Regional Comparative Perspective Study

Authors: Anu S. Krishna

Abstract:

The paper tries to analyse the UN involvement/intervention in the case of intra-state armed conflict of El Salvador and Nepal comparatively. The peace mission in El Salvador is considered to be the most successful missions of UN ever since it started involving in the peace-building activities. Meanwhile, in the armed conflict of South Asian country, Nepal, the result seemed to be disappointing in comparison with its counterpart. The study on this paper takes three variables as the success or failure of international mediation, i.e., a) signing of the peace agreement, b) disarmament/demobilization and c) constitutional mechanism. A significant amount of scholarship looks at the case of ONUSAL (United Nations Mission in El Salvador). Meanwhile, the armed conflict of Nepal and the role of UNMIN (United Nations Mediation in Nepal) are under researched so far. The paper thus tries to throw light on these cross-regional contexts that share certain similarities and dissimilarities in the nature of conflict. In addition, the international third-party involvement and their way of approaching both the cases differ, which again affected the mediation outcome. The paper tries to argue that, since the approach of the UN led international mediation in theses peace missions were contextual and varied from case to case, thus, finally affected the mediation outcome too.

Keywords: Nepal, UNMIN, El Salvador, ONUSAL, international mediation, armed conflict

Procedia PDF Downloads 382
3705 The Semiotics of Soft Power; An Examination of the South Korean Entertainment Industry

Authors: Enya Trenholm-Jensen

Abstract:

This paper employs various semiotic methodologies to examine the mechanism of soft power. Soft power refers to a country’s global reputation and their ability to leverage that reputation to achieve certain aims. South Korea has invested heavily in their soft power strategy for a multitude of predominantly historical and geopolitical reasons. On account of this investment and the global prominence of their strategy, South Korea was considered to be the optimal candidate for the aims of this investigation. Having isolated the entertainment industry as one of the most heavily funded segments of the South Korean soft power strategy, the analysis restricted itself to this sector. Within this industry, two entertainment products were selected as case studies. The case studies were chosen based on commercial success according to metrics such as streams, purchases, and subsequent revenue. This criterion was deemed to be the most objective and verifiable indicator of the products general appeal. The entertainment products which met the chosen criterion were Netflix’ “Squid Game” and BTS’ hit single “Butter”. The methodologies employed were chosen according to the medium of the entertainment products. For “Squid Game,” an aesthetic analysis was carried out to investigate how multi- layered meanings were mobilized in a show popularized by its visual grammar. To examine “Butter”, both music semiology and linguistic analysis were employed. The music section featured an analysis underpinned by denotative and connotative music semiotic theories borrowing from scholars Theo van Leeuwen and Martin Irvine. The linguistic analysis focused on stance and semantic fields according to scholarship by George Yule and John W. DuBois. The aesthetic analysis of the first case study revealed intertextual references to famous artworks, which served to augment the emotional provocation of the Squid Game narrative. For the second case study, the findings exposed a set of musical meaning units arranged in a patchwork of familiar and futuristic elements to achieve a song that existed on the boundary between old and new. The linguistic analysis of the song’s lyrics found a deceptively innocuous surface level meaning that bore implications for authority, intimacy, and commercial success. Whether through means of visual metaphor, embedded auditory associations, or linguistic subtext, the collective findings of the three analyses exhibited a desire to conjure a form of positive arousal in the spectator. In the synthesis section, this process is likened to that of branding. Through an exploration of branding, the entertainment products can be understood as cogs in a larger operation aiming to create positive associations to Korea as a country and a concept. Limitations in the form of a timeframe biased perspective are addressed, and directions for future research are suggested. This paper employs semiotic methodologies to examine two entertainment products as mechanisms of soft power. Through means of visual metaphor, embedded auditory associations, or linguistic subtext, the findings reveal a desire to conjure positive arousal in the spectator. The synthesis finds similarities to branding, thus positioning the entertainment products as cogs in a larger operation aiming to create positive associations to Korea as a country and a concept.

Keywords: BTS, cognitive semiotics, entertainment, soft power, south korea, squid game

Procedia PDF Downloads 136
3704 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 300
3703 E-Learning Platform for School Kids

Authors: Gihan Thilakarathna, Fernando Ishara, Rathnayake Yasith, Bandara A. M. R. Y.

Abstract:

E-learning is a crucial component of intelligent education. Even in the midst of a pandemic, E-learning is becoming increasingly important in the educational system. Several e-learning programs are accessible for students. Here, we decided to create an e-learning framework for children. We've found a few issues that teachers are having with their online classes. When there are numerous students in an online classroom, how does a teacher recognize a student's focus on academics and below-the-surface behaviors? Some kids are not paying attention in class, and others are napping. The teacher is unable to keep track of each and every student. Key challenge in e-learning is online exams. Because students can cheat easily during online exams. Hence there is need of exam proctoring is occurred. In here we propose an automated online exam cheating detection method using a web camera. The purpose of this project is to present an E-learning platform for math education and include games for kids as an alternative teaching method for math students. The game will be accessible via a web browser. The imagery in the game is drawn in a cartoonish style. This will help students learn math through games. Everything in this day and age is moving towards automation. However, automatic answer evaluation is only available for MCQ-based questions. As a result, the checker has a difficult time evaluating the theory solution. The current system requires more manpower and takes a long time to evaluate responses. It's also possible to mark two identical responses differently and receive two different grades. As a result, this application employs machine learning techniques to provide an automatic evaluation of subjective responses based on the keyword provided to the computer as student input, resulting in a fair distribution of marks. In addition, it will save time and manpower. We used deep learning, machine learning, image processing and natural language technologies to develop these research components.

Keywords: math, education games, e-learning platform, artificial intelligence

Procedia PDF Downloads 140
3702 Dynamic Control Theory: A Behavioral Modeling Approach to Demand Forecasting amongst Office Workers Engaged in a Competition on Energy Shifting

Authors: Akaash Tawade, Manan Khattar, Lucas Spangher, Costas J. Spanos

Abstract:

Many grids are increasing the share of renewable energy in their generation mix, which is causing the energy generation to become less controllable. Buildings, which consume nearly 33% of all energy, are a key target for demand response: i.e., mechanisms for demand to meet supply. Understanding the behavior of office workers is a start towards developing demand response for one sector of building technology. The literature notes that dynamic computational modeling can be predictive of individual action, especially given that occupant behavior is traditionally abstracted from demand forecasting. Recent work founded on Social Cognitive Theory (SCT) has provided a promising conceptual basis for modeling behavior, personal states, and environment using control theoretic principles. Here, an adapted linear dynamical system of latent states and exogenous inputs is proposed to simulate energy demand amongst office workers engaged in a social energy shifting game. The energy shifting competition is implemented in an office in Singapore that is connected to a minigrid of buildings with a consistent 'price signal.' This signal is translated into a 'points signal' by a reinforcement learning (RL) algorithm to influence participant energy use. The dynamic model functions at the intersection of the points signals, baseline energy consumption trends, and SCT behavioral inputs to simulate future outcomes. This study endeavors to analyze how the dynamic model trains an RL agent and, subsequently, the degree of accuracy to which load deferability can be simulated. The results offer a generalizable behavioral model for energy competitions that provides the framework for further research on transfer learning for RL, and more broadly— transactive control.

Keywords: energy demand forecasting, social cognitive behavioral modeling, social game, transfer learning

Procedia PDF Downloads 98
3701 The Ability of Forecasting the Term Structure of Interest Rates Based on Nelson-Siegel and Svensson Model

Authors: Tea Poklepović, Zdravka Aljinović, Branka Marasović

Abstract:

Due to the importance of yield curve and its estimation it is inevitable to have valid methods for yield curve forecasting in cases when there are scarce issues of securities and/or week trade on a secondary market. Therefore in this paper, after the estimation of weekly yield curves on Croatian financial market from October 2011 to August 2012 using Nelson-Siegel and Svensson models, yield curves are forecasted using Vector auto-regressive model and Neural networks. In general, it can be concluded that both forecasting methods have good prediction abilities where forecasting of yield curves based on Nelson Siegel estimation model give better results in sense of lower Mean Squared Error than forecasting based on Svensson model Also, in this case Neural networks provide slightly better results. Finally, it can be concluded that most appropriate way of yield curve prediction is neural networks using Nelson-Siegel estimation of yield curves.

Keywords: Nelson-Siegel Model, neural networks, Svensson Model, vector autoregressive model, yield curve

Procedia PDF Downloads 310
3700 Photo-Fenton Decolorization of Methylene Blue Adsolubilized on Co2+ -Embedded Alumina Surface: Comparison of Process Modeling through Response Surface Methodology and Artificial Neural Network

Authors: Prateeksha Mahamallik, Anjali Pal

Abstract:

In the present study, Co(II)-adsolubilized surfactant modified alumina (SMA) was prepared, and methylene blue (MB) degradation was carried out on Co-SMA surface by visible light photo-Fenton process. The entire reaction proceeded on solid surface as MB was embedded on Co-SMA surface. The reaction followed zero order kinetics. Response surface methodology (RSM) and artificial neural network (ANN) were used for modeling the decolorization of MB by photo-Fenton process as a function of dose of Co-SMA (10, 20 and 30 g/L), initial concentration of MB (10, 20 and 30 mg/L), concentration of H2O2 (174.4, 348.8 and 523.2 mM) and reaction time (30, 45 and 60 min). The prediction capabilities of both the methodologies (RSM and ANN) were compared on the basis of correlation coefficient (R2), root mean square error (RMSE), standard error of prediction (SEP), relative percent deviation (RPD). Due to lower value of RMSE (1.27), SEP (2.06) and RPD (1.17) and higher value of R2 (0.9966), ANN was proved to be more accurate than RSM in order to predict decolorization efficiency.

Keywords: adsolubilization, artificial neural network, methylene blue, photo-fenton process, response surface methodology

Procedia PDF Downloads 243
3699 Outcome of Bowel Management Program in Patient with Spinal Cord Injury

Authors: Roongtiwa Chobchuen, Angkana Srikhan, Pattra Wattanapan

Abstract:

Background: Neurogenic bowel is common condition after spinal cord injury. Most of spinal cord injured patients have motor weakness, mobility impairment which leads to constipation. Moreover, the neural pathway involving bowel function is interrupted. Therefore, the bowel management program should be implemented in nursing care in the earliest time after the onset of the disease to prevent the morbidity and mortality. Objective: To study the outcome of bowel management program of the patients with spinal cord injury who admitted for rehabilitation program. Study design: Descriptive study. Setting: Rehabilitation ward in Srinagarind Hospital. Populations: patients with subacute to chronic spinal cord injury who admitted at rehabilitation ward, Srinagarind hospital, aged over 18 years old. Instrument: The neurogenic bowel dysfunction score (NBDS) was used to determine the severity of neurogenic bowel. Procedure and statistical analysis: All participants were asked to complete the demographic data; age gender, duration of disease, diagnosis. The individual bowel function was assessed using NBDS at admission. The patients and caregivers were trained by nurses about the bowel management program which consisted of diet modification, abdominal massage, digital stimulation, stool evacuation including medication and physical activity. The outcome of the bowel management program was assessed by NBDS at discharge. The chi-square test was used to detect the difference in severity of neurogenic bowel at admission and discharge. Results: Sixteen spinal cord injured patients were enrolled in the study (age 45 ± 17 years old, 69% were male). Most of them (50%) were tetraplegia. On the admission, 12.5%, 12.5%, 43.75% and 31.25% were categorized as very minor (NBDS 0-6), minor (NBDS 7-9), moderate (NBDS 10-13) and severe (NBDS 14+) respectively. The severity of neurogenic bowel was decreased significantly at discharge (56.25%, 18.755%, 18.75% and 6.25% for very minor, minor, moderate and severe group respectively; p < 0.001) compared with NBDS at admission. Conclusions: Implementation of the effective bowel program decrease the severity of the neurogenic bowel in patient with spinal cord injury.

Keywords: neurogenic bowel, NBDS, spinal cord injury, bowel program

Procedia PDF Downloads 230
3698 A Team-Based Learning Game Guided by a Social Robot

Authors: Gila Kurtz, Dan Kohen Vacs

Abstract:

Social robots (SR) is an emerging field striving to deploy computers capable of resembling human shapes and mimicking human movements, gestures, and behaviors. The evolving capability of SR to interact with human offers groundbreaking ways for learning and training opportunities. Studies show that SR can offer instructional experiences for fostering creativity, entertainment, enjoyment, and curiosity. These added values are essential for empowering instructional opportunities as gamified learning experiences. We present our project focused on deploying an activity to be experienced in an escape room aimed at team-based learning scaffolded by an SR, NAO. An escape room is a well-known approach for gamified activities focused on a simulated scenario experienced by team-based participants. Usually, the simulation takes place in a physical environment where participants must complete a series of challenges in a limited amount of time. During this experience, players learn something about the assigned topic of the room. In the current learning simulation, students must "save the nation" by locating sensitive information stolen and stored in a vault of four locks. Team members have to look for hints and solve riddles mediated by NAO. Each solution provides a unique code for opening one of the four locks. NAO is also used to provide ongoing feedback on the team's performance. We captured the proceeding of our activity and used it to conduct an evaluation study among ten experts in related areas. The experts were interviewed on their overall assessment of the learning activity and their perception of the added value related to the robot. The results were very encouraging on the feasibility that NAO can serve as a motivational tutor in adults' collaborative game-based learning. We believe that this study marks the first step toward a template for developing innovative team-based training using escape rooms supported by a humanoid robot.

Keywords: social robot, NAO, learning, team based activity, escape room

Procedia PDF Downloads 61
3697 Recurrent Neural Networks for Complex Survival Models

Authors: Pius Marthin, Nihal Ata Tutkun

Abstract:

Survival analysis has become one of the paramount procedures in the modeling of time-to-event data. When we encounter complex survival problems, the traditional approach remains limited in accounting for the complex correlational structure between the covariates and the outcome due to the strong assumptions that limit the inference and prediction ability of the resulting models. Several studies exist on the deep learning approach to survival modeling; moreover, the application for the case of complex survival problems still needs to be improved. In addition, the existing models need to address the data structure's complexity fully and are subject to noise and redundant information. In this study, we design a deep learning technique (CmpXRnnSurv_AE) that obliterates the limitations imposed by traditional approaches and addresses the above issues to jointly predict the risk-specific probabilities and survival function for recurrent events with competing risks. We introduce the component termed Risks Information Weights (RIW) as an attention mechanism to compute the weighted cumulative incidence function (WCIF) and an external auto-encoder (ExternalAE) as a feature selector to extract complex characteristics among the set of covariates responsible for the cause-specific events. We train our model using synthetic and real data sets and employ the appropriate metrics for complex survival models for evaluation. As benchmarks, we selected both traditional and machine learning models and our model demonstrates better performance across all datasets.

Keywords: cumulative incidence function (CIF), risk information weight (RIW), autoencoders (AE), survival analysis, recurrent events with competing risks, recurrent neural networks (RNN), long short-term memory (LSTM), self-attention, multilayers perceptrons (MLPs)

Procedia PDF Downloads 76
3696 Air Dispersion Modeling for Prediction of Accidental Emission in the Atmosphere along Northern Coast of Egypt

Authors: Moustafa Osman

Abstract:

Modeling of air pollutants from the accidental release is performed for quantifying the impact of industrial facilities into the ambient air. The mathematical methods are requiring for the prediction of the accidental scenario in probability of failure-safe mode and analysis consequences to quantify the environmental damage upon human health. The initial statement of mitigation plan is supporting implementation during production and maintenance periods. In a number of mathematical methods, the flow rate at which gaseous and liquid pollutants might be accidentally released is determined from various types in term of point, line and area sources. These emissions are integrated meteorological conditions in simplified stability parameters to compare dispersion coefficients from non-continuous air pollution plumes. The differences are reflected in concentrations levels and greenhouse effect to transport the parcel load in both urban and rural areas. This research reveals that the elevation effect nearby buildings with other structure is higher 5 times more than open terrains. These results are agreed with Sutton suggestion for dispersion coefficients in different stability classes.

Keywords: air pollutants, dispersion modeling, GIS, health effect, urban planning

Procedia PDF Downloads 356
3695 Multi-Faceted Growth in Creative Industries

Authors: Sanja Pfeifer, Nataša Šarlija, Marina Jeger, Ana Bilandžić

Abstract:

The purpose of this study is to explore the different facets of growth among micro, small and medium-sized firms in Croatia and to analyze the differences between models designed for all micro, small and medium-sized firms and those in creative industries. Three growth prediction models were designed and tested using the growth of sales, employment and assets of the company as dependent variables. The key drivers of sales growth are: prudent use of cash, industry affiliation and higher share of intangible assets. Growth of assets depends on retained profits, internal and external sources of financing, as well as industry affiliation. Growth in employment is closely related to sources of financing, in particular, debt and it occurs less frequently than growth in sales and assets. The findings confirm the assumption that growth strategies of small and medium-sized enterprises (SMEs) in creative industries have specific differences in comparison to SMEs in general. Interestingly, only 2.2% of growing enterprises achieve growth in employment, assets and sales simultaneously.

Keywords: creative industries, growth prediction model, growth determinants, growth measures

Procedia PDF Downloads 317
3694 Exploring the Dualistic Nature of Design: Integrative Perspectives and Methodological Approaches in Design Research

Authors: Joni Agung Sudarmanto

Abstract:

The concept of design has historically been elusive and characterized by its fluidity, leading to divergent viewpoints on its fundamental nature. Guy Julier views design as inherent in material culture, while Sanders sees it as a collective endeavor focusing on the outcome. Design's dualistic nature, procedural and outcome-oriented, spans various domains, including objects, individuals, and the environment. This comprehensive view of design challenges the notion that design practice is distinct from research, highlighting their shared exploratory nature. The article explores methodological techniques in design research and the three prevalent approaches: "into design," "through design," and "for design." The contradictory meanings of design arise from its etymology and its duality as both process and result, leading to its integrative nature across objects, humans, and the environment. The parallels between design and research activities, underscoring their exploratory and knowledge-generating nature, are situated within creative research, challenging the perception of design practice as separate from research endeavors. The "into design" approach encourages interdisciplinary collaboration, enriching design research with diverse perspectives. The "through design" approach bridges theory and practice, producing more practical outcomes. The "for design" approach supports specific design solutions, providing designers with valuable guidance.

Keywords: dualistic nature of design, integrative perspectives, methodological approaches, design research

Procedia PDF Downloads 57
3693 Graph Clustering Unveiled: ClusterSyn - A Machine Learning Framework for Predicting Anti-Cancer Drug Synergy Scores

Authors: Babak Bahri, Fatemeh Yassaee Meybodi, Changiz Eslahchi

Abstract:

In the pursuit of effective cancer therapies, the exploration of combinatorial drug regimens is crucial to leverage synergistic interactions between drugs, thereby improving treatment efficacy and overcoming drug resistance. However, identifying synergistic drug pairs poses challenges due to the vast combinatorial space and limitations of experimental approaches. This study introduces ClusterSyn, a machine learning (ML)-powered framework for classifying anti-cancer drug synergy scores. ClusterSyn employs a two-step approach involving drug clustering and synergy score prediction using a fully connected deep neural network. For each cell line in the training dataset, a drug graph is constructed, with nodes representing drugs and edge weights denoting synergy scores between drug pairs. Drugs are clustered using the Markov clustering (MCL) algorithm, and vectors representing the similarity of drug pairs to each cluster are input into the deep neural network for synergy score prediction (synergy or antagonism). Clustering results demonstrate effective grouping of drugs based on synergy scores, aligning similar synergy profiles. Subsequently, neural network predictions and synergy scores of the two drugs on others within their clusters are used to predict the synergy score of the considered drug pair. This approach facilitates comparative analysis with clustering and regression-based methods, revealing the superior performance of ClusterSyn over state-of-the-art methods like DeepSynergy and DeepDDS on diverse datasets such as Oniel and Almanac. The results highlight the remarkable potential of ClusterSyn as a versatile tool for predicting anti-cancer drug synergy scores.

Keywords: drug synergy, clustering, prediction, machine learning., deep learning

Procedia PDF Downloads 62
3692 Search for EEG Correlates of Mental States Using EEG Neurofeedback Paradigm

Authors: Cyril Kaplan

Abstract:

26 participants played 4 EEG neurofeedback (NF) games encouraged to find their strategies to control the specific NF parameter. Mixed method analysis of performance in the games and post-session interviews led to the identification of states of consciousness that correlated with success in the game. We found that increase in left frontal beta activity was facilitated by evoking interest in observed surroundings, by wondering what is happening behind the window or what lies in a drawer in front.

Keywords: EEG neurofeedback, states of consciousness, frontal beta activity, mixed methods

Procedia PDF Downloads 128
3691 Developing Medical Leaders: A Realistic Evaluation Study for Improving Patient Safety and Maximising Medical Engagement

Authors: Lisa Fox, Jill Aylott

Abstract:

There is a global need to identify ways to engage doctors in non-clinical matters such as medical leadership, service improvement and health system transformation. Using the core principles of Realistic Evaluation (RE), this study examined what works, for doctors of different grades, specialities and experience in an acute NHS Hospital Trust in the UK. Realistic Evaluation is an alternative to more traditional cause and effect evaluation models and seeks to understand the interdependencies of Context, Mechanism and Outcome proposing that Context (C) + Mechanism (M) = Outcome (O). In this study, the context, mechanism and outcome were examined from within individual medical leaders to determine what enables levels of medical engagement in a specific improvement project to reduce hospital inpatient mortality. Five qualitative case studies were undertaken with consultants who had regularly completed mortality reviews over a six month period. The case studies involved semi-structured interviews to test the theory behind the drivers for medical engagement. The interviews were analysed using a theory-driven thematic analysis to identify CMO configurations to explain what works, for whom and in what circumstances. The findings showed that consultants with a longer length of service became more engaged if there were opportunities to be involved in the beginning of an improvement project, with more opportunities to affect the design. Those that are new to a consultant role were more engaged if they felt able to apply any learning directly into their own settings or if they could use it as an opportunity to understand more about the organisation they are working in. This study concludes that RE is a useful methodology for better understanding the complexities of motivation and consultant engagement in a trust wide service improvement project. The study showed that there should be differentiated and bespoke training programmes to maximise each individual doctor’s propensity for medical engagement. The RE identified that there are different ways to ensure that doctors have the right skills to feel confident in service improvement projects.

Keywords: realistic evaluation, medical leadership, medical engagement, patient safety, service improvement

Procedia PDF Downloads 200
3690 Evaluating the Effect of Spatial Qualities, Openness and Complexity, on Human Cognitive Performance within Virtual Reality

Authors: Pierre F. Gerard, Frederic F. Leymarie, William Latham

Abstract:

Architects have developed a series of objective evaluations, using spatial analysis tools such as Isovist, that show how certain spatial qualities are beneficial to specific human activities hosted in the built environments. In return, they can build more adapted environments by tuning those spatial qualities in their design. In parallel, virtual reality technologies have been developed by engineers with the dream of creating a system that immerses users in a new form of spatial experiences. They already have demonstrated a useful range of benefits not only in simulating critical events to assist people in acquiring new skills, but also to enhance memory retention, to name just a few. This paper investigates the effects of two spatial qualities, openness, and complexity, on cognitive performance within immersive virtual environments. Isovist measure is used to design a series of room settings with different levels of each spatial qualities. In an empirical study, each room was then used by every participant to solve a navigational puzzle game and give a rating of their spatial experience. They were then asked to fill in a questionnaire before solving the visual-spatial memory quiz, which addressed how well they remembered the different rooms. Findings suggest that those spatial qualities have an effect on some of the measures, including navigation performance and memory retention. In particular, there is an order effect for the navigation puzzle game. Participants tended to spend a longer time in the complex room settings. Moreover, there is an interaction effect while with more open settings, participants tended to perform better when in a simple setting; however, with more closed settings, participants tended to perform better in a more complex setting. For the visual-spatial memory quiz, participants performed significantly better within the more open rooms. We believe this is a first step in using virtual environments to enhance participant cognitive performances through better use of specific spatial qualities.

Keywords: architecture, navigation, spatial cognition, virtual reality

Procedia PDF Downloads 121
3689 A Case Series on Isolated Lead aVR ST-Segment Elevation Clinical Significance and Outcome

Authors: Fae Princess Bermudez

Abstract:

Background: One of the least significant leads on a 12-lead electrocardiogram is the augmented right lead (aVR), as it is not as specific compared to the other leads. In this case series, the value of lead aVR, which is more often than not ignored, is highlighted. Three cases of aVR ST segment elevation on 12-lead electrocardiogram are described, with the end outcome of demise of all three patients. The importance of immediate revascularization is described to improve prognosis in this group of patients. Objectives: This case series aims to primarily present under-reported cases of isolated aVR ST-segrment elevation myocardial infarction (STEMI), their course and outcome. More specific aims are to identify the criteria in determination of isolated aVR STEMI, know its clinical significance, and determine appropriate management for patients with this ECG finding. Method: A short review of previous studies, case reports, articles and guidelines from 2011-2016 was done. The author reviewed available literature, sorted out those that proved to be significant for the presented cases, and described them in conjunction with the aforementioned cases. Findings: Based on the limited information on these rare or under-reported cases, it was found that isolated aVR STEMI had a poorer prognosis that led to significant mortality and morbidity of patients. The significance of aVR ST-elevation was that of an occlusion of the left coronary artery or a severe three-vessel disease in the presence of an Acute Coronary Syndrome. Guidelines from American Heart Association/American College of Cardiology Foundation in 2013 already recognized ST-elevation of lead aVR in isolation as a STEMI; hence, recommended that patients with this particular ECG finding should undergo reperfusion strategies to improve prognosis. Conclusion: The indispensability of isolated aVR ST-segment elevation on ECG should alert physicians, especially Emergency physicians, to the high probability of Acute Coronary Syndrome with a very poor prognosis. If this group of patients is not promptly managed, demise may ensue, with cardiogenic shock as the most probable cause. With this electrocardiogram finding, physicians must be quick to make clinical decisions to increase chances of survival of this group of patients.

Keywords: AVR ST-elevation, diffuse ST-segment depression, left coronary artery infarction, myocardial infarction

Procedia PDF Downloads 195
3688 Comparative Analysis of Predictive Models for Customer Churn Prediction in the Telecommunication Industry

Authors: Deepika Christopher, Garima Anand

Abstract:

To determine the best model for churn prediction in the telecom industry, this paper compares 11 machine learning algorithms, namely Logistic Regression, Support Vector Machine, Random Forest, Decision Tree, XGBoost, LightGBM, Cat Boost, AdaBoost, Extra Trees, Deep Neural Network, and Hybrid Model (MLPClassifier). It also aims to pinpoint the top three factors that lead to customer churn and conducts customer segmentation to identify vulnerable groups. According to the data, the Logistic Regression model performs the best, with an F1 score of 0.6215, 81.76% accuracy, 68.95% precision, and 56.57% recall. The top three attributes that cause churn are found to be tenure, Internet Service Fiber optic, and Internet Service DSL; conversely, the top three models in this article that perform the best are Logistic Regression, Deep Neural Network, and AdaBoost. The K means algorithm is applied to establish and analyze four different customer clusters. This study has effectively identified customers that are at risk of churn and may be utilized to develop and execute strategies that lower customer attrition.

Keywords: attrition, retention, predictive modeling, customer segmentation, telecommunications

Procedia PDF Downloads 43
3687 Implementation of Correlation-Based Data Analysis as a Preliminary Stage for the Prediction of Geometric Dimensions Using Machine Learning in the Forming of Car Seat Rails

Authors: Housein Deli, Loui Al-Shrouf, Hammoud Al Joumaa, Mohieddine Jelali

Abstract:

When forming metallic materials, fluctuations in material properties, process conditions, and wear lead to deviations in the component geometry. Several hundred features sometimes need to be measured, especially in the case of functional and safety-relevant components. These can only be measured offline due to the large number of features and the accuracy requirements. The risk of producing components outside the tolerances is minimized but not eliminated by the statistical evaluation of process capability and control measurements. The inspection intervals are based on the acceptable risk and are at the expense of productivity but remain reactive and, in some cases, considerably delayed. Due to the considerable progress made in the field of condition monitoring and measurement technology, permanently installed sensor systems in combination with machine learning and artificial intelligence, in particular, offer the potential to independently derive forecasts for component geometry and thus eliminate the risk of defective products - actively and preventively. The reliability of forecasts depends on the quality, completeness, and timeliness of the data. Measuring all geometric characteristics is neither sensible nor technically possible. This paper, therefore, uses the example of car seat rail production to discuss the necessary first step of feature selection and reduction by correlation analysis, as otherwise, it would not be possible to forecast components in real-time and inline. Four different car seat rails with an average of 130 features were selected and measured using a coordinate measuring machine (CMM). The run of such measuring programs alone takes up to 20 minutes. In practice, this results in the risk of faulty production of at least 2000 components that have to be sorted or scrapped if the measurement results are negative. Over a period of 2 months, all measurement data (> 200 measurements/ variant) was collected and evaluated using correlation analysis. As part of this study, the number of characteristics to be measured for all 6 car seat rail variants was reduced by over 80%. Specifically, direct correlations for almost 100 characteristics were proven for an average of 125 characteristics for 4 different products. A further 10 features correlate via indirect relationships so that the number of features required for a prediction could be reduced to less than 20. A correlation factor >0.8 was assumed for all correlations.

Keywords: long-term SHM, condition monitoring, machine learning, correlation analysis, component prediction, wear prediction, regressions analysis

Procedia PDF Downloads 22
3686 Comparison of Different Intraocular Lens Power Calculation Formulas in People With Very High Myopia

Authors: Xia Chen, Yulan Wang

Abstract:

purpose: To compare the accuracy of Haigis, SRK/T, T2, Holladay 1, Hoffer Q, Barrett Universal II, Emmetropia Verifying Optical (EVO) and Kane for intraocular lens power calculation in patients with axial length (AL) ≥ 28 mm. Methods: In this retrospective single-center study, 50 eyes of 41 patients with AL ≥ 28 mm that underwent uneventful cataract surgery were enrolled. The actual postoperative refractive results were compared to the predicted refraction calculated with different formulas (Haigis, SRK/T, T2, Holladay 1, Hoffer Q, Barrett Universal II, EVO and Kane). The mean absolute prediction errors (MAE) 1 month postoperatively were compared. Results: The MAE of different formulas were as follows: Haigis (0.509), SRK/T (0.705), T2 (0.999), Holladay 1 (0.714), Hoffer Q (0.583), Barrett Universal II (0.552), EVO (0.463) and Kane (0.441). No significant difference was found among the different formulas (P = .122). The Kane and EVO formulas achieved the lowest level of mean prediction error (PE) and median absolute error (MedAE) (p < 0.05). Conclusion: The Kane and EVO formulas had a better success rate than others in predicting IOL power in high myopic eyes with AL longer than 28 mm in this study.

Keywords: cataract, power calculation formulas, intraocular lens, long axial length

Procedia PDF Downloads 55
3685 Lies and Pretended Fairness of Police Officers in Sharing

Authors: Eitan Elaad

Abstract:

The current study aimed to examine lying and pretended fairness by police personnel in sharing situations. Forty Israeli police officers and 40 laypeople from the community, all males, self-assessed their lie-telling ability, rated the frequency of their lies, evaluated the acceptability of lying, and indicated using rational and intuitive thinking while lying. Next, according to the ultimatum game procedure, participants were asked to share 100 points with an imagined target, either a male policeman or a male non-policeman. Participants allocated points to the target person bearing in mind that the other person must accept or reject their offer. Participants' goal was to retain as many points as possible, and to this end, they could tell the target person that fewer than 100 points were available for distribution. We defined concealment or lying as the difference between the available 100 points and the sum of points designated for sharing. Results indicated that police officers lied less to their fellow police targets than non-police targets, whereas laypeople lied less to non-police targets than imagined police targets. The ratio between the points offered to the imagined target person and the points endowed by the participant as available for sharing defined pretended fairness.Enhanced pretended fairness indicates higher motivation to display fair sharing even if the fair sharing is fictitious. Police officers presented higher pretended fairness to police targets than laypeople, whereas laypeople set off more fairness to non-police targets than police officers. We discussed the results concerning occupation solidarity and loyalty among police personnel. Specifically, police work involves uncertainty, danger and risk, coercive authority, and the use of force, which isolates the police from the community and dictates strong bonds of solidarity between police personnel. No wonder police officers shared more points (lied less) to fellow police targets than non-police targets. On the other hand, police legitimacy or the belief that the police are acting honestly in the best interest of the citizens constitutes citizens' attitudes toward the police. The relatively low number of points shared for distribution by laypeople to police targets indicates difficulties with the legitimacy of the Israeli police.

Keywords: lying, fairness, police solidarity, police legitimacy, sharing, ultimatum game

Procedia PDF Downloads 106
3684 Prediction of Critical Flow Rate in Tubular Heat Exchangers for the Onset of Damaging Flow-Induced Vibrations

Authors: Y. Khulief, S. Bashmal, S. Said, D. Al-Otaibi, K. Mansour

Abstract:

The prediction of flow rates at which the vibration-induced instability takes place in tubular heat exchangers due to cross-flow is of major importance to the performance and service life of such equipment. In this paper, the semi-analytical model for square tube arrays was extended and utilized to study the triangular tube patterns. A laboratory test rig with instrumented test section is used to measure the fluidelastic coefficients to be used for tuning the mathematical model. The test section can be made of any bundle pattern. In this study, two test sections were constructed for both the normal triangular and the rotated triangular tube arrays. The developed scheme is utilized in predicting the onset of flow-induced instability in the two triangular tube arrays. The results are compared to those obtained for two other bundle configurations. The results of the four different tube patterns are viewed in the light of TEMA predictions. The comparison demonstrated that TEMA guidelines are more conservative in all configurations considered

Keywords: fluid-structure interaction, cross-flow, heat exchangers,

Procedia PDF Downloads 265
3683 Dispersion Rate of Spilled Oil in Water Column under Non-Breaking Water Waves

Authors: Hanifeh Imanian, Morteza Kolahdoozan

Abstract:

The purpose of this study is to present a mathematical phrase for calculating the dispersion rate of spilled oil in water column under non-breaking waves. In this regard, a multiphase numerical model is applied for which waves and oil phase were computed concurrently, and accuracy of its hydraulic calculations have been proven. More than 200 various scenarios of oil spilling in wave waters were simulated using the multiphase numerical model and its outcome were collected in a database. The recorded results were investigated to identify the major parameters affected vertical oil dispersion and finally 6 parameters were identified as main independent factors. Furthermore, some statistical tests were conducted to identify any relationship between the dependent variable (dispersed oil mass in the water column) and independent variables (water wave specifications containing height, length and wave period and spilled oil characteristics including density, viscosity and spilled oil mass). Finally, a mathematical-statistical relationship is proposed to predict dispersed oil in marine waters. To verify the proposed relationship, a laboratory example available in the literature was selected. Oil mass rate penetrated in water body computed by statistical regression was in accordance with experimental data was predicted. On this occasion, it was necessary to verify the proposed mathematical phrase. In a selected laboratory case available in the literature, mass oil rate penetrated in water body computed by suggested regression. Results showed good agreement with experimental data. The validated mathematical-statistical phrase is a useful tool for oil dispersion prediction in oil spill events in marine areas.

Keywords: dispersion, marine environment, mathematical-statistical relationship, oil spill

Procedia PDF Downloads 224
3682 Rainfall–Runoff Simulation Using WetSpa Model in Golestan Dam Basin, Iran

Authors: M. R. Dahmardeh Ghaleno, M. Nohtani, S. Khaledi

Abstract:

Flood simulation and prediction is one of the most active research areas in surface water management. WetSpa is a distributed, continuous, and physical model with daily or hourly time step that explains precipitation, runoff, and evapotranspiration processes for both simple and complex contexts. This model uses a modified rational method for runoff calculation. In this model, runoff is routed along the flow path using Diffusion-Wave equation which depends on the slope, velocity, and flow route characteristics. Golestan Dam Basin is located in Golestan province in Iran and it is passing over coordinates 55° 16´ 50" to 56° 4´ 25" E and 37° 19´ 39" to 37° 49´ 28"N. The area of the catchment is about 224 km2, and elevations in the catchment range from 414 to 2856 m at the outlet, with average slope of 29.78%. Results of the simulations show a good agreement between calculated and measured hydrographs at the outlet of the basin. Drawing upon Nash-Sutcliffe model efficiency coefficient for calibration periodic model estimated daily hydrographs and maximum flow rate with an accuracy up to 59% and 80.18%, respectively.

Keywords: watershed simulation, WetSpa, stream flow, flood prediction

Procedia PDF Downloads 235
3681 Outcomes of the Gastrocnemius Flap Performed by Orthopaedic Surgeons in Salvage Revision Knee Arthroplasty: A Retrospective Study at a Tertiary Orthopaedic Centre

Authors: Amirul Adlan, Robert McCulloch, Scott Evans, Michael Parry, Jonathan Stevenson, Lee Jeys

Abstract:

Background and Objectives: The gastrocnemius myofascial flap is used to manage soft-tissue defects over the anterior aspect of the knee in the context of a patient presenting with a sinus and periprosthetic joint infection (PJI) or extensor mechanism failure. The aim of this study was twofold: firstly, to evaluate the outcomes of gastrocnemius flaps performed by appropriately trained orthopaedic surgeons in the context of PJI and, secondly, to evaluate the infection-free survival of this patient group. Methods: We retrospectively reviewed 30 patients who underwent gastrocnemius flap reconstruction during staged revision total knee arthroplasty for prosthetic joint infection (PJI). All flaps were performed by an orthopaedic surgeon with orthoplastics training. Patients had a mean age of 68.9 years (range 50–84) and were followed up for a mean of 50.4 months (range 2–128 months). A total of 29 patients (97 %) were categorized into Musculoskeletal Infection Society (MSIS) local extremity grade 3 (greater than two compromising factors), and 52 % of PJIs were polymicrobial. The primary outcome measure was flap failure, and the secondary outcome measure was a recurrent infection. Results: Flap survival was 100% with no failures or early returns to theatre for flap problems such as necrosis or haematoma. Overall infection-free survival during the study period was 48% (13 of 27 infected cases). Using limb salvage as the outcome, 77% (23 of 30 patients) retained the limb. Infection recurrence occurred in 48% (10 patients) in the type B3 cohort and 67% (4 patients) in the type C3 cohort (p = 0.65). Conclusion: The surgical technique for a gastrocnemius myofascial flap is reliable and reproducible when performed by appropriately trained orthopaedic surgeons, even in high-risk groups. However, the risks of recurrent infection and amputation remain high within our series due to poor host and extremity factors.

Keywords: gastrocnemius flap, limb salvage, revision arthroplasty, outcomes

Procedia PDF Downloads 100
3680 Iraqi Short Term Electrical Load Forecasting Based on Interval Type-2 Fuzzy Logic

Authors: Firas M. Tuaimah, Huda M. Abdul Abbas

Abstract:

Accurate Short Term Load Forecasting (STLF) is essential for a variety of decision making processes. However, forecasting accuracy can drop due to the presence of uncertainty in the operation of energy systems or unexpected behavior of exogenous variables. Interval Type 2 Fuzzy Logic System (IT2 FLS), with additional degrees of freedom, gives an excellent tool for handling uncertainties and it improved the prediction accuracy. The training data used in this study covers the period from January 1, 2012 to February 1, 2012 for winter season and the period from July 1, 2012 to August 1, 2012 for summer season. The actual load forecasting period starts from January 22, till 28, 2012 for winter model and from July 22 till 28, 2012 for summer model. The real data for Iraqi power system which belongs to the Ministry of Electricity.

Keywords: short term load forecasting, prediction interval, type 2 fuzzy logic systems, electric, computer systems engineering

Procedia PDF Downloads 385
3679 Prediction of Gully Erosion with Stochastic Modeling by using Geographic Information System and Remote Sensing Data in North of Iran

Authors: Reza Zakerinejad

Abstract:

Gully erosion is a serious problem that threading the sustainability of agricultural area and rangeland and water in a large part of Iran. This type of water erosion is the main source of sedimentation in many catchment areas in the north of Iran. Since in many national assessment approaches just qualitative models were applied the aim of this study is to predict the spatial distribution of gully erosion processes by means of detail terrain analysis and GIS -based logistic regression in the loess deposition in a case study in the Golestan Province. This study the DEM with 25 meter result ion from ASTER data has been used. The Landsat ETM data have been used to mapping of land use. The TreeNet model as a stochastic modeling was applied to prediction the susceptible area for gully erosion. In this model ROC we have set 20 % of data as learning and 20 % as learning data. Therefore, applying the GIS and satellite image analysis techniques has been used to derive the input information for these stochastic models. The result of this study showed a high accurate map of potential for gully erosion.

Keywords: TreeNet model, terrain analysis, Golestan Province, Iran

Procedia PDF Downloads 520
3678 Comparison of Cervical Length Using Transvaginal Ultrasonography and Bishop Score to Predict Succesful Induction

Authors: Lubena Achmad, Herman Kristanto, Julian Dewantiningrum

Abstract:

Background: The Bishop score is a standard method used to predict the success of induction. This examination tends to be subjective with high inter and intraobserver variability, so it was presumed to have a low predictive value in terms of the outcome of labor induction. Cervical length measurement using transvaginal ultrasound is considered to be more objective to assess the cervical length. Meanwhile, this examination is not a complicated procedure and less invasive than vaginal touché. Objective: To compare transvaginal ultrasound and Bishop score in predicting successful induction. Methods: This study was a prospective cohort study. One hundred and twenty women with singleton pregnancies undergoing induction of labor at 37 – 42 weeks and met inclusion and exclusion criteria were enrolled in this study. Cervical assessment by both transvaginal ultrasound and Bishop score were conducted prior induction. The success of labor induction was defined as an ability to achieve active phase ≤ 12 hours after induction. To figure out the best cut-off point of cervical length and Bishop score, receiver operating characteristic (ROC) curves were plotted. Logistic regression analysis was used to determine which factors best-predicted induction success. Results: This study showed significant differences in terms of age, premature rupture of the membrane, the Bishop score, cervical length and funneling as significant predictors of successful induction. Using ROC curves found that the best cut-off point for prediction of successful induction was 25.45 mm for cervical length and 3 for Bishop score. Logistic regression was performed and showed only premature rupture of membranes and cervical length ≤ 25.45 that significantly predicted the success of labor induction. By excluding premature rupture of the membrane as the indication of induction, cervical length less than 25.3 mm was a better predictor of successful induction. Conclusion: Compared to Bishop score, cervical length using transvaginal ultrasound was a better predictor of successful induction.

Keywords: Bishop Score, cervical length, induction, successful induction, transvaginal sonography

Procedia PDF Downloads 314