Search results for: real time pest tracking
18032 Comics as an Intermediary for Media Literacy Education
Authors: Ryan C. Zlomek
Abstract:
The value of using comics in the literacy classroom has been explored since the 1930s. At that point in time researchers had begun to implement comics into daily lesson plans and, in some instances, had started the development process for comics-supported curriculum. In the mid-1950s, this type of research was cut short due to the work of psychiatrist Frederic Wertham whose research seemingly discovered a correlation between comic readership and juvenile delinquency. Since Wertham’s allegations the comics medium has had a hard time finding its way back to education. Now, over fifty years later, the definition of literacy is in mid-transition as the world has become more visually-oriented and students require the ability to interpret images as often as words. Through this transition, comics has found a place in the field of literacy education research as the shift focuses from traditional print to multimodal and media literacies. Comics are now believed to be an effective resource in bridging the gap between these different types of literacies. This paper seeks to better understand what students learn from the process of reading comics and how those skills line up with the core principles of media literacy education in the United States. In the first section, comics are defined to determine the exact medium that is being examined. The different conventions that the medium utilizes are also discussed. In the second section, the comics reading process is explored through a dissection of the ways a reader interacts with the page, panel, gutter, and different comic conventions found within a traditional graphic narrative. The concepts of intersubjective acts and visualization are attributed to the comics reading process as readers draw in real world knowledge to decode meaning. In the next section, the learning processes that comics encourage are explored parallel to the core principles of media literacy education. Each principle is explained and the extent to which comics can act as an intermediary for this type of education is theorized. In the final section, the author examines comics use in his computer science and technology classroom. He lays out different theories he utilizes from Scott McCloud’s text Understanding Comics and how he uses them to break down media literacy strategies with his students. The article concludes with examples of how comics has positively impacted classrooms around the United States. It is stated that integrating comics into the classroom will not solve all issues related to literacy education but, rather, that comics can be a powerful multimodal resource for educators looking for new mediums to explore with their students.Keywords: comics, graphics novels, mass communication, media literacy, metacognition
Procedia PDF Downloads 29918031 Effects of Body Positioning on Videofluoroscopic Barium Esophagram in Healthy Cats
Authors: Hyeona Kim, Kichang Lee, Seunghee Lee, Jeongsu An, Kyungjun Min
Abstract:
Contrast videofluoroscopy is the diagnostic imaging technique for evaluating cat with dysphagia. Generally, videofluoroscopic studies have been done with the cat restrained in lateral recumbency. It is different from the neutral position such as standing or sternal recumbency which is actual swallowing posture. We hypothesized that measurement of esophageal transit and peristalsis would be affected by body position. This experimental study analyzed the imaging findings of barium esophagram in 5 cats. Each cat underwent videofluoroscopy during swallowing of liquid barium and barium-soaked kibble in standing position and lateral recumbency. Esophageal transit time and the number of esophageal peristaltic waves were compared among body positions. Transit time in the cervical esophagus (0.57s), cranial thoracic esophagus (2.5s), and caudal thoracic esophagus(1.10s) was delayed when cats were in lateral recumbency for liquid barium. For kibble, transit time was more delayed than that of liquid through the entire esophagus in lateral recumbency. Liquid and kibble frequently started to delay at thoracic inlet region, transit time in the thoracic esophagus was significantly delayed than the cervical esophagus. In standing position, 60.2% of liquid swallows stimulated primary esophageal peristalsis. In lateral recumbency, 50.5% of liquid swallows stimulated primary esophageal peristalsis. Other variables were not significantly different. Lateral body positioning increases entire esophageal transit time and thoracic esophageal transit time is most significantly delayed. Thus, lateral recumbency decreases the number of primary esophageal peristalsis.Keywords: barium esophagram, body positioning, cat, videofluoroscopy
Procedia PDF Downloads 20118030 Correlation of P53 Gene Expression With Serum Alanine Transaminase Levels and Hepatitis B Viral Load in Cirrhosis and Hepatocellular Carcinoma Patients
Authors: Umme Shahera, Saifullah Munshi, Munira Jahan, Afzalun Nessa, Shahinul Alam, Shahina Tabassum
Abstract:
The development of HCC is a multi-stage process. Several extrinsic factors, such as aflatoxin, HBV, nutrition, alcohol, and trace elements are thought to initiate or/and promote the hepatocarcinogenesis. Alteration of p53 status is an important intrinsic factor in this process as p53 is essential for preventing inappropriate cell proliferation and maintaining genome integrity following genotoxic stress. This study was designed to assess the correlation of p53 gene expression with HBV-DNA and serum Alanine transaminase (ALT) in patients with cirrhosis and HCC. The study was conducted among 60 patients. The study population were divided into four groups (15 in each groups)-HBV positive cirrhosis, HBV negative cirrhosis, HBV positive HCC and HBV negative HCC. Expression of p53 gene was observed using real time PCR. P53 gene expressions in the above mentioned groups were correlated with serum ALT level and HBV viral load. p53 gene was significantly higher in HBV-positive patients with HCC than HBV-positive cirrhosis. Similarly, the expression of p53 was significantly higher in HBV-positive HCC than HBV-negative HCC patients. However, the expression of p53 was reduced in HBV-positive cirrhosis in comparison with HBV-negative cirrhosis. P53 gene expression in liver was not correlated with the serum levels of ALT in any of the study groups. HBV- DNA load also did not correlated with p53 gene expression in HBV positive HCC and HBV positive cirrhosis patients. This study shows that there was no significant change with the expression of p53 gene in any of the study groups with ALT level or viral load, though differential expression of p53 gene were observed in cirrhosis and HCC patients.Keywords: P53, ALT, HBV-DNA, liver cirrhosis, hepatocellular carcinoma
Procedia PDF Downloads 9518029 An Evaluation of the Use of Telematics for Improving the Driving Behaviours of Young People
Authors: James Boylan, Denny Meyer, Won Sun Chen
Abstract:
Background: Globally, there is an increasing trend of road traffic deaths, reaching 1.35 million in 2016 in comparison to 1.3 million a decade ago, and overall, road traffic injuries are ranked as the eighth leading cause of death for all age groups. The reported death rate for younger drivers aged 16-19 years is almost twice the rate reported for older drivers aged 25 and above, with a rate of 3.5 road traffic fatalities per annum for every 10,000 licenses held. Telematics refers to a system with the ability to capture real-time data about vehicle usage. The data collected from telematics can be used to better assess a driver's risk. It is typically used to measure acceleration, turn, braking, and speed, as well as to provide locational information. With the Australian government creating the National Telematics Framework, there has been an increase in the government's focus on using telematics data to improve road safety outcomes. The purpose of this study is to test the hypothesis that improvements in telematics measured driving behaviour to relate to improvements in road safety attitudes measured by the Driving Behaviour Questionnaire (DBQ). Methodology: 28 participants were recruited and given a telematics device to insert into their vehicles for the duration of the study. The participant's driving behaviour over the course of the first month will be compared to their driving behaviour in the second month to determine whether feedback from telematics devices improves driving behaviour. Participants completed the DBQ, evaluated using a 6-point Likert scale (0 = never, 5 = nearly all the time) at the beginning, after the first month, and after the second month of the study. This is a well-established instrument used worldwide. Trends in the telematics data will be captured and correlated with the changes in the DBQ using regression models in SAS. Results: The DBQ has provided a reliable measure (alpha = .823) of driving behaviour based on a sample of 23 participants, with an average of 50.5 and a standard deviation of 11.36, and a range of 29 to 76, with higher scores, indicating worse driving behaviours. This initial sample is well stratified in terms of gender and age (range 19-27). It is expected that in the next six weeks, a larger sample of around 40 will have completed the DBQ after experiencing in-vehicle telematics for 30 days, allowing a comparison with baseline levels. The trends in the telematics data over the first 30 days will be compared with the changes observed in the DBQ. Conclusions: It is expected that there will be a significant relationship between the improvements in the DBQ and the trends in reduced telematics measured aggressive driving behaviours supporting the hypothesis.Keywords: telematics, driving behavior, young drivers, driving behaviour questionnaire
Procedia PDF Downloads 10618028 Thinking Lean in ICU: A Time Motion Study Quantifying ICU Nurses’ Multitasking Time Allocation
Authors: Fatma Refaat Ahmed, Sally Mohamed Farghaly
Abstract:
Context: Intensive care unit (ICU) nurses often face pressure and constraints in their work, leading to the rationing of care when demands exceed available time and resources. Observations suggest that ICU nurses are frequently distracted from their core nursing roles by non-core tasks. This study aims to provide evidence on ICU nurses' multitasking activities and explore the association between nurses' personal and clinical characteristics and their time allocation. Research Aim: The aim of this study is to quantify the time spent by ICU nurses on multitasking activities and investigate the relationship between their personal and clinical characteristics and time allocation. Methodology: A self-observation form utilizing the "Diary" recording method was used to record the number of tasks performed by ICU nurses and the time allocated to each task category. Nurses also reported on the distractions encountered during their nursing activities. A convenience sample of 60 ICU nurses participated in the study, with each nurse observed for one nursing shift (6 hours), amounting to a total of 360 hours. The study was conducted in two ICUs within a university teaching hospital in Alexandria, Egypt. Findings: The results showed that ICU nurses completed 2,730 direct patient-related tasks and 1,037 indirect tasks during the 360-hour observation period. Nurses spent an average of 33.65 minutes on ventilator care-related tasks, 14.88 minutes on tube care-related tasks, and 10.77 minutes on inpatient care-related tasks. Additionally, nurses spent an average of 17.70 minutes on indirect care tasks per hour. The study identified correlations between nursing time and nurses' personal and clinical characteristics. Theoretical Importance: This study contributes to the existing research on ICU nurses' multitasking activities and their relationship with personal and clinical characteristics. The findings shed light on the significant time spent by ICU nurses on direct care for mechanically ventilated patients and the distractions that require attention from ICU managers. Data Collection: Data were collected using self-observation forms completed by participating ICU nurses. The forms recorded the number of tasks performed, the time allocated to each task category, and any distractions encountered during nursing activities. Analysis Procedures: The collected data were analyzed to quantify the time spent on different tasks by ICU nurses. Correlations were also examined between nursing time and nurses' personal and clinical characteristics. Question Addressed: This study addressed the question of how ICU nurses allocate their time across multitasking activities and whether there is an association between nurses' personal and clinical characteristics and time allocation. Conclusion: The findings of this study emphasize the need for a lean evaluation of ICU nurses' activities to identify and address potential gaps in patient care and distractions. Implementing lean techniques can improve efficiency, safety, clinical outcomes, and satisfaction for both patients and nurses, ultimately enhancing the quality of care and organizational performance in the ICU setting.Keywords: motion study, ICU nurse, lean, nursing time, multitasking activities
Procedia PDF Downloads 6818027 Local Binary Patterns-Based Statistical Data Analysis for Accurate Soccer Match Prediction
Authors: Mohammad Ghahramani, Fahimeh Saei Manesh
Abstract:
Winning a soccer game is based on thorough and deep analysis of the ongoing match. On the other hand, giant gambling companies are in vital need of such analysis to reduce their loss against their customers. In this research work, we perform deep, real-time analysis on every soccer match around the world that distinguishes our work from others by focusing on particular seasons, teams and partial analytics. Our contributions are presented in the platform called “Analyst Masters.” First, we introduce various sources of information available for soccer analysis for teams around the world that helped us record live statistical data and information from more than 50,000 soccer matches a year. Our second and main contribution is to introduce our proposed in-play performance evaluation. The third contribution is developing new features from stable soccer matches. The statistics of soccer matches and their odds before and in-play are considered in the image format versus time including the halftime. Local Binary patterns, (LBP) is then employed to extract features from the image. Our analyses reveal incredibly interesting features and rules if a soccer match has reached enough stability. For example, our “8-minute rule” implies if 'Team A' scores a goal and can maintain the result for at least 8 minutes then the match would end in their favor in a stable match. We could also make accurate predictions before the match of scoring less/more than 2.5 goals. We benefit from the Gradient Boosting Trees, GBT, to extract highly related features. Once the features are selected from this pool of data, the Decision trees decide if the match is stable. A stable match is then passed to a post-processing stage to check its properties such as betters’ and punters’ behavior and its statistical data to issue the prediction. The proposed method was trained using 140,000 soccer matches and tested on more than 100,000 samples achieving 98% accuracy to select stable matches. Our database from 240,000 matches shows that one can get over 20% betting profit per month using Analyst Masters. Such consistent profit outperforms human experts and shows the inefficiency of the betting market. Top soccer tipsters achieve 50% accuracy and 8% monthly profit in average only on regional matches. Both our collected database of more than 240,000 soccer matches from 2012 and our algorithm would greatly benefit coaches and punters to get accurate analysis.Keywords: soccer, analytics, machine learning, database
Procedia PDF Downloads 23818026 Noticing Nature: Benefits for Connectedness to Nature and Wellbeing
Authors: Dawn Watling, Lorraine Lecourtois, Adnan Levent, Ryan Jeffries, Aysha Bellamy
Abstract:
Mental health diagnoses are on the rise for adolescents worldwide, with many being unable to access support and increasing use of social prescribing time in nature. There is an increasing need to better understand the preventive benefits of spending time in nature. In this paper, research findings from 599 seven to 12-year-olds completed two sets of questionnaires (before the visit and after a walk in nature). Participants spent time in one of three different biodiverse habitats. Findings explore predictors (including age, sex, and mental health) of increases in connection to nature and well-being. Secondly, research findings from 313 eighteen to 87-year-olds who completed questionnaires and had their heart rate monitored, followed by a self-guided walk, will be discussed. Findings explore predictors (including age, sex, connectedness to nature, well-being, and heart rate as a proxy measure of stress) of increases in mood and feelings of restoration. The discussion will focus on the converging evidence for taking time to notice nature and the role of different environments in enhancing connection to nature, well-being, and positive mental health.Keywords: nature, connectedness to nature, social prescribing, wellbeing
Procedia PDF Downloads 3218025 Optimal Continuous Scheduled Time for a Cumulative Damage System with Age-Dependent Imperfect Maintenance
Authors: Chin-Chih Chang
Abstract:
Many manufacturing systems suffer failures due to complex degradation processes and various environment conditions such as random shocks. Consider an operating system is subject to random shocks and works at random times for successive jobs. When successive jobs often result in production losses and performance deterioration, it would be better to do maintenance or replacement at a planned time. A preventive replacement (PR) policy is presented to replace the system before a failure occurs at a continuous time T. In such a policy, the failure characteristics of the system are designed as follows. Each job would cause a random amount of additive damage to the system, and the system fails when the cumulative damage has exceeded a failure threshold. Suppose that the deteriorating system suffers one of the two types of shocks with age-dependent probabilities: type-I (minor) shock is rectified by a minimal repair, or type-II (catastrophic) shock causes the system to fail. A corrective replacement (CR) is performed immediately when the system fails. In summary, a generalized maintenance model to scheduling replacement plan for an operating system is presented below. PR is carried out at time T, whereas CR is carried out when any type-II shock occurs and the total damage exceeded a failure level. The main objective is to determine the optimal continuous schedule time of preventive replacement through minimizing the mean cost rate function. The existence and uniqueness of optimal replacement policy are derived analytically. It can be seen that the present model is a generalization of the previous models, and the policy with preventive replacement outperforms the one without preventive replacement.Keywords: preventive replacement, working time, cumulative damage model, minimal repair, imperfect maintenance, optimization
Procedia PDF Downloads 36318024 Predicting Survival in Cancer: How Cox Regression Model Compares to Artifial Neural Networks?
Authors: Dalia Rimawi, Walid Salameh, Amal Al-Omari, Hadeel AbdelKhaleq
Abstract:
Predication of Survival time of patients with cancer, is a core factor that influences oncologist decisions in different aspects; such as offered treatment plans, patients’ quality of life and medications development. For a long time proportional hazards Cox regression (ph. Cox) was and still the most well-known statistical method to predict survival outcome. But due to the revolution of data sciences; new predication models were employed and proved to be more flexible and provided higher accuracy in that type of studies. Artificial neural network is one of those models that is suitable to handle time to event predication. In this study we aim to compare ph Cox regression with artificial neural network method according to data handling and Accuracy of each model.Keywords: Cox regression, neural networks, survival, cancer.
Procedia PDF Downloads 20118023 On Transferring of Transient Signals along Hollow Waveguide
Authors: E. Eroglu, S. Semsit, E. Sener, U.S. Sener
Abstract:
In Electromagnetics, there are three canonical boundary value problem with given initial conditions for the electromagnetic field sought, namely: Cavity Problem, Waveguide Problem, and External Problem. The Cavity Problem and Waveguide Problem were rigorously studied and new results were arised at original works in the past decades. In based on studies of an analytical time domain method Evolutionary Approach to Electromagnetics (EAE), electromagnetic field strength vectors produced by a time dependent source function are sought. The fields are took place in L2 Hilbert space. The source function that performs signal transferring, energy and surplus of energy has been demonstrated with all clarity. Depth of the method and ease of applications are emerged needs of gathering obtained results. Main discussion is about perfect electric conductor and hollow waveguide. Even if well studied time-domain modes problems are mentioned, specifically, the modes which have a hollow (i.e., medium-free) cross-section domain are considered.Keywords: evolutionary approach to electromagnetics, time-domain waveguide mode, Neumann problem, Dirichlet boundary value problem, Klein-Gordon
Procedia PDF Downloads 32918022 Insecticidal and Repellent Efficacy of Clove and Lemongrass Oils Against Museum Pest, Lepisma Saccharina (Zygentoma: Lepismatidae)
Authors: Suboohi Nasrin, MHD. Shahid, Abduraheem K.
Abstract:
India is a tropical country, and it is estimated that biological and abiological agents are the major factors in the destruction and deterioration of archival materials like herbarium, paper, cellulose, bookbinding, etc. Silverfish, German Cockroaches, Termites, Booklice, Tobacco beetle and Carpet beetles are the common insect's pests in the museum, which causes deterioration to collections of museum specimens. Among them, silverfish is one of the most notorious pests and primarily responsible for the deterioration of Archival materials. So far, the investigation has been carried to overcome this existing problem as different management strategies such as chemical insecticides, fungicides, herbicides, nematicides, etc., have been applied. Moreover, Synthetic molecules lead to affect the ecological balance, have a detrimental effects on human health, reduce the beneficial microbial flora and fauna, etc. With a view, numbers of chemicals have been banned and advised not to be used due to their long-lasting persistency in soil ecosystem, water and carcinogenic. That’s why the authors used natural products with biocidal activity, cost-effective and eco-friendly approaches. In this study, various concentrations (30, 60 and 90 ml/L) of clove and lemongrass essential oil at different treatment duration (30, 60, 90 and 120-minutes) were investigated to test its properties as a silverfish repellent and insecticidal effect. The result of two ways ANOVA revealed that the mortality was significantly influenced by oil concentration, treatment duration and interaction between two independent factors was also found significant. The mortality rate increased with increasing the oil concentration in clove oil, and 100 % mortality was recorded in 0.9 ml at 120-minute. It was also observed that the treatment duration has the highest effect on the mortality rate of silverfish. The clove oil had the greatest effect on the silverfish in comparison to lemongrass. While in the case of percentage, repellency of adult silverfish was oil concentration and treatment duration-dependent, i.e., increase in concentration and treatment duration resulted in higher repellency percentage. The clove oil was found more effective, showing maximum repellency of 80.00% at 0.9ml/cm2 (highest) concentration, and in lemongrass highest repellency was observed at 33.4% at 0.9 ml/cm2 concentration in the treated area.Keywords: adult silverfish, oils, oil concentration, treatment duration, mortality (%) and repellency
Procedia PDF Downloads 16518021 'Systems' and Its Impact on Virtual Teams and Electronic Learning
Authors: Shavindrie Cooray
Abstract:
It is vital that students are supported in having balanced conversations about topics that might be controversial. This process is crucial to the development of critical thinking skills. This can be difficult to attain in e-learning environments, with some research finding students report a perceived loss in the quality of knowledge exchange and performance. This research investigated if Systems Theory could be applied to structure the discussion, improve information sharing, and reduce conflicts when students are working in online environments. This research involved 160 participants across four categories of student groups at a college in the Northeastern US. Each group was provided with a shared problem, and each group was expected to make a proposal for a solution. Two groups worked face-to-face; the first face to face group engaged with the problem and each other with no intervention from a facilitator; a second face to face group worked on the problem using Systems tools to facilitate problem structuring, group discussion, and decision-making. There were two types of virtual teams. The first virtual group also used Systems tools to facilitate problem structuring and group discussion. However, all interactions were conducted in a synchronous virtual environment. The second type of virtual team also met in real time but worked with no intervention. Findings from the study demonstrated that the teams (both virtual and face-to-face) using Systems tools shared more information with each other than the other teams; additionally, these teams reported an increased level of disagreement amongst their members, but also expressed more confidence and satisfaction with the experience and resulting decision compared to the other groups.Keywords: e-learning, virtual teams, systems approach, conflicts
Procedia PDF Downloads 13718020 Predictive Modelling Approach to Identify Spare Parts Inventory Obsolescence
Authors: Madhu Babu Cherukuri, Tamoghna Ghosh
Abstract:
Factory supply chain management spends billions of dollars every year to procure and manage equipment spare parts. Due to technology -and processes changes some of these spares become obsolete/dead inventory. Factories have huge dead inventory worth millions of dollars accumulating over time. This is due to lack of a scientific methodology to identify them and send the inventory back to the suppliers on a timely basis. The standard approach followed across industries to deal with this is: if a part is not used for a set pre-defined period of time it is declared dead. This leads to accumulation of dead parts over time and these parts cannot be sold back to the suppliers as it is too late as per contract agreement. Our main idea is the time period for identifying a part as dead cannot be a fixed pre-defined duration across all parts. Rather, it should depend on various properties of the part like historical consumption pattern, type of part, how many machines it is being used in, whether it- is a preventive maintenance part etc. We have designed a predictive algorithm which predicts part obsolescence well in advance with reasonable accuracy and which can help save millions.Keywords: obsolete inventory, machine learning, big data, supply chain analytics, dead inventory
Procedia PDF Downloads 31918019 Detection of Atrial Fibrillation Using Wearables via Attentional Two-Stream Heterogeneous Networks
Authors: Huawei Bai, Jianguo Yao, Fellow, IEEE
Abstract:
Atrial fibrillation (AF) is the most common form of heart arrhythmia and is closely associated with mortality and morbidity in heart failure, stroke, and coronary artery disease. The development of single spot optical sensors enables widespread photoplethysmography (PPG) screening, especially for AF, since it represents a more convenient and noninvasive approach. To our knowledge, most existing studies based on public and unbalanced datasets can barely handle the multiple noises sources in the real world and, also, lack interpretability. In this paper, we construct a large- scale PPG dataset using measurements collected from PPG wrist- watch devices worn by volunteers and propose an attention-based two-stream heterogeneous neural network (TSHNN). The first stream is a hybrid neural network consisting of a three-layer one-dimensional convolutional neural network (1D-CNN) and two-layer attention- based bidirectional long short-term memory (Bi-LSTM) network to learn representations from temporally sampled signals. The second stream extracts latent representations from the PPG time-frequency spectrogram using a five-layer CNN. The outputs from both streams are fed into a fusion layer for the outcome. Visualization of the attention weights learned demonstrates the effectiveness of the attention mechanism against noise. The experimental results show that the TSHNN outperforms all the competitive baseline approaches and with 98.09% accuracy, achieves state-of-the-art performance.Keywords: PPG wearables, atrial fibrillation, feature fusion, attention mechanism, hyber network
Procedia PDF Downloads 12118018 A Study of Effect of Yoga on Choice Visual Reaction Time of Soccer Players
Authors: Vikram Singh, Parmod Kumar Sethi
Abstract:
The objective of the study was to study the effectiveness of common yoga protocol on reaction time (choice visual reaction time, measured in milliseconds/seconds) of male football players in the age group of 16 to 21 years. The 40 boys were measured initially on parameters of years of experience, level of participation. They were randomly assigned into two groups i.e. control and experimental. CVRT for both the groups was measured on day-1 and post intervention (common yoga protocol here) was measured after 45 days of training to the experimental group after they had finished with their regular fitness and soccer skill training. One way ANOVA (Univariate analysis) and Independent t-test using SPSS 23 statistical package were applied to get and analyze the results. The experimental yoga protocol group showed a significant reduction in CVRT, whereas the insignificant difference in reaction times was observed for control group after 45 days. The effect size was more than 52% for CVRT indicating that the effect of treatment was large. Power of the study was also found to be high (> .80). There was a significant difference after 45 days of yoga protocol in choice visual reaction time of experimental group (p = .000), t (21.93) = 6.410, p = .000 (two-tailed). The null hypothesis (that there would be no difference in reaction times of control and experimental groups) was rejected. Where p< .05. Therefore alternate hypothesis was accepted.Keywords: reaction time, yoga protocol, t-test, soccer players
Procedia PDF Downloads 23618017 Using Business Interactive Games to Improve Management Skills
Authors: Nuno Biga
Abstract:
Continuous processes’ improvement is a permanent challenge for managers of any organization. Lean management means that efficiency gains can be obtained through a systematic framework able to explore synergies between processes, eliminate waste of time, and other resources. Leaderships in organizations determine the efficiency of the teams through their influence on collaborators, their motivation, and consolidation of ownership (group) feeling. The “organization health” depends on the leadership style, which is directly influenced by the intrinsic characteristics of each personality and leadership ability (leadership competencies). Therefore, it’s important that managers can correct in advance any deviation from expected leadership exercises. Top management teams must assume themselves as regulatory agents of leadership within the organization, ensuring monitoring of actions and the alignment of managers in accordance with the humanist standards anchored in a visible Code of Ethics and Conduct. This article is built around an innovative model of “Business Interactive Games” (BI GAMES) that simulates a real-life management environment. It shows that the strategic management of operations depends on a complex set of endogenous and exogenous variables to the intervening agents that require specific skills and a set of critical processes to monitor. BI GAMES are designed for each management reality and have already been applied successfully in several contexts over the last five years comprising the educational and enterprise ones. Results from these experiences are used to demonstrate how serious games in working living labs contributed to improve the organizational environment by focusing on the evaluation of players’ (agents’) skills, empower its capabilities, and the critical factors that create value in each context. The implementation of the BI GAMES simulator highlights that leadership skills are decisive for the performance of teams, regardless of the sector of activity and the specificities of each organization whose operation is intended to simulate. The players in the BI GAMES can be managers or employees of different roles in the organization or students in the learning context. They interact with each other and are asked to decide/make choices in the presence of several options for the follow-up operation, for example, when the costs and benefits are not fully known but depend on the actions of external parties (e.g., subcontracted enterprises and actions of regulatory bodies). Each team must evaluate resources used/needed in each operation, identify bottlenecks in the system of operations, assess the performance of the system through a set of key performance indicators, and set a coherent strategy to improve efficiency. Through the gamification and the serious games approach, organizational managers will be able to confront the scientific approach in strategic decision-making versus their real-life approach based on experiences undertaken. Considering that each BI GAME’s team has a leader (chosen by draw), the performance of this player has a direct impact on the results obtained. Leadership skills are thus put to the test during the simulation of the functioning of each organization, allowing conclusions to be drawn at the end of the simulation, including its discussion amongst participants.Keywords: business interactive games, gamification, management empowerment skills, simulation living labs
Procedia PDF Downloads 11218016 A Study on the Effect of Different Climate Conditions on Time of Balance of Bleeding and Evaporation in Plastic Shrinkage Cracking of Concrete Pavements
Authors: Hasan Ziari, Hassan Fazaeli, Seyed Javad Vaziri Kang Olyaei, Asma Sadat Dabiri
Abstract:
The presence of cracks in concrete pavements is a place for the ingression of corrosive substances, acids, oils, and water into the pavement and reduces its long-term durability and level of service. One of the causes of early cracks in concrete pavements is the plastic shrinkage. This shrinkage occurs due to the formation of negative capillary pressures after the equilibrium of the bleeding and evaporation rates at the pavement surface. These cracks form if the tensile stresses caused by the restrained shrinkage exceed the tensile strength of the concrete. Different climate conditions change the rate of evaporation and thus change the balance time of the bleeding and evaporation, which changes the severity of cracking in concrete. The present study examined the relationship between the balance time of bleeding and evaporation and the area of cracking in the concrete slabs using the standard method ASTM C1579 in 27 different environmental conditions by using continuous video recording and digital image analyzing. The results showed that as the evaporation rate increased and the balance time decreased, the crack severity significantly increased so that by reducing the balance time from the maximum value to its minimum value, the cracking area increased more than four times. It was also observed that the cracking area- balance time curve could be interpreted in three sections. An examination of these three parts showed that the combination of climate conditions has a significant effect on increasing or decreasing these two variables. The criticality of a single factor cannot cause the critical conditions of plastic cracking. By combining two mild environmental factors with a severe climate factor (in terms of surface evaporation rate), a considerable reduction in balance time and a sharp increase in cracking severity can be prevented. The results of this study showed that balance time could be an essential factor in controlling and predicting plastic shrinkage cracking in concrete pavements. It is necessary to control this factor in the case of constructing concrete pavements in different climate conditions.Keywords: bleeding and cracking severity, concrete pavements, climate conditions, plastic shrinkage
Procedia PDF Downloads 14618015 Degradation of the Mechanical Properties of the Polypropylene Talc Nanocomposite in Chemical Environment
Authors: Ahmed Ouadah Bouakkaz, Mohamed Elmeguenni, Bel Abbes Bachir Bouiadjra, Mohamed Belhouari, Abdulmohsen Albedah
Abstract:
In this study, the effect of the chemical environment on the mechanical properties of the polypropylene-talc composite was analyzed. The talc proportion was varied in order to highlight the combined effects of time of immersion in the chemical environment 'benzene' and talc concentration on the mechanical properties of the composite. Tensile test was carried out to evaluate the mechanical properties of PP-talc composite and to analyze the effect of the immersion time on the variation of these properties. The obtained results show that increasing the time of immersion has a very negative effect on the mechanical strength of the PP-talc composite, but this effect can be significantly reduced by the augmentation of the talc proportion.Keywords: polypropylene (PP), talc, nanocomposite, degradation
Procedia PDF Downloads 38518014 SISSLE in Consensus-Based Ripple: Some Improvements in Speed, Security, Last Mile Connectivity and Ease of Use
Authors: Mayank Mundhra, Chester Rebeiro
Abstract:
Cryptocurrencies are rapidly finding wide application in areas such as Real Time Gross Settlements and Payments Systems. Ripple is a cryptocurrency that has gained prominence with banks and payment providers. It solves the Byzantine General’s Problem with its Ripple Protocol Consensus Algorithm (RPCA), where each server maintains a list of servers, called Unique Node List (UNL) that represents the network for the server, and will not collectively defraud it. The server believes that the network has come to a consensus when members of the UNL come to a consensus on a transaction. In this paper we improve Ripple to achieve better speed, security, last mile connectivity and ease of use. We implement guidelines and automated systems for building and maintaining UNLs for resilience, robustness, improved security, and efficient information propagation. We enhance the system so as to ensure that each server receives information from across the whole network rather than just from the UNL members. We also introduce the paradigm of UNL overlap as a function of information propagation and the trust a server assigns to its own UNL. Our design not only reduces vulnerabilities such as eclipse attacks, but also makes it easier to identify malicious behaviour and entities attempting to fraudulently Double Spend or stall the system. We provide experimental evidence of the benefits of our approach over the current Ripple scheme. We observe ≥ 4.97x and 98.22x in speedup and success rate for information propagation respectively, and ≥ 3.16x and 51.70x in speedup and success rate in consensus.Keywords: Ripple, Kelips, unique node list, consensus, information propagation
Procedia PDF Downloads 14618013 Alpha: A Groundbreaking Avatar Merging User Dialogue with OpenAI's GPT-3.5 for Enhanced Reflective Thinking
Authors: Jonas Colin
Abstract:
Standing at the vanguard of AI development, Alpha represents an unprecedented synthesis of logical rigor and human abstraction, meticulously crafted to mirror the user's unique persona and personality, a feat previously unattainable in AI development. Alpha, an avant-garde artefact in the realm of artificial intelligence, epitomizes a paradigmatic shift in personalized digital interaction, amalgamating user-specific dialogic patterns with the sophisticated algorithmic prowess of OpenAI's GPT-3.5 to engender a platform for enhanced metacognitive engagement and individualized user experience. Underpinned by a sophisticated algorithmic framework, Alpha integrates vast datasets through a complex interplay of neural network models and symbolic AI, facilitating a dynamic, adaptive learning process. This integration enables the system to construct a detailed user profile, encompassing linguistic preferences, emotional tendencies, and cognitive styles, tailoring interactions to align with individual characteristics and conversational contexts. Furthermore, Alpha incorporates advanced metacognitive elements, enabling real-time reflection and adaptation in communication strategies. This self-reflective capability ensures continuous refinement of its interaction model, positioning Alpha not just as a technological marvel but as a harbinger of a new era in human-computer interaction, where machines engage with us on a deeply personal and cognitive level, transforming our interaction with the digital world.Keywords: chatbot, GPT 3.5, metacognition, symbiose
Procedia PDF Downloads 7018012 Output-Feedback Control Design for a General Class of Systems Subject to Sampling and Uncertainties
Authors: Tomas Menard
Abstract:
The synthesis of output-feedback control law has been investigated by many researchers since the last century. While many results exist for the case of Linear Time Invariant systems whose measurements are continuously available, nowadays, control laws are usually implemented on micro-controller, then the measurements are discrete-time by nature. This fact has to be taken into account explicitly in order to obtain a satisfactory behavior of the closed-loop system. One considers here a general class of systems corresponding to an observability normal form and which is subject to uncertainties in the dynamics and sampling of the output. Indeed, in practice, the modeling of the system is never perfect, this results in unknown uncertainties in the dynamics of the model. We propose here an output feedback algorithm which is based on a linear state feedback and a continuous-discrete time observer. The main feature of the proposed control law is that only discrete-time measurements of the output are needed. Furthermore, it is formally proven that the state of the closed loop system exponentially converges toward the origin despite the unknown uncertainties. Finally, the performances of this control scheme are illustrated with simulations.Keywords: dynamical systems, output feedback control law, sampling, uncertain systems
Procedia PDF Downloads 28618011 Determination of Thermal Conductivity of Plaster Tow Material and Kapok Plaster by Numerical Method: Influence of the Heat Exchange Coefficient in Transitional Regime
Authors: Traore Papa Touty
Abstract:
This article presents a numerical method for determining the thermal conductivity of local materials, kapok plaster and tow plaster. It consists of heating the front face of a wall made from these two materials and at the same time insulating its rear face. We simultaneously study the curves of the evolution of the heat flux density as a function of time on the rear face and the evolution of the temperature gradient as a function of time between the heated face and the insulated face. Thermal conductivity is obtained when reaching a steady state when the evolution of the heat flux density and the temperature gradient no longer depend on time. The results showed that the theoretical value of thermal conductivity is obtained when the material has reached its equilibrium state. And the values obtained for different values of the convective exchange coefficients are appreciably equal to the experimental value.Keywords: thermal conductivity, numerical method, heat exchange coefficient, transitional regime
Procedia PDF Downloads 21918010 Development of an Intelligent Decision Support System for Smart Viticulture
Authors: C. M. Balaceanu, G. Suciu, C. S. Bosoc, O. Orza, C. Fernandez, Z. Viniczay
Abstract:
The Internet of Things (IoT) represents the best option for smart vineyard applications, even if it is necessary to integrate the technologies required for the development. This article is based on the research and the results obtained in the DISAVIT project. For Smart Agriculture, the project aims to provide a trustworthy, intelligent, integrated vineyard management solution that is based on the IoT. To have interoperability through the use of a multiprotocol technology (being the future connected wireless IoT) it is necessary to adopt an agnostic approach, providing a reliable environment to address cyber security, IoT-based threats and traceability through blockchain-based design, but also creating a concept for long-term implementations (modular, scalable). The ones described above represent the main innovative technical aspects of this project. The DISAVIT project studies and promotes the incorporation of better management tools based on objective data-based decisions, which are necessary for agriculture adapted and more resistant to climate change. It also exploits the opportunities generated by the digital services market for smart agriculture management stakeholders. The project's final result aims to improve decision-making, performance, and viticulturally infrastructure and increase real-time data accuracy and interoperability. Innovative aspects such as end-to-end solutions, adaptability, scalability, security and traceability, place our product in a favorable situation over competitors. None of the solutions in the market meet every one of these requirements by a unique product being innovative.Keywords: blockchain, IoT, smart agriculture, vineyard
Procedia PDF Downloads 20218009 Extreme Value Theory Applied in Reliability Analysis: Case Study of Diesel Generator Fans
Authors: Jelena Vucicevic
Abstract:
Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. In this paper, the results for the reliability of diesel generator fans were calculated through Extreme Value Theory. The Extreme Value Theory is not widely used in the engineering field. Its usage is well known in other areas such as hydrology, meteorology, finance. The significance of this theory is in the fact that unlike the other statistical methods it is focused on rare and extreme values, and not on average. It should be noted that this theory is not designed exclusively for extreme events, but for extreme values in any event. Therefore, this is a great opportunity to apply the theory and test if it could be applied in this situation. The significance of the work is the calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know the time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. The results achieved in this method will show the approximation of time for which the fans will work as they should, and the percentage of probability of fans working more than certain estimated time. Extreme Value Theory can be applied not only for rare and extreme events, but for any event that has values which we can consider as extreme.Keywords: extreme value theory, lifetime, reliability analysis, statistic, time to failure
Procedia PDF Downloads 32818008 Numerical Investigation of the Needle Opening Process in a High Pressure Gas Injector
Authors: Matthias Banholzer, Hagen Müller, Michael Pfitzner
Abstract:
Gas internal combustion engines are widely used as propulsion systems or in power plants to generate heat and electricity. While there are different types of injection methods including the manifold port fuel injection and the direct injection, the latter has more potential to increase the specific power by avoiding air displacement in the intake and to reduce combustion anomalies such as backfire or pre-ignition. During the opening process of the injector, multiple flow regimes occur: subsonic, transonic and supersonic. To cover the wide range of Mach numbers a compressible pressure-based solver is used. While the standard Pressure Implicit with Splitting of Operators (PISO) method is used for the coupling between velocity and pressure, a high-resolution non-oscillatory central scheme established by Kurganov and Tadmor calculates the convective fluxes. A blending function based on the local Mach- and CFL-number switches between the compressible and incompressible regimes of the developed model. As the considered operating points are well above the critical state of the used fluids, the ideal gas assumption is not valid anymore. For the real gas thermodynamics, the models based on the Soave-Redlich-Kwong equation of state were implemented. The caloric properties are corrected using a departure formalism, for the viscosity and the thermal conductivity the empirical correlation of Chung is used. For the injector geometry, the dimensions of a diesel injector were adapted. Simulations were performed using different nozzle and needle geometries and opening curves. It can be clearly seen that there is a significant influence of all three parameters.Keywords: high pressure gas injection, hybrid solver, hydrogen injection, needle opening process, real-gas thermodynamics
Procedia PDF Downloads 46118007 Effect of Fire Exposure on the Ultimate Strength of Loaded Columns
Authors: Hatem Hamdy Ghieth
Abstract:
In the recent time many fires happened in many skeleton buildings. The fire may be continues for a long time. This fire may cause a collapse of the building. This collapse may be happened due to the time of exposure to fire as well as the rate of the loading to the carrying elements. In this research a laboratory study for reinforced concrete columns under effect of fire with temperature reaches (650 ْ C) on the behavior of columns which loaded with axial load and with exposing to fire temperature only from all sides of columns. the main parameters of this study are level of load applying to the column, and the temperature applied to the fire, this temperatures was 500oC and 650oc. Nine concrete columns with dimensions 20x20x100 cms were casted one of these columns was tested to determine the ultimate load while the least were fired according to the experimental schedule.Keywords: columns, fire duration, concrete strength, level of loading
Procedia PDF Downloads 44018006 Competition Between the Effects of Pesticides and Immune-activation on the Expression of Toll Pathway Genes
Authors: Dani Sukkar, Ali Kanso, Philippe Laval-Gilly, Jairo Falla-Angel
Abstract:
The honeybees' immune system is challenged by different risk factors that induce various responses. However, complex scenarios where bees are exposed to different pesticides simultaneously with immune activation are not well evaluated. The Toll pathway is one of the main signaling pathways studied in invertebrate immune responses, and it is a good indicator of the effect of such complex interactions in addition to key signaling elements of other pathways like Relish of the immune deficiency (IMD) pathway or Eater, the phagocytosis receptor or vitellogenin levels. Honeybee hemocytes extracted from 5th instar larvae were exposed to imidacloprid and/or amitraz with or without the presence of the zymosan a as an immune activator. The gene expression of multiple immune related genes were studied, including spaetzle, Toll, myD88, relish, eater and vitellogenin, by real-time polymerase chain reaction after RNA extraction. The results demonstrated that the Toll pathway is mainly affected by the pesticides; imidacloprid and amitraz, especially by their different combinations. Furthermore, immune activation by zymosan A, a fungal cell-wall component, acts to mitigate to some extent the effect of pesticides on the different levels of the Toll pathway. In addition, imidacloprid, amitraz, and zymosan A have complex and context-specific interactions depending on the levels of immune activation and the pathway evaluated affecting immune-gene expression differently.Keywords: toll pathway, immune modulation, β-glucan, imidacloprid, amitraz, honeybees, immune genes
Procedia PDF Downloads 8718005 Modeling the Time-Dependent Rheological Behavior of Clays Used in Fabrication of Ceramic
Authors: Larbi Hammadi, N. Boudjenane, N. Benhallou, R. Houjedje, R. Reffis, M. Belhadri
Abstract:
Many of clays exhibited the thixotropic behavior in which, the apparent viscosity of material decreases with time of shearing at constant shear rate. The structural kinetic model (SKM) was used to characterize the thixotropic behavior of two different kinds of clays used in fabrication of ceramic. Clays selected for analysis represent the fluid and semisolid clays materials. The SKM postulates that the change in the rheological behavior is associated with shear-induced breakdown of the internal structure of the clays. This model for the structure decay with time at constant shear rate assumes nth order kinetics for the decay of the material structure with a rate constant.Keywords: ceramic, clays, structural kinetic model, thixotropy, viscosity
Procedia PDF Downloads 41018004 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 19518003 The Effect of Naringenin on the Apoptosis in T47D Cell Line of Breast Cancer
Authors: AliAkbar Hafezi, Jahanbakhsh Asadi, Majid Shahbazi, Alijan Tabarraei, Nader Mansour Samaei, Hamed Sheibak, Roghaye Gharaei
Abstract:
Background: Breast cancer is the most common cancer in women. In most cancer cells, apoptosis is blocked. As for the importance of apoptosis in cancer cell death and the role of different genes in its induction or inhibition, the search for compounds that can begin the process of apoptosis in tumor cells is discussed as a new strategy in anticancer drug discovery. The aim of this study was to investigate the effect of Naringenin (NGEN) on the apoptosis in the T47D cell line of breast cancer. Materials and Methods: In this experimental study in vitro, the T47D cell line of breast cancer was selected as a sample. The cells at 24, 48, and 72 hours were treated with doses of 20, 200, and 1000 µm of Naringenin. Then, the transcription levels of the genes involved in apoptosis, including Bcl-2, Bax, Caspase 3, Caspase 8, Caspase 9, P53, PARP-1, and FAS, were assessed using Real Time-PCR. The collected data were analyzed using IBM SPSS Statistics 24.0. Results: The results showed that Naringenin at doses of 20, 200, and 1000 µm in all three times of 24, 48, and 72 hours increased the expression of Caspase 3, P53, PARP-1 and FAS and reduced the expression of Bcl-2 and increased the Bax/Bcl-2 ratio, nevertheless in none of the studied doses and times, had not a significant effect on the expression of Bax, Caspase 8 and Caspase 9. Conclusion: This study indicates that Naringenin can reduce the growth of some cancer cells and cause their deaths through increased apoptosis and decreased anti-apoptotic Bcl-2 gene expression and, resulting in the induction of apoptosis via both internal and external pathways.Keywords: apoptosis, breast cancer, naringenin, T47D cell line
Procedia PDF Downloads 53