Search results for: event injuries
1386 Modelling a Hospital as a Queueing Network: Analysis for Improving Performance
Authors: Emad Alenany, M. Adel El-Baz
Abstract:
In this paper, the flow of different classes of patients into a hospital is modelled and analyzed by using the queueing network analyzer (QNA) algorithm and discrete event simulation. Input data for QNA are the rate and variability parameters of the arrival and service times in addition to the number of servers in each facility. Patient flows mostly match real flow for a hospital in Egypt. Based on the analysis of the waiting times, two approaches are suggested for improving performance: Separating patients into service groups, and adopting different service policies for sequencing patients through hospital units. The separation of a specific group of patients, with higher performance target, to be served separately from the rest of patients requiring lower performance target, requires the same capacity while improves performance for the selected group of patients with higher target. Besides, it is shown that adopting the shortest processing time and shortest remaining processing time service policies among other tested policies would results in, respectively, 11.47% and 13.75% reduction in average waiting time relative to first come first served policy.Keywords: queueing network, discrete-event simulation, health applications, SPT
Procedia PDF Downloads 1871385 The Ongoing Impact of Secondary Stressors on Businesses in Northern Ireland Affected by Flood Events
Authors: Jill Stephenson, Marie Vaganay, Robert Cameron, Caoimhe McGurk, Neil Hewitt
Abstract:
Purpose: The key aim of the research was to identify the secondary stressors experienced by businesses affected by single or repeated flooding and to determine to what extent businesses were affected by these stressors, along with any resulting impact on health. Additionally, the research aimed to establish the likelihood of businesses being re-exposed to the secondary stressors through assessing awareness of flood risk, implementation of property protection measures and level of community resilience. Design/methodology/approach: The chosen research method involved the distribution of a questionnaire survey to businesses affected by either single or repeated flood events. The questionnaire included the Impact of Event Scale (a 15-item self-report measure which assesses subjective distress caused by traumatic events). Findings: 55 completed questionnaires were returned by flood impacted businesses. 89% of the businesses had sustained internal flooding while 11% had experienced external flooding. The results established that the key secondary stressors experienced by businesses, in order of priority, were: flood damage, fear of reoccurring flooding, prevention of access to the premise/closure, loss of income, repair works, length of closure and insurance issues. There was a lack of preparedness for potential future floods and consequent vulnerability to the emergence of secondary stressors among flood affected businesses, as flood resistance or flood resilience measures had only been implemented by 11% and 13% respectively. In relation to the psychological repercussions, the Impact of Event scores suggested that potential prevalence of post-traumatic stress disorder (PTSD) was noted among 8 out of 55 respondents (l5%). Originality/value: The results improve understanding of the enduring repercussions of flood events on businesses, indicating that not only residents may be susceptible to the detrimental health impacts of flood events and single flood events may be just as likely as reoccurring flooding to contribute to ongoing stress. Lack of financial resources is a possible explanation for the lack of implementation of property protection measures among businesses, despite 49% experiencing flooding on multiple occasions. Therefore it is recommended that policymakers should consider potential sources of financial support or grants towards flood defences for flood impacted businesses. Any form of assistance should be made available to businesses at the earliest opportunity as there was no significant association between the time of the last flood event and the likelihood of experiencing PTSD symptoms.Keywords: flood event, flood resilience, flood resistance, PTSD, secondary stressors
Procedia PDF Downloads 4301384 Effect on Occupational Health Safety and Environment at Work from Metal Handicraft Using Rattanakosin Local Wisdom
Authors: Witthaya Mekhum, Waleerak Sittisom
Abstract:
This research investigated the effect on occupational health safety and environment at work from metal handicraft using Rattanakosin local wisdom focusing on pollution, accidents, and injuries from work. The sample group in this study included 48 metal handicraft workers in 5 communities by using questionnaires and interview to collect data. The evaluation form TISI 18001 was used to analyze job safety analysis (JSA). The results showed that risk at work reduced after applying the developed model. Banbu Community produces alloy bowl rubbed with stone. The high risk process is melting and hitting process. Before the application, the work risk was 82.71%. After the application of the developed model, the work risk was reduced to 50.61%. Banbart Community produces monk’s food bowl. The high risk process is blow pipe welding. Before the application, the work risk was 93.59%. After the application of the developed model, the work risk was reduced to 48.14%. Bannoen Community produces circle gong. The high risk process is milling process. Before the application, the work risk was 85.18%. After the application of the developed model, the work risk was reduced to 46.91%. Teethong Community produces gold leaf. The high risk process is hitting and spreading process. Before the application, the work risk was 86.42%. After the application of the developed model, the work risk was reduced to 64.19%. Ban Changthong Community produces gold ornament. The high risk process is gold melting process. Before the application, the work risk was 67.90%. After the application of the developed model, the work risk was reduced to 37.03%. It can be concluded that with the application of the developed model, the work risk of 5 communities was reduced in the 3 main groups: (1) Work illness reduced by 16.77%; (2) Pollution from work reduced by 10.31%; (3) Accidents and injuries from work reduced by 15.62%.Keywords: occupational health, safety, local wisdom, Rattanakosin
Procedia PDF Downloads 4401383 Kinematic of Thrusts and Tectonic Vergence in the Paleogene Orogen of Eastern Iran, Sechangi Area
Authors: Shahriyar Keshtgar, Mahmoud Reza Heyhat, Sasan Bagheri, Ebrahim Gholami, Seyed Naser Raiisosadat
Abstract:
The eastern Iranian range is a Z-shaped sigmoidal outcrop appearing with a NS-trending general strike on the satellite images, has already been known as the Sistan suture zone, recently identified as the product of an orogenic event introduced either by the Paleogene or Sistan orogen names. The flysch sedimentary basin of eastern Iran was filled by a huge volume of fine-grained Eocene turbiditic sediments, smaller amounts of pelagic deposits and Cretaceous ophiolitic slices, which are entirely remnants of older accretionary prisms appeared in a fold-thrust belt developed onto a subduction zone under the Lut/Afghan block, portions of the Cimmerian superterrane. In these ranges, there are Triassic sedimentary and carbonate sequences (equivalent to Nayband and Shotori Formations) along with scattered outcrops of Permian limestones (equivalent to Jamal limestone) and greenschist-facies metamorphic rocks, probably belonging to the basement of the Lut block, which have tectonic contacts with younger rocks. Moreover, the younger Eocene detrital-volcanic rocks were also thrusted onto the Cretaceous or younger turbiditic deposits. The first generation folds (parallel folds) and thrusts with slaty cleavage appeared parallel to the NE edge of the Lut block. Structural analysis shows that the most vergence of thrusts is toward the southeast so that the Permo-Triassic units in Lut have been thrusted on the younger rocks, including older (probably Jurassic) granites. Additional structural studies show that the regional transport direction in this deformation event is from northwest to the southeast where, from the outside to the inside of the orogen in the Sechengi area. Younger thrusts of the second deformation event were either directly formed as a result of the second deformation event, or they were older thrusts that reactivated and folded so that often, two sets or more slickenlines can be recognized on the thrust planes. The recent thrusts have been redistributed in directions nearly perpendicular to the edge of the Lut block and parallel to the axial surfaces of the northwest second generation large-scale folds (radial folds). Some of these younger thrusts follow the out-of-the-syncline thrust system. The both axial planes of these folds and associated penetrative shear cleavage extended towards northwest appeared with both northeast and southwest dips parallel to the younger thrusts. The large-scale buckling with the layer-parallel stress field has created this deformation event. Such consecutive deformation events perpendicular to each other cannot be basically explained by the simple linear orogen models presented for eastern Iran so far and are more consistent with the oroclinal buckling model.Keywords: thrust, tectonic vergence, orocline buckling, sechangi, eastern iranian ranges
Procedia PDF Downloads 781382 Anemia and Nutritional Status as Dominant Factor of the Event Low Birth Weight in Indonesia: A Systematic Review
Authors: Lisnawati Hutagalung
Abstract:
Background: Low birth weight (LBW) is one cause of newborn death. Babies with low birth weight tend to have slower cognitive development, growth retardation, more at risk of infectious disease event at risk of death. Objective: Identifying risk factors and dominant factors that influence the incidence of LBW in Indonesia. Method: This research used some database of public health such as Google Scholar, UGM journals, UI journals and UNAND journals in 2012-2015. Data were filtered using keywords ‘Risk Factors’ AND ‘Cause LBW’ with amounts 2757 study. The filtrate obtained 5 public health research that meets the criteria. Results: Risk factors associated with LBW, among other environment factors (exposure to cigarette smoke and residence), social demographics (age and socio-economic) and maternal factors (anemia, placental abnormal, nutritional status of mothers, examinations antenatal, preeclampsia, parity, and complications in pregnancy). Anemia and nutritional status become the dominant factor affecting LBW. Conclusions: The risk factors that affect LBW, most commonly found in the maternal factors. The dominant factors are a big effect on LBW is anemia and nutritional status of the mother during pregnancy.Keywords: low birth weight, anemia, nutritional status, the dominant factor
Procedia PDF Downloads 3651381 The Use of Social Media Sarcasm as a Response to Media-Coverage of Iran’s Unprecedented Attack on Israel
Authors: Afif J. Arabi
Abstract:
On April 15, 2024, Iran announced its unprecedented military attack by sending waves of more than 300 drones and ballistic missiles toward Israel. The Attack lasted approximately five hours and was a widely covered, distributed, and followed media event. Iran’s military action against Israel was a long-awaited action across the Middle East since the early days of the October 7th war on Gaza and after a long history of verbal threats. While people in many Arab countries stayed up past midnight in anticipation of watching the disastrous results of this unprecedented attack, voices on traditional and social media alike started to question the timed public announcement of the attack, which gave Israel at least a two-hour notice to prepare its defenses. When live news coverage started showing that nearly all the drones and missiles were intercepted by Israel – with help from the U.S. and other countries – and no deaths were reported, the social media response to this media event turned toward sarcasm, mockery, irony, and humor. Social media users posted sarcastic pictures, jokes, and comments mocking the Iranian offensive. This research examines this unique media event and the sarcastic response it generated on social media. The study aims to investigate the causes leading to media sarcasm in militarized political conflict, the social function of such generated sarcasm, and the role of social media as a platform for consuming frustration, dissatisfaction, and outrage passively through various media products. The study compares the serious traditional media coverage of the event with the humorous social media response among Arab countries. The research uses an eclectic theoretical approach using framing theory as a paradigm for understanding and investigating communication social functionalism theory in media studies to examine sarcasm. Social functionalism theory is a sociological perspective that views society as a complex system whose parts work together to promote solidarity and stability. In the context of media and sarcasm, this theory would suggest that sarcasm serves specific functions within society, such as reinforcing social norms, providing a means for social critique, or functioning as a safety valve for expressing social tension.; and a qualitative analysis of specific examples including responses of SM commentators to such manifestations of political criticism. The preliminary findings of this study point to a heightened dramatization of the televised event and a widespread belief that this attack was a staged show incongruent with Iran’s official enmity and death threats toward Israel. The social media sarcasm reinforces Arab’s view of Iran and Israel as mutual threats. This belief stems from the complex dynamics, historical context, and regional conflict surrounding these three nations: Iran, Israel, and Arabs.Keywords: social functionalism, social media sarcasm, Television news framing, live militarized conflict coverage, iran, israel, communication theory
Procedia PDF Downloads 441380 A Simulation of Patient Queuing System on Radiology Department at Tertiary Specialized Referral Hospital in Indonesia
Authors: Yonathan Audhitya Suthihono, Ratih Dyah Kusumastuti
Abstract:
The radiology department in a tertiary referral hospital faces service operation challenges such as huge and various patient arrival, which can increase the probability of patient queuing. During the COVID-19 pandemic, it is mandatory to apply social distancing protocol in the radiology department. A strategy to prevent the accumulation of patients at one spot would be required. The aim of this study is to identify an alternative solution which can reduce the patient’s waiting time in radiology department. Discrete event simulation (DES) is used for this study by constructing several improvement scenarios with Arena simulation software. Statistical analysis is used to test the validity of the base case scenario model and to investigate the performance of the improvement scenarios. The result of this study shows that the selected scenario is able to reduce patient waiting time significantly, which leads to more efficient services in a radiology department, be able to serve patients more effectively, and thus increase patient satisfaction. The result of the simulation can be used by the hospital management to improve the operational performance of the radiology department.Keywords: discrete event simulation, hospital management patient queuing model, radiology department services
Procedia PDF Downloads 1191379 A General Framework for Measuring the Internal Fraud Risk of an Enterprise Resource Planning System
Authors: Imran Dayan, Ashiqul Khan
Abstract:
Internal corporate fraud, which is fraud carried out by internal stakeholders of a company, affects the well-being of the organisation just like its external counterpart. Even if such an act is carried out for the short-term benefit of a corporation, the act is ultimately harmful to the entity in the long run. Internal fraud is often carried out by relying upon aberrations from usual business processes. Business processes are the lifeblood of a company in modern managerial context. Such processes are developed and fine-tuned over time as a corporation grows through its life stages. Modern corporations have embraced technological innovations into their business processes, and Enterprise Resource Planning (ERP) systems being at the heart of such business processes is a testimony to that. Since ERP systems record a huge amount of data in their event logs, the logs are a treasure trove for anyone trying to detect any sort of fraudulent activities hidden within the day-to-day business operations and processes. This research utilises the ERP systems in place within corporations to assess the likelihood of prospective internal fraud through developing a framework for measuring the risks of fraud through Process Mining techniques and hence finds risky designs and loose ends within these business processes. This framework helps not only in identifying existing cases of fraud in the records of the event log, but also signals the overall riskiness of certain business processes, and hence draws attention for carrying out a redesign of such processes to reduce the chance of future internal fraud while improving internal control within the organisation. The research adds value by applying the concepts of Process Mining into the analysis of data from modern day applications of business process records, which is the ERP event logs, and develops a framework that should be useful to internal stakeholders for strengthening internal control as well as provide external auditors with a tool of use in case of suspicion. The research proves its usefulness through a few case studies conducted with respect to big corporations with complex business processes and an ERP in place.Keywords: enterprise resource planning, fraud risk framework, internal corporate fraud, process mining
Procedia PDF Downloads 3341378 Applied Complement of Probability and Information Entropy for Prediction in Student Learning
Authors: Kennedy Efosa Ehimwenma, Sujatha Krishnamoorthy, Safiya Al‑Sharji
Abstract:
The probability computation of events is in the interval of [0, 1], which are values that are determined by the number of outcomes of events in a sample space S. The probability Pr(A) that an event A will never occur is 0. The probability Pr(B) that event B will certainly occur is 1. This makes both events A and B a certainty. Furthermore, the sum of probabilities Pr(E₁) + Pr(E₂) + … + Pr(Eₙ) of a finite set of events in a given sample space S equals 1. Conversely, the difference of the sum of two probabilities that will certainly occur is 0. This paper first discusses Bayes, the complement of probability, and the difference of probability for occurrences of learning-events before applying them in the prediction of learning objects in student learning. Given the sum of 1; to make a recommendation for student learning, this paper proposes that the difference of argMaxPr(S) and the probability of student-performance quantifies the weight of learning objects for students. Using a dataset of skill-set, the computational procedure demonstrates i) the probability of skill-set events that have occurred that would lead to higher-level learning; ii) the probability of the events that have not occurred that requires subject-matter relearning; iii) accuracy of the decision tree in the prediction of student performance into class labels and iv) information entropy about skill-set data and its implication on student cognitive performance and recommendation of learning.Keywords: complement of probability, Bayes’ rule, prediction, pre-assessments, computational education, information theory
Procedia PDF Downloads 1611377 Evolution of Bombings against Transportation Infrastructure
Authors: Jonathan K. Hill
Abstract:
The transportation networks throughout Africa remain the only transportation infrastructure system in the world that is attacked by terrorists at a high frequency, so the international community can learn from each attack. The targeting of transportation should be recognized as a direct attack against a civilian population, so the international community should work to better understand the types of attacks utilized, the types of improvised explosive device designs adapted to transportation targets, and the ways the various modes of transportation have been attacked throughout the continent. Some countries have seen grenade attacks that have resulted in only injuries, while some countries have experienced large vehicle bombings that have resulted in hundreds of injuries and numerous deaths. With insurgencies, explosive devices have been small, complex, and generally target an enemy of the insurgency. With terrorist bombings, the explosive devices have been large, brazen, and targeted at civilian populations. And, these civilian populations are easily targeted within the transportation system. The presentation provided by Assess Africa LLC is titled ‘Evolution of Bombings Against Transportation Infrastructure’ and covers improvised explosive device characteristics, how improvised explosive devices have been adapted to transportation targets in Africa, analyses recent incidents, and provides some advice for effective protective measures. A main component of the improvised explosive device characteristics portion of the presentation focuses on the link between explosive device components, the intelligence network, and the bomb-builder’s network. By understanding the components, how the use of various components can be linked to a terrorist group’s capabilities, and how the bomb-builder acquires materials, the analysis of improvised explosive device attacks takes on a new direction – one that focuses on defeating the network instead of merely reviewing incidents of the past.Keywords: Africa, bombings, critical infrastructure protection, transportation security
Procedia PDF Downloads 4251376 Base Deficit Profiling in Patients with Isolated Blunt Traumatic Brain Injury – Correlation with Severity and Outcomes
Authors: Shahan Waheed, Muhammad Waqas, Asher Feroz
Abstract:
Objectives: To determine the utility of base deficit in traumatic brain injury in assessing the severity and to correlate with the conventional computed tomography scales in grading the severity of head injury. Methodology: Observational cross-sectional study conducted in a tertiary care facility from 1st January 2010 to 31st December 2012. All patients with isolated traumatic brain injury presenting within 24 hours of the injury to the emergency department were included in the study. Initial Glasgow Coma Scale and base deficit values were taken at presentation, the patients were followed during their hospital stay and CT scan brain findings were recorded and graded as per the Rotterdam scale, the findings were cross-checked by a radiologist, Glasgow Outcome Scale was taken on last follow up. Outcomes were dichotomized into favorable and unfavorable outcomes. Continuous variables with normal and non-normal distributions are reported as mean ± SD. Categorical variables are presented as frequencies and percentages. Relationship of the base deficit with GCS, GOS, CT scan brain and length of stay was calculated using Spearman`s correlation. Results: 154 patients were enrolled in the study. Mean age of the patients were 30 years and 137 were males. The severity of brain injuries as per the GCS was 34 moderate and 109 severe respectively. 34 percent of the total has an unfavorable outcome with a mean of 18±14. The correlation was significant at the 0.01 level with GCS on presentation and the base deficit 0.004. The correlation was not significant between the Rotterdam CT scan brain findings, length of stay and the base deficit. Conclusion: The base deficit was found to be a good predictor of severity of brain injury. There was no association of the severity of injuries on the CT scan brain as per the Rotterdam scale and the base deficit. Further studies with large sample size are needed to further evaluate the associations.Keywords: base deficit, traumatic brain injury, Rotterdam, GCS
Procedia PDF Downloads 4431375 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data
Authors: N. Borjalilu, P. Rabiei, A. Enjoo
Abstract:
Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.Keywords: F-topsis, fuzzy set, flight data monitoring (FDM), flight safety
Procedia PDF Downloads 1681374 Empirical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;
Procedia PDF Downloads 821373 Monitoring of Wound Healing Through Structural and Functional Mechanisms Using Photoacoustic Imaging Modality
Authors: Souradip Paul, Arijit Paramanick, M. Suheshkumar Singh
Abstract:
Traumatic injury is the leading worldwide health problem. Annually, millions of surgical wounds are created for the sake of routine medical care. The healing of these unintended injuries is always monitored based on visual inspection. The maximal restoration of tissue functionality remains a significant concern of clinical care. Although minor injuries heal well with proper care and medical treatment, large injuries negatively influence various factors (vasculature insufficiency, tissue coagulation) and cause poor healing. Demographically, the number of people suffering from severe wounds and impaired healing conditions is burdensome for both human health and the economy. An incomplete understanding of the functional and molecular mechanism of tissue healing often leads to a lack of proper therapies and treatment. Hence, strong and promising medical guidance is necessary for monitoring the tissue regeneration processes. Photoacoustic imaging (PAI), is a non-invasive, hybrid imaging modality that can provide a suitable solution in this regard. Light combined with sound offers structural, functional and molecular information from the higher penetration depth. Therefore, molecular and structural mechanisms of tissue repair will be readily observable in PAI from the superficial layer and in the deep tissue region. Blood vessel formation and its growth is an essential tissue-repairing components. These vessels supply nutrition and oxygen to the cell in the wound region. Angiogenesis (formation of new capillaries from existing blood vessels) contributes to new blood vessel formation during tissue repair. The betterment of tissue healing directly depends on angiogenesis. Other optical microscopy techniques can visualize angiogenesis in micron-scale penetration depth but are unable to provide deep tissue information. PAI overcomes this barrier due to its unique capability. It is ideally suited for deep tissue imaging and provides the rich optical contrast generated by hemoglobin in blood vessels. Hence, an early angiogenesis detection method provided by PAI leads to monitoring the medical treatment of the wound. Along with functional property, mechanical property also plays a key role in tissue regeneration. The wound heals through a dynamic series of physiological events like coagulation, granulation tissue formation, and extracellular matrix (ECM) remodeling. Therefore tissue elasticity changes, can be identified using non-contact photoacoustic elastography (PAE). In a nutshell, angiogenesis and biomechanical properties are both critical parameters for tissue healing and these can be characterized in a single imaging modality (PAI).Keywords: PAT, wound healing, tissue coagulation, angiogenesis
Procedia PDF Downloads 1061372 The Structural Pattern: An Event-Related Potential Study on Tang Poetry
Authors: ShuHui Yang, ChingChing Lu
Abstract:
Measuring event-related potentials (ERPs) has been fundamental to our understanding of how people process language. One specific ERP component, a P600, has been hypothesized to be associated with syntactic reanalysis processes. We, however, propose that the P600 is not restricted to reanalysis processes, but is the index of the structural pattern processing. To investigate the structural pattern processing, we utilized the effects of stimulus degradation in structural priming. To put it another way, there was no P600 effect if the structure of the prime was the same with the structure of the target. Otherwise, there would be a P600 effect if the structure were different between the prime and the target. In the experiment, twenty-two participants were presented with four sentences of Tang poetry. All of the first two sentences, being prime, were conducted with SVO+VP. The last two sentences, being the target, were divided into three types. Type one of the targets was SVO+VP. Type two of the targets was SVO+VPVP. Type three of the targets was VP+VP. The result showed that both of the targets, SVO+VPVP and VP+VP, elicited positive-going brainwave, a P600 effect, at 600~900ms time window. Furthermore, the P600 component was lager for the target’ VP+VP’ than the target’ SVO+VPVP’. That meant the more dissimilar the structure was, the lager the P600 effect we got. These results indicate that P600 was the index of the structure processing, and it would affect the P600 effect intensity with the degrees of structural heterogeneity.Keywords: ERPs, P600, structural pattern, structural priming, Tang poetry
Procedia PDF Downloads 1401371 Two-Level Graph Causality to Detect and Predict Random Cyber-Attacks
Authors: Van Trieu, Shouhuai Xu, Yusheng Feng
Abstract:
Tracking attack trajectories can be difficult, with limited information about the nature of the attack. Even more difficult as attack information is collected by Intrusion Detection Systems (IDSs) due to the current IDSs having some limitations in identifying malicious and anomalous traffic. Moreover, IDSs only point out the suspicious events but do not show how the events relate to each other or which event possibly cause the other event to happen. Because of this, it is important to investigate new methods capable of performing the tracking of attack trajectories task quickly with less attack information and dependency on IDSs, in order to prioritize actions during incident responses. This paper proposes a two-level graph causality framework for tracking attack trajectories in internet networks by leveraging observable malicious behaviors to detect what is the most probable attack events that can cause another event to occur in the system. Technically, given the time series of malicious events, the framework extracts events with useful features, such as attack time and port number, to apply to the conditional independent tests to detect the relationship between attack events. Using the academic datasets collected by IDSs, experimental results show that the framework can quickly detect the causal pairs that offer meaningful insights into the nature of the internet network, given only reasonable restrictions on network size and structure. Without the framework’s guidance, these insights would not be able to discover by the existing tools, such as IDSs. It would cost expert human analysts a significant time if possible. The computational results from the proposed two-level graph network model reveal the obvious pattern and trends. In fact, more than 85% of causal pairs have the average time difference between the causal and effect events in both computed and observed data within 5 minutes. This result can be used as a preventive measure against future attacks. Although the forecast may be short, from 0.24 seconds to 5 minutes, it is long enough to be used to design a prevention protocol to block those attacks.Keywords: causality, multilevel graph, cyber-attacks, prediction
Procedia PDF Downloads 1561370 Performance Evaluation of Production Schedules Based on Process Mining
Authors: Kwan Hee Han
Abstract:
External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.Keywords: data mining, event log, process mining, production scheduling
Procedia PDF Downloads 2791369 Socioeconomic Burden of Life Long Disease: A Case of Diabetes Care in Bangladesh
Authors: Samira Humaira Habib
Abstract:
Diabetes has profound effects on individuals and their families. If diabetes is not well monitored and managed, then it leads to long-term complications and a large and growing cost to the health care system. Prevalence and socioeconomic burden of diabetes and relative return of investment for the elimination or the reduction of the burden are much more important regarding its cost burden. Various studies regarding the socioeconomic cost burden of diabetes are well explored in developed countries but almost absent in developing countries like Bangladesh. The main objective of the study is to estimate the total socioeconomic burden of diabetes. It is a prospective longitudinal follow up study which is analytical in nature. Primary and secondary data are collected from patients who are undergoing treatment for diabetes at the out-patient department of Bangladesh Institute of Research & Rehabilitation in Diabetes, Endocrine & Metabolic Disorders (BIRDEM). Of the 2115 diabetic subjects, females constitute around 50.35% of the study subject, and the rest are male (49.65%). Among the subjects, 1323 are controlled, and 792 are uncontrolled diabetes. Cost analysis of 2115 diabetic patients shows that the total cost of diabetes management and treatment is US$ 903018 with an average of US$ 426.95 per patient. In direct cost, the investigation and medical treatment at hospital along with investigation constitute most of the cost in diabetes. The average cost of a hospital is US$ 311.79, which indicates an alarming warn for diabetic patients. The indirect cost shows that cost of productivity loss (US$ 51110.1) is higher among the all indirect item. All constitute total indirect cost as US$ 69215.7. The incremental cost of intensive management of uncontrolled diabetes is US$ 101.54 per patient and event-free time gained in this group is 0.55 years and the life years gain is 1.19 years. The incremental cost per event-free year gained is US$ 198.12. The incremental cost of intensive management of the controlled group is US$ 89.54 per patient and event-free time gained is 0.68 years, and the life year gain is 1.12 years. The incremental cost per event-free year gained is US$ 223.34. The EuroQoL difference between the groups is found to be 64.04. The cost-effective ratio is found to be US$ 1.64 cost per effect in case of controlled diabetes and US$ 1.69 cost per effect in case of uncontrolled diabetes. So management of diabetes is much more cost-effective. Cost of young type 1 diabetic patient showed upper socioeconomic class, and with the increase of the duration of diabetes, the cost increased also. The dietary pattern showed macronutrients intake and cost are significantly higher in the uncontrolled group than their counterparts. Proper management and control of diabetes can decrease the cost of care for the long term.Keywords: cost, cost-effective, chronic diseases, diabetes care, burden, Bangladesh
Procedia PDF Downloads 1471368 Causes and Impacts of Marine Heatwaves in the Bay of Bengal Region in the Recent Period
Authors: Sudhanshu Kumar, Raghvendra Chandrakar, Arun Chakraborty
Abstract:
In the ocean, the temperature extremes have the potential to devastate marine habitats, ecosystems together with ensuing socioeconomic consequences. In recent years, these extreme events are more frequent and intense globally and their increasing trend is expected to continue in the upcoming decades. It recently attracted public interest, as well as scientific researchers, which motivates us to analyze the current marine heatwave (MHW) events in the Bay of Bengal region. we have isolated 107 MHW events (above 90th percentile threshold) in this region of the Indian Ocean and investigated the variation in duration, intensity, and frequency of MHW events during our test period (1982-2021). Our study reveals that in the study region the average of three MHW events per year with an increasing linear trend of 1.11 MHW events per decade. In the analysis, we found the longest MHW event which lasted about 99 days, which is far greater than an average MHW event duration. The maximum intensity was 5.29°C (above the climatology-mean), while the mean intensity was 2.03°C. In addition, we observed net heat flux accompanied by anticyclonic eddies to be the primary cause of these events. Moreover, we concluded that these events affect sea surface height and oceanic productivity, highlighting the adverse impact of MHWs on marine ecosystems.Keywords: marine heatwaves, global warming, climate change, sea surface temperature, marine ecosystem
Procedia PDF Downloads 1231367 Aging and Falls Profile from Hospital Databases
Authors: Nino Chikhladze, Tamar Dochviri, Nato Pitskhelauri, Maia Bitskhinashvili
Abstract:
Population aging is a key social and demographic trend of the 21st century. Falls represent a prevalent geriatric syndrome that poses significant risks to the health and independence of older adults. The World Health Organization notes a lack of comprehensive data on falls in low- and middle-income countries, complicating the creation of effective prevention programs. To the authors’ best knowledge, no such studies have been conducted in Georgia. The aim of the study is to explore the epidemiology of falls in the elderly population. The hospitalization database of the National Center for Disease Control and Public Health of Georgia was used for the retrospective study. Falls-related injuries were identified using ICD-10 classifications using the class XIX (S and T codes) and class XX for the type of injury (V-Y codes). Statistical data analyses were done using SPSS software version 23.0. The total number of fall-related hospitalizations for individuals aged 65 and older from 2015 to 2021 was 29,697. The study revealed that falls accounted for an average of 63% (ranging from 59% to 66%) of all hospitalizations and 68% (ranging from 65% to 70%) of injury-related hospitalizations during this period. The 69% of all patients were women and 31%-men (Chi2=4482.1, p<0.001). The highest rate of hospitalization was in the age groups 80-84 and 75-79. The probability of fall-related hospitalization was significantly higher in women (p<0.001) compared to men in all age groups except 65-69 years. In the target age group of 65 years and older, the probability of hospitalization increased significantly with an increase in age (p<0.001). The study's results can be leveraged to create evidence-based awareness programs, design targeted multi-domain interventions addressing specific risk factors, and enhance the quality of geriatric healthcare services in Georgia.Keywords: elderly population, falls, geriatric patients, hospitalization, injuries
Procedia PDF Downloads 281366 Development of Trigger Tool to Identify Adverse Drug Events From Warfarin Administered to Patient Admitted in Medical Wards of Chumphae Hospital
Authors: Puntarikorn Rungrattanakasin
Abstract:
Objectives: To develop the trigger tool to warn about the risk of bleeding as an adverse event from warfarin drug usage during admission in Medical Wards of Chumphae Hospital. Methods: A retrospective study was performed by reviewing the medical records for the patients admitted between June 1st,2020- May 31st, 2021. ADEs were evaluated by Naranjo’s algorithm. The international normalized ratio (INR) and events of bleeding during admissions were collected. Statistical analyses, including Chi-square test and Reciever Operating Characteristic (ROC) curve for optimal INR threshold, were used for the study. Results: Among the 139 admissions, the INR range was found to vary between 0.86-14.91, there was a total of 15 bleeding events, out of which 9 were mild, and 6 were severe. The occurrence of bleeding started whenever the INR was greater than 2.5 and reached the statistical significance (p <0.05), which was in concordance with the ROC curve and yielded 100 % sensitivity and 60% specificity in the detection of a bleeding event. In this regard, the INR greater than 2.5 was considered to be an optimal threshold to alert promptly for bleeding tendency. Conclusions: The INR value of greater than 2.5 (>2.5) would be an appropriate trigger tool to warn of the risk of bleeding for patients taking warfarin in Chumphae Hospital.Keywords: trigger tool, warfarin, risk of bleeding, medical wards
Procedia PDF Downloads 1481365 Initial Palaeotsunami and Historical Tsunami in the Makran Subduction Zone of the Northwest Indian Ocean
Authors: Mohammad Mokhtari, Mehdi Masoodi, Parvaneh Faridi
Abstract:
history of tsunami generating earthquakes along the Makran Subduction Zone provides evidence of the potential tsunami hazard for the whole coastal area. In comparison with other subduction zone in the world, the Makran region of southern Pakistan and southeastern Iran remains low seismicity. Also, it is one of the least studied area in the northwest of the Indian Ocean regarding tsunami studies. We present a review of studies dealing with the historical /and ongoing palaeotsunamis supported by IGCP of UNESCO in the Makran Subduction Zone. The historical tsunami presented here includes about nine tsunamis in the Makran Subduction Zone, of which over 7 tsunamis occur in the eastern Makran. Tsunami is not as common in the western Makran as in the eastern Makran, where a database of historical events exists. The historically well-documented event is related to the 1945 earthquake with a magnitude of 8.1moment magnitude and tsunami in the western and eastern Makran. There are no details as to whether a tsunami was generated by a seismic event before 1945 off western Makran. But several potentially large tsunamigenic events in the MSZ before 1945 occurred in 325 B.C., 1008, 1483, 1524, 1765, 1851, 1864, and 1897. Here we will present new findings from a historical point of view, immediately, we would like to emphasize that the area needs to be considered with higher research investigation. As mentioned above, a palaeotsunami (geological evidence) is now being planned, and here we will present the first phase result. From a risk point of view, the study shows as preliminary achievement within 20 minutes the wave reaches to Iranian as well Pakistan and Oman coastal zone with very much destructive tsunami waves capable of inundating destructive effect. It is important to note that all the coastal areas of all states surrounding the MSZ are being developed very rapidly, so any event would have a devastating effect on this region. Although several papers published about modelling, seismology, tsunami deposits in the last decades; as Makran is a forgotten subduction zone, more data such as the main crustal structure, fault location, and its related parameter are required.Keywords: historical tsunami, Indian ocean, makran subduction zone, palaeotsunami
Procedia PDF Downloads 1311364 In vitro Regeneration of Neural Cells Using Human Umbilical Cord Derived Mesenchymal Stem Cells
Authors: Urvi Panwar, Kanchan Mishra, Kanjaksha Ghosh, ShankerLal Kothari
Abstract:
Background: Day-by-day the increasing prevalence of neurodegenerative diseases have become a global issue to manage them by medical sciences. The adult neural stem cells are rare and require an invasive and painful procedure to obtain it from central nervous system. Mesenchymal stem cell (MSCs) therapies have shown remarkable application in treatment of various cell injuries and cell loss. MSCs can be derived from various sources like adult tissues, human bone marrow, umbilical cord blood and cord tissue. MSCs have similar proliferation and differentiation capability, but the human umbilical cord-derived mesenchymal stem cells (hUCMSCs) are proved to be more beneficial with respect to cell procurement, differentiation to other cells, preservation, and transplantation. Material and method: Human umbilical cord is easily obtainable and non-controversial comparative to bone marrow and other adult tissues. The umbilical cord can be collected after delivery of baby, and its tissue can be cultured using explant culture method. Cell culture medium such as DMEMF12+10% FBS and DMEMF12+Neural growth factors (bFGF, human noggin, B27) with antibiotics (Streptomycin/Gentamycin) were used to culture and differentiate mesenchymal stem cells into neural cells, respectively. The characterisations of MSCs were done with Flow Cytometer for surface markers CD90, CD73 and CD105 and colony forming unit assay. The differentiated various neural cells will be characterised by fluorescence markers for neurons, astrocytes, and oligodendrocytes; quantitative PCR for genes Nestin and NeuroD1 and Western blotting technique for gap43 protein. Result and discussion: The high quality and number of MSCs were isolated from human umbilical cord via explant culture method. The obtained MSCs were differentiated into neural cells like neurons, astrocytes and oligodendrocytes. The differentiated neural cells can be used to treat neural injuries and neural cell loss by delivering cells by non-invasive administration via cerebrospinal fluid (CSF) or blood. Moreover, the MSCs can also be directly delivered to different injured sites where they differentiate into neural cells. Therefore, human umbilical cord is demonstrated to be an inexpensive and easily available source for MSCs. Moreover, the hUCMSCs can be a potential source for neural cell therapies and neural cell regeneration for neural cell injuries and neural cell loss. This new way of research will be helpful to treat and manage neural cell damages and neurodegenerative diseases like Alzheimer and Parkinson. Still the study has a long way to go but it is a promising approach for many neural disorders for which at present no satisfactory management is available.Keywords: bone marrow, cell therapy, explant culture method, flow cytometer, human umbilical cord, mesenchymal stem cells, neurodegenerative diseases, neuroprotective, regeneration
Procedia PDF Downloads 2021363 Dynamic EEG Desynchronization in Response to Vicarious Pain
Authors: Justin Durham, Chanda Rooney, Robert Mather, Mickie Vanhoy
Abstract:
The psychological construct of empathy is to understand a person’s cognitive perspective and experience the other person’s emotional state. Deciphering emotional states is conducive for interpreting vicarious pain. Observing others' physical pain activates neural networks related to the actual experience of pain itself. The study addresses empathy as a nonlinear dynamic process of simulation for individuals to understand the mental states of others and experience vicarious pain, exhibiting self-organized criticality. Such criticality follows from a combination of neural networks with an excitatory feedback loop generating bistability to resonate permutated empathy. Cortical networks exhibit diverse patterns of activity, including oscillations, synchrony and waves, however, the temporal dynamics of neurophysiological activities underlying empathic processes remain poorly understood. Mu rhythms are EEG oscillations with dominant frequencies of 8-13 Hz becoming synchronized when the body is relaxed with eyes open and when the sensorimotor system is in idle, thus, mu rhythm synchrony is expected to be highest in baseline conditions. When the sensorimotor system is activated either by performing or simulating action, mu rhythms become suppressed or desynchronize, thus, should be suppressed while observing video clips of painful injuries if previous research on mirror system activation holds. Twelve undergraduates contributed EEG data and survey responses to empathy and psychopathy scales in addition to watching consecutive video clips of sports injuries. Participants watched a blank, black image on a computer monitor before and after observing a video of consecutive sports injuries incidents. Each video condition lasted five-minutes long. A BIOPAC MP150 recorded EEG signals from sensorimotor and thalamocortical regions related to a complex neural network called the ‘pain matrix’. Physical and social pain are activated in this network to resonate vicarious pain responses to processing empathy. Five EEG single electrode locations were applied to regions measuring sensorimotor electrical activity in microvolts (μV) to monitor mu rhythms. EEG signals were sampled at a rate of 200 Hz. Mu rhythm desynchronization was measured via 8-13 Hz at electrode sites (F3 & F4). Data for each participant’s mu rhythms were analyzed via Fast Fourier Transformation (FFT) and multifractal time series analysis.Keywords: desynchronization, dynamical systems theory, electroencephalography (EEG), empathy, multifractal time series analysis, mu waveform, neurophysiology, pain simulation, social cognition
Procedia PDF Downloads 2831362 Pharmacovigilance: An Empowerment in Safe Utilization of Pharmaceuticals
Authors: Pankaj Prashar, Bimlesh Kumar, Ankita Sood, Anamika Gautam
Abstract:
Pharmacovigilance (PV) is a rapidly growing discipline in pharmaceutical industries as an integral part of clinical research and drug development over the past few decades. PV carries a breadth of scope from drug manufacturing to its regulation with safer utilization. The fundamental steps of PV not only includes data collection and verification, coding of drugs with adverse drug reactions, causality assessment and timely reporting to the authorities but also monitoring drug manufacturing, safety issues, product quality and conduction of due diligence. Standardization of adverse event information, collaboration of multiple departments in different companies, preparation of documents in accordance to both governmental as well as non-governmental organizations (FDA, EMA, GVP, ICH) are the advancements in discipline of PV. De-harmonization, lack of predictive drug safety models, improper funding by government, non-reporting, and non-acceptability of ADRs by developing countries and reports directly from patients to the monitoring centres respectively are the major road backs of PV. Mandatory pharmacovigilance reporting, frequent inspections, funding by government, educating and training medical students, pharmacists and nurses in this segment can bring about empowerment in PV. This area needs to be addressed with a sense of urgency for the safe utilization of pharmaceuticals.Keywords: pharmacovigilance, regulatory, adverse event, drug safety
Procedia PDF Downloads 1241361 Performance Comparison of Thread-Based and Event-Based Web Servers
Authors: Aikaterini Kentroti, Theodore H. Kaskalis
Abstract:
Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.Keywords: apache, Go, Nginx, node.js, web server benchmarking
Procedia PDF Downloads 971360 C-Spine Imaging in a Non-trauma Centre: Compliance with NEXUS Criteria Audit
Authors: Andrew White, Abigail Lowe, Kory Watkins, Hamed Akhlaghi, Nicole Winter
Abstract:
The timing and appropriateness of diagnostic imaging are critical to the evaluation and management of traumatic injuries. Within the subclass of trauma patients, the prevalence of c-spine injury is less than 4%. However, the incidence of delayed diagnosis within this cohort has been documented as up to 20%, with inadequate radiological examination most cited issue. In order to assess those in which c-spine injury cannot be fully excluded based on clinical examination alone and, therefore, should undergo diagnostic imaging, a set of criteria is used to provide clinical guidance. The NEXUS (National Emergency X-Radiography Utilisation Study) criteria is a validated clinical decision-making tool used to facilitate selective c-spine radiography. The criteria allow clinicians to determine whether cervical spine imaging can be safely avoided in appropriate patients. The NEXUS criteria are widely used within the Emergency Department setting given their ease of use and relatively straightforward application and are used in the Victorian State Trauma System’s guidelines. This audit utilized retrospective data collection to examine the concordance of c-spine imaging in trauma patients to that of the NEXUS criteria and assess compliance with state guidance on diagnostic imaging in trauma. Of the 183 patients that presented with trauma to the head, neck, or face (244 excluded due to incorrect triage), 98 did not undergo imaging of the c-spine. Out of those 98, 44% fulfilled at least one of the NEXUS criteria, meaning the c-spine could not be clinically cleared as per the current guidelines. The criterion most met was intoxication, comprising 42% (18 of 43), with midline spinal tenderness (or absence of documentation of this) the second most common with 23% (10 of 43). Intoxication being the most met criteria is significant but not unexpected given the cohort of patients seen at St Vincent’s and within many emergency departments in general. Given these patients will always meet NEXUS criteria, an element of clinical judgment is likely needed, or concurrent use of the Canadian C-Spine Rules to exclude the need for imaging. Midline tenderness as a met criterion was often in the context of poor or absent documentation relating to this, emphasizing the importance of clear and accurate assessments. The distracting injury was identified in 7 out of the 43 patients; however, only one of these patients exhibited a thoracic injury (T11 compression fracture), with the remainder comprising injuries to the extremities – some studies suggest that C-spine imaging may not be required in the evaluable blunt trauma patient despite distracting injuries in any body regions that do not involve the upper chest. This emphasises the need for standardised definitions for distracting injury, at least at a departmental/regional level. The data highlights the currently poor application of the NEXUS guidelines, with likely common themes throughout emergency departments, highlighting the need for further education regarding implementation and potential refinement/clarification of criteria. Of note, there appeared to be no significant differences between levels of experience with respect to inappropriately clearing the c-spine clinically with respect to the guidelines.Keywords: imaging, guidelines, emergency medicine, audit
Procedia PDF Downloads 721359 Two-Way Reminder Systems to Support Activities of Daily Living for Adults with Cognitive Impairments: A Scoping Review
Authors: Julia Brudzinski, Ashley Croswell, Jade Mardin, Hannah Shilling, Jennifer Berg-Carnegie
Abstract:
Adults with brain injuries and mental illnesses commonly experience cognitive impairments that interfere with their participation in activities of daily living (ADLs). Prior research states that electronic reminder systems can support adults with cognitive impairments; however, previous studies focus primarily on one-way reminder systems. Research on adults with chronic diseases reported that two-way reminder systems yield better health outcomes and disease self-management compared to one-way reminder systems. Literature was identified through systematically searching 7 databases and hand-searching relevant reference lists. Retrieved studies were independently screened and reviewed by at least two members of the research team. Data was extracted on study design, participant characteristics, intervention details, study objectives, outcome measures, and important results. 574 articles were screened and reviewed. Nine articles met all inclusion criteria and were included. The literature focused on three main areas: system feasibility (n=8), stakeholder satisfaction (n=6), and efficacy of the two-way reminder systems (n=6). Participants in eight of the studies had brain injuries, with participants in only one study having a mental illness (i.e., schizophrenia). Two-way reminder systems were used to support participation in a wide range of ADLs. The current literature on two-way reminder systems to support ADLs for adults with cognitive impairments focuses on feasibility, stakeholder satisfaction, and system efficacy. Future research should focus on addressing the barriers to accessing and implementing two-way reminder systems and identifying specific client characteristics that would benefit most from using these systems.Keywords: brain injury, digital health, occupational therapy, activities of daily living, two-way reminder systems
Procedia PDF Downloads 741358 Explanatory Variables for Crash Injury Risk Analysis
Authors: Guilhermina Torrao
Abstract:
An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.Keywords: crash, exploratory, injury, risk, variables, vehicle
Procedia PDF Downloads 1351357 Purity Monitor Studies in Medium Liquid Argon TPC
Authors: I. Badhrees
Abstract:
This paper is an attempt to describe some of the results that had been found through a journey of study in the field of particle physics. This study consists of two parts, one about the measurement of the cross section of the decay of the Z particle in two electrons, and the other deals with the measurement of the cross section of the multi-photon absorption process using a beam of laser in the Liquid Argon Time Projection Chamber. The first part of the paper concerns the results based on the analysis of a data sample containing 8120 ee candidates to reconstruct the mass of the Z particle for each event where each event has an ee pair with PT(e) > 20GeV, and η(e) < 2.5. Monte Carlo templates of the reconstructed Z particle were produced as a function of the Z mass scale. The distribution of the reconstructed Z mass in the data was compared to the Monte Carlo templates, where the total cross section is calculated to be equal to 1432 pb. The second part concerns the Liquid Argon Time Projection Chamber, LAr TPC, the results of the interaction of the UV Laser, Nd-YAG with λ= 266mm, with LAr and through the study of the multi-photon ionization process as a part of the R&D at Bern University. The main result of this study was the cross section of the process of the multi-photon ionization process of the LAr, σe = 1.24±0.10stat±0.30sys.10 -56cm4.Keywords: ATLAS, CERN, KACST, LArTPC, particle physics
Procedia PDF Downloads 346