Search results for: fast handover
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1907

Search results for: fast handover

17 An Innovation Decision Process View in an Adoption of Total Laboratory Automation

Authors: Chia-Jung Chen, Yu-Chi Hsu, June-Dong Lin, Kun-Chen Chan, Chieh-Tien Wang, Li-Ching Wu, Chung-Feng Liu

Abstract:

With fast advances in healthcare technology, various total laboratory automation (TLA) processes have been proposed. However, adopting TLA needs quite high funding. This study explores an early adoption experience by Taiwan’s large-scale hospital group, the Chimei Hospital Group (CMG), which owns three branch hospitals (Yongkang, Liouying and Chiali, in order by service scale), based on the five stages of Everett Rogers’ Diffusion Decision Process. 1.Knowledge stage: Over the years, two weaknesses exists in laboratory department of CMG: 1) only a few examination categories (e.g., sugar testing and HbA1c) can now be completed and reported within a day during an outpatient clinical visit; 2) the Yongkang Hospital laboratory space is dispersed across three buildings, resulting in duplicated investment in analysis instruments and inconvenient artificial specimen transportation. Thus, the senior management of the department raised a crucial question, was it time to process the redesign of the laboratory department? 2.Persuasion stage: At the end of 2013, Yongkang Hospital’s new building and restructuring project created a great opportunity for the redesign of the laboratory department. However, not all laboratory colleagues had the consensus for change. Thus, the top managers arranged a series of benchmark visits to stimulate colleagues into being aware of and accepting TLA. Later, the director of the department proposed a formal report to the top management of CMG with the results of the benchmark visits, preliminary feasibility analysis, potential benefits and so on. 3.Decision stage: This TLA suggestion was well-supported by the top management of CMG and, finally, they made a decision to carry out the project with an instrument-leasing strategy. After the announcement of a request for proposal and several vendor briefings, CMG confirmed their laboratory automation architecture and finally completed the contracts. At the same time, a cross-department project team was formed and the laboratory department assigned a section leader to the National Taiwan University Hospital for one month of relevant training. 4.Implementation stage: During the implementation, the project team called for regular meetings to review the results of the operations and to offer an immediate response to the adjustment. The main project tasks included: 1) completion of the preparatory work for beginning the automation procedures; 2) ensuring information security and privacy protection; 3) formulating automated examination process protocols; 4) evaluating the performance of new instruments and the instrument connectivity; 5)ensuring good integration with hospital information systems (HIS)/laboratory information systems (LIS); and 6) ensuring continued compliance with ISO 15189 certification. 5.Confirmation stage: In short, the core process changes include: 1) cancellation of signature seals on the specimen tubes; 2) transfer of daily examination reports to a data warehouse; 3) routine pre-admission blood drawing and formal inpatient morning blood drawing can be incorporated into an automatically-prepared tube mechanism. The study summarizes below the continuous improvement orientations: (1) Flexible reference range set-up for new instruments in LIS. (2) Restructure of the specimen category. (3) Continuous review and improvements to the examination process. (4) Whether installing the tube (specimen) delivery tracks need further evaluation.

Keywords: innovation decision process, total laboratory automation, health care

Procedia PDF Downloads 418
16 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 148
15 Contactless Heart Rate Measurement System based on FMCW Radar and LSTM for Automotive Applications

Authors: Asma Omri, Iheb Sifaoui, Sofiane Sayahi, Hichem Besbes

Abstract:

Future vehicle systems demand advanced capabilities, notably in-cabin life detection and driver monitoring systems, with a particular emphasis on drowsiness detection. To meet these requirements, several techniques employ artificial intelligence methods based on real-time vital sign measurements. In parallel, Frequency-Modulated Continuous-Wave (FMCW) radar technology has garnered considerable attention in the domains of healthcare and biomedical engineering for non-invasive vital sign monitoring. FMCW radar offers a multitude of advantages, including its non-intrusive nature, continuous monitoring capacity, and its ability to penetrate through clothing. In this paper, we propose a system utilizing the AWR6843AOP radar from Texas Instruments (TI) to extract precise vital sign information. The radar allows us to estimate Ballistocardiogram (BCG) signals, which capture the mechanical movements of the body, particularly the ballistic forces generated by heartbeats and respiration. These signals are rich sources of information about the cardiac cycle, rendering them suitable for heart rate estimation. The process begins with real-time subject positioning, followed by clutter removal, computation of Doppler phase differences, and the use of various filtering methods to accurately capture subtle physiological movements. To address the challenges associated with FMCW radar-based vital sign monitoring, including motion artifacts due to subjects' movement or radar micro-vibrations, Long Short-Term Memory (LSTM) networks are implemented. LSTM's adaptability to different heart rate patterns and ability to handle real-time data make it suitable for continuous monitoring applications. Several crucial steps were taken, including feature extraction (involving amplitude, time intervals, and signal morphology), sequence modeling, heart rate estimation through the analysis of detected cardiac cycles and their temporal relationships, and performance evaluation using metrics such as Root Mean Square Error (RMSE) and correlation with reference heart rate measurements. For dataset construction and LSTM training, a comprehensive data collection system was established, integrating the AWR6843AOP radar, a Heart Rate Belt, and a smart watch for ground truth measurements. Rigorous synchronization of these devices ensured data accuracy. Twenty participants engaged in various scenarios, encompassing indoor and real-world conditions within a moving vehicle equipped with the radar system. Static and dynamic subject’s conditions were considered. The heart rate estimation through LSTM outperforms traditional signal processing techniques that rely on filtering, Fast Fourier Transform (FFT), and thresholding. It delivers an average accuracy of approximately 91% with an RMSE of 1.01 beat per minute (bpm). In conclusion, this paper underscores the promising potential of FMCW radar technology integrated with artificial intelligence algorithms in the context of automotive applications. This innovation not only enhances road safety but also paves the way for its integration into the automotive ecosystem to improve driver well-being and overall vehicular safety.

Keywords: ballistocardiogram, FMCW Radar, vital sign monitoring, LSTM

Procedia PDF Downloads 72
14 Microfluidic Plasmonic Device for the Sensitive Dual LSPR-Thermal Detection of the Cardiac Troponin Biomarker in Laminal Flow

Authors: Andreea Campu, Ilinica Muresan, Simona Cainap, Simion Astilean, Monica Focsan

Abstract:

Acute myocardial infarction (AMI) is the most severe cardiovascular disease, which has threatened human lives for decades, thus a continuous interest is directed towards the detection of cardiac biomarkers such as cardiac troponin I (cTnI) in order to predict risk and, implicitly, fulfill the early diagnosis requirements in AMI settings. Microfluidics is a major technology involved in the development of efficient sensing devices with real-time fast responses and on-site applicability. Microfluidic devices have gathered a lot of attention recently due to their advantageous features such as high sensitivity and specificity, miniaturization and portability, ease-of-use, low-cost, facile fabrication, and reduced sample manipulation. The integration of gold nanoparticles into the structure of microfluidic sensors has led to the development of highly effective detection systems, considering the unique properties of the metallic nanostructures, specifically the Localized Surface Plasmon Resonance (LSPR), which makes them highly sensitive to their microenvironment. In this scientific context, herein, we propose the implementation of a novel detection device, which successfully combines the efficiency of gold bipyramids (AuBPs) as signal transducers and thermal generators with the sample-driven advantages of the microfluidic channels into a miniaturized, portable, low-cost, specific, and sensitive test for the dual LSPR-thermographic cTnI detection. Specifically, AuBPs with longitudinal LSPR response at 830 nm were chemically synthesized using the seed-mediated growth approach and characterized in terms of optical and morphological properties. Further, the colloidal AuBPs were deposited onto pre-treated silanized glass substrates thus, a uniform nanoparticle coverage of the substrate was obtained and confirmed by extinction measurements showing a 43 nm blue-shift of the LSPR response as a consequence of the refractive index change. The as-obtained plasmonic substrate was then integrated into a microfluidic “Y”-shaped polydimethylsiloxane (PDMS) channel, fabricated using a Laser Cutter system. Both plasmonic and microfluidic elements were plasma treated in order to achieve a permanent bond. The as-developed microfluidic plasmonic chip was further coupled to an automated syringe pump system. The proposed biosensing protocol implicates the successive injection inside the microfluidic channel as follows: p-aminothiophenol and glutaraldehyde, to achieve a covalent bond between the metallic surface and cTnI antibody, anti-cTnI, as a recognition element, and target cTnI biomarker. The successful functionalization and capture of cTnI was monitored by LSPR detection thus, after each step, a red-shift of the optical response was recorded. Furthermore, as an innovative detection technique, thermal determinations were made after each injection by exposing the microfluidic plasmonic chip to 785 nm laser excitation, considering that the AuBPs exhibit high light-to-heat conversion performances. By the analysis of the thermographic images, thermal curves were obtained, showing a decrease in the thermal efficiency after the anti-cTnI-cTnI reaction was realized. Thus, we developed a microfluidic plasmonic chip able to operate as both LSPR and thermal sensor for the detection of the cardiac troponin I biomarker, leading thus to the progress of diagnostic devices.

Keywords: gold nanobipyramids, microfluidic device, localized surface plasmon resonance detection, thermographic detection

Procedia PDF Downloads 128
13 Measuring the Biomechanical Effects of Worker Skill Level and Joystick Crane Speed on Forestry Harvesting Performance Using a Simulator

Authors: Victoria L. Chester, Usha Kuruganti

Abstract:

The forest industry is a major economic sector of Canada and also one of the most dangerous industries for workers. The use of mechanized mobile forestry harvesting machines has successfully reduced the incidence of injuries in forest workers related to manual labor. However, these machines have also created additional concerns, including a high machine operation learning curve, increased the length of the workday, repetitive strain injury, cognitive load, physical and mental fatigue, and increased postural loads due to sitting in a confined space. It is critical to obtain objective performance data for employers to develop appropriate work practices for this industry, however ergonomic field studies of this industry are lacking mainly due to the difficulties in obtaining comprehensive data while operators are cutting trees in the woods. The purpose of this study was to establish a measurement and experimental protocol to examine the effects of worker skill level and movement training speed (joystick crane speed) on harvesting performance using a forestry simulator. A custom wrist angle measurement device was developed as part of the study to monitor Euler angles during operation of the simulator. The device of the system consisted of two accelerometers, a Bluetooth module, three 3V coin cells, a microcontroller, a voltage regulator and an application software. Harvesting performance and crane data was provided by the simulator software and included tree to frame collisions, crane to tree collisions, boom tip distance, number of trees cut, etc. A pilot study of 3 operators with various skill levels was tested to identify factors that distinguish highly skilled operators from novice or intermediate operators. Dependent variables such as reaction time, math skill, past work experience, training movement speed (e.g. joystick control speeds), harvesting experience level, muscle activity, and wrist biomechanics were measured and analyzed. A 10-channel wireless surface EMG system was used to monitor the amplitude and mean frequency of 10 upper extremity muscles during pre and postperformance on the forestry harvest stimulator. The results of the pilot study showed inconsistent changes in median frequency pre-and postoperation, but there was the increase in the activity of the flexor carpi radialis, anterior deltoid and upper trapezius of both arms. The wrist sensor results indicated that wrist supination and pronation occurred more than flexion and extension with radial-ulnar rotation demonstrating the least movement. Overall, wrist angular motion increased as the crane speed increased from slow to fast. Further data collection is needed and will help industry partners determine those factors that separate skill levels of operators, identify optimal training speeds, and determine the length of training required to bring new operators to an efficient skill level effectively. In addition to effective and employment training programs, results of this work will be used for selective employee recruitment strategies to improve employee retention after training. Further, improved training procedures and knowledge of the physical and mental demands on workers will lead to highly trained and efficient personnel, reduced risk of injury, and optimal work protocols.

Keywords: EMG, forestry, human factors, wrist biomechanics

Procedia PDF Downloads 141
12 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop

Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen

Abstract:

Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.

Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.

Procedia PDF Downloads 40
11 Metal-Organic Frameworks-Based Materials for Volatile Organic Compounds Sensing Applications: Strategies to Improve Sensing Performances

Authors: Claudio Clemente, Valentina Gargiulo, Alessio Occhicone, Giovanni Piero Pepe, Giovanni Ausanio, Michela Alfè

Abstract:

Volatile organic compound (VOC) emissions represent a serious risk to human health and the integrity of the ecosystems, especially at high concentrations. For this reason, it is very important to continuously monitor environmental quality and develop fast and reliable portable sensors to allow analysis on site. Chemiresistors have become promising candidates for VOC sensing as their ease of fabrication, variety of suitable sensitive materials, and simple sensing data. A chemoresistive gas sensor is a transducer that allows to measure the concentration of an analyte in the gas phase because the changes in resistance are proportional to the amount of the analyte present. The selection of the sensitive material, which interacts with the target analyte, is very important for the sensor performance. The most used VOC detection materials are metal oxides (MOx) for their rapid recovery, high sensitivity to various gas molecules, easy fabrication. Their sensing performance can be improved in terms of operating temperature, selectivity, and detection limit. Metal-organic frameworks (MOFs) have attracted a lot of attention also in the field of gas sensing due to their high porosity, high surface area, tunable morphologies, structural variety. MOFs are generated by the self-assembly of multidentate organic ligands connecting with adjacent multivalent metal nodes via strong coordination interactions, producing stable and highly ordered crystalline porous materials with well-designed structures. However, most MOFs intrinsically exhibit low electrical conductivity. To improve this property, MOFs can be combined with organic and inorganic materials in a hybrid fashion to produce composite materials or can be transformed into more stable structures. MOFs, indeed, can be employed as the precursors of metal oxides with well-designed architectures via the calcination method. The MOF-derived MOx partially preserved the original structure with high surface area and intrinsic open pores, which act as trapping centers for gas molecules, and showed a higher electrical conductivity. Core-shell heterostructures, in which the surface of a metal oxide core is completely coated by a MOF shell, forming a junction at the core-shell heterointerface, can also be synthesized. Also, nanocomposite in which MOF structures are intercalated with graphene related materials can also be produced, and the conductivity increases thanks to the high mobility of electrons of carbon materials. As MOF structures, zinc-based MOFs belonging to the ZIF family were selected in this work. Several Zn-based materials based and/or derived from MOFs were produced, structurally characterized, and arranged in a chemo resistive architecture, also exploring the potentiality of different approaches of sensing layer deposition based on PLD (pulsed laser deposition) and, in case of thermally labile materials, MAPLE (Matrix Assisted Pulsed Laser Evaporation) to enhance the adhesion to the support. The sensors were tested in a controlled humidity chamber, allowing for the possibility of varying the concentration of ethanol, a typical analyte chosen among the VOCs for a first survey. The effect of heating the chemiresistor to improve sensing performances was also explored. Future research will focus on exploring new manufacturing processes for MOF-based gas sensors with the aim to improve sensitivity, selectivity and reduce operating temperatures.

Keywords: chemiresistors, gas sensors, graphene related materials, laser deposition, MAPLE, metal-organic frameworks, metal oxides, nanocomposites, sensing performance, transduction mechanism, volatile organic compounds

Procedia PDF Downloads 60
10 Design of DNA Origami Structures Using LAMP Products as a Combined System for the Detection of Extended Spectrum B-Lactamases

Authors: Kalaumari Mayoral-Peña, Ana I. Montejano-Montelongo, Josué Reyes-Muñoz, Gonzalo A. Ortiz-Mancilla, Mayrin Rodríguez-Cruz, Víctor Hernández-Villalobos, Jesús A. Guzmán-López, Santiago García-Jacobo, Iván Licona-Vázquez, Grisel Fierros-Romero, Rosario Flores-Vallejo

Abstract:

The group B-lactamic antibiotics include some of the most frequently used small drug molecules against bacterial infections. Nevertheless, an alarming decrease in their efficacy has been reported due to the emergence of antibiotic-resistant bacteria. Infections caused by bacteria expressing extended Spectrum B-lactamases (ESBLs) are difficult to treat and account for higher morbidity and mortality rates, delayed recovery, and high economic burden. According to the Global Report on Antimicrobial Resistance Surveillance, it is estimated that mortality due to resistant bacteria will ascend to 10 million cases per year worldwide. These facts highlight the importance of developing low-cost and readily accessible detection methods of drug-resistant ESBLs bacteria to prevent their spread and promote accurate and fast diagnosis. Bacterial detection is commonly done using molecular diagnostic techniques, where PCR stands out for its high performance. However, this technique requires specialized equipment not available everywhere, is time-consuming, and has a high cost. Loop-Mediated Isothermal Amplification (LAMP) is an alternative technique that works at a constant temperature, significantly decreasing the equipment cost. It yields double-stranded DNA of several lengths with repetitions of the target DNA sequence as a product. Although positive and negative results from LAMP can be discriminated by colorimetry, fluorescence, and turbidity, there is still a large room for improvement in the point-of-care implementation. DNA origami is a technique that allows the formation of 3D nanometric structures by folding a large single-stranded DNA (scaffold) into a determined shape with the help of short DNA sequences (staples), which hybridize with the scaffold. This research aimed to generate DNA origami structures using LAMP products as scaffolds to improve the sensitivity to detect ESBLs in point-of-care diagnosis. For this study, the coding sequence of the CTM-X-15 ESBL of E. coli was used to generate the LAMP products. The set of LAMP primers were designed using PrimerExplorerV5. As a result, a target sequence of 200 nucleotides from CTM-X-15 ESBL was obtained. Afterward, eight different DNA origami structures were designed using the target sequence in the SDCadnano and analyzed with CanDo to evaluate the stability of the 3D structures. The designs were constructed minimizing the total number of staples to reduce costs and complexity for point-of-care applications. After analyzing the DNA origami designs, two structures were selected. The first one was a zig-zag flat structure, while the second one was a wall-like shape. Given the sequence repetitions in the scaffold sequence, both were able to be assembled with only 6 different staples each one, ranging between 18 to 80 nucleotides. Simulations of both structures were performed using scaffolds of different sizes yielding stable structures in all the cases. The generation of the LAMP products were tested by colorimetry and electrophoresis. The formation of the DNA structures was analyzed using electrophoresis and colorimetry. The modeling of novel detection methods through bioinformatics tools allows reliable control and prediction of results. To our knowledge, this is the first study that uses LAMP products and DNA-origami in combination to delect ESBL-producing bacterial strains, which represent a promising methodology for diagnosis in the point-of-care.

Keywords: beta-lactamases, antibiotic resistance, DNA origami, isothermal amplification, LAMP technique, molecular diagnosis

Procedia PDF Downloads 219
9 Adequate Nutritional Support and Monitoring in Post-Traumatic High Output Duodenal Fistula

Authors: Richa Jaiswal, Vidisha Sharma, Amulya Rattan, Sushma Sagar, Subodh Kumar, Amit Gupta, Biplab Mishra, Maneesh Singhal

Abstract:

Background: Adequate nutritional support and daily patient monitoring have an independent therapeutic role in the successful management of high output fistulae and early recovery after abdominal trauma. Case presentation: An 18-year-old girl was brought to AIIMS emergency with alleged history of fall of a heavy weight (electric motor) over abdomen. She was evaluated as per Advanced Trauma Life Support(ATLS) protocols and diagnosed to have significant abdominal trauma. After stabilization, she was referred to Trauma center. Abdomen was guarded and focused assessment with sonography for trauma(FAST) was found positive. Complete duodenojejunal(DJ) junction transection was found at laparotomy, and end-to-end repair was done. However, patient was re-explored in view of biliary peritonitis on post-operative day3, and anastomotic leak was found with sloughing of duodenal end. Resection of non-viable segments was done followed by side-to-side anastomosis. Unfortunately, the anastomosis leaked again, this time due to a post-anastomotic kink, diagnosed on dye study. Due to hostile abdomen, the patient was planned for supportive care, with plan of build-up and delayed definitive surgery. Percutaneous transheptic biliary drainage (PTBD) and STSG were required in the course as well. Nutrition: In intensive care unit (ICU), major goals of nutritional therapy were to improve wound healing, optimize nutrition, minimize enteral feed associated complications, reduce biliary fistula output, and prepare the patient for definitive surgeries. Feeding jejunostomy (FJ) was started from day 4 at the rate of 30ml/h along with total parenteral nutrition (TPN) and intra-venous (IV) micronutrients support. Due to high bile output, bile refeed started from day 13.After 23 days of ICU stay, patient was transferred to general ward with body mass index (BMI)<11kg/m2 and serum albumin –1.5gm%. Patient was received in the ward in catabolic phase with high risk of refeeding syndrome. Patient was kept on FJ bolus feed at the rate of 30–50 ml/h. After 3–4 days, while maintaining patient diet book log it was observed that patient use to refuse feed at night and started becoming less responsive with every passing day. After few minutes of conversation with the patient for a couple of days, she complained about enteral feed discharge in urine, mild pain and sign of dumping syndrome. Dye study was done, which ruled out any enterovesical fistula and conservative management were planned. At this time, decision was taken for continuous slow rate feeding through commercial feeding pump at the rate of 2–3ml/min. Drastic improvement was observed from the second day in gastro-intestinal symptoms and general condition of the patient. Nutritional composition of feed, TPN and diet ranged between 800 and 2100 kcal and 50–95 g protein. After STSG, TPN was stopped. Periodic diet counselling was given to improve oral intake. At the time of discharge, serum albumin level was 2.1g%, weight – 38.6, BMI – 15.19 kg/m2. Patient got discharge on an oral diet. Conclusion: Successful management of post-traumatic proximal high output fistulae is a challenging task, due to impaired nutrient absorption and enteral feed associated complications. Strategic- and goal-based nutrition support can salvage such critically ill patients, as demonstrated in the present case.

Keywords: nutritional monitoring, nutritional support, duodenal fistula, abdominal trauma

Procedia PDF Downloads 259
8 Bridging the Communication Gap in Emergency Care: How Informational Pamphlet Enhance Satisfaction for Patients with Distal Radius Fractures

Authors: Amr Mansour, Boaz Granot, Amani Tatar, Assil Mahamid, Mohammad Haj Yahia, Fairoz Jayyusi, Eyal Behrbalk

Abstract:

INTRODUCTION: Distal radius fractures are common orthopedic injuries often treated in the fast-paced, high-stress environment of emergency departments (EDs). In such settings, patient satisfaction can be significantly influenced by the clarity of communication and the accessibility of information This study explores the impact of providing an informational pamphlet that outlines ED processes, treatment expectations, and follow-up instructions on patient satisfaction across key domains, including trust, communication, organization, responsiveness, and overall experience. We hypothesize that a structured informational pamphlet will enhance patient satisfaction by fostering better understanding and aligning patient expectations with the realities of the ED visit. METHODS: A total of 100 adult patients treated for distal radius fractures between January and August 2024 participated in this survey-based study. Patients were randomized into two equal groups: one group received an informational pamphlet detailing their condition and treatment, while the other did not. Satisfaction levels were assessed using a structured questionnaire addressing five domains. Fisher's exact test was used to compare satisfaction measures between the two groups, and multivariate logistic regression analysis was conducted to evaluate the association between receiving an information sheet and high satisfaction. The study was approved by the Institutional Review Board. RESULTS SECTION: Patients who received an informational pamphlet reported significantly higher satisfaction across all five domains (p < .001). In Trust and Understanding, 82% of info-sheet recipients felt “in good hands,” compared to 10% of non-recipients. For Communication, 86% rated doctor explanations as “very clear,” versus 16% among non-recipients. Logistic regression showed that receiving an informational pamphlet was a significant predictor of high satisfaction with Discharge Explanation—clarity on condition, treatment, and follow-up (OR = 17.65, 95% CI: 4.74 - 65.77, p < .001) and Reasonable Solution—feeling their primary concern was resolved (OR = 37.82, 95% CI: 8.75 - 163.42, p < .001). Other predictors, including fracture reduction, gender, and age, were not significant. DISCUSSION: This study highlights the substantial role that simple, cost-effective interventions like informational pamphlets can play in enhancing patient satisfaction in emergency care. By improving communication, fostering trust, and promoting a patient-centered approach, informational pamphlets offer a valuable tool for healthcare providers seeking to enhance the quality of care and patient experience in high-pressure emergency environments. However, the study's limitations, including its single-center design and reliance on self-reported satisfaction scores, may affect the generalizability of the results. Future research should consider a multi-center approach and explore long-term outcomes to further validate the efficacy of informational pamphlets in diverse ED settings. Ultimately, sustained improvement in patient satisfaction is a complex and dynamic issue necessitating a multifactorial approach, and other methods should also be explored to complement this strategy. SIGNIFICANCE/CLINICAL RELEVANCE: This study demonstrates that providing an informational pamphlet in the ED setting can significantly improve patient satisfaction across multiple domains, emphasizing its potential as a simple, cost-effective tool to enhance communication, trust, and overall patient experience during emergency care for distal radius fractures. Integrating such interventions into standard ED protocols may foster a more patient-centered approach, improving both patient outcomes and healthcare efficiency.

Keywords: distal radius fracture, quality care, patient satisfaction, emergency medicine, patient-centered care, communication

Procedia PDF Downloads 16
7 Top Skills That Build Cultures at Organizations

Authors: Priyanka Botny Srinath, Alessandro Suglia, Mel McKendrick

Abstract:

Background: Organizational cultural studies integrate sociology and anthropology, portraying man as a creator of symbols, languages, beliefs, and ideologies -essentially, a creator and manager of meaning. In our research, we leverage analytical measures to discern whether an organization embodies a singular culture or a myriad of subcultures. Fast-forward to 2023, our research thesis focuses on digitally measuring culture, coining it as the "Work Culture Quotient." This entails conceptually mapping common experiential patterns to provide executives insights into the digital organization journey, aiding in understanding their current position and identifying future steps. Objectives: Finding the new age skills that help in defining the culture; understand the implications of post-COVID effects; derive a digital framework for measuring skillsets. Method: We conducted two comprehensive Delphi studies to distill essential insights. Delphi 1: Through a thematic analysis of interviews with 20 high-level leaders representing companies across diverse regions -India, Japan, the US, Canada, Morocco, and Uganda- we identified 20 key skills critical for cultivating a robust organizational culture. The skills are -influence, self-confidence, optimism, empathy, leadership, collaboration and cooperation, developing others, commitment, innovativeness, leveraging diversity, change management, team capabilities, self-control, digital communication, emotional awareness, team bonding, communication, problem solving, adaptability, and trustworthiness. Delphi 2: Subject matter experts were asked to complete a questionnaire derived from the thematic analysis in stage 1 to formalise themes and draw consensus amongst experts on the most important workplace skills. Results: The thematic analysis resulted in 20 workplace employee skills being identified. These skills were all included in the Delphi round 2 questionnaire. From the outputs, we analysed the data using R Studio for arriving at agreement and consensus, we also used sum of squares method to compare various agreements to extract various themes with a threshold of 80% agreements. This yielded three themes at over 80% agreement (leadership, collaboration and cooperation, communication) and three further themes at over 60% agreement (commitment, empathy, trustworthiness). From this, we selected five questionnaires to be included in the primary data collection phase, and these will be paired with the digital footprints to provide a workplace culture quotient. Implications: The findings from these studies bear profound implications for decision-makers, revolutionizing their comprehension of organizational culture. Tackling the challenge of mapping the digital organization journey involves innovative methodologies that probe not only external landscapes but also internal cultural dynamics. This holistic approach furnishes decision-makers with a nuanced understanding of their organizational culture and visualizes pivotal skills for employee growth. This clarity enables informed choices resonating with the organization's unique cultural fabric. Anticipated outcomes transcend mere individual cultural measurements, aligning with organizational goals to unveil a comprehensive view of culture, exposing artifacts and depth. Armed with this profound understanding, decision-makers gain tangible evidence for informed decision-making, strategically leveraging cultural strengths to cultivate an environment conducive to growth, innovation, and enduring success, ultimately leading to measurable outcomes.

Keywords: leadership, cooperation, collaboration, teamwork, work culture

Procedia PDF Downloads 45
6 Salmon Diseases Connectivity between Fish Farm Management Areas in Chile

Authors: Pablo Reche

Abstract:

Since 1980’s aquaculture has become the biggest economic activity in southern Chile, being Salmo salar and Oncorhynchus mykiss the main finfish species. High fish density makes both species prone to contract diseases, what drives the industry to big losses, affecting greatly the local economy. Three are the most concerning infective agents, the infectious salmon anemia virus (ISAv), the bacteria Piscirickettsia salmonis and the copepod Caligus rogercresseyi. To regulate the industry the government arranged the salmon farms within management areas named as barrios, which coordinate the fallowing periods and antibiotics treatments of their salmon farms. In turn, barrios are gathered into larger management areas, named as macrozonas whose purpose is to minimize the risk of disease transmission between them and to enclose the outbreaks within their boundaries. However, disease outbreaks still happen and transmission to neighbor sites enlarges the initial event. Salmon disease agents are mostly transported passively by local currents. Thus, to understand how transmission occurs it must be firstly studied the physical environment. In Chile, salmon farming takes place in the inner seas of the southernmost regions of western Patagonia, between 41.5ºS-55ºS. This coastal marine system is characterised by western winds, latitudinally modulated by the position of the South-Eats Pacific high-pressure centre, high precipitation rates and freshwater inflows from the numerous glaciers (including the largest ice cap out of Antarctic and Greenland). All of these forcings meet in a complex bathymetry and coastline system - deep fjords, shallow sills, narrow straits, channels, archipelagos, inlets, and isolated inner seas- driving an estuarine circulation (fast outflows westwards on surface and slow deeper inflows eastwards). Such a complex system is modelled on the numerical model MIKE3, upon whose 3D current fields particle-track-biological models (one for each infective agent) are decoupled. Each agent biology is parameterized by functions for maturation and mortality (reproduction not included). Such parameterizations are depending upon environmental factors, like temperature and salinity, so their lifespan will depend upon the environmental conditions those virtual agents encounter on their way while passively transported. CLIC (Connectivity-Langrangian–IFOP-Chile) is a service platform that supports the graphical visualization of the connectivity matrices calculated from the particle trajectories files resultant of the particle-track-biological models. On CLIC users can select, from a high-resolution grid (~1km), the areas the connectivity will be calculated between them. These areas can be barrios and macrozonas. Users also can select what nodes of these areas are allowed to release and scatter particles from, depth and frequency of the initial particle release, climatic scenario (winter/summer) and type of particle (ISAv, Piscirickettsia salmonis, Caligus rogercresseyi plus an option for lifeless particles). Results include probabilities downstream (where the particles go) and upstream (where the particles come from), particle age and vertical distribution, all of them aiming to understand how currently connectivity works to eventually propose a minimum risk zonation for aquaculture purpose. Preliminary results in Chiloe inner sea shows that the risk depends not only upon dynamic conditions but upon barrios location with respect to their neighbors.

Keywords: aquaculture zonation, Caligus rogercresseyi, Chilean Patagonia, coastal oceanography, connectivity, infectious salmon anemia virus, Piscirickettsia salmonis

Procedia PDF Downloads 152
5 Employee Engagement

Authors: Jai Bakliya, Palak Dhamecha

Abstract:

Today customer satisfaction is given utmost priority be it any industry. But when it comes to hospitality industry this applies even more as they come in direct contact with customers while providing them services. Employee engagement is new concept adopted by Human Resource Department which impacts customer satisfactions. To satisfy your customers, it is necessary to see that the employees in the organisation are satisfied and engaged enough in their work that they meet the company’s expectations and contribute in the process of achieving company’s goals and objectives. After all employees is human capital of the organisation. Employee engagement has become a top business priority for every organisation. In this fast moving economy, business leaders know that having a potential and high-performing human resource is important for growth and survival. They recognize that a highly engaged manpower can increase innovation, productivity, and performance, while reducing costs related to retention and hiring in highly competitive talent markets. But while most executives see a clear need to improve employee engagement, many have yet to develop tangible ways to measure and tackle this goal. Employee Engagement is an approach which is applied to establish an emotional connection between an employee and the organisation which ensures the employee’s commitment towards his work which affects the productivity and overall performance of the organisation. The study was conducted in hospitality industry. A popular branded hotel was chosen as a sample unit. Data were collected, both qualitative and quantitative from respondents. It is found that employee engagement level of the organisation (Hotel) is quite low. This means that employees are not emotionally connected with the organisation which may in turn, affect performance of the employees it is important to note that in hospitality industry individual employee’s performance specifically in terms of emotional engagement is critical and, therefore, a low engagement level may contribute to low organisation performance. An attempt to this study was made to identify employee engagement level. Another objective to take this study was to explore the factors impeding employee engagement and to explore employee engagement facilitation. While in the hospitality industry where people tend to work for as long as 16 to 18 hours concepts like employee engagement is essential. Because employees get tired of their routine job and in case where job rotation cannot be done employee engagement acts as a solution. The study was conducted at Trident Hotel, Udaipur. It was conducted on the sample size of 30 in-house employees from 6 different departments. The various departments were: Accounts and General, Front Office, Food & Beverage Service, Housekeeping, Food & Beverage Production and Engineering. It was conducted with the help of research instrument. The research instrument was Questionnaire. Data collection source was primary source. Trident Udaipur is one of the busiest hotels in Udaipur. The occupancy rate of the guest over there is nearly 80%. Due the high occupancy rate employees or staff of the hotel used to remain very busy and occupied all the time in their work. They worked for their remuneration only. As a result, they do not have any encouragement for their work nor they are interested in going an extra mile for the organisation. The study result shows working environment factors including recognition and appreciation, opinions of the employee, counselling, feedback from superiors, treatment of managers and respect from the organisation are capable of increasing employee engagement level in the hotel. The above study result encouraged us to explore the factors contributed to low employee engagement. It is being found that factors such as recognition and appreciation, feedback from supervisors, opinion of the employee, counselling, feedback from supervisors, treatment from managers has contributed negatively to employee engagement level. Probable reasons for the low contribution are number of employees gave the negative feedback in accordance to the factors stated above of the organisation. It seems that the structure of organisation itself is responsible for the low contribution of employee engagement. The scope of this study is limited to trident hotel situated in the Udaipur. The limitation of the study was that that the results or findings were only based on the responses of respondents of Trident, Udaipur. And so the recommendations were also applicable in Trident, Udaipur and not to all the like organisations across the country. Through the data collected was further analysed, interpreted and concluded. On the basis of the findings, suggestions were provided to the hotel for improvisation.

Keywords: human resource, employee engagement, research, study

Procedia PDF Downloads 306
4 Development Programmes Requirements for Managing and Supporting the Ever-Dynamic Job Roles of Middle Managers in Higher Education Institutions: The Espousal Demanded from Human Resources Department; Case Studies of a New University in United Kingdom

Authors: Mohamed Sameer Mughal, Andrew D. Ross, Damian J. Fearon

Abstract:

Background: The fast-paced changing landscape of UK Higher Education Institution (HEIs) is poised by changes and challenges affecting Middle Managers (MM) in their job roles. MM contribute to the success of HEIs by balancing the equilibrium and pass organization strategies from senior staff towards operationalization directives to junior staff. However, this study showcased from the data analyzed during the semi structured interviews; MM job role is becoming more complex due to changes and challenges creating colossal pressures and workloads in day-to-day working. Current development programmes provisions by Human Resources (HR) departments in such HEIs are not feasible, applicable, and matching the true essence and requirements of MM who suggest that programmes offered by HR are too generic to suit their precise needs and require tailor made espousal to work effectively in their pertinent job roles. Methodologies: This study aims to capture demands of MM Development Needs (DN) by means of a conceptual model as conclusive part of the research that is divided into 2 phases. Phase 1 initiated by carrying out 2 pilot interviews with a retired Emeritus status professor and HR programmes development coordinator. Key themes from the pilot and literature review subsidized into formulation of 22 set of questions (Kvale and Brinkmann) in form of interviewing questionnaire during qualitative data collection. Data strategy and collection consisted of purposeful sampling of 12 semi structured interviews (n=12) lasting approximately an hour for all participants. The MM interviewed were at faculty and departmental levels which included; deans (n=2), head of departments (n=4), subject leaders (n=2), and lastly programme leaders (n=4). Participants recruitment was carried out via emails and snowballing technique. The interviews data was transcribed (verbatim) and managed using Computer Assisted Qualitative Data Analysis using Nvivo ver.11 software. Data was meticulously analyzed using Miles and Huberman inductive approach of positivistic style grounded theory, whereby key themes and categories emerged from the rich data collected. The data was precisely coded and classified into case studies (Robert Yin); with a main case study, sub cases (4 classes of MM) and embedded cases (12 individual MMs). Major Findings: An interim conceptual model emerged from analyzing the data with main concepts that included; key performance indicators (KPI’s), HEI effectiveness and outlook, practices, processes and procedures, support mechanisms, student events, rules, regulations and policies, career progression, reporting/accountability, changes and challenges, and lastly skills and attributes. Conclusion: Dynamic elements affecting MM includes; increase in government pressures, student numbers, irrelevant development programmes, bureaucratic structures, transparency and accountability, organization policies, skills sets… can only be confronted by employing structured development programmes originated by HR that are not provided generically. Future Work: Stage 2 (Quantitative method) of the study plans to validate the interim conceptual model externally through fully completed online survey questionnaire (Bram Oppenheim) from external HEIs (n=150). The total sample targeted is 1500 MM. Author contribution focuses on enhancing management theory and narrow the gap between by HR and MM development programme provision.

Keywords: development needs (DN), higher education institutions (HEIs), human resources (HR), middle managers (MM)

Procedia PDF Downloads 230
3 Blue Economy and Marine Mining

Authors: Fani Sakellariadou

Abstract:

The Blue Economy includes all marine-based and marine-related activities. They correspond to established, emerging as well as unborn ocean-based industries. Seabed mining is an emerging marine-based activity; its operations depend particularly on cutting-edge science and technology. The 21st century will face a crisis in resources as a consequence of the world’s population growth and the rising standard of living. The natural capital stored in the global ocean is decisive for it to provide a wide range of sustainable ecosystem services. Seabed mineral deposits were identified as having a high potential for critical elements and base metals. They have a crucial role in the fast evolution of green technologies. The major categories of marine mineral deposits are deep-sea deposits, including cobalt-rich ferromanganese crusts, polymetallic nodules, phosphorites, and deep-sea muds, as well as shallow-water deposits including marine placers. Seabed mining operations may take place within continental shelf areas of nation-states. In international waters, the International Seabed Authority (ISA) has entered into 15-year contracts for deep-seabed exploration with 21 contractors. These contracts are for polymetallic nodules (18 contracts), polymetallic sulfides (7 contracts), and cobalt-rich ferromanganese crusts (5 contracts). Exploration areas are located in the Clarion-Clipperton Zone, the Indian Ocean, the Mid Atlantic Ridge, the South Atlantic Ocean, and the Pacific Ocean. Potential environmental impacts of deep-sea mining include habitat alteration, sediment disturbance, plume discharge, toxic compounds release, light and noise generation, and air emissions. They could cause burial and smothering of benthic species, health problems for marine species, biodiversity loss, reduced photosynthetic mechanism, behavior change and masking acoustic communication for mammals and fish, heavy metals bioaccumulation up the food web, decrease of the content of dissolved oxygen, and climate change. An important concern related to deep-sea mining is our knowledge gap regarding deep-sea bio-communities. The ecological consequences that will be caused in the remote, unique, fragile, and little-understood deep-sea ecosystems and inhabitants are still largely unknown. The blue economy conceptualizes oceans as developing spaces supplying socio-economic benefits for current and future generations but also protecting, supporting, and restoring biodiversity and ecological productivity. In that sense, people should apply holistic management and make an assessment of marine mining impacts on ecosystem services, including the categories of provisioning, regulating, supporting, and cultural services. The variety in environmental parameters, the range in sea depth, the diversity in the characteristics of marine species, and the possible proximity to other existing maritime industries cause a span of marine mining impact the ability of ecosystems to support people and nature. In conclusion, the use of the untapped potential of the global ocean demands a liable and sustainable attitude. Moreover, there is a need to change our lifestyle and move beyond the philosophy of single-use. Living in a throw-away society based on a linear approach to resource consumption, humans are putting too much pressure on the natural environment. Applying modern, sustainable and eco-friendly approaches according to the principle of circular economy, a substantial amount of natural resource savings will be achieved. Acknowledgement: This work is part of the MAREE project, financially supported by the Division VI of IUPAC. This work has been partly supported by the University of Piraeus Research Center.

Keywords: blue economy, deep-sea mining, ecosystem services, environmental impacts

Procedia PDF Downloads 82
2 The Outcome of Early Balance Exercises and Agility Training in Sports Rehabilitation for Patients Post Anterior Cruciate Ligament (ACL) Reconstruction

Authors: S. M. A. Ismail, M. I. Ibrahim, H. Masdar, F. M. Effendi, M. F. Suhaimi, A. Suun

Abstract:

Introduction: It is generally known that the rehabilitation process is as important as the reconstruction surgery. Several literature has focused on how early the rehabilitation modalities can be initiated after the surgery to ensure a safe return of patients to sports or at least regaining the pre-injury level of function following an ACL reconstruction. Objectives: The main objective is to study and evaluate the outcome of early balance exercises and agility training in sports rehabilitation for patients post ACL reconstruction. To compare between early balance exercises and agility training as intervention and control. (material or non-material). All of them were recruited for material exercise (balance exercises and agility training with strengthening) and strengthening only rehabilitation protocol (non-material). Followed the prospective intervention trial. Materials and Methods: Post-operative ACL reconstruction patients performed in Selayang and Sg Buloh Hospitals from 2012 to 2014 were selected for this study. They were taken from Malaysian Knee Ligament Registry (MKLR) and all patients had single bundle reconstruction with autograft hamstring tendon (semitendinosus and gracilis). ACL injury from any type of sports were included. Subjects performed various type of physical activity for rehabilitation in every 18 week for a different type of rehab activity. All subject attended all 18 sessions of rehabilitation exercises and evaluation was done during the first, 9th and 18th session. Evaluation format were based on clinical assessment (anterior drawer, Lachmann, pivot shift, laxity with rolimeter, the end point and thigh circumference) and scoring (Lysholm Knee scoring and Tegner Activity Level scale). Rehabilitation protocol initiated from 24 week after the surgery. Evaluation format were based on clinical assessment (anterior drawer, Lachmann, pivot shift, laxity with rolimeter, the end point and thigh circumference) and scoring (Lysholm Knee scoring and Tegner Activity Level scale). Results and Discussion: 100 patients were selected of which 94 patients are male and 6 female. Age range is 18 to 54 year with the average of 28 years old for included 100 patients. All patients are evaluated after 24 week after the surgery. 50 of them were recruited for material exercise (balance exercises and agility training with strengthening) and 50 for strengthening only rehabilitation protocol (non-material). Demographically showed 85% suffering sports injury mainly from futsal and football. 39 % of them have abnormal BMI (26 – 38) and involving of the left knee. 100% of patient had the basic radiographic x-ray of knee and 98% had MRI. All patients had negative anterior drawer’s, Lachman test and Pivot shift test during the post ACL reconstruction after the complete rehabilitation. There was 95 subject sustained grade I injury, 5 of grade II and 0 of grade III with 90% of them had soft end-point. Overall they scored badly on presentation with 53% of Lysholm score (poor) and Tegner activity score level 3/10. After completing 9 weeks of exercises, of material group 90% had grade I laxity, 75% with firm end-point, Lysholm score 71% (fair) and Tegner activity level 5/10 comparing non-material group who had 62% of grade I laxity , 54% of firm end-point, Lyhslom score 62 % (poor) and Tegner activity level 4/10. After completed 18 weeks of exercises, of material group maintained 90% grade I laxity with 100 % with firm end-point, Lysholm score increase 91% (excellent) and Tegner activity level 7/10 comparing non-material group who had 69% of grade I laxity but maintained 54% of firm end-point, Lysholm score 76% (fair) and Tegner activity level 5/10. These showed the improvement were achieved fast on material group who have achieved satisfactory level after 9th cycle of exercises 75% (15/20) comparing non-material group who only achieved 54% (7/13) after completed 18th session. Most of them were grade I. These concepts are consolidated into our approach to prepare patients for return to play including field testing and maintenance training. Conclusions: The basic approach in ACL rehabilitation is to ensure return to sports at post-operative 6 month. Grade I and II laxity has favourable and early satisfactory outcome base on clinical assessment and Lysholm and Tegner scoring point. Reduction of laxity grading indicates satisfactory outcome. Firm end-point showed the adequacy of rehabilitation before starting previous sports game. Material exercise (balance exercises and agility training with strengthening) were beneficial and reliable in order to achieve favourable and early satisfactory outcome comparing strengthening only (non-material).We have identified that rehabilitation protocol varies between different patients. Therefore future post ACL reconstruction rehabilitation guidelines should look into focusing on rehabilitation techniques instead of time.

Keywords: post anterior cruciate ligament (ACL) reconstruction, single bundle, hamstring tendon, sports rehabilitation, balance exercises, agility balance

Procedia PDF Downloads 253
1 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 503