Search results for: accuracy improvement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7878

Search results for: accuracy improvement

6348 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator

Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić

Abstract:

Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.

Keywords: CT simulator, radiotherapy, quality control, QA programme

Procedia PDF Downloads 535
6347 A Robust Visual Simultaneous Localization and Mapping for Indoor Dynamic Environment

Authors: Xiang Zhang, Daohong Yang, Ziyuan Wu, Lei Li, Wanting Zhou

Abstract:

Visual Simultaneous Localization and Mapping (VSLAM) uses cameras to collect information in unknown environments to realize simultaneous localization and environment map construction, which has a wide range of applications in autonomous driving, virtual reality and other related fields. At present, the related research achievements about VSLAM can maintain high accuracy in static environment. But in dynamic environment, due to the presence of moving objects in the scene, the movement of these objects will reduce the stability of VSLAM system, resulting in inaccurate localization and mapping, or even failure. In this paper, a robust VSLAM method was proposed to effectively deal with the problem in dynamic environment. We proposed a dynamic region removal scheme based on semantic segmentation neural networks and geometric constraints. Firstly, semantic extraction neural network is used to extract prior active motion region, prior static region and prior passive motion region in the environment. Then, the light weight frame tracking module initializes the transform pose between the previous frame and the current frame on the prior static region. A motion consistency detection module based on multi-view geometry and scene flow is used to divide the environment into static region and dynamic region. Thus, the dynamic object region was successfully eliminated. Finally, only the static region is used for tracking thread. Our research is based on the ORBSLAM3 system, which is one of the most effective VSLAM systems available. We evaluated our method on the TUM RGB-D benchmark and the results demonstrate that the proposed VSLAM method improves the accuracy of the original ORBSLAM3 by 70%˜98.5% under high dynamic environment.

Keywords: dynamic scene, dynamic visual SLAM, semantic segmentation, scene flow, VSLAM

Procedia PDF Downloads 118
6346 Ozone Therapy for Disc Herniation: A Non-surgical Option

Authors: Shahzad Karim Bhatti

Abstract:

Background: Ozone is a combination of oxygen and can be used in treatment of low back pain due to herniated disc. It is a minimally invasive procedure using biochemical properties of ozone resulting in reduced volume of disc and inflammation resulting in significant pain relief. Aim: The purpose of this study was to evaluate the effectiveness of ozone therapy in combination with peri-ganglionic injection of local anesthetic and corticosteroid. Material and Methods: This retrospective study was done at the Interventional Radiology Department of Mayo Hospital, Lahore. A total of 49000 patients were included from January 2008 to March 2022. All the patients presented with clinical signs and symptoms of lumber disc herniation, which was confirmed by a MRI scan of the lumbar sacral spine. The pain reduction was calculated using modified MacNab method. All the patients underwent percutaneous injection of ozone at a concentration of 27 micrograms/ml to lumber disc under fluoroscopic guidance with combination of local anesthetic and corticosteroid in peri-ganglionic space. Results were evaluated by two expert observers who were blinded to patient treatment. Results A satisfactory therapeutic outcome was obtained. 55% of the patients showed complete recovery with resolution of symptoms. 20% of the patients complained of occasional episodic pain with no limitation of occupational activity. 15% of cases showed insufficient improvement. 5% of cases had insufficient improvement and went for surgery. 10% of cases never turned up after the first visit. Conclusion Intradiscal ozone for the treatment of herniated discs has revolutionized percutaneous approach to nerve root compression making it safer, economical and easier to repeat without any side effects than treatments currently used in Pakistan.

Keywords: pain, prolapse, Ozone, backpain

Procedia PDF Downloads 30
6345 Collaboration During Planning and Reviewing in Writing: Effects on L2 Writing

Authors: Amal Sellami, Ahlem Ammar

Abstract:

Writing is acknowledged to be a cognitively demanding and complex task. Indeed, the writing process is composed of three iterative sub-processes, namely planning, translating (writing), and reviewing. Not only do second or foreign language learners need to write according to this process, but they also need to respect the norms and rules of language and writing in the text to-be-produced. Accordingly, researchers have suggested to approach writing as a collaborative task in order to al leviate its complexity. Consequently, collaboration has been implemented during the whole writing process or only during planning orreviewing. Researchers report that implementing collaboration during the whole process might be demanding in terms of time in comparison to individual writing tasks. Consequently, because of time constraints, teachers may avoid it. For this reason, it might be pedagogically more realistic to limit collaboration to one of the writing sub-processes(i.e., planning or reviewing). However, previous research implementing collaboration in planning or reviewing is limited and fails to explore the effects of the seconditionson the written text. Consequently, the present study examines the effects of collaboration in planning and collaboration in reviewing on the written text. To reach this objective, quantitative as well as qualitative methods are deployed to examine the written texts holistically and in terms of fluency, complexity, and accuracy. Participants of the study include 4 pairs in each group (n=8). They participated in two experimental conditions, which are: (1) collaborative planning followed by individual writing and individual reviewing and (2) individual planning followed by individual writing and collaborative reviewing. The comparative research findings indicate that while collaborative planning resulted in better overall text quality (precisely better content and organization ratings), better fluency, better complexity, and fewer lexical errors, collaborative reviewing produces better accuracy and less syntactical and mechanical errors. The discussion of the findings suggests the need to conduct more comparative research in order to further explore the effects of collaboration in planning or in reviewing. Pedagogical implications of the current study include advising teachers to choose between implementing collaboration in planning or in reviewing depending on their students’ need and what they need to improve.

Keywords: collaboration, writing, collaborative planning, collaborative reviewing

Procedia PDF Downloads 100
6344 Play-Based Intervention Training Program for Daycare Workers Attending to Children with Autism

Authors: Raymond E. Raguindin

Abstract:

Objective: This research studied the teaching improvement of daycare workers in imitation, joint attention, and language activities using the play-based early intervention training program in Cabanatuan City, Nueva Ecija. Methods: Focus group discussions were developed to explore the attitude, beliefs, and practices of daycare workers. Results: Findings of the study revealed that daycare workers have existing knowledge and experience in teaching children with autism. Their workshops on managing inappropriate behaviors of children with autism resulting in a general positive perception of accepting and teaching children with autism in daycare centers. Play based activities were modelled and participated in by daycare workers. These include demonstration, modelling, prompting and providing social reinforcers as reward. Five lectures and five training days were done to implement the training program. Daycare workers’ levels of skill in teaching imitation, joint attention and language were gathered before and after the participation in the training program. Findings suggest significant differences between pre-test and post test scores. They have shown significant improvement in facilitating imitation, joint attention, and language children with autism after the play-based early intervention training. They were able to initiate and sustain imitation, joint attention, and language activities with adequate knowledge and confidence. Conclusions: 1. Existing attitudes and beliefs greatly influenced the positive delivery mode of instruction. 2. Teacher-directed approach to improve attention, imitation, joint attention, and language of children with autism can be acquired by daycare workers. 3. Teaching skills and experience can be used as reference and basis for identifying future training needs.

Keywords: early intervention, imitation, joint attention, language

Procedia PDF Downloads 124
6343 Hearing Conservation Program for Vector Control Workers: Short-Term Outcomes from a Cluster-Randomized Controlled Trial

Authors: Rama Krishna Supramanian, Marzuki Isahak, Noran Naqiah Hairi

Abstract:

Noise-induced hearing loss (NIHL) is one of the highest recorded occupational diseases, despite being preventable. Hearing Conservation Program (HCP) is designed to protect workers hearing and prevent them from developing hearing impairment due to occupational noise exposures. However, there is still a lack of evidence regarding the effectiveness of this program. The purpose of this study was to determine the effectiveness of a Hearing Conservation Program (HCP) in preventing or reducing audiometric threshold changes among vector control workers. This study adopts a cluster randomized controlled trial study design, with district health offices as the unit of randomization. Nine district health offices were randomly selected and 183 vector control workers were randomized to intervention or control group. The intervention included a safety and health policy, noise exposure assessment, noise control, distribution of appropriate hearing protection devices, training and education program and audiometric testing. The control group only underwent audiometric testing. Audiometric threshold changes observed in the intervention group showed improvement in the hearing threshold level for all frequencies except 500 Hz and 8000 Hz for the left ear. The hearing threshold changes range from 1.4 dB to 5.2 dB with largest improvement at higher frequencies mainly 4000 Hz and 6000 Hz. Meanwhile for the right ear, the mean hearing threshold level remained similar at 4000 Hz and 6000 Hz after 3 months of intervention. The Hearing Conservation Program (HCP) is effective in preserving the hearing of vector control workers involved in fogging activity as well as increasing their knowledge, attitude and practice towards noise-induced hearing loss (NIHL).

Keywords: adult, hearing conservation program, noise-induced hearing loss, vector control worker

Procedia PDF Downloads 171
6342 Regularizing Software for Aerosol Particles

Authors: Christine Böckmann, Julia Rosemann

Abstract:

We present an inversion algorithm that is used in the European Aerosol Lidar Network for the inversion of data collected with multi-wavelength Raman lidar. These instruments measure backscatter coefficients at 355, 532, and 1064 nm, and extinction coefficients at 355 and 532 nm. The algorithm is based on manually controlled inversion of optical data which allows for detailed sensitivity studies and thus provides us with comparably high quality of the derived data products. The algorithm allows us to derive particle effective radius, volume, surface-area concentration with comparably high confidence. The retrieval of the real and imaginary parts of the complex refractive index still is a challenge in view of the accuracy required for these parameters in climate change studies in which light-absorption needs to be known with high accuracy. Single-scattering albedo (SSA) can be computed from the retrieve microphysical parameters and allows us to categorize aerosols into high and low absorbing aerosols. From mathematical point of view the algorithm is based on the concept of using truncated singular value decomposition as regularization method. This method was adapted to work for the retrieval of the particle size distribution function (PSD) and is called hybrid regularization technique since it is using a triple of regularization parameters. The inversion of an ill-posed problem, such as the retrieval of the PSD, is always a challenging task because very small measurement errors will be amplified most often hugely during the solution process unless an appropriate regularization method is used. Even using a regularization method is difficult since appropriate regularization parameters have to be determined. Therefore, in a next stage of our work we decided to use two regularization techniques in parallel for comparison purpose. The second method is an iterative regularization method based on Pade iteration. Here, the number of iteration steps serves as the regularization parameter. We successfully developed a semi-automated software for spherical particles which is able to run even on a parallel processor machine. From a mathematical point of view, it is also very important (as selection criteria for an appropriate regularization method) to investigate the degree of ill-posedness of the problem which we found is a moderate ill-posedness. We computed the optical data from mono-modal logarithmic PSD and investigated particles of spherical shape in our simulations. We considered particle radii as large as 6 nm which does not only cover the size range of particles in the fine-mode fraction of naturally occurring PSD but also covers a part of the coarse-mode fraction of PSD. We considered errors of 15% in the simulation studies. For the SSA, 100% of all cases achieve relative errors below 12%. In more detail, 87% of all cases for 355 nm and 88% of all cases for 532 nm are well below 6%. With respect to the absolute error for non- and weak-absorbing particles with real parts 1.5 and 1.6 in all modes the accuracy limit +/- 0.03 is achieved. In sum, 70% of all cases stay below +/-0.03 which is sufficient for climate change studies.

Keywords: aerosol particles, inverse problem, microphysical particle properties, regularization

Procedia PDF Downloads 344
6341 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment

Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto

Abstract:

Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.

Keywords: carbon stock, forest inventory, LiDAR, tree count

Procedia PDF Downloads 391
6340 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System

Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee

Abstract:

This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.

Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation

Procedia PDF Downloads 102
6339 Oral Betahistine Versus Intravenous Diazepam in Acute Peripheral Vertigo: A Randomized, Double-Blind Controlled Trial

Authors: Saeed Abbasi, Davood Farsi, Soudabeh Shafiee Ardestani, Neda Valizadeh

Abstract:

Objectives: Peripheral vertigo is a common complaint of patients who are visited in emergency departments. In our study, we wanted to evaluate the effect of betahistine as an oral drug vs. intravenous diazepam for the treatment of acute peripheral vertigo. We also wanted to see the possibility of substitution of parenteral drug with an oral one with fewer side effects. Materials and Methods: In this randomized, double-blind study, 101 patients were enrolled in the study. The patients were divided in two groups in a double-blind randomized manner. Group A took oral placebo and 10 mg of intravenous diazepam. Group B received 8mg of oral betahistine and intravenous placebo. Patients’ symptoms and signs (Vertigo severity, Nausea, Vomiting, Nistagmus and Gate) were evaluated after 0, 2, 4, 6 hours by emergency physicians and data were collected by a questionnaire. Results: In both groups, there was significant improvement in vertigo (betahistine group P=0.02 and Diazepam group P=0.03). Analysis showed more improvement in vertigo severity after 4 hours of treatment in betahistine group comparing to diazepam group (P=0.02). Nausea and vomiting were significantly lower in patients receiving diazepam after 2 and 6 hours (P=0.02 & P=0.03).No statistically significant differences were found between the groups in nistagmus, equilibrium & vertigo duration. Conclusion: The results of this randomized trial showed that both drugs had acceptable therapeutic effects in peripheral vertigo, although betahistine was significantly more efficacious after 4 hours of drug intake. As for higher nausea and vomiting in betahistine group, physician should consider these side effects before drug prescription.

Keywords: acute peripheral vertigo, betahistine, diazepam, emergency department

Procedia PDF Downloads 390
6338 A Comparison of Convolutional Neural Network Architectures for the Classification of Alzheimer’s Disease Patients Using MRI Scans

Authors: Tomas Premoli, Sareh Rowlands

Abstract:

In this study, we investigate the impact of various convolutional neural network (CNN) architectures on the accuracy of diagnosing Alzheimer’s disease (AD) using patient MRI scans. Alzheimer’s disease is a debilitating neurodegenerative disorder that affects millions worldwide. Early, accurate, and non-invasive diagnostic methods are required for providing optimal care and symptom management. Deep learning techniques, particularly CNNs, have shown great promise in enhancing this diagnostic process. We aim to contribute to the ongoing research in this field by comparing the effectiveness of different CNN architectures and providing insights for future studies. Our methodology involved preprocessing MRI data, implementing multiple CNN architectures, and evaluating the performance of each model. We employed intensity normalization, linear registration, and skull stripping for our preprocessing. The selected architectures included VGG, ResNet, and DenseNet models, all implemented using the Keras library. We employed transfer learning and trained models from scratch to compare their effectiveness. Our findings demonstrated significant differences in performance among the tested architectures, with DenseNet201 achieving the highest accuracy of 86.4%. Transfer learning proved to be helpful in improving model performance. We also identified potential areas for future research, such as experimenting with other architectures, optimizing hyperparameters, and employing fine-tuning strategies. By providing a comprehensive analysis of the selected CNN architectures, we offer a solid foundation for future research in Alzheimer’s disease diagnosis using deep learning techniques. Our study highlights the potential of CNNs as a valuable diagnostic tool and emphasizes the importance of ongoing research to develop more accurate and effective models.

Keywords: Alzheimer’s disease, convolutional neural networks, deep learning, medical imaging, MRI

Procedia PDF Downloads 74
6337 Designing Urban Spaces Differently: A Case Study of the Hercity Herstreets Public Space Improvement Initiative in Nairobi, Kenya

Authors: Rehema Kabare

Abstract:

As urban development initiatives continue to emerge and are implemented amid rapid urbanization and climate change effects in the global south, the plight of women is only being noticed. The pandemic exposed the atrocities, violence and unsafety women and girls face daily both in their homes and in public urban spaces. This is a result of poorly implemented and managed urban structures, which women have been left out of during design and implementation for centuries. The UN Habitat’s HerCity toolkit provides a unique opportunity to change course for both governments and civil society actors where women and girls are onboarded onto urban development initiatives, with their designs and ideas being the focal point. This toolkit proves that when women and girls design, they design for everyone. The HerCity HerStreets, Public Space Improvement Initiative, resulted in a design that focused on two aspects: Streets are a shared resource, and Streets are public spaces. These two concepts illustrate that for streets to be experienced effectively as cultural spaces, they need to be user-friendly, safe and inclusive. This report demonstrates how the HerCity HerStreets as a pilot project can be a benchmark for designing urban spaces in African cities. The project focused on five dimensions to improve the air quality of the space, the space allocation to street vending and bodaboda (passenger motorcycle) stops parking and the green coverage. The process displays how digital tools such as Minecraft and Kobo Toolbox can be utilized to improve citizens’ participation in the development of public spaces, with a special focus on including vulnerable groups such as women, girls and youth.

Keywords: urban space, sustainable development, gender and the city, digital tools and urban development

Procedia PDF Downloads 84
6336 Evaluation of Yield and Yield Components of Malaysian Palm Oil Board-Senegal Oil Palm Germplasm Using Multivariate Tools

Authors: Khin Aye Myint, Mohd Rafii Yusop, Mohd Yusoff Abd Samad, Shairul Izan Ramlee, Mohd Din Amiruddin, Zulkifli Yaakub

Abstract:

The narrow base of genetic is the main obstacle of breeding and genetic improvement in oil palm industry. In order to broaden the genetic bases, the Malaysian Palm Oil Board has been extensively collected wild germplasm from its original area of 11 African countries which are Nigeria, Senegal, Gambia, Guinea, Sierra Leone, Ghana, Cameroon, Zaire, Angola, Madagascar, and Tanzania. The germplasm collections were established and maintained as a field gene bank in Malaysian Palm Oil Board (MPOB) Research Station in Kluang, Johor, Malaysia to conserve a wide range of oil palm genetic resources for genetic improvement of Malaysian oil palm industry. Therefore, assessing the performance and genetic diversity of the wild materials is very important for understanding the genetic structure of natural oil palm population and to explore genetic resources. Principal component analysis (PCA) and Cluster analysis are very efficient multivariate tools in the evaluation of genetic variation of germplasm and have been applied in many crops. In this study, eight populations of MPOB-Senegal oil palm germplasm were studied to explore the genetic variation pattern using PCA and cluster analysis. A total of 20 yield and yield component traits were used to analyze PCA and Ward’s clustering using SAS 9.4 version software. The first four principal components which have eigenvalue >1 accounted for 93% of total variation with the value of 44%, 19%, 18% and 12% respectively for each principal component. PC1 showed highest positive correlation with fresh fruit bunch (0.315), bunch number (0.321), oil yield (0.317), kernel yield (0.326), total economic product (0.324), and total oil (0.324) while PC 2 has the largest positive association with oil to wet mesocarp (0.397) and oil to fruit (0.458). The oil palm population were grouped into four distinct clusters based on 20 evaluated traits, this imply that high genetic variation existed in among the germplasm. Cluster 1 contains two populations which are SEN 12 and SEN 10, while cluster 2 has only one population of SEN 3. Cluster 3 consists of three populations which are SEN 4, SEN 6, and SEN 7 while SEN 2 and SEN 5 were grouped in cluster 4. Cluster 4 showed the highest mean value of fresh fruit bunch, bunch number, oil yield, kernel yield, total economic product, and total oil and Cluster 1 was characterized by high oil to wet mesocarp, and oil to fruit. The desired traits that have the largest positive correlation on extracted PCs could be utilized for the improvement of oil palm breeding program. The populations from different clusters with the highest cluster means could be used for hybridization. The information from this study can be utilized for effective conservation and selection of the MPOB-Senegal oil palm germplasm for the future breeding program.

Keywords: cluster analysis, genetic variability, germplasm, oil palm, principal component analysis

Procedia PDF Downloads 167
6335 Simultaneous Interpreting and Meditation: An Experimental Study on the Effects of Qigong Meditation on Simultaneous Interpreting Performance

Authors: Lara Bruno, Ilaria Tipà, Franco Delogu

Abstract:

Simultaneous interpreting (SI) is a demanding language task which includes the contemporary activation of different cognitive processes. This complex activity requires interpreters not only to be proficient in their working languages; but also to have a great ability in focusing attention and controlling anxiety during their performance. Effects of Qigong meditation techniques have a positive impact on several cognitive functions, including attention and anxiety control. This study aims at exploring the influence of Qigong meditation on the quality of simultaneous interpreting. 20 interpreting students, divided into two groups, were trained for 8 days in Qigong meditation practice. Before and after training, a brief simultaneous interpreting task was performed. Language combinations of group A and group B were respectively English-Italian and Chinese-Italian. Students’ performances were recorded and rated by independent evaluators. Assessments were based on 12 different parameters, divided into 4 macro-categories: content, form, delivery and anxiety control. To determine if there was any significant variation between the pre-training and post-training SI performance, ANOVA analyses were conducted on the ratings provided by the independent evaluators. Main results indicate a significant improvement of the interpreting performance after the meditation training intervention for both groups. However, group A registered a higher improvement compared to Group B. Nonetheless, positive effects of meditation have been found in all the observed macro-categories. Meditation was not only beneficial for speech delivery and anxiety control but also for cognitive and attention abilities. From a cognitive and pedagogical point of view, present results open new paths of research on the practice of meditation as a tool to improve SI performances.

Keywords: cognitive science, interpreting studies, Qigong meditation, simultaneous interpreting, training

Procedia PDF Downloads 161
6334 Choosing an Optimal Epsilon for Differentially Private Arrhythmia Analysis

Authors: Arin Ghazarian, Cyril Rakovski

Abstract:

Differential privacy has become the leading technique to protect the privacy of individuals in a database while allowing useful analysis to be done and the results to be shared. It puts a guarantee on the amount of privacy loss in the worst-case scenario. Differential privacy is not a toggle between full privacy and zero privacy. It controls the tradeoff between the accuracy of the results and the privacy loss using a single key parameter called

Keywords: arrhythmia, cardiology, differential privacy, ECG, epsilon, medi-cal data, privacy preserving analytics, statistical databases

Procedia PDF Downloads 155
6333 Use of Locally Available Organic Resources for Soil Fertility Improvement on Farmers Yield in the Eastern and Greater Accra Regions of Ghana

Authors: Ebenezer Amoquandoh, Daniel Bruce Sarpong, Godfred K. Ofosu-Budu, Andreas Fliessbach

Abstract:

Soil quality is at stake globally, but under tropical conditions, the loss of soil fertility may be existential. The current rates of soil nutrient depletion, erosion and environmental degradation in most of Africa’s farmland urgently require methods for soil fertility restoration through affordable agricultural management techniques. The study assessed the effects of locally available organic resources to improve soil fertility, crop yield and profitability compared to business as usual on farms in the Eastern and Greater Accra regions of Ghana. Apart from this, we analyzed the change of farmers’ perceptions and knowledge upon the experience with the new techniques; the effect of using locally available organic resource on farmers’ yield and determined the factors influencing the profitability of farming. Using the Difference in Mean Score and Proportion to estimate the extent to which farmers’ perceptions, knowledge and practices have changed, the study showed that farmers’ perception, knowledge and practice on the use of locally available organic resources have changed significantly. This paves way for the sustainable use of locally available organic resource for soil fertility improvement. The Propensity Score Matching technique and Endogenous Switching Regression model used showed that using locally available organic resources have the potential to increase crop yield. It was also observed that using the Profit Margin, Net Farm Income and Return on Investment analysis, it is more profitable to use locally available organic resources than other soil fertility amendments techniques studied. The results further showed that socioeconomic, farm characteristics and institutional factors are significant in influencing farmers’ decision to use locally available organic resources and profitability.

Keywords: soil fertility, locally available organic resources, perception, profitability, sustainability

Procedia PDF Downloads 149
6332 Combined Cervical Headache Snag with Cervical Snag Half Rotation Techniques on Cervicogenic Headache Patients

Authors: Wael Salah Shendy, Moataz Mohamed EL Semary, Hosam Salah Murad, Adham A. Mohamed

Abstract:

Background: Cervicogenic headache is a major problem in many people suffering from upper cervical dysfunction with a great conflict in its physical therapy management. Objectives: To determine the effect of C1-C2 Mulligan SNAGs mobilizations on cervicogenic headache and associated dizziness symptoms. Methods: Forty-eight patients with cervicogenic headache included in the study; from the outpatient clinic of Faculty of Physical Therapy, Cairo University, and New Cairo outpatient clinics, were randomly assigned into three equal groups; group A ( Headache SNAG), group B (C1-C2 SNAG rotation) and group C (combined). Their mean age was (29.37 ± 2.6), (29.31 ± 2.54) and (29.68 ± 2.65). Neck Disability Index used to examine neck pain intensity and CEH symptoms. 6 Items Headache Impact test '6-HIT' scale used to examine headache severity and its adverse effects on social life and functions. Flexion-Rotation Test 'FRT' also used to assess rotation ROM at the level of C1-C2 by 'CROM' device. Dizziness Handicap Inventory 'DHI' scale was used to evaluate dizziness symptoms. Evaluation is done pre and post treatment, and comparison between groups was quantified. Correlations between the examined parameters were also measured. Headache SNAG and C1-C2 Rotation SNAGs were done separately in group (A- B) and combined in group C as a treatment intervention. Results: Group C has Significant improvement in whole parameters compared to group A and B, positive correlation was found between NDI and 6-HIT scores compared to negative correlation between NDI and DHI scores. Conclusion: SNAGs mobilizations used in the study were effective in reducing cervicogenic headache and dizziness symptoms in all groups with a noticeable improvement in the combined group.

Keywords: cervicogenic headache, cervical headache snag, cervical snag half rotation, cervical dizziness

Procedia PDF Downloads 199
6331 A Hybrid Model of Structural Equation Modelling-Artificial Neural Networks: Prediction of Influential Factors on Eating Behaviors

Authors: Maryam Kheirollahpour, Mahmoud Danaee, Amir Faisal Merican, Asma Ahmad Shariff

Abstract:

Background: The presence of nonlinearity among the risk factors of eating behavior causes a bias in the prediction models. The accuracy of estimation of eating behaviors risk factors in the primary prevention of obesity has been established. Objective: The aim of this study was to explore the potential of a hybrid model of structural equation modeling (SEM) and Artificial Neural Networks (ANN) to predict eating behaviors. Methods: The Partial Least Square-SEM (PLS-SEM) and a hybrid model (SEM-Artificial Neural Networks (SEM-ANN)) were applied to evaluate the factors affecting eating behavior patterns among university students. 340 university students participated in this study. The PLS-SEM analysis was used to check the effect of emotional eating scale (EES), body shape concern (BSC), and body appreciation scale (BAS) on different categories of eating behavior patterns (EBP). Then, the hybrid model was conducted using multilayer perceptron (MLP) with feedforward network topology. Moreover, Levenberg-Marquardt, which is a supervised learning model, was applied as a learning method for MLP training. The Tangent/sigmoid function was used for the input layer while the linear function applied for the output layer. The coefficient of determination (R²) and mean square error (MSE) was calculated. Results: It was proved that the hybrid model was superior to PLS-SEM methods. Using hybrid model, the optimal network happened at MPLP 3-17-8, while the R² of the model was increased by 27%, while, the MSE was decreased by 9.6%. Moreover, it was found that which one of these factors have significantly affected on healthy and unhealthy eating behavior patterns. The p-value was reported to be less than 0.01 for most of the paths. Conclusion/Importance: Thus, a hybrid approach could be suggested as a significant methodological contribution from a statistical standpoint, and it can be implemented as software to be able to predict models with the highest accuracy.

Keywords: hybrid model, structural equation modeling, artificial neural networks, eating behavior patterns

Procedia PDF Downloads 157
6330 Comparison between the Quadratic and the Cubic Linked Interpolation on the Mindlin Plate Four-Node Quadrilateral Finite Elements

Authors: Dragan Ribarić

Abstract:

We employ the so-called problem-dependent linked interpolation concept to develop two cubic 4-node quadrilateral Mindlin plate finite elements with 12 external degrees of freedom. In the problem-independent linked interpolation, the interpolation functions are independent of any problem material parameters and the rotation fields are not expressed in terms of the nodal displacement parameters. On the contrary, in the problem-dependent linked interpolation, the interpolation functions depend on the material parameters and the rotation fields are expressed in terms of the nodal displacement parameters. Two cubic 4-node quadrilateral plate elements are presented, named Q4-U3 and Q4-U3R5. The first one is modelled with one displacement and two rotation degrees of freedom in every of the four element nodes and the second element has five additional internal degrees of freedom to get polynomial completeness of the cubic form and which can be statically condensed within the element. Both elements are able to pass the constant-bending patch test exactly as well as the non-zero constant-shear patch test on the oriented regular mesh geometry in the case of cylindrical bending. In any mesh shape, the elements have the correct rank and only the three eigenvalues, corresponding to the solid body motions are zero. There are no additional spurious zero modes responsible for instability of the finite element models. In comparison with the problem-independent cubic linked interpolation implemented in Q9-U3, the nine-node plate element, significantly less degrees of freedom are employed in the model while retaining the interpolation conformity between adjacent elements. The presented elements are also compared to the existing problem-independent quadratic linked-interpolation element Q4-U2 and to the other known elements that also use the quadratic or the cubic linked interpolation, by testing them on several benchmark examples. Simple functional upgrading from the quadratic to the cubic linked interpolation, implemented in Q4-U3 element, showed no significant improvement compared to the quadratic linked form of the Q4-U2 element. Only when the additional bubble terms are incorporated in the displacement and rotation function fields, which complete the full cubic linked interpolation form, qualitative improvement is fulfilled in the Q4-U3R5 element. Nevertheless, the locking problem exists even for the both presented elements, like in all pure displacement elements when applied to very thin plates modelled by coarse meshes. But good and even slightly better performance can be noticed for the Q4-U3R5 element when compared with elements from the literature, if the model meshes are moderately dense and the plate thickness not extremely thin. In some cases, it is comparable to or even better than Q9-U3 element which has as many as 12 more external degrees of freedom. A significant improvement can be noticed in particular when modeling very skew plates and models with singularities in the stress fields as well as circular plates with distorted meshes.

Keywords: Mindlin plate theory, problem-independent linked interpolation, problem-dependent interpolation, quadrilateral displacement-based plate finite elements

Procedia PDF Downloads 313
6329 Clinical Parameters Response to Low Level Laser Versus Monochromatic Near Infrared Photo Energy in Diabetic Patient with Peripheral Neuropathy

Authors: Abeer Ahmed Abdehameed

Abstract:

Background: Diabetic sensorimotor polyneuropathy (DSP) is one of the most common micro vascular complications of type 2 diabetes. Loss of sensation is thought to contribute to lake of static and dynamic stability and increased risk of falling. Purpose: The purpose of this study was to compare the effects of low level laser (LLL) and monochromatic near infrared photo energy (MIRE) on pain , cutaneous sensation, static stability and index of lower limb blood flow in diabetic with peripheral neuropathy. Methods: Forty subjects with diabetic peripheral neuropathy were recruited for study. They were divided into two groups: The ( MIRE) group that included (20) patients and (LLL) group included (20) patients. All patients in the study had been subjected to various physical assessment procedures including pain, cutaneous sensation, Doppler flow meter and static stability assessments. The baseline measurements were followed by treatment sessions that conducted twice a week for 6 successive weeks. Results: The statistical analysis of the data had revealed significant improvement of the pain in both groups, with significant improvement in cutaneous sensation and static balance in (MIRE) group compared to (LLL) group; on the other hand results showed no significant differences on lower limb blood flow in both groups. Conclusion: Low level laser and monochromatic near infrared therapy can improve painful symptoms in patients with diabetic neuropathy. On the other hand (MIRE) is useful in improving cutaneous sensation and static stability in patients with diabetic neuropathy.

Keywords: diabetic neuropathy, doppler flow meter, low level laser, monochromatic near infrared photo energy

Procedia PDF Downloads 315
6328 Larger Diameter 22 MM-PDC Cutter Greatly Improves Drilling Efficiency of PDC Bit

Authors: Fangyuan Shao, Wei Liu, Deli Gao

Abstract:

With the increasing speed of oil and gas exploration, development and production at home and abroad, the demand for drilling speed up technology is becoming more and more critical to reduce the development cost. Highly efficient and personalized PDC bit is important equipment in the bottom hole assembly (BHA). Therefore, improving the rock-breaking efficiency of PDC bits will help reduce drilling time and drilling cost. Advances in PDC bit technology have resulted in a leapfrogging improvement in the rate of penetration (ROP) of PDC bits over roller cone bits in soft to medium-hard formations. Recently, with the development of PDC technology, the diameter of the PDC tooth can be further expanded. The maximum diameter of the PDC cutter used in this paper is 22 mm. According to the theoretical calculation, under the same depth of cut (DOC), the 22mm-PDC cutter increases the exposure of the cutter, and the increase of PDC cutter diameter helps to increase the cutting area of the PDC cutter. In order to evaluate the cutting performance of the 22 mm-PDC cutter and the existing commonly used cutters, the 16 mm, 19 mm and 22 mm PDC cutter was selected put on a vertical turret lathe (VTL) in the laboratory for cutting tests under different DOCs. The DOCs were 0.5mm, 1.0 mm, 1.5 mm and 2.0 mm, 2.5 mm and 3 mm, respectively. The rock sample used in the experiment was limestone. Results of laboratory tests have shown the new 22 mm-PDC cutter technology greatly improved cutting efficiency. On the one hand, as the DOC increases, the mechanical specific energy (MSE) of all cutters decreases, which means that the cutting efficiency increases. On the other hand, under the same DOC condition, the larger the cutter diameter is, the larger the working area of the cutter is, which leads to higher the cutting efficiency. In view of the high performance of the 22 mm-PDC cutters, which was applied to carry out full-scale bit field experiments. The result shows that the bit with 22mm-PDC cutters achieves a breakthrough improvement of ROP than that with conventional 16mm and 19mm cutters in offset well drilling.

Keywords: polycrystalline diamond compact, 22 mm-PDC cutters, cutting efficiency, mechanical specific energy

Procedia PDF Downloads 207
6327 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis

Authors: Mehrnaz Mostafavi

Abstract:

The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.

Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans

Procedia PDF Downloads 104
6326 Classification Using Worldview-2 Imagery of Giant Panda Habitat in Wolong, Sichuan Province, China

Authors: Yunwei Tang, Linhai Jing, Hui Li, Qingjie Liu, Xiuxia Li, Qi Yan, Haifeng Ding

Abstract:

The giant panda (Ailuropoda melanoleuca) is an endangered species, mainly live in central China, where bamboos act as the main food source of wild giant pandas. Knowledge of spatial distribution of bamboos therefore becomes important for identifying the habitat of giant pandas. There have been ongoing studies for mapping bamboos and other tree species using remote sensing. WorldView-2 (WV-2) is the first high resolution commercial satellite with eight Multi-Spectral (MS) bands. Recent studies demonstrated that WV-2 imagery has a high potential in classification of tree species. The advanced classification techniques are important for utilising high spatial resolution imagery. It is generally agreed that object-based image analysis is a more desirable method than pixel-based analysis in processing high spatial resolution remotely sensed data. Classifiers that use spatial information combined with spectral information are known as contextual classifiers. It is suggested that contextual classifiers can achieve greater accuracy than non-contextual classifiers. Thus, spatial correlation can be incorporated into classifiers to improve classification results. The study area is located at Wuyipeng area in Wolong, Sichuan Province. The complex environment makes it difficult for information extraction since bamboos are sparsely distributed, mixed with brushes, and covered by other trees. Extensive fieldworks in Wuyingpeng were carried out twice. The first one was on 11th June, 2014, aiming at sampling feature locations for geometric correction and collecting training samples for classification. The second fieldwork was on 11th September, 2014, for the purposes of testing the classification results. In this study, spectral separability analysis was first performed to select appropriate MS bands for classification. Also, the reflectance analysis provided information for expanding sample points under the circumstance of knowing only a few. Then, a spatially weighted object-based k-nearest neighbour (k-NN) classifier was applied to the selected MS bands to identify seven land cover types (bamboo, conifer, broadleaf, mixed forest, brush, bare land, and shadow), accounting for spatial correlation within classes using geostatistical modelling. The spatially weighted k-NN method was compared with three alternatives: the traditional k-NN classifier, the Support Vector Machine (SVM) method and the Classification and Regression Tree (CART). Through field validation, it was proved that the classification result obtained using the spatially weighted k-NN method has the highest overall classification accuracy (77.61%) and Kappa coefficient (0.729); the producer’s accuracy and user’s accuracy achieve 81.25% and 95.12% for the bamboo class, respectively, also higher than the other methods. Photos of tree crowns were taken at sample locations using a fisheye camera, so the canopy density could be estimated. It is found that it is difficult to identify bamboo in the areas with a large canopy density (over 0.70); it is possible to extract bamboos in the areas with a median canopy density (from 0.2 to 0.7) and in a sparse forest (canopy density is less than 0.2). In summary, this study explores the ability of WV-2 imagery for bamboo extraction in a mountainous region in Sichuan. The study successfully identified the bamboo distribution, providing supporting knowledge for assessing the habitats of giant pandas.

Keywords: bamboo mapping, classification, geostatistics, k-NN, worldview-2

Procedia PDF Downloads 314
6325 Applicability of Linearized Model of Synchronous Generator for Power System Stability Analysis

Authors: J. Ritonja, B. Grcar

Abstract:

For the synchronous generator simulation and analysis and for the power system stabilizer design and synthesis a mathematical model of synchronous generator is needed. The model has to accurately describe dynamics of oscillations, while at the same time has to be transparent enough for an analysis and sufficiently simplified for design of control system. To study the oscillations of the synchronous generator against to the rest of the power system, the model of the synchronous machine connected to an infinite bus through a transmission line having resistance and inductance is needed. In this paper, the linearized reduced order dynamic model of the synchronous generator connected to the infinite bus is presented and analysed in details. This model accurately describes dynamics of the synchronous generator only in a small vicinity of an equilibrium state. With the digression from the selected equilibrium point the accuracy of this model is decreasing considerably. In this paper, the equations’ descriptions and the parameters’ determinations for the linearized reduced order mathematical model of the synchronous generator are explained and summarized and represent the useful origin for works in the areas of synchronous generators’ dynamic behaviour analysis and synchronous generator’s control systems design and synthesis. The main contribution of this paper represents the detailed analysis of the accuracy of the linearized reduced order dynamic model in the entire synchronous generator’s operating range. Borders of the areas where the linearized reduced order mathematical model represents accurate description of the synchronous generator’s dynamics are determined with the systemic numerical analysis. The thorough eigenvalue analysis of the linearized models in the entire operating range is performed. In the paper, the parameters of the linearized reduced order dynamic model of the laboratory salient poles synchronous generator were determined and used for the analysis. The theoretical conclusions were confirmed with the agreement of experimental and simulation results.

Keywords: eigenvalue analysis, mathematical model, power system stability, synchronous generator

Procedia PDF Downloads 248
6324 Agile Software Effort Estimation Using Regression Techniques

Authors: Mikiyas Adugna

Abstract:

Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.

Keywords: agile software development, effort estimation, elastic net regression, LASSO

Procedia PDF Downloads 73
6323 Enhancing the Implementation Strategy of Simultaneous Operations (SIMOPS) for the Major Turnaround at Pertamina Plaju Refinery

Authors: Fahrur Rozi, Daniswara Krisna Prabatha, Latief Zulfikar Chusaini

Abstract:

Amidst the backdrop of Pertamina Plaju Refinery, which stands as the oldest and historically less technologically advanced among Pertamina's refineries, lies a unique challenge. Originally integrating facilities established by Shell in 1904 and Stanvac (originally Standard Oil) in 1926, the primary challenge at Plaju Refinery does not solely revolve around complexity; instead, it lies in ensuring reliability, considering its operational history of over a century. After centuries of existence, Plaju Refinery has never undergone a comprehensive major turnaround encompassing all its units. The usual practice involves partial turnarounds that are sequentially conducted across its primary, secondary, and tertiary units (utilities and offsite). However, a significant shift is on the horizon. In the Q-IV of 2023, the refinery embarks on its first-ever major turnaround since its establishment. This decision was driven by the alignment of maintenance timelines across various units. Plaju Refinery's major turnaround was scheduled for October-November 2023, spanning 45 calendar days, with the objective of enhancing the operational reliability of all refinery units. The extensive job list for this turnaround encompasses 1583 tasks across 18 units/areas, involving approximately 9000 contracted workers. In this context, the Strategy of Simultaneous Operations (SIMOPS) execution emerges as a pivotal tool to optimize time efficiency and ensure safety. A Hazard Effect Management Process (HEMP) has been employed to assess the risk ratings of each task within the turnaround. Out of the tasks assessed, 22 are deemed high-risk and necessitate mitigation. The SIMOPS approach serves as a preventive measure against potential incidents. It is noteworthy that every turnaround period at Pertamina Plaju Refinery involves SIMOPS-related tasks. In this context, enhancing the implementation strategy of "Simultaneous Operations (SIMOPS)" becomes imperative to minimize the occurrence of incidents. At least four improvements have been introduced in the enhancement process for the major turnaround at Refinery Plaju. The first improvement involves conducting systematic risk assessment and potential hazard mitigation studies for SIMOPS tasks before task execution, as opposed to the previous on-site approach. The second improvement includes the completion of SIMOPS Job Mitigation and Work Matrices Sheets, which was often neglected in the past. The third improvement emphasizes comprehensive awareness to workers/contractors regarding potential hazards and mitigation strategies for SIMOPS tasks before and during the major turnaround. The final improvement is the introduction of a daily program for inspecting and observing work in progress for SIMOPS tasks. Prior to these improvements, there was no established program for monitoring ongoing activities related to SIMOPS tasks during the turnaround. This study elucidates the steps taken to enhance SIMOPS within Pertamina, drawing from the experiences of Plaju Refinery as a guide. A real actual case study will be provided from our experience in the operational unit. In conclusion, these efforts are essential for the success of the first-ever major turnaround at Plaju Refinery, with the SIMOPS strategy serving as a central component. Based on these experiences, enhancements have been made to Pertamina's official Internal Guidelines for Executing SIMOPS Risk Mitigation, benefiting all Pertamina units.

Keywords: process safety management, turn around, oil refinery, risk assessment

Procedia PDF Downloads 76
6322 Effectiveness and Safety of Vitamin D3 Supplementation in Children across India: A Retrospective Study

Authors: Gunjal Vijaya, Madkholkar Nishikant, Pawar Roshan, Sharma Akhilesh

Abstract:

Background: Vitamin D deficiency poses a significant burden on the Indian pediatric population. This observational study aims to evaluate the real-world effectiveness and safety of vitamin D3 supplementation in children. Methods: A retrospective multi-center study was conducted using data from electronic health records spanning 10 years (2014–2024). Pediatric patients aged 0–12 years with documented vitamin D deficiency who were supplemented with vitamin D3 for at least two months were included. Serum vitamin D levels, health improvements using PROMIS Global Physical Health Scale means by Response to Excellent to Poor (global01) Item, and adverse effects were analyzed. Paired t-test was used for data analysis. Results: A sample size of 1,942 participants was included, for which the serum vitamin D levels significantly elevated from 18.01 ng/mL (±8.73) before supplementation to 35.29 ng/mL (±13.53) post-supplementation (p < 0.001). Participants with poorer baseline health experienced greater elevation in serum vitamin D levels. Improvements were reported in general health (70.64% rated excellent or very good), bone development (58.35% mean improvement), and immunity (62.41% mean improvement). Adverse events were rare (0.21%), and mild, primarily transient gastrointestinal symptoms were reported by most of the participants. Conclusion: Vitamin D3 supplementation successfully corrects deficiency and improves pediatric health outcomes with minimal adverse effects. This study provides essential real-world evidence for its clinical application in India. Future research should focus on randomized controlled trials to validate these findings and assist in developing dosing regimens.

Keywords: cholecalciferol, pediatric health, vitamin D deficiency, real world evidence

Procedia PDF Downloads 3
6321 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 116
6320 Enhancing Nursing Teams' Learning: The Role of Team Accountability and Team Resources

Authors: Sarit Rashkovits, Anat Drach- Zahavy

Abstract:

The research considers the unresolved question regarding the link between nursing team accountability and team learning and the resulted team performance in nursing teams. Empirical findings reveal disappointing evidence regarding improvement in healthcare safety and quality. Therefore, there is a need in advancing managerial knowledge regarding the factors that enhance constant healthcare teams' proactive improvement efforts, meaning team learning. We first aim to identify the organizational resources that are needed for team learning in nursing teams; second, to test the moderating role of nursing teams' learning resources in the team accountability-team learning link; and third, to test the moderated mediation model suggesting that nursing teams' accountability affects team performance by enhancing team learning when relevant resources are available to the team. We point on the intervening role of three team learning resources, namely time availability, team autonomy and performance data on the relation between team accountability and team learning and test the proposed moderated mediation model on 44 nursing teams (462 nurses and 44 nursing managers). The results showed that, as was expected, there was a positive significant link between team accountability and team learning and the subsequent team performance when time availability and team autonomy were high rather than low. Nevertheless, the positive team accountability- team learning link was significant when team performance feedback was low rather than high. Accordingly, there was a positive mediated effect of team accountability on team performance via team learning when either time availability or team autonomy were high and the availability of team performance data was low. Nevertheless, this mediated effect was negative when time availability and team autonomy were low and the availability of team performance data was high. We conclude that nurturing team accountability is not enough for achieving nursing teams' learning and the subsequent improved team performance. Rather there is need to provide nursing teams with adequate time, autonomy, and be cautious with performance feedback, as the latter may motivate nursing teams to repeat routine work strategies rather than explore improved ones.

Keywords: nursing teams' accountability, nursing teams' learning, performance feedback, teams' autonomy

Procedia PDF Downloads 266
6319 Feasibility Study on Developing and Enhancing of Flood Forecasting and Warning Systems in Thailand

Authors: Sitarrine Thongpussawal, Dasarath Jayasuriya, Thanaroj Woraratprasert, Sakawtree Prajamwong

Abstract:

Thailand grapples with recurrent floods causing substantial repercussions on its economy, society, and environment. In 2021, the economic toll of these floods amounted to an estimated 53,282 million baht, primarily impacting the agricultural sector. The existing flood monitoring system in Thailand suffers from inaccuracies and insufficient information, resulting in delayed warnings and ineffective communication to the public. The Office of the National Water Resources (OWNR) is tasked with developing and integrating data and information systems for efficient water resources management, yet faces challenges in monitoring accuracy, forecasting, and timely warnings. This study endeavors to evaluate the viability of enhancing Thailand's Flood Forecasting and Warning (FFW) systems. Additionally, it aims to formulate a comprehensive work package grounded in international best practices to enhance the country's FFW systems. Employing qualitative research methodologies, the study conducted in-depth interviews and focus groups with pertinent agencies. Data analysis involved techniques like note-taking and document analysis. The study substantiates the feasibility of developing and enhancing FFW systems in Thailand. Implementation of international best practices can augment the precision of flood forecasting and warning systems, empowering local agencies and residents in high-risk areas to prepare proactively, thereby minimizing the adverse impact of floods on lives and property. This research underscores that Thailand can feasibly advance its FFW systems by adopting international best practices, enhancing accuracy, and improving preparedness. Consequently, the study enriches the theoretical understanding of flood forecasting and warning systems and furnishes valuable recommendations for their enhancement in Thailand.

Keywords: flooding, forecasting, warning, monitoring, communication, Thailand

Procedia PDF Downloads 62