Search results for: gait time
14814 The Effect of Mindfulness Meditation on Pain, Sleep Quality, and Self-Esteem in Patients Receiving Hemodialysis in Jordan
Authors: Hossam N. Alhawatmeh, Areen I. Albustanji
Abstract:
Hemodialysis negatively affects physical and psychological health. Pain, poor sleep quality, and low self-esteem are highly prevalent among patients with end-stage renal disease (ESRD) who receive hemodialysis, significantly increasing mortality and morbidity of those patients. Mind-body interventions (MBI), such as mindfulness meditation, have been recently gaining popularity that improved pain, sleep quality, and self-esteem in different populations. However, to our best knowledge, its effects on these health problems in patients receiving hemodialysis have not been studied in Jordan. Thus, the purpose of the study was to examine the effect of mindfulness meditation on pain, sleep quality, and self-esteem in patients with ESR receiving hemodialysis in Jordan. An experimental repeated-measures, randomized, parallel control design was conducted on (n =60) end-stage renal disease patients undergoing hemodialysis between March and June 2023 in the dialysis center at a public hospital in Jordan. Participants were randomly assigned to the experimental (n =30) and control groups (n =30) using a simple random assignment method. The experimental group practiced mindfulness meditation for 30 minutes three times per week for five weeks during their hemodialysis treatments. The control group's patients continued to receive hemodialysis treatment as usual for five weeks during hemodialysis sessions. The study variables for both groups were measured at baseline (Time 0), two weeks after intervention (Time 1), and at the end of intervention (Time 3). The numerical rating scale (NRS), the Rosenberg Self-Esteem Scale (RSES-M), and the Pittsburgh Sleep Quality Index (PSQI) were used to measure pain, self-esteem, and sleep quality, respectively. SPSS version 25 was used to analyze the study data. The sample was described by frequency, mean, and standard deviation as an appropriate. The repeated measures analysis of variance (ANOVA) tests were run to test the study hypotheses. The results of repeated measures ANOVA (within-subject) revealed that mindfulness meditation significantly decrease pain by the end of the intervention in the experimental group. Additionally, mindfulness meditation improved sleep quality and self-esteem in the experimental group, and these improvements occurred significantly after two weeks of the intervention and at the end of the intervention. The results of repeated measures ANOVA (within and between-subject) revealed that the experimental group, compared to the control group, experienced lower levels of pain and higher levels of sleep quality and self-esteem over time. In conclusion, the results provided substantial evidence supporting the positive impacts of mindfulness meditation on pain, sleep quality, and self-esteem in patients with ESRD undergoing hemodialysis. These results highlight the potential of mindfulness meditation as an adjunctive therapy in the comprehensive care of this patient population. Incorporating mindfulness meditation into the treatment plan for patients receiving hemodialysis may contribute to improved well-being and overall quality of life.Keywords: hemodialysis, pain, sleep quality, self-esteem, mindfulness
Procedia PDF Downloads 8614813 Presence and Absence: The Use of Photographs in Paris, Texas
Authors: Yi-Ting Wang, Wen-Shu Lai
Abstract:
The subject of this paper is the photography in the 1983 film Paris, Texas, directed by Wim Wenders. Wenders is well known as a film director as well as a photographer. We have found that photography is shown as a photographic element in many of his films. Some of these photographs serve as details within the films, while others play important roles that are relevant to the story. This paper aims to consider photographs in film as a specific type of text, which is the output of both still photography and the film itself. In the film Paris, Texas, three sets of important photographs appear whose symbolic meanings are as dialectical as their text types. The relationship between the existence of these photos and the storyline is both dependent and isolated. The film’s images fly by and progress into other images, while the photos in the film serve a unique narrative function by stopping the continuously flowing images thus provide the viewer a space for imagination and contemplation. They are more than just artistic forms; they also contained multiple meanings. The photographs in Paris, Texas play the role of both presence and absence according to their shifting meanings. There are references to their presence: photographs exist between film time and narrative time, so in terms of the interaction between the characters in the film, photographs are a common symbol of the beginning and end of the characters’ journeys. In terms of the audience, the film’s photographs are a link in the viewing frame structure, through which the creative motivation of the film director can be explored. Photographs also point to the absence of certain objects: the scenes in the photos represent an imaginary map of emotion. The town of Paris, Texas is therefore isolated from the physical presence of the photograph, and is far more abstract than the reality in the film. This paper embraces the ambiguous nature of photography and demonstrates its presence and absence in film with regard to the meaning of text. However, it is worth reflecting that the temporary nature of the interpretation of the film’s photographs is far greater than any other type of photographic text: the characteristics of the text cause the interpretation results to change along with the variations in the interpretation process, which makes their meaning a dynamic process. The photographs’ presence or absence in the context of Paris, Texas also demonstrates the presence and absence of the creator, time, the truth, and the imagination. The film becomes more complete as a result of the revelation of the photographs, while the intertextual connection between these two forms simultaneously provides multiple possibilities for the interpretation of the photographs in the film.Keywords: film, Paris, Texas, photography, Wim Wenders
Procedia PDF Downloads 31914812 Embedded Visual Perception for Autonomous Agricultural Machines Using Lightweight Convolutional Neural Networks
Authors: René A. Sørensen, Søren Skovsen, Peter Christiansen, Henrik Karstoft
Abstract:
Autonomous agricultural machines act in stochastic surroundings and therefore, must be able to perceive the surroundings in real time. This perception can be achieved using image sensors combined with advanced machine learning, in particular Deep Learning. Deep convolutional neural networks excel in labeling and perceiving color images and since the cost of high-quality RGB-cameras is low, the hardware cost of good perception depends heavily on memory and computation power. This paper investigates the possibility of designing lightweight convolutional neural networks for semantic segmentation (pixel wise classification) with reduced hardware requirements, to allow for embedded usage in autonomous agricultural machines. Using compression techniques, a lightweight convolutional neural network is designed to perform real-time semantic segmentation on an embedded platform. The network is trained on two large datasets, ImageNet and Pascal Context, to recognize up to 400 individual classes. The 400 classes are remapped into agricultural superclasses (e.g. human, animal, sky, road, field, shelterbelt and obstacle) and the ability to provide accurate real-time perception of agricultural surroundings is studied. The network is applied to the case of autonomous grass mowing using the NVIDIA Tegra X1 embedded platform. Feeding case-specific images to the network results in a fully segmented map of the superclasses in the image. As the network is still being designed and optimized, only a qualitative analysis of the method is complete at the abstract submission deadline. Proceeding this deadline, the finalized design is quantitatively evaluated on 20 annotated grass mowing images. Lightweight convolutional neural networks for semantic segmentation can be implemented on an embedded platform and show competitive performance with regards to accuracy and speed. It is feasible to provide cost-efficient perceptive capabilities related to semantic segmentation for autonomous agricultural machines.Keywords: autonomous agricultural machines, deep learning, safety, visual perception
Procedia PDF Downloads 39614811 Development and Characterization of Wheat Bread with Lupin Flour
Authors: Paula M. R. Correia, Marta Gonzaga, Luis M. Batista, Luísa Beirão-Costa, Raquel F. P. Guiné
Abstract:
The purpose of the present work was to develop an innovative food product with good textural and sensorial characteristics. The product, a new type of bread, was prepared with wheat (90%) and lupin (10%) flours, without the addition of any conservatives. Several experiences were also done to find the most appropriate proportion of lupin flour. The optimized product was characterized considering the rheological, physical-chemical and sensorial properties. The water absorption of wheat flour with 10% of lupin was higher than that of the normal wheat flours, and Wheat Ceres flour presented the lower value, with lower dough development time and high stability time. The breads presented low moisture but a considerable water activity. The density of bread decreased with the introduction of lupin flour. The breads were quite white, and during storage the colour parameters decreased. The lupin flour clearly increased the number of alveolus, but the total area increased significantly just for the Wheat Cerealis bread. The addition of lupin flour increased the hardness and chewiness of breads, but the elasticity did not vary significantly. Lupin bread was sensorially similar to wheat bread produced with WCerealis flour, and the main differences are the crust rugosity, colour and alveolus characteristics.Keywords: Lupin flour, physical-chemical properties, sensorial analysis, wheat flour
Procedia PDF Downloads 51414810 The Effect of Supercritical Fluid on the Extraction Efficiency of Heavy Metal from Soil
Authors: Haifa El-Sadi, Maria Elektorowicz, Reed Rushing, Ammar Badawieh, Asif Chaudry
Abstract:
Clay soils have particular properties that affect the assessment and remediation of contaminated sites. In clay soils, electro-kinetic transport of heavy metals has been carried out. The transport of these metals is predicated on maintaining a low pH throughout the cell, which, in turn, keeps the metals in the pore water phase where they are accessible to electro-kinetic transport. Supercritical fluid extraction and acid digestion were used for the analysis of heavy metals concentrations after the completion of electro-kinetic experimentation. Supercritical fluid (carbon dioxide) extraction is a new technique used to extract the heavy metal (lead, nickel, calcium and potassium) from clayey soil. The comparison between supercritical extraction and acid digestion of different metals was carried out. Supercritical fluid extraction, using ethylenediaminetetraacetic acid (EDTA) as a modifier, proved to be efficient and a safer technique than acid digestion technique in extracting metals from clayey soil. Mixing time of soil with EDTA before extracting heavy metals from clayey soil was investigated. The optimum and most practical shaking time for the extraction of lead, nickel, calcium and potassium was two hours.Keywords: clay soil, heavy metals, supercritical fluid extraction, acid digestion
Procedia PDF Downloads 46714809 Management of Acute Biliary Pathology at Gozo General Hospital
Authors: Kristian Bugeja, Upeshala A. Jayawardena, Clarissa Fenech, Mark Zammit Vincenti
Abstract:
Introduction: Biliary colic, acute cholecystitis, and gallstone pancreatitis are some of the most common surgical presentations at Gozo General Hospital (GGH). National Institute for Health and Care Excellence (NICE) guidelines advise that suitable patients with acute biliary problems should be offered a laparoscopic cholecystectomy within one week of diagnosis. There has traditionally been difficulty in achieving this mainly due to the reluctance of some surgeons to operate in the acute setting, limited, timely access to MRCP and ERCP, and organizational issues. Methodology: A retrospective study was performed involving all biliary pathology-related admissions to GGH during the two-year period of 2019 and 2020. Patients’ files and electronic case summary (ECS) were used for data collection, which included demographic data, primary diagnosis, co-morbidities, management, waiting time to surgery, length of stay, readmissions, and reason for readmissions. NICE clinical guidance 188 – Gallstone disease were used as the standard. Results: 51 patients were included in the study. The mean age was 58 years, and 35 (68.6%) were female. The main diagnoses on admission were biliary colic in 31 (60.8%), acute cholecystitis in 10 (19.6%). Others included gallstone pancreatitis in 3 (5.89%), chronic cholecystitis in 2 (3.92%), gall bladder malignancy in 4 (7.84%), and ascending cholangitis in 1 (1.97%). Management included laparoscopic cholecystectomy in 34 (66.7%); conservative in 8 (15.7%) and ERCP in 6 (11.7%). The mean waiting time for laparoscopic cholecystectomy for patients with acute cholecystitis was 74 days – range being between 3 and 146 days since the date of diagnosis. Only one patient who was diagnosed with acute cholecystitis and managed with laparoscopic cholecystectomy was done so within the 7-day time frame. Hospital re-admissions were reported in 5 patients (9.8%) due to vomiting (1), ascending cholangitis (1), and gallstone pancreatitis (3). Discussion: Guidelines were not met for patients presenting to Gozo General Hospital with acute biliary pathology. This resulted in 5 patients being re-admitted to hospital while waiting for definitive surgery. The local issues resulting in the delay to surgery need to be identified and steps are taken to facilitate the provision of urgent cholecystectomy for suitable patients.Keywords: biliary colic, acute cholecystits, laparoscopic cholecystectomy, conservative management
Procedia PDF Downloads 16114808 Empirical Investigation of Gender Differences in Information Processing Style, Tinkering, and Self-Efficacy for Robot Tele-Operation
Authors: Dilruba Showkat, Cindy Grimm
Abstract:
As robots become more ubiquitous, it is significant for us to understand how different groups of people respond to possible ways of interacting with the robot. In this study, we focused on gender differences while users were tele-operating a humanoid robot that was physically co-located with them. We investigated three factors during the human-robot interaction (1) information processing strategy (2) self-efficacy and (3) tinkering or exploratory behavior. The experimental results show that the information on how to use the robot was processed comprehensively by the female participants whereas males processed them selectively (p < 0.001). Males were more confident when using the robot than females (p = 0.0002). Males tinkered more with the robot than females (p = 0.0021). We found that tinkering was positively correlated (p = 0.0068) with task success and negatively correlated (p = 0.0032) with task completion time. Tinkering might have resulted in greater task success and lower task completion time for males. Findings from this research can be used for making design decisions for robots and open new research directions. Our results show the importance of accounting for gender differences when developing interfaces for interacting with robots and open new research directions.Keywords: humanoid robots, tele-operation, gender differences, human-robot interaction
Procedia PDF Downloads 16714807 Massive Intrapartum Hemorrhage Following by Inner Myometrial Laceration during a Vaginal Delivery: A Rare Case Report
Authors: Bahareh Khakifirooz, Arian Shojaei, Amirhossein Hajialigol, Bahare Abdolahi
Abstract:
Laceration of the inner layer of the myometrium can cause massive bleeding during and after childbirth, which can lead to the death of the mother if it is not diagnosed in time. We studied a rare case of massive intrapartum bleeding following myometrial laceration that was diagnosed correctly, and the patient survived with in-time treatments. The patient was a 26 years-old woman who was under observation for term pregnancy and complaint of rupture of membranes (ROM) and vaginal bleeding. Following the spontaneous course of labor and without receiving oxytocin, during the normal course of labor, she had an estimated total blood loss of 750 mL bleeding, which, despite the normal fetal heart rate and with the mother's indication for cesarean section, was transferred to the operating room and underwent cesarean section. During the cesarean section, the amniotic fluid was clear; after the removal of the placenta, severe and clear bleeding was flowing from the posterior wall of the uterus, which was caused by the laceration of the inner layer of the myometrium in the posterior wall of the lower segment of the uterus. The myometrial laceration was repaired with absorbable continuous locked sutures, and hemostasis was established, then, the patient used uterotonic drugs, and after monitoring, the patient was discharged from the hospital in good condition.Keywords: intrapartum hemorrhage, inner myometrial laceration, labor, Increased intrauterine pressure
Procedia PDF Downloads 2514806 Solution of Singularly Perturbed Differential Difference Equations Using Liouville Green Transformation
Authors: Y. N. Reddy
Abstract:
The class of differential-difference equations which have characteristics of both classes, i.e., delay/advance and singularly perturbed behaviour is known as singularly perturbed differential-difference equations. The expression ‘positive shift’ and ‘negative shift’ are also used for ‘advance’ and ‘delay’ respectively. In general, an ordinary differential equation in which the highest order derivative is multiplied by a small positive parameter and containing at least one delay/advance is known as singularly perturbed differential-difference equation. Singularly perturbed differential-difference equations arise in the modelling of various practical phenomena in bioscience, engineering, control theory, specifically in variational problems, in describing the human pupil-light reflex, in a variety of models for physiological processes or diseases and first exit time problems in the modelling of the determination of expected time for the generation of action potential in nerve cells by random synaptic inputs in dendrites. In this paper, we envisage the use of Liouville Green Transformation to find the solution of singularly perturbed differential difference equations. First, using Taylor series, the given singularly perturbed differential difference equation is approximated by an asymptotically equivalent singularly perturbation problem. Then the Liouville Green Transformation is applied to get the solution. Several model examples are solved, and the results are compared with other methods. It is observed that the present method gives better approximate solutions.Keywords: difference equations, differential equations, singular perturbations, boundary layer
Procedia PDF Downloads 19914805 Neighbourhood Walkability and Quality of Life: The Mediating Role of Place Adherence and Social Interaction
Authors: Michał Jaśkiewicz
Abstract:
The relation between walkability, place adherence, social relations and quality of life was explored in a Polish context. A considerable number of studies have suggested that environmental factors may influence the quality of life through indirect pathways. The list of possible psychological mediators includes social relations and identity-related variables. Based on the results of Study 1, local identity is a significant mediator in the relationship between neighbourhood walkability and quality of life. It was assumed that pedestrian-oriented neighbourhoods enable residents to interact and that these spontaneous interactions can help to strengthen a sense of local identity, thus influencing the quality of life. We, therefore, conducted further studies, testing the relationship experimentally in studies 2a and 2b. Participants were exposed to (2a) photos of walkable/non-walkable neighbourhoods or (2b) descriptions of high/low-walkable neighbourhoods. They were then asked to assess the walkability of the neighbourhoods and to evaluate their potential social relations and quality of life in these places. In both studies, social relations with neighbours turned out to be a significant mediator between walkability and quality of life. In Study 3, we implemented the measure of overlapping individual and communal identity (fusion with the neighbourhood) and willingness to collective action as mediators. Living in a walkable neighbourhood was associated with identity fusion with that neighbourhood. Participants who felt more fused expressed greater willingness to engage in collective action with other neighbours. Finally, this willingness was positively related to the quality of life in the city. In Study 4, we used commuting time (an aspect of walkability related to the time that people spend travelling to work) as the independent variable. The results showed that a shorter average daily commuting time was linked to more frequent social interactions in the neighbourhood. Individuals who assessed their social interactions as more frequent expressed a stronger city identification, which was in turn related to quality of life. To sum up, our research replicated and extended previous findings on the association between walkability and well-being measures. We introduced potential mediators of this relationship: social interactions in the neighbourhood and identity-related variables.Keywords: walkability, quality of life, social relations, analysis of mediation
Procedia PDF Downloads 32714804 Graphic Calculator Effectiveness in Biology Teaching and Learning
Authors: Nik Azmah Nik Yusuff, Faridah Hassan Basri, Rosnidar Mansor
Abstract:
The purpose of the study is to find out the effectiveness of using Graphic calculators (GC) with Calculator Based Laboratory 2 (CBL2) in teaching and learning of form four biology for these topics: Nutrition, Respiration and Dynamic Ecosystem. Sixty form four science stream students were the participants of this study. The participants were divided equally into the treatment and control groups. The treatment group used GC with CBL2 during experiments while the control group used the ordinary conventional laboratory apparatus without using GC with CBL2. Instruments in this study were a set of pre-test and post-test and a questionnaire. T-Test was used to compare the student’s biology achievement while a descriptive statistic was used to analyze the outcome of the questionnaire. The findings of this study indicated the use of GC with CBL2 in biology had significant positive effect. The highest mean was 4.43 for item stating the use of GC with CBL2 had saved collecting experiment result’s time. The second highest mean was 4.10 for item stating GC with CBL2 had saved drawing and labelling graphs. The outcome from the questionnaire also showed that GC with CBL2 were easy to use and save time. Thus, teachers should use GC with CBL2 in support of efforts by Malaysia Ministry of Education in encouraging technology-enhanced lessons.Keywords: biology experiments, Calculator-Based Laboratory 2 (CBL2), graphic calculators, Malaysia Secondary School, teaching/learning
Procedia PDF Downloads 40314803 Plotting of an Ideal Logic versus Resource Outflow Graph through Response Analysis on a Strategic Management Case Study Based Questionnaire
Authors: Vinay A. Sharma, Shiva Prasad H. C.
Abstract:
The initial stages of any project are often observed to be in a mixed set of conditions. Setting up the project is a tough task, but taking the initial decisions is rather not complex, as some of the critical factors are yet to be introduced into the scenario. These simple initial decisions potentially shape the timeline and subsequent events that might later be plotted on it. Proceeding towards the solution for a problem is the primary objective in the initial stages. The optimization in the solutions can come later, and hence, the resources deployed towards attaining the solution are higher than what they would have been in the optimized versions. A ‘logic’ that counters the problem is essentially the core of the desired solution. Thus, if the problem is solved, the deployment of resources has led to the required logic being attained. As the project proceeds along, the individuals working on the project face fresh challenges as a team and are better accustomed to their surroundings. The developed, optimized solutions are then considered for implementation, as the individuals are now experienced, and know better of the consequences and causes of possible failure, and thus integrate the adequate tolerances wherever required. Furthermore, as the team graduates in terms of strength, acquires prodigious knowledge, and begins its efficient transfer, the individuals in charge of the project along with the managers focus more on the optimized solutions rather than the traditional ones to minimize the required resources. Hence, as time progresses, the authorities prioritize attainment of the required logic, at a lower amount of dedicated resources. For empirical analysis of the stated theory, leaders and key figures in organizations are surveyed for their ideas on appropriate logic required for tackling a problem. Key-pointers spotted in successfully implemented solutions are noted from the analysis of the responses and a metric for measuring logic is developed. A graph is plotted with the quantifiable logic on the Y-axis, and the dedicated resources for the solutions to various problems on the X-axis. The dedicated resources are plotted over time, and hence the X-axis is also a measure of time. In the initial stages of the project, the graph is rather linear, as the required logic will be attained, but the consumed resources are also high. With time, the authorities begin focusing on optimized solutions, since the logic attained through them is higher, but the resources deployed are comparatively lower. Hence, the difference between consecutive plotted ‘resources’ reduces and as a result, the slope of the graph gradually increases. On an overview, the graph takes a parabolic shape (beginning on the origin), as with each resource investment, ideally, the difference keeps on decreasing, and the logic attained through the solution keeps increasing. Even if the resource investment is higher, the managers and authorities, ideally make sure that the investment is being made on a proportionally high logic for a larger problem, that is, ideally the slope of the graph increases with the plotting of each point.Keywords: decision-making, leadership, logic, strategic management
Procedia PDF Downloads 10814802 Monte Carlo and Biophysics Analysis in a Criminal Trial
Authors: Luca Indovina, Carmela Coppola, Carlo Altucci, Riccardo Barberi, Rocco Romano
Abstract:
In this paper a real court case, held in Italy at the Court of Nola, in which a correct physical description, conducted with both a Monte Carlo and biophysical analysis, would have been sufficient to arrive at conclusions confirmed by documentary evidence, is considered. This will be an example of how forensic physics can be useful in confirming documentary evidence in order to reach hardly questionable conclusions. This was a libel trial in which the defendant, Mr. DS (Defendant for Slander), had falsely accused one of his neighbors, Mr. OP (Offended Person), of having caused him some damages. The damages would have been caused by an external plaster piece that would have detached from the neighbor’s property and would have hit Mr DS while he was in his garden, much more than a meter far away from the facade of the building from which the plaster piece would have detached. In the trial, Mr. DS claimed to have suffered a scratch on his forehead, but he never showed the plaster that had hit him, nor was able to tell from where the plaster would have arrived. Furthermore, Mr. DS presented a medical certificate with a diagnosis of contusion of the cerebral cortex. On the contrary, the images of Mr. OP’s security cameras do not show any movement in the garden of Mr. DS in a long interval of time (about 2 hours) around the time of the alleged accident, nor do they show any people entering or coming out from the house of Mr. DS in the same interval of time. Biophysical analysis shows that both the diagnosis of the medical certificate and the wound declared by the defendant, already in conflict with each other, are not compatible with the fall of external plaster pieces too small to be found. The wind was at a level 1 of the Beaufort scale, that is, unable to raise even dust (level 4 of the Beaufort scale). Therefore, the motion of the plaster pieces can be described as a projectile motion, whereas collisions with the building cornice can be treated using Newtons law of coefficients of restitution. Numerous numerical Monte Carlo simulations show that the pieces of plaster would not have been able to reach even the garden of Mr. DS, let alone a distance over 1.30 meters. Results agree with the documentary evidence (images of Mr. OP’s security cameras) that Mr. DS could not have been hit by plaster pieces coming from Mr. OP’s property.Keywords: biophysics analysis, Monte Carlo simulations, Newton’s law of restitution, projectile motion
Procedia PDF Downloads 13114801 Conceptual Study on 4PL and Activities in Turkey
Authors: Berna Kalkan, Kenan Aydin
Abstract:
Companies give importance customer satisfaction to compete the developing and changing market. This is possible when customer reaches the right product, right quality, place, time and cost. In this regard, the extension of logistics services has played active role on formation and development of the different logistics services concept. The concept of logistics services has played important role involved in the healing of economic indicators today. Companies can use logistics providers, thus have competitive advantage and low cost, reducing time, tobe flexibility. In recent years, Fourth Party Logistics (4PL) has emerged as a new concept that includes relationship between suppliers and firms in outsourcing. 4PL provider is an integrator that offers comprehensive supply chain solutions with the technology, resources and capabilities that it possesses. Also, 4PL has attracted as a popular research topic attention in the recent past. In this paper, logistics outsourcing and 4PL concepts are analyzed and a literature review on 4PL activities is given. Also, the previous studies in literature and the approaches that are used in previous studies in literature is presented by analysing on 4PL activities. In this context, a field study will be applied to 4PL providers and service buyer in Turkey. If necessary, results related to this study will be shared in scientific areas.Keywords: fourth party logistics, literature review, outsourcing, supply chain management
Procedia PDF Downloads 17814800 Analysis of Brain Specific Creatine Kinase of Postmortem Cerebrospinal Fluid and Serum in Blunt Head Trauma Cases
Authors: Rika Susanti, Eryati Darwin, Dedi Afandi, Yanwirasti, Syahruddin Said, Noverika Windasari, Zelly Dia Rofinda
Abstract:
Introduction: Blunt head trauma is one of the leading causes of death associated with murders and other deaths involved in criminal acts. Creatine kinase (CKBB) levels have been used as a biomarker for blunt head trauma. Therefore, it is now used as an alternative to an autopsy. The aim of this study is to investigate CKBB levels in cerebrospinal fluid (CSF) and post-mortem serum in order to deduce the cause and time of death. Method: This investigation was conducted through post-test–only group design involving deaths caused by blunt head trauma, which was compared to deaths caused by ketamine poisoning. Results: There were eight treatment groups, each consisting of six adult rats (Rattus norvegicus) Sprague-Dawley strain. Examinations were done at 0 hours, 1 hour, 2 hours, and 3 hours post-mortem, which followed by brain tissue observation. Data were then analyzed statistically with a repeated-measures general linear model. Conclusion: There were increases in the level of CKBB in CSF and postmortem serum in both blunt head trauma and ketamine poisoning treatment groups. However, there were no significant differences between these two groups.Keywords: blunt head trauma, CKBB, the cause of death, estimated time of death
Procedia PDF Downloads 19214799 Evaluating the Success of an Intervention Course in a South African Engineering Programme
Authors: Alessandra Chiara Maraschin, Estelle Trengove
Abstract:
In South Africa, only 23% of engineering students attain their degrees in the minimum time of 4 years. This begs the question: Why is the 4-year throughput rate so low? Improving the throughput rate is crucial in assisting students to the shortest possible path to completion. The Electrical Engineering programme has a fixed curriculum and students must pass all courses in order to graduate. In South Africa, as is the case in several other countries, many students rely on external funding such as bursaries from companies in industry. If students fail a course, they often lose their bursaries, and most might not be able to fund their 'repeating year' fees. It is thus important to improve the throughput rate, since for many students, graduating from university is a way out of poverty for an entire family. In Electrical Engineering, it has been found that the Software Development I course (an introduction to C++ programming) is a significant hurdle course for students and has been found to have a low pass rate. It has been well-documented that students struggle with this type of course as it introduces a number of new threshold concepts that can be challenging to grasp in a short time frame. In an attempt to mitigate this situation, a part-time night-school for Software Development I was introduced in 2015 as an intervention measure. The course includes all the course material from the Software Development I module and allows students who failed the course in first semester a second chance by repeating the course through taking the night-school course. The purpose of this study is to determine whether the introduction of this intervention course could be considered a success. The success of the intervention is assessed in two ways. The study will first look at whether the night-school course contributed to improving the pass rate of the Software Development I course. Secondly, the study will examine whether the intervention contributed to improving the overall throughput from the 2nd year to the 3rd year of study at a South African University. Second year academic results for a sample of 1216 students have been collected from 2010-2017. Preliminary results show that the lowest pass rate for Software Development I was found to be in 2017 with a pass rate of 34.9%. Since the intervention course's inception, the pass rate for Software Development I has increased each year from 2015-2017 by 13.75%, 25.53% and 25.81% respectively. To conclude, the preliminary results show that the intervention course is a success in improving the pass rate of Software Development I.Keywords: academic performance, electrical engineering, engineering education, intervention course, low pass rate, software development course, throughput
Procedia PDF Downloads 16414798 Remediation of Oil and Gas Exploration and Production (O&G E&P) Wastes Using Soil-Poultry Dropping Amendment
Authors: Ofonime U. M. John, Justina I. R. Udotong, Victor O. Nwaugo, Ime R. Udotong
Abstract:
Oily wastes from oil and gas exploration and production (O&G E&P) activities were remediated for twelve weeks using Soil-Poultry dropping amendment. Culture-dependent microbiological, chemical and enzymatic techniques were employed to assess the efficacy of remediation process. Microbiological activities of the remediated wastes showed increased hydrocarbonoclastic microbial populations with increased remediation time; 2.7±0.1 x 105cfu/g to 8.3 ± 0.04 x106cfu/g for hydrocarbon utilizing bacteria, 1.7 ± 0.2 x103cfu/g to 6.0 ± 0.01 x 104cfu/g for hydrocarbon utilizing fungi and 2.2 ± 0.1 x 102cfu/g to 6.7 ± 0.1 x 103cfu/g for hydrocarbon utilizing actinomycetes. Bacteria associated with the remediated wastes after the remediation period included the genera Bacillus, Psuedomonas, Beijerinckia, Acinetobacter, Alcaligenes and Serratia. Fungal isolates included species of Penicillium, Aspergillus and Cladosporium, while the Actinomycetes included species of Rhodococcus, Nocardia and Streptomyces. Slight fluctuations in pH values between 6.5± 0.2 and 7.1 ± 0.08 were recorded throughout the process, while total petroleum hydrocarbon (TPH) content decreased from 89, 900 ± 0.03mg/kg to 425 ± 0.1 mg/kg after twelve weeks of remediation. The polycyclic aromatic hydrocarbon (PAH) levels decreased with increased remediation time; naphthalene, flourene, pheneanthrene, anthracene, pyrene, chrysene and benzo(b)flouranthene showed decreased values < 0.01 after twelve weeks of remediation. Enzyme activities revealed increased dehydrogenase and urease activities with increased remediation time and decreased phenol oxidase activity with increased remediation period. There was a positive linear correlation between densities of hydrocarbonoclastic microbes and dehydrogenase activity. On the contrary, phenol oxidase and urease activities showed negative correlation with microbial population. Results of this study confirmed that remediation of oily wastes using soil-poultry dropping amendment can result in eco-friendly O&G E&P wastes. It also indicates that urease and phenol oxidase activities can be reliable indices/tools to monitor PAH levels and rates of petroleum hydrocarbon degradation.Keywords: dehydrogenase activity, oily wastes, remediation, soil-poultry dropping amendment
Procedia PDF Downloads 32214797 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning
Authors: Tanvi P. Patel, Warish D. Patel
Abstract:
Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern
Procedia PDF Downloads 37114796 Socio-Technical Systems: Transforming Theory into Practice
Authors: L. Ngowi, N. H. Mvungi
Abstract:
This paper critically examines the evolution of socio-technical systems theory, its practices, and challenges in system design and development. It examines concepts put forward by researchers focusing on the application of the theory in software engineering. There are various methods developed that use socio-technical concepts based on systems engineering without remarkable success. The main constraint is the large amount of data and inefficient techniques used in the application of the concepts in system engineering for developing time-bound systems and within a limited/controlled budget. This paper critically examines each of the methods, highlight bottlenecks and suggest the way forward. Since socio-technical systems theory only explains what to do, but not how doing it, hence engineers are not using the concept to save time, costs and reduce risks associated with new frameworks. Hence, a new framework, which can be considered as a practical approach is proposed that borrows concepts from soft systems method, agile systems development and object-oriented analysis and design to bridge the gap between theory and practice. The approach will enable the development of systems using socio-technical systems theory to attract/enable the system engineers/software developers to use socio-technical systems theory in building worthwhile information systems to avoid fragilities and hostilities in the work environment.Keywords: socio-technical systems, human centered design, software engineering, cognitive engineering, soft systems, systems engineering
Procedia PDF Downloads 28614795 Tourism Potentials of Ikogosi Warm Spring in Nigeria
Authors: A.I. Adeyemo
Abstract:
Ikogosi warm spring results from a complex mechanical and chemical forces that generates internal heat in the rocks forming a warm and cold water at the same geographical location at the same time. From time immemorial, the local community had thought, it to be the work of a deity, and they were worshipping the spring. This complex phenomenon has been a source of tourist attraction to both local and international tourists over the years. 450 copies of a structured questionnaire were given out, and a total of 500 respondents were interviewed. The result showed that ikogosi warm spring impacts the community positively by providing employment to the teeming youths, and it provides income to traders. The result shows that 66% of the respondents confirmed that it increased their income and that transportation business increased more than 73%.the level of enlightenment and socialization increased greatly in the community. However, it also impacted the community negatively as it increased crime rates such as stealing, kidnapping, prostitution, and unwanted pregnancy among the secondary school girls and the other teenagers. Generally, 50% of the respondents reported that tourism in the warm spring results in insecurity in the community. IT also increased environmental problems such as noise and waste pollutions; the continuous movement on the land results in soil compartment leading to erosion, and leaching, which also results in loss of soil fertility. It was concluded that if the potentials of the spring are fully tapped, it will be a good avenue for income generation to the country.Keywords: community, Ikogosi, revenue, warm spring
Procedia PDF Downloads 15914794 An Automatic Large Classroom Attendance Conceptual Model Using Face Counting
Authors: Sirajdin Olagoke Adeshina, Haidi Ibrahim, Akeem Salawu
Abstract:
large lecture theatres cannot be covered by a single camera but rather by a multicamera setup because of their size, shape, and seating arrangements. Although, classroom capture is achievable through a single camera. Therefore, a design and implementation of a multicamera setup for a large lecture hall were considered. Researchers have shown emphasis on the impact of class attendance taken on the academic performance of students. However, the traditional method of carrying out this exercise is below standard, especially for large lecture theatres, because of the student population, the time required, sophistication, exhaustiveness, and manipulative influence. An automated large classroom attendance system is, therefore, imperative. The common approach in this system is face detection and recognition, where known student faces are captured and stored for recognition purposes. This approach will require constant face database updates due to constant changes in the facial features. Alternatively, face counting can be performed by cropping the localized faces on the video or image into a folder and then count them. This research aims to develop a face localization-based approach to detect student faces in classroom images captured using a multicamera setup. A selected Haar-like feature cascade face detector trained with an asymmetric goal to minimize the False Rejection Rate (FRR) relative to the False Acceptance Rate (FAR) was applied on Raspberry Pi 4B. A relationship between the two factors (FRR and FAR) was established using a constant (λ) as a trade-off between the two factors for automatic adjustment during training. An evaluation of the proposed approach and the conventional AdaBoost on classroom datasets shows an improvement of 8% TPR (output result of low FRR) and 7% minimization of the FRR. The average learning speed of the proposed approach was improved with 1.19s execution time per image compared to 2.38s of the improved AdaBoost. Consequently, the proposed approach achieved 97% TPR with an overhead constraint time of 22.9s compared to 46.7s of the improved Adaboost when evaluated on images obtained from a large lecture hall (DK5) USM.Keywords: automatic attendance, face detection, haar-like cascade, manual attendance
Procedia PDF Downloads 7214793 Filtering Momentum Life Cycles, Price Acceleration Signals and Trend Reversals for Stocks, Credit Derivatives and Bonds
Authors: Periklis Brakatsoulas
Abstract:
Recent empirical research shows a growing interest in investment decision-making under market anomalies that contradict the rational paradigm. Momentum is undoubtedly one of the most robust anomalies in the empirical asset pricing research and remains surprisingly lucrative ever since first documented. Although predominantly phenomena identified across equities, momentum premia are now evident across various asset classes. Yet few many attempts are made so far to provide traders a diversified portfolio of strategies across different assets and markets. Moreover, literature focuses on patterns from past returns rather than mechanisms to signal future price directions prior to momentum runs. The aim of this paper is to develop a diversified portfolio approach to price distortion signals using daily position data on stocks, credit derivatives, and bonds. An algorithm allocates assets periodically, and new investment tactics take over upon price momentum signals and across different ranking groups. We focus on momentum life cycles, trend reversals, and price acceleration signals. The main effort here concentrates on the density, time span and maturity of momentum phenomena to identify consistent patterns over time and measure the predictive power of buy-sell signals generated by these anomalies. To tackle this, we propose a two-stage modelling process. First, we generate forecasts on core macroeconomic drivers. Secondly, satellite models generate market risk forecasts using the core driver projections generated at the first stage as input. Moreover, using a combination of the ARFIMA and FIGARCH models, we examine the dependence of consecutive observations across time and portfolio assets since long memory behavior in volatilities of one market appears to trigger persistent volatility patterns across other markets. We believe that this is the first work that employs evidence of volatility transmissions among derivatives, equities, and bonds to identify momentum life cycle patterns.Keywords: forecasting, long memory, momentum, returns
Procedia PDF Downloads 10214792 The Droplet Generation and Flow in the T-Shape Microchannel with the Side Wall Fluctuation
Authors: Yan Pang, Xiang Wang, Zhaomiao Liu
Abstract:
Droplet microfluidics, in which nanoliter to picoliter droplets acted as individual compartments, are common to a diverse array of applications such as analytical chemistry, tissue engineering, microbiology and drug discovery. The droplet generation in a simplified two dimension T-shape microchannel with the main channel width of 50 μm and the side channel width of 25 μm, is simulated to investigate effects of the forced fluctuation of the side wall on the droplet generation and flow. The periodic fluctuations are applied on a length of the side wall in the main channel of the T-junction with the deformation shape of the double-clamped beam acted by the uniform force, which varies with the flow time and fluctuation periods, forms and positions. The fluctuations under most of the conditions expand the distribution range of the droplet size but have a little effect on the average size, while the shape of the fixed side wall changes the average droplet size chiefly. Droplet sizes show a periodic pattern along the relative time when the fluctuation is forced on the side wall near the T-junction. The droplet emerging frequency is not varied by the fluctuation of the side wall under the same flow rate and geometry conditions. When the fluctuation period is similar with the droplet emerging period, the droplet size shows a nice stability as the no fluctuation case.Keywords: droplet generation, droplet size, flow flied, forced fluctuation
Procedia PDF Downloads 28214791 Exploring Fertility Dynamics in the MENA Region: Distribution, Determinants, and Temporal Trends
Authors: Dena Alhaloul
Abstract:
The Middle East and North Africa (MENA) region is characterized by diverse cultures, economies, and social structures. Fertility rates in MENA have seen significant changes over time, with variations among countries and subregions. Understanding fertility patterns in this region is essential due to its impact on demographic dynamics, healthcare, labor markets, and social policies. Rising or declining fertility rates have far-reaching consequences for the region's socioeconomic development. The main thrust of this study is to comprehensively examine fertility rates in the Middle East and North Africa (MENA) region. It aims to understand the distribution, determinants, and temporal trends of fertility rates in MENA countries. The study seeks to provide insights into the factors influencing fertility decisions, assess how fertility rates have evolved over time, and potentially develop statistical models to characterize these trends. As for the methodology of the study, the study uses descriptive statistics to summarize and visualize fertility rate data. It also uses regression analyses to identify determinants of fertility rates as well as statistical modeling to characterize temporal trends in fertility rates. The conclusion of this study The research will contribute to a deeper understanding of fertility dynamics in the MENA region, shedding light on the distribution of fertility rates, their determinants, and historical trends.Keywords: fertility, distribution, modeling, regression
Procedia PDF Downloads 8114790 Effect of Nitriding and Shot Peening on Corrosion Behavior and Surface Properties of Austenite Stainless Steel 316L
Authors: Khiaira S. Hassan, Abbas S. Alwan, Muna K. Abbass
Abstract:
This research aims to study the effect of the liquid nitriding and shot peening on the hardness, surface roughness, residual stress, microstructure and corrosion behavior of austenite stainless steel 316 L. Chemical surface heat treatment by liquid nitriding process was carried out at 500 °C for 1 h and followed by shot peening with using ball steel diameter of 1.25 mm in different exposure time of 10 and 20 min. Electrochemical corrosion test was applied in sea water (3.5% NaCl solution) by using potentostat instrument. The results showed that the nitride layer consists of a compound layer (white layer) and diffusion zone immediately below the alloy layer. It has been found that the mechanical treatment (shot peening) has led to the formation of compressive residual stresses in layer surface that increased the hardness of stainless steel surface. All surface treatment (nitriding and shot peening) processes have led to the formation of carbide of CrN in hard surface layer. It was shown that both processes caused an increase in surface hardness and roughness which increases with shot peening time. Also, the corrosion results showed that the liquid nitriding and shot peening processes increase the corrosion rate to values more than that of not treated stainless steel.Keywords: stainless steel 316L, shot peening, nitriding, corrosion, hardness
Procedia PDF Downloads 46814789 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler
Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury
Abstract:
An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler
Procedia PDF Downloads 15014788 Optimization of Pregelatinized Taro Boloso-I Starch as a Direct Compression Tablet Excipient
Authors: Tamrat Balcha Balla
Abstract:
Background: Tablets are still the most preferred means of drug delivery. The search for new and improved direct compression tablet excipients is an area of research focus. Taro Boloso-I is a variety of Colocasia esculenta (L. Schott) yielding 67% more than the other varieties (Godare) in Ethiopia. This study aimed to enhance the flowability while keeping the compressibility and compactibility of the pregelatinized Taro Boloso-I starch. Methods: Central composite design was used for the optimization of two factors which were the temperature and duration of pregelatinization against 5 responses. The responses were angle of repose, Hausner ratio, Kawakita compressibility index, mean yield pressure and tablet breaking force. Results and Discussions: An increase in both temperature and time resulted in decrease in the angle of repose. The increase in temperature was shown to decrease the Hausner ratio and to decrease the Kawakita compressibility index. The mean yield pressure was observed to increase with increasing levels of both temperature and time. The pregelatinized (optimized) Taro Boloso-I starch could show desired flow property and compressibility. Conclusions: Pregelatinized Taro Boloso - I starch could be regarded as a potential direct compression excipient in terms of flowability, compressibility and compactibility.Keywords: starch, compression, pregelatinization, Taro Boloso-I
Procedia PDF Downloads 11314787 Improving the Performance of Road Salt on Anti-Icing
Authors: Mohsen Abotalebi Esfahani, Amin Rahimi
Abstract:
Maintenance and management of route and roads infrastructure is one of the most important and the most fundamental principles of the countries. Several methods have been under investigation as preventive proceedings for the maintenance of asphalt pavements for many years. Using a mixture of salt, sand and gravel is the most common method of deicing, which could have numerous harmful consequences. Icy or snow-covered road is one of the major reasons of accidents in rainy seasons, which causes substantial damages such as loss of time and energy, environmental pollution, destruction of buildings, traffic congestion and rising possibility of accidents. Regarding this, every year the government incurred enormous costs to secure traverses. In this study, asphalt pavements have been cured, in terms of compressive strength, tensile strength and resilient modulus of asphalt samples, under the influence of Magnesium Chloride, Calcium Chloride, Sodium Chloride, Urea and pure water; and showed that de-icing with the calcium chloride solution and urea have the minimum negative effect and de-icing with pure water has most negative effect on laboratory specimens. Hence some simple techniques and new equipment and less use of sand and salt, can reduce significantly the risks and harmful effects of excessive use of salt, sand and gravel and at the same time use the safer roads.Keywords: maintenance, sodium chloride, icyroad, calcium chloride
Procedia PDF Downloads 28414786 Reasons for Language Words in the Quran and Literary Approaches That Are Persian
Authors: Fateme Mazbanpoor, Sayed Mohammad Amiri
Abstract:
In this article, we will examine the Persian words in Quran and study the reasons of their presence in this holy book. Writers of this paper extracted about 70 Persian words of Quran by referring to resources. (Alalfaz ol Moarab ol Farsieh Edishir, Almoarabol Javalighi, Almahzab va Etghan Seuti; Vocabulary involved in Quran Arthur Jeffry;, and etc…), some of these words are: ‘Abarigh, ‘Estabragh’,’Barzakh’, ‘Din’,’Zamharir, ‘Sondos’ ‘Sejil’,’ Namaregh’, ‘Fil’ etc. These Persian words have entered Arabic and finally entered Quran in two ways: 1) directly from Persian language, 2) via other languages. The first way: because of the Iranian dominance on Hira, Yemen, whole Oman and Bahrein land in Sasanian period, there were political, religious, linguistic, literary, and trade ties between these Arab territories causing the impact of Persian on Arabic; giving way to many Persian-loan words into Arabic in this period of time. The second way: Since the geographical and business conditions of the areas were dominated by Iran, Hejaz had lots of deals and trades with Mesopotamia and Yemen. On the other hand, Arabic language which was relatively a young language at that time, used to be impressed by Semitic languages in order to expand its vocabulary (Syrian and Aramaic were influenced by the languages of Iran). Consequently, due to the long relationship between Iranian and Arabs, some of the Persian words have taken longer ways through Aramaic and Syrian to find their way into Quran.Keywords: Quran, Persian word, Arabic language, Persian
Procedia PDF Downloads 46214785 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators
Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros
Abstract:
Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis
Procedia PDF Downloads 139