Search results for: assurance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 282

Search results for: assurance

102 Investigation on Scattered Dose Rate and Exposure Parameters during Diagnostic Examination Done with an Overcouch X-Ray Tube in Nigerian Teaching Hospital

Authors: Gbenga Martins, Christopher J. Olowookere, Lateef Bamidele, Kehinde O. Olatunji

Abstract:

The aims of this research are to measure the scattered dose rate during an X-ray examination in an X-ray room, compare the scattered dose rate with exposure parameters based on the body region examined, and examine the X-ray examination done with an over couch tube. The research was carried out using Gamma Scout software installation on the computer system (Laptop) to record the radiation counts, pulse rate, and dose rate. The measurement was employed by placing the detector at 900 to the incident X-ray. Proforma was used for the collection of patients’ data such as age, sex, examination type, and initial diagnosis. Data such as focus skin distance (FSD), body mass index (BMI), body thickness of the patients, the beam output (kVp) were collected at Obafemi Awolowo University, Ile-Ife, Western Nigeria. Total number of 136 patients was considered during this research. Dose rate range between 14.21 and 86.78 µSv/h for the plain abdominal region, 85.70 and 2.86 µSv/h for the lumbosacral region,1.3 µSv/yr and 3.6 µSv/yr in the pelvis region, 2.71 µSv/yr and 28.88 µSv/yr for leg region, 3.06 µSv/yr and 29.98 µSv/yr in hand region. The results of this study were compared with those of other studies carried out in other countries. The findings of this study indicated that the number of exposure parameters selected for each diagnostic examination contributed to the dose rate recorded. Therefore, these results call for a quality assurance program (QAP) in diagnostic X-ray units in Nigerian hospitals.

Keywords: X-radiation, exposure parameters, dose rate, pulse rate, number of counts, tube current, tube potential, diagnostic examination, scattered radiation

Procedia PDF Downloads 73
101 Approaching In vivo Dosimetry for Kilovoltage X-Ray Radiotherapy

Authors: Rodolfo Alfonso, David Alonso, Albin Garcia, Jose Luis Alonso

Abstract:

Recently a new kilovoltage radiotherapy unit model Xstrahl 200 - donated to the INOR´s Department of Radiotherapy (DR-INOR) in the framework of a IAEA's technical cooperation project- has been commissioned. This unit is able to treat shallow and low deep laying lesions, as it provides 8 discrete beam qualities, from 40 to 200 kV. As part of the patient-specific quality assurance program established at DR-INOR for external beam radiotherapy, it has been recommended to implement in vivo dose measurements (IVD), as they allow effectively discovering eventual errors or failures in the radiotherapy process. For that purpose a radio-photoluminescence (RPL) dosimetry system, model XXX, -also donated to DR-INOR by the same IAEA project- has been studied and commissioned. Main dosimetric parameters of the RPL system, such as reproducibility, linearity, and filed size influence were assessed. In a similar way, the response of radiochromic EBT3 type film was investigated for purposes of IVD. Both systems were calibrated in terms of entrance surface dose. Results of the dosimetric commissioning of RPL and EBT3 for IVD, and their pre-clinical implementation through end-to-end test cases are presented. The RPL dosimetry seems more recommendable for hyper-fractionated schemes with larger fields and curved patient contours, as those in chest wall irradiations, where the use of more than one dosimeter could be required. The radiochromic system involves smaller corrections with field size, but it sensibility is lower; hence it is more adequate for hypo-fractionated treatments with smaller fields.

Keywords: glass dosimetry, in vivo dosimetry, kilovotage radiotherapy, radiochromic dosimetry

Procedia PDF Downloads 367
100 Hybrid Thresholding Lifting Dual Tree Complex Wavelet Transform with Wiener Filter for Quality Assurance of Medical Image

Authors: Hilal Naimi, Amelbahahouda Adamou-Mitiche, Lahcene Mitiche

Abstract:

The main problem in the area of medical imaging has been image denoising. The most defying for image denoising is to secure data carrying structures like surfaces and edges in order to achieve good visual quality. Different algorithms with different denoising performances have been proposed in previous decades. More recently, models focused on deep learning have shown a great promise to outperform all traditional approaches. However, these techniques are limited to the necessity of large sample size training and high computational costs. This research proposes a denoising approach basing on LDTCWT (Lifting Dual Tree Complex Wavelet Transform) using Hybrid Thresholding with Wiener filter to enhance the quality image. This research describes the LDTCWT as a type of lifting wavelets remodeling that produce complex coefficients by employing a dual tree of lifting wavelets filters to get its real part and imaginary part. Permits the remodel to produce approximate shift invariance, directionally selective filters and reduces the computation time (properties lacking within the classical wavelets transform). To develop this approach, a hybrid thresholding function is modeled by integrating the Wiener filter into the thresholding function.

Keywords: lifting wavelet transform, image denoising, dual tree complex wavelet transform, wavelet shrinkage, wiener filter

Procedia PDF Downloads 131
99 Accessibility to Urban Parks for Low-income Residents in Chongqing, China: Perspective from Relative Deprivation

Authors: Junhang Luo

Abstract:

With the transformation of spatial structure and the deepening of urban development, the demand for a better life and the concerns for social resources equities of residents are increasing. As an important social resource, park plays an essential role in building environmentally sustainable cities. Thus, it is important to examine park accessibility for low-income and how it works in relative deprivation, so as to provide all residents with equitable services. Using the network and buffer methods of GIS, this paper analyzes urban park accessibility for low-income residents in Chongqing, China. And then conduct a satisfaction evaluation of park resource accessibility with low-incomes through questionnaire surveys from deprivation dimensions. Results show that the level of park accessibility in Chongqing varies significantly and the degree of relative deprivation is relatively high. Public transportation convenience improves and the number of community park increases contribute positively to improving park accessibility and alleviating the relative deprivation of public resources. Combined with the innovation pattern of social governance in China, it suggests that urban park accessibility needs to be jointly governed and optimized by multiple social resources from the government to the public, and the service efficiency needs the index system and planning standards according to local conditions to improve quality and promote equity. At the same time, building a perfect park system and complete legislation assurance system will also play a positive role in ensuring that all residents can enjoy the urban public space more fairly, especially low-income groups.

Keywords: urban park, accessibility, relative deprivation, GIS network analysis, chongqing

Procedia PDF Downloads 126
98 Conflict, Confusion or Compromise: Violence against Women, A Case Study of Pakistan

Authors: Farhat Jabeen, Syed Asfaq Hussain Bukhari

Abstract:

In the wake of the contemporary period the basic objective of the research paper points out that socio-cultural scenario of Pakistan reveals that gender-based violence is deep rooted in the society irrespective of language and ethnicity. This paper would reconnaissance the possibility reforms in Pakistan for diminishing of violence. Women are not given their due role, rights, and respect. Furthermore, they are treated as chattels. This presentation will cover the socio-customary practices in the context of discrimination, stigmatization, and violence against women. This paper envisages justice in a broader sense of recognition of rights for women, and masculine structure of society, socio-customary practices and discrimination against women are a very serious concern which needs to be understood as a multidimensional problem. The paper will specially focus on understanding the existing obstacles of women in Pakistan in the constitutional scenario. Women stumble across discrimination and human rights manipulations, voluptuous violation and manipulation including domestic viciousness and are disadvantaged by laws, strategies, and programming that do not take their concerns into considerations. This presentation examines the role of honour killings among Pakistani community. This affects their self-assurance and capability to elevation integrity campaign where gender inequalities and discrimination in social, legal domain are to be put right. This paper brings to light the range of practices, laws and legal justice regarding the status of women and also covers attitude towards compensations for murders/killings, domestic violence, rape, adultery, social behavior and recourse to justice.

Keywords: discrimination, cultural, women, violence

Procedia PDF Downloads 293
97 Nature-based Solutions for Mitigating the Impact of Climate Change on Plants: Utilizing Encapsulated Plant Growth Regulators and Associative Microorganisms

Authors: Raana Babadi Fathipour

Abstract:

Over the past decades, the climatic CO2 concentration and worldwide normal temperature have been expanding, and this drift is anticipated to before long gotten to be more extreme. This situation of climate alter escalate abiotic stretch components (such as dry spell, flooding, saltiness, and bright radiation) that debilitate timberland and related environments as well as trim generation. These variables can contrarily influence plant development and advancement with a ensuing lessening in plant biomass aggregation and surrender, in expansion to expanding plant defenselessness to biotic stresses. As of late, biostimulants have ended up a hotspot as an viable and economical elective to reduce the negative impacts of stresses on plants. In any case, the larger part of biostimulants has destitute solidness beneath natural conditions, which leads to untimely debasement, shortening their organic movement. To unravel these bottlenecks, small scale- and nano-based definitions containing biostimulant atoms and/or microorganisms are picking up consideration as they illustrate a few points of interest over their routine details. In this survey, we center on the embodiment of plant development controllers and plant acquainted microorganisms as a technique to boost their application for plant assurance against abiotic stresses. We moreover address the potential restrictions and challenges confronted for the execution of this innovation, as well as conceivable outcomes with respect to future inquire about.

Keywords: bio stimulants, Seed priming, nano biotechnology, plant growth-promoting, rhizobacteria, plant growth regulators, microencapsulation

Procedia PDF Downloads 38
96 Efficacy of Technology for Successful Learning Experience; Technology Supported Model for Distance Learning: Case Study of Botho University, Botswana

Authors: Ivy Rose Mathew

Abstract:

The purpose of this study is to outline the efficacy of technology and the opportunities it can bring to implement a successful delivery model in Distance Learning. Distance Learning has proliferated over the past few years across the world. Some of the current challenges faced by current students of distance education include lack of motivation, a sense of isolation and a need for greater and improved communication. Hence the author proposes a creative technology supported model for distance learning exactly mirrored on the traditional face to face learning that can be adopted by distance learning providers. This model suggests the usage of a range of technologies and social networking facilities, with the aim of creating a more engaging and sustaining learning environment to help overcome the isolation often noted by distance learners. While discussing the possibilities, the author also highlights the complexity and practical challenges of implementing such a model. Design/methodology/approach: Theoretical issues from previous research related to successful models for distance learning providers will be considered. And also the analysis of a case study from one of the largest private tertiary institution in Botswana, Botho University will be included. This case study illustrates important aspects of the distance learning delivery model and provides insights on how curriculum development is planned, quality assurance is done, and learner support is assured for successful distance learning experience. Research limitations/implications: While some of the aspects of this study may not be applicable to other contexts, a number of new providers of distance learning can adapt the key principles of this delivery model.

Keywords: distance learning, efficacy, learning experience, technology supported model

Procedia PDF Downloads 210
95 Comparative Evaluation of EBT3 Film Dosimetry Using Flat Bad Scanner, Densitometer and Spectrophotometer Methods and Its Applications in Radiotherapy

Authors: K. Khaerunnisa, D. Ryangga, S. A. Pawiro

Abstract:

Over the past few decades, film dosimetry has become a tool which is used in various radiotherapy modalities, either for clinical quality assurance (QA) or dose verification. The response of the film to irradiation is usually expressed in optical density (OD) or net optical density (netOD). While the film's response to radiation is not linear, then the use of film as a dosimeter must go through a calibration process. This study aimed to compare the function of the calibration curve of various measurement methods with various densitometer, using a flat bad scanner, point densitometer and spectrophotometer. For every response function, a radichromic film calibration curve is generated from each method by performing accuracy, precision and sensitivity analysis. netOD is obtained by measuring changes in the optical density (OD) of the film before irradiation and after irradiation when using a film scanner if it uses ImageJ to extract the pixel value of the film on the red channel of three channels (RGB), calculate the change in OD before and after irradiation when using a point densitometer, and calculate changes in absorbance before and after irradiation when using a spectrophotometer. the results showed that the three calibration methods gave readings with a netOD precision of doses below 3% for the uncertainty value of 1σ (one sigma). while the sensitivity of all three methods has the same trend in responding to film readings against radiation, it has a different magnitude of sensitivity. while the accuracy of the three methods provides readings below 3% for doses above 100 cGy and 200 cGy, but for doses below 100 cGy found above 3% when using point densitometers and spectrophotometers. when all three methods are used for clinical implementation, the results of the study show accuracy and precision below 2% for the use of scanners and spectrophotometers and above 3% for precision and accuracy when using point densitometers.

Keywords: Callibration Methods, Film Dosimetry EBT3, Flat Bad Scanner, Densitomete, Spectrophotometer

Procedia PDF Downloads 101
94 User-Perceived Quality Factors for Certification Model of Web-Based System

Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh

Abstract:

One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.

Keywords: software certification model, user centric approach, software quality factors, metrics and measurements, web-based system

Procedia PDF Downloads 375
93 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 169
92 Design and Development of Herbal Formulations: Challenges and Solutions

Authors: B. Sathyanarayana

Abstract:

As per the report of World Health Organization, more than 80% of world population uses medicines made from herbal and natural materials. They have stood the test of time for their safety, efficacy, cultural acceptability and lesser side effects. Quality assurance and control measures, such as national quality specification and standards for herbal materials, good manufacturing practices (GMP) for herbal medicines, labelling, and licensing schemes for manufacturing, imports and marketing, should be in place in every country where herbal medicines are regulated. These measures are vital for ensuring the safety and efficacy of herbal medicines. In the case of herbal products challenge begins at the stage of designing itself except the classical products. Selection of herbal ingredients, officinal parts to be used, proportions are vital. Once the formulation is designed one should take utmost care to produce the standardized product of assured quality and safety. Quality control measures should cover the validation of quality and identity of raw materials, in process control (as per SOP and GMP norms) and at the level of final product. Quality testing, safety and efficacy studies of the final product are required to ensure the safe and effective use of the herbal products in human beings. Medicinal plants being the materials of natural resource are subjected to great variation making it really difficult to fix quality standards especially in the case of polyherbal preparations. Manufacturing also needs modification according to the type of ingredients present. Hence, it becomes essential to develop Standard operative Procedure for a specific herbal product. Present paper throws a light on the challenges that are encountered during the design and development of herbal products.

Keywords: herbal product, challenges, quality, safety, efficacy

Procedia PDF Downloads 477
91 Urban Transport Demand Management Multi-Criteria Decision Using AHP and SERVQUAL Models: Case Study of Nigerian Cities

Authors: Suleiman Hassan Otuoze, Dexter Vernon Lloyd Hunt, Ian Jefferson

Abstract:

Urbanization has continued to widen the gap between demand and resources available to provide resilient and sustainable transport services in many fast-growing developing countries' cities. Transport demand management is a decision-based optimization concept for both benchmarking and ensuring efficient use of transport resources. This study assesses the service quality of infrastructure and mobility services in the Nigerian cities of Kano and Lagos through five dimensions of quality (i.e., Tangibility, Reliability, Responsibility, Safety Assurance and Empathy). The methodology adopts a hybrid AHP-SERVQUAL model applied on questionnaire surveys to gauge the quality of satisfaction and the views of experts in the field. The AHP results prioritize tangibility, which defines the state of transportation infrastructure and services in terms of satisfaction qualities and intervention decision weights in the two cities. The results recorded ‘unsatisfactory’ indices of quality of performance and satisfaction rating values of 48% and 49% for Kano and Lagos, respectively. The satisfaction indices are identified as indicators of low performances of transportation demand management (TDM) measures and the necessity to re-order priorities and take proactive steps towards infrastructure. The findings pilot a framework for comparative assessment of recognizable standards in transport services, best ethics of management and a necessity of quality infrastructure to guarantee both resilient and sustainable urban mobility.

Keywords: transportation demand management, multi-criteria decision support, transport infrastructure, service quality, sustainable transport

Procedia PDF Downloads 195
90 Incidences and Chemico-Mobility of Toxic Heavy Metals in Environmental Samples

Authors: I. Hilia, C. Hange, F. Hakala, M. Matheus, C. Jansen, J. Hidinwa, O. Awofolu

Abstract:

The article reports on the occurrences, level, and mobility of selected trace metals in environmental samples. The conceptual basis was to examine the possible influence of anthropogenic activities and the impact on human and environmental health. Environmental samples (soil, plant and lower animal) were randomly collected from stratified study/sampling areas, preserved and pre-treated before analysis. Mineral acid digestion procedure was employed for the isolation of metallic contents in samples, and elemental qualitative and quantitative analysis was by ICP-OES. Analytical protocol was validated through the quality assurance process and was found acceptable with quantitative metallic recoveries in the range of 85-90%; hence considered applicable for the analyses of environmental samples. The mean concentration of analysed metals in soil samples ranged from 53.2- 2532.8 mg/kg (Cu); 59.5- 2020.1 mg/kg (Zn); 1.80 – 21.26 mg/kg (Cd) and 19.6- 140.9 mg/kg (Pb). The mean level in grass samples ranged from 9.33 – 38.63 mg/kg (Cu); 64.20-105.18 mg/kg (Zn); 0.28–0.73 mg/kg (Cd) and 0.53 -16.26 mg/kg (Pb) while the mean level in lower animal sample (beetle) varied from 9.6 - 105.3 mg/kg (Cu); 134.1-297.2 mg/kg (Zn); 0.63 – 3.78 (Cd) and 8.0 – 29.1 mg/kg (Pb) across sample collection points (SCPs) 1-4 respectively. Metallic transfer factors (TFs) were in the order Zn >Cd > Cu > Pb with metal Pollution Indices (MPIs) in the order SCP1 > SCP2 > SCP3 > SCP4. About 60-70 % of analysed metals were above the maximum allowable limits (MALs) in soil and plant samples. Results obtained revealed the general prevalence of analysed metals at all sampled sites with indication of metallic mobility across the food chain which portrayed dire consequences for environmental and human health. Systematic environmental remediation and pollution abatement strategies are recommended.

Keywords: trace metals, pollution, human health, Incidences, ICP-OES

Procedia PDF Downloads 135
89 An Audit on Tracheal Tube Cuff Pressure Check and Monitoring during Current Practice

Authors: Mahmoud Hassanin, Roshan Thawale, Kiran Yelamati

Abstract:

Background: During current practice, intraoperative regular endotracheal cuff pressure monitoring is not routine, despite the significant number of clinicians interested in checking it after intubation to ensure a good seal and adequate ventilation. Aims and objectives: to highlight that the current practice has no guidance related to regular intra-operative monitoring of the endotracheal tube cuff pressure, which can improve patient safety and post-operative experience. Methods: local department survey was done targeting anaesthetists' current practice, measuring their knowledge and problem awareness to improve patient satisfaction and change the current approach. Results: The participants were not using the manometer, despite their interest in ensuring that the cuff pressure was high enough and there was a proper seal. More than 50% of the participant don't know the ideal range of the endotracheal tube cuff pressure range, and 32% don't know whether it is available or not in the theatre. Despite the previous finding, 100% of the participants used different methods to ensure adequate cuff pressure. The collected data revealed that at least 26% of the participant confirmed that they had seen patients having post-intubation complications. Conclusion: There is an increasing importance placed on quality assurance. Clinical practice varies widely among practitioners, with the only consistency being the omission of cuff manometers during routine intra-operative management, despite their proven benefit and efficacy. Encourage the anaesthetists and ODPs to use cuff pressure manometers. The availability of portable pressure manometers can help to maintain safe cuff pressures in patients requiring endotracheal intubation.

Keywords: endotracheal cuff pressure, intra-operative monitoring, current practice, patient satisfaction

Procedia PDF Downloads 76
88 Global Healthcare Village Based on Mobile Cloud Computing

Authors: Laleh Boroumand, Muhammad Shiraz, Abdullah Gani, Rashid Hafeez Khokhar

Abstract:

Cloud computing being the use of hardware and software that are delivered as a service over a network has its application in the area of health care. Due to the emergency cases reported in most of the medical centers, prompt for an efficient scheme to make health data available with less response time. To this end, we propose a mobile global healthcare village (MGHV) model that combines the components of three deployment model which include country, continent and global health cloud to help in solving the problem mentioned above. In the creation of continent model, two (2) data centers are created of which one is local and the other is global. The local replay the request of residence within the continent, whereas the global replay the requirements of others. With the methods adopted, there is an assurance of the availability of relevant medical data to patients, specialists, and emergency staffs regardless of locations and time. From our intensive experiment using the simulation approach, it was observed that, broker policy scheme with respect to optimized response time, yields a very good performance in terms of reduction in response time. Though, our results are comparable to others when there is an increase in the number of virtual machines (80-640 virtual machines). The proportionality in increase of response time is within 9%. The results gotten from our simulation experiments shows that utilizing MGHV leads to the reduction of health care expenditures and helps in solving the problems of unqualified medical staffs faced by both developed and developing countries.

Keywords: cloud computing (MCC), e-healthcare, availability, response time, service broker policy

Procedia PDF Downloads 337
87 Method Validation for Heavy Metal Determination in Spring Water and Sediments

Authors: Habtamu Abdisa

Abstract:

Spring water is particularly valuable due to its high mineral content, which is beneficial for human health. However, anthropogenic activities usually imbalance the natural levels of its composition, which can cause adverse health effects. Regular monitoring of a naturally given environmental resource is of great concern in the world today. The spectrophotometric application is one of the best methods for qualifying and quantifying the mineral contents of environmental water samples. This research was conducted to evaluate the quality of spring water concerning its heavy metal composition. A grab sampling technique was employed to collect representative samples, including duplicates. The samples were then treated with concentrated HNO3 to a pH level below 2 and stored at 4oC. The samples were digested and analyzed for cadmium (Cd), chromium (Cr), manganese (Mn), copper (Cu), iron (Fe), and zinc (Zn) following method validation. Atomic Absorption Spectrometry (AAS) was utilized for the sample analysis. Quality control measures, including blanks, duplicates, and certified reference materials (CRMs), were implemented to ensure the accuracy and precision of the analytical results. Of the metals analyzed in the water samples, Cd and Cr were found to be below the detection limit. However, the concentrations of Mn, Cu, Fe, and Zn ranged from mean values of 0.119-0.227 mg/L, 0.142-0.166 mg/L, 0.183-0.267 mg/L, and 0.074-0.181 mg/L, respectively. Sediment analysis revealed mean concentration ranges of 348.31-429.21 mg/kg, 0.23-0.28 mg/kg, 18.73-22.84 mg/kg, 2.76-3.15 mg/kg, 941.84-1128.56 mg/kg, and 42.39-66.53 mg/kg for Mn, Cd, Cu, Cr, Fe, and Zn, respectively. The study results established that the evaluated spring water and its associated sediment met the regulatory standards and guidelines for heavy metal concentrations. Furthermore, this research can enhance the quality assurance and control processes for environmental sample analysis, ensuring the generation of reliable data.

Keywords: method validation, heavy metal, spring water, sediment, method detection limit

Procedia PDF Downloads 35
86 Modeling Flow and Deposition Characteristics of Solid CO2 during Choked Flow of CO2 Pipeline in CCS

Authors: Teng lin, Li Yuxing, Han Hui, Zhao Pengfei, Zhang Datong

Abstract:

With the development of carbon capture and storage (CCS), the flow assurance of CO2 transportation becomes more important, particularly for supercritical CO2 pipelines. The relieving system using the choke valve is applied to control the pressure in CO2 pipeline. However, the temperature of fluid would drop rapidly because of Joule-Thomson cooling (JTC), which may cause solid CO2 form and block the pipe. In this paper, a Computational Fluid Dynamic (CFD) model, using the modified Lagrangian method, Reynold's Stress Transport model (RSM) for turbulence and stochastic tracking model (STM) for particle trajectory, was developed to predict the deposition characteristic of solid carbon dioxide. The model predictions were in good agreement with the experiment data published in the literature. It can be observed that the particle distribution affected the deposition behavior. In the region of the sudden expansion, the smaller particles accumulated tightly on the wall were dominant for pipe blockage. On the contrary, the size of solid CO2 particles deposited near the outlet usually was bigger and the stacked structure was looser. According to the calculation results, the movement of the particles can be regarded as the main four types: turbulent motion close to the sudden expansion structure, balanced motion at sudden expansion-middle region, inertial motion near the outlet and the escape. Furthermore the particle deposits accumulated primarily in the sudden expansion region, reattachment region and outlet region because of the four type of motion. Also the Stokes number had an effect on the deposition ratio and it is recommended for Stokes number to avoid 3-8St.

Keywords: carbon capture and storage, carbon dioxide pipeline, gas-particle flow, deposition

Procedia PDF Downloads 342
85 Development of Alpha Spectroscopy Method with Solid State Nuclear Track Detector Using Aluminium Thin Films

Authors: Nidal Dwaikat

Abstract:

This work presents the development of alpha spectroscopy method with Solid-state nuclear track detectors using aluminum thin films. The resolution of this method is high, and it is able to discriminate between alpha particles at different incident energy. It can measure the exact number of alpha particles at specific energy without needing a calibration of alpha track diameter versus alpha energy. This method was tested by using Cf-252 alpha standard source at energies 5.11 Mev, 3.86 MeV and 2.7 MeV, which produced by the variation of detector -standard source distance. On front side, two detectors were covered with two Aluminum thin films and the third detector was kept uncovered. The thickness of Aluminum thin films was selected carefully (using SRIM 2013) such that one of the films will block the lower two alpha particles (3.86 MeV and 2.7 MeV) and the alpha particles at higher energy (5.11 Mev) can penetrate the film and reach the detector’s surface. The second thin film will block alpha particles at lower energy of 2.7 MeV and allow alpha particles at higher two energies (5.11 Mev and 3.86 MeV) to penetrate and produce tracks. For uncovered detector, alpha particles at three different energies can produce tracks on it. For quality assurance and accuracy, the detectors were mounted on thick enough copper substrates to block exposure from the backside. The tracks on the first detector are due to alpha particles at energy of 5.11 MeV. The difference between the tracks number on the first detector and the tracks number on the second detector is due to alpha particles at energy of 3.8 MeV. Finally, by subtracting the tracks number on the second detector from the tracks number on the third detector (uncovered), we can find the tracks number due to alpha particles at energy 2.7 MeV. After knowing the efficiency calibration factor, we can exactly calculate the activity of standard source.

Keywords: aluminium thin film, alpha particles, copper substrate, CR-39 detector

Procedia PDF Downloads 339
84 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 40
83 Unlocking Justice: Exploring the Power and Challenges of DNA Analysis in the Criminal Justice System

Authors: Sandhra M. Pillai

Abstract:

This article examines the relevance, difficulties, and potential applications of DNA analysis in the criminal justice system. A potent tool for connecting suspects to crime sites, clearing the innocent of wrongdoing, and resolving cold cases, DNA analysis has transformed forensic investigations. The scientific foundations of DNA analysis, including DNA extraction, sequencing, and statistical analysis, are covered in the article. To guarantee accurate and trustworthy findings, it also discusses the significance of quality assurance procedures, chain of custody, and DNA sample storage. DNA analysis has significantly advanced science, but it also brings up substantial moral and legal issues. To safeguard individual rights and uphold public confidence, privacy concerns, possible discrimination, and abuse of DNA information must be properly addressed. The paper also emphasises the effects of the criminal justice system on people and communities while highlighting the necessity of equity, openness, and fair access to DNA testing. The essay describes the obstacles and future directions for DNA analysis. It looks at cutting-edge technology like next-generation sequencing, which promises to make DNA analysis quicker and more affordable. To secure the appropriate and informed use of DNA evidence, it also emphasises the significance of multidisciplinary collaboration among scientists, law enforcement organisations, legal experts, and policymakers. In conclusion, DNA analysis has enormous potential for improving the course of criminal justice. We can exploit the potential of DNA technology while respecting the ideals of justice, fairness, and individual rights by navigating the ethical, legal, and societal issues and encouraging discussion and collaboration.

Keywords: DNA analysis, DNA evidence, reliability, validity, legal frame, admissibility, ethical considerations, impact, future direction, challenges

Procedia PDF Downloads 40
82 Evaluating the Dosimetric Performance for 3D Treatment Planning System for Wedged and Off-Axis Fields

Authors: Nashaat A. Deiab, Aida Radwan, Mohamed S. Yahiya, Mohamed Elnagdy, Rasha Moustafa

Abstract:

This study is to evaluate the dosimetric performance of our institution's 3D treatment planning system for wedged and off-axis 6MV photon beams, guided by the recommended QA tests documented in the AAPM TG53; NCS report 15 test packages, IAEA TRS 430 and ESTRO booklet no.7. The study was performed for Elekta Precise linear accelerator designed for clinical range of 4, 6 and 15 MV photon beams with asymmetric jaws and fully integrated multileaf collimator that enables high conformance to target with sharp field edges. Ten tests were applied on solid water equivalent phantom along with 2D array dose detection system. The calculated doses using 3D treatment planning system PrecisePLAN were compared with measured doses to make sure that the dose calculations are accurate for simple situations such as square and elongated fields, different SSD, beam modifiers e.g. wedges, blocks, MLC-shaped fields and asymmetric collimator settings. The QA results showed dosimetric accuracy of the TPS within the specified tolerance limits. Except for large elongated wedged field, the central axis and outside central axis have errors of 0.2% and 0.5%, respectively, and off- planned and off-axis elongated fields the region outside the central axis of the beam errors are 0.2% and 1.1%, respectively. The dosimetric investigated results yielded differences within the accepted tolerance level as recommended. Differences between dose values predicted by the TPS and measured values at the same point are the result from limitations of the dose calculation, uncertainties in the measurement procedure, or fluctuations in the output of the accelerator.

Keywords: quality assurance, dose calculation, wedged fields, off-axis fields, 3D treatment planning system, photon beam

Procedia PDF Downloads 412
81 Quality Assurance for the Climate Data Store

Authors: Judith Klostermann, Miguel Segura, Wilma Jans, Dragana Bojovic, Isadora Christel Jimenez, Francisco Doblas-Reyees, Judit Snethlage

Abstract:

The Climate Data Store (CDS), developed by the Copernicus Climate Change Service (C3S) implemented by the European Centre for Medium-Range Weather Forecasts (ECMWF) on behalf of the European Union, is intended to become a key instrument for exploring climate data. The CDS contains both raw and processed data to provide information to the users about the past, present and future climate of the earth. It allows for easy and free access to climate data and indicators, presenting an important asset for scientists and stakeholders on the path for achieving a more sustainable future. The C3S Evaluation and Quality Control (EQC) is assessing the quality of the CDS by undertaking a comprehensive user requirement assessment to measure the users’ satisfaction. Recommendations will be developed for the improvement and expansion of the CDS datasets and products. User requirements will be identified on the fitness of the datasets, the toolbox, and the overall CDS service. The EQC function of the CDS will help C3S to make the service more robust: integrated by validated data that follows high-quality standards while being user-friendly. This function will be closely developed with the users of the service. Through their feedback, suggestions, and contributions, the CDS can become more accessible and meet the requirements for a diverse range of users. Stakeholders and their active engagement are thus an important aspect of CDS development. This will be achieved with direct interactions with users such as meetings, interviews or workshops as well as different feedback mechanisms like surveys or helpdesk services at the CDS. The results provided by the users will be categorized as a function of CDS products so that their specific interests will be monitored and linked to the right product. Through this procedure, we will identify the requirements and criteria for data and products in order to build the correspondent recommendations for the improvement and expansion of the CDS datasets and products.

Keywords: climate data store, Copernicus, quality, user engagement

Procedia PDF Downloads 121
80 Integrated Risk Management in The Supply Chain of Essential Medicines in Zambia

Authors: Mario M. J. Musonda

Abstract:

Access to health care is a human right, which includes having timely access to affordable and quality essential medicines at the right place and in sufficient quantity. However, inefficient public sector supply chain management contributes to constant shortages of essential medicines at health facilities. Literature review involved a desktop study of published research studies and reports on risk management, supply chain management of essential medicines and their integration to increase the efficiency of the latter. The research was conducted on a sample population of offices under Ministry of Health Headquarters, Lusaka Provincial and District Offices, selected health facilities in Lusaka, Medical Stores Limited, Zambia Medicines Regulatory Authority and Cooperating Partners. Individuals involved in study were selected judgmentally by their functions under selection and quantification, regulation, procurement, storage, distribution, quality assurance, and dispensing of essential medicines. Structured interviews and discussions were held with selected experts and self-administered questionnaires were distributed. Collected and analysed data of 35 returned and usable questionnaires from the 50 distributed. The highest prioritised risks were; inadequate and inconsistent fund disbursements, weak information management systems, weak quality management systems and insufficient resources (HR and infrastructure) among others. The results for this research can be used to increase the efficiency of the public sector supply chain of essential medicines and other pharmaceuticals. The results of the study showed that there is need to implement effective risk management systems by participating institutions and organisations to increase the efficiency of the entire supply chain in order to avoid and/or reduce shortages of essential medicines at health facilities.

Keywords: essential medicine, risk assessment, risk management, supply chain, supply chain risk management

Procedia PDF Downloads 414
79 An Overview of Technology Availability to Support Remote Decentralized Clinical Trials

Authors: Simone Huber, Bianca Schnalzer, Baptiste Alcalde, Sten Hanke, Lampros Mpaltadoros, Thanos G. Stavropoulos, Spiros Nikolopoulos, Ioannis Kompatsiaris, Lina Pérez- Breva, Vallivana Rodrigo-Casares, Jaime Fons-Martínez, Jeroen de Bruin

Abstract:

Developing new medicine and health solutions and improving patient health currently rely on the successful execution of clinical trials, which generate relevant safety and efficacy data. For their success, recruitment and retention of participants are some of the most challenging aspects of protocol adherence. Main barriers include: i) lack of awareness of clinical trials; ii) long distance from the clinical site; iii) the burden on participants, including the duration and number of clinical visits and iv) high dropout rate. Most of these aspects could be addressed with a new paradigm, namely the Remote Decentralized Clinical Trials (RDCTs). Furthermore, the COVID-19 pandemic has highlighted additional advantages and challenges for RDCTs in practice, allowing participants to join trials from home and not depend on site visits, etc. Nevertheless, RDCTs should follow the process and the quality assurance of conventional clinical trials, which involve several processes. For each part of the trial, the Building Blocks, existing software and technologies were assessed through a systematic search. The technology needed to perform RDCTs is widely available and validated but is yet segmented and developed in silos, as different software solutions address different parts of the trial and at various levels. The current paper is analyzing the availability of technology to perform RDCTs, identifying gaps and providing an overview of Basic Building Blocks and functionalities that need to be covered to support the described processes.

Keywords: architectures and frameworks for health informatics systems, clinical trials, information and communications technology, remote decentralized clinical trials, technology availability

Procedia PDF Downloads 178
78 Adverse Curing Conditions and Performance of Concrete: Bangladesh Perspective

Authors: T. Manzur

Abstract:

Concrete is the predominant construction material in Bangladesh. In large projects, stringent quality control procedures are usually followed under the supervision of experienced engineers and skilled labors. However, in the case of small projects and particularly at distant locations from major cities, proper quality control is often an issue. It has been found from experience that such quality related issues mainly arise from inappropriate proportioning of concrete mixes and improper curing conditions. In most cases external curing method is followed which requires supply of adequate quantity of water along with proper protection against evaporation. Often these conditions are found missing in the general construction sites and eventually lead to production of weaker concrete both in terms of strength and durability. In this study, an attempt has been made to investigate the performance of general concreting works of the country when subjected to several adverse curing conditions that are quite common in various small to medium construction sites. A total of six different types of adverse curing conditions were simulated in the laboratory and samples were kept under those conditions for several days. A set of samples was also submerged in normal curing condition having proper supply of curing water. Performance of concrete was evaluated in terms of compressive strength, tensile strength, chloride permeability and drying shrinkage. About 37% and 25% reduction in 28-day compressive and tensile strength were observed respectively, for samples subjected to most adverse curing condition as compared to the samples under normal curing conditions. Normal curing concrete exhibited moderate permeability (close to low permeability) whereas concrete under adverse curing conditions showed very high permeability values. Similar results were also obtained for shrinkage tests. This study, thus, will assist concerned engineers and supervisors to understand the importance of quality assurance during the curing period of concrete.

Keywords: adverse, concrete, curing, compressive strength, drying shrinkage, permeability, tensile strength

Procedia PDF Downloads 175
77 Security of Database Using Chaotic Systems

Authors: Eman W. Boghdady, A. R. Shehata, M. A. Azem

Abstract:

Database (DB) security demands permitting authorized users and prohibiting non-authorized users and intruders actions on the DB and the objects inside it. Organizations that are running successfully demand the confidentiality of their DBs. They do not allow the unauthorized access to their data/information. They also demand the assurance that their data is protected against any malicious or accidental modification. DB protection and confidentiality are the security concerns. There are four types of controls to obtain the DB protection, those include: access control, information flow control, inference control, and cryptographic. The cryptographic control is considered as the backbone for DB security, it secures the DB by encryption during storage and communications. Current cryptographic techniques are classified into two types: traditional classical cryptography using standard algorithms (DES, AES, IDEA, etc.) and chaos cryptography using continuous (Chau, Rossler, Lorenz, etc.) or discreet (Logistics, Henon, etc.) algorithms. The important characteristics of chaos are its extreme sensitivity to initial conditions of the system. In this paper, DB-security systems based on chaotic algorithms are described. The Pseudo Random Numbers Generators (PRNGs) from the different chaotic algorithms are implemented using Matlab and their statistical properties are evaluated using NIST and other statistical test-suits. Then, these algorithms are used to secure conventional DB (plaintext), where the statistical properties of the ciphertext are also tested. To increase the complexity of the PRNGs and to let pass all the NIST statistical tests, we propose two hybrid PRNGs: one based on two chaotic Logistic maps and another based on two chaotic Henon maps, where each chaotic algorithm is running side-by-side and starting from random independent initial conditions and parameters (encryption keys). The resulted hybrid PRNGs passed the NIST statistical test suit.

Keywords: algorithms and data structure, DB security, encryption, chaotic algorithms, Matlab, NIST

Procedia PDF Downloads 242
76 Calculating Asphaltenes Precipitation Onset Pressure by Using Cardanol as Precipitation Inhibitor: A Strategy to Increment the Oil Well Production

Authors: Camilo A. Guerrero-Martin, Erik Montes Paez, Marcia C. K. Oliveira, Jonathan Campos, Elizabete F. Lucas

Abstract:

Asphaltenes precipitation is considered as a formation damage problem, which can reduce the oil recovery factor. It fouls piping and surface installations, as well as cause serious flow assurance complications and decline oil well production. Therefore, researchers have shown an interest in chemical treatments to control this phenomenon. The aim of this paper is to assess the asphaltenes precipitation onset of crude oils in the presence of cardanol, by titrating the crude with n-heptane. Moreover, based on this results obtained at atmosphere pressure, the asphaltenes precipitation onset pressure were calculated to predict asphaltenes precipitation in the reservoir, by using differential liberation and refractive index data of the oils. The influence of cardanol concentrations in the asphaltenes stabilization of three Brazilian crude oils samples (with similar API densities) was studied. Therefore, four formulations of cardanol in toluene were prepared: 0, 3, 5, 10 and 15 m/m%. The formulations were added to the crude at 2:98 ratio. The petroleum samples were characterized by API density, elemental analysis and differential liberation test. The asphaltenes precipitation onset (APO) was determined by titrating with n-heptane and monitoring with near-infrared (NIR). UV-Vis spectroscopy experiments were also done to assess the precipitate asphaltenes content. The asphaltenes precipitation envelopes (APE) were also determined by numerical simulation (Multiflash). In addition, the adequate artificial lift systems (ALS) for the oils were selected. It was based on the downhole well profile and a screening methodology. Finally, the oil flowrates were modelling by NODAL analysis production system in the PIPESIM software. The results of this study show that the asphaltenes precipitation onset of the crude oils were 2.2, 2.3 and 6.0 mL of n-heptane/g of oil. The cardanol was an effective inhibitor of asphaltenes precipitation for the crude oils used in this study, since it displaces the precipitation pressure of the oil to lower values. This indicates that cardanol can increase the oil wells productivity.

Keywords: asphaltenes, NODAL analysis production system, precipitation pressure onset, inhibitory molecule

Procedia PDF Downloads 148
75 Creation of Computerized Benchmarks to Facilitate Preparedness for Biological Events

Authors: B. Adini, M. Oren

Abstract:

Introduction: Communicable diseases and pandemics pose a growing threat to the well-being of the global population. A vital component of protecting the public health is the creation and sustenance of a continuous preparedness for such hazards. A joint Israeli-German task force was deployed in order to develop an advanced tool for self-evaluation of emergency preparedness for variable types of biological threats. Methods: Based on a comprehensive literature review and interviews with leading content experts, an evaluation tool was developed based on quantitative and qualitative parameters and indicators. A modified Delphi process was used to achieve consensus among over 225 experts from both Germany and Israel concerning items to be included in the evaluation tool. Validity and applicability of the tool for medical institutions was examined in a series of simulation and field exercises. Results: Over 115 German and Israeli experts reviewed and examined the proposed parameters as part of the modified Delphi cycles. A consensus of over 75% of experts was attained for 183 out of 188 items. The relative importance of each parameter was rated as part of the Delphi process, in order to define its impact on the overall emergency preparedness. The parameters were integrated in computerized web-based software that enables to calculate scores of emergency preparedness for biological events. Conclusions: The parameters developed in the joint German-Israeli project serve as benchmarks that delineate actions to be implemented in order to create and maintain an ongoing preparedness for biological events. The computerized evaluation tool enables to continuously monitor the level of readiness and thus strengths and gaps can be identified and corrected appropriately. Adoption of such a tool is recommended as an integral component of quality assurance of public health and safety.

Keywords: biological events, emergency preparedness, bioterrorism, natural biological events

Procedia PDF Downloads 396
74 Experimental Research and Analyses of Yoruba Native Speakers’ Chinese Phonetic Errors

Authors: Obasa Joshua Ifeoluwa

Abstract:

Phonetics is the foundation and most important part of language learning. This article, through an acoustic experiment as well as using Praat software, uses Yoruba students’ Chinese consonants, vowels, and tones pronunciation to carry out a visual comparison with that of native Chinese speakers. This article is aimed at Yoruba native speakers learning Chinese phonetics; therefore, Yoruba students are selected. The students surveyed are required to be at an elementary level and have learned Chinese for less than six months. The students selected are all undergraduates majoring in Chinese Studies at the University of Lagos. These students have already learned Chinese Pinyin and are all familiar with the pinyin used in the provided questionnaire. The Chinese students selected are those that have passed the level two Mandarin proficiency examination, which serves as an assurance that their pronunciation is standard. It is discovered in this work that in terms of Mandarin’s consonants pronunciation, Yoruba students cannot distinguish between the voiced and voiceless as well as the aspirated and non-aspirated phonetics features. For instance, while pronouncing [ph] it is clearly shown in the spectrogram that the Voice Onset Time (VOT) of a Chinese speaker is higher than that of a Yoruba native speaker, which means that the Yoruba speaker is pronouncing the unaspirated counterpart [p]. Another difficulty is to pronounce some affricates like [tʂ]、[tʂʰ]、[ʂ]、[ʐ]、 [tɕ]、[tɕʰ]、[ɕ]. This is because these sounds are not in the phonetic system of the Yoruba language. In terms of vowels, some students find it difficult to pronounce some allophonic high vowels such as [ɿ] and [ʅ], therefore pronouncing them as their phoneme [i]; another pronunciation error is pronouncing [y] as [u], also as shown in the spectrogram, a student pronounced [y] as [iu]. In terms of tone, it is most difficult for students to differentiate between the second (rising) and third (falling and rising) tones because these tones’ emphasis is on the rising pitch. This work concludes that the major error made by Yoruba students while pronouncing Chinese sounds is caused by the interference of their first language (LI) and sometimes by their lingua franca.

Keywords: Chinese, Yoruba, error analysis, experimental phonetics, consonant, vowel, tone

Procedia PDF Downloads 76
73 The Advantages of Using DNA-Barcoding for Determining the Fraud in Seafood

Authors: Elif Tugce Aksun Tumerkan

Abstract:

Although seafood is an important part of human diet and categorized highly traded food industry internationally, it is remain overlooked generally in the global food security aspect. Food product authentication is the main interest in the aim of both avoids commercial fraud and to consider the risks that might be harmful to human health safety. In recent years, with increasing consumer demand for regarding food content and it's transparency, there are some instrumental analyses emerging for determining food fraud depend on some analytical methodologies such as proteomic and metabolomics. While, fish and seafood consumed as fresh previously, within advanced technology, processed or packaged seafood consumption have increased. After processing or packaging seafood, morphological identification is impossible when some of the external features have been removed. The main fish and seafood quality-related issues are the authentications of seafood contents such as mislabelling products which may be contaminated and replacement partly or completely, by lower quality or cheaper ones. For all mentioned reasons, truthful consistent and easily applicable analytical methods are needed for assurance the correct labelling and verifying of seafood products. DNA-barcoding methods become popular robust that used in taxonomic research for endangered or cryptic species in recent years; they are used for determining food traceability also. In this review, when comparing the other proteomic and metabolic analysis, DNA-based methods are allowing a chance to identification all type of food even as raw, spiced and processed products. This privilege caused by DNA is a comparatively stable molecule than protein and other molecules. Furthermore showing variations in sequence based on different species and founding in all organisms, make DNA-based analysis more preferable. This review was performed to clarify the main advantages of using DNA-barcoding for determining seafood fraud among other techniques.

Keywords: DNA-barcoding, genetic analysis, food fraud, mislabelling, packaged seafood

Procedia PDF Downloads 140