Search results for: error detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5095

Search results for: error detection

1285 Detection the Abundance of Chicken Skin in Hamburger in Tehran

Authors: Ghazanfari Masoumeh, Hajimohammadi Bahador, Eskandari Soheyl, Karimian Khosroshahi Nader

Abstract:

Consumption of ready to cook meat products such as hamburgers, sausages and etc is being increased in the worldwide specially in the big cities , so safety and quality required for food products is very important and vital for consumers with consideration of meat price and increasing demands for meat products, possibility of substitution of cheep and unauthorized textures such as undesirable enclosures animals (massacre, lung tissue, breast of spleen, the organs abdominal cavity, gizzard chicken, skin, etc. ) have increased in the recent years, in this study 30 industrial and 30 handmade hamburgers in fast food restaurants detected out of Iranian national standard for hamburger No. 2304 in using the unauthorized textures. The purpose of this study was to determine using of chicken skin in produced hamburgers from chicken meat in Tehran base on histology methods. The rates of skin used were, 2 % in industrial and 9 % in handmade formula samples. Statistically using the unauthorized textures had significant higher rate in handmade (P < 0.05) in compare with the industrial samples. The results showed the handmade hamburgers with higher adulteration rate and non-compliance with the hamburger national standard could be a potentially health hazard.

Keywords: histology, adulteration, unauthorized textures, undesirable enclosures animals

Procedia PDF Downloads 442
1284 Application of the Electrical Resistivity Tomography and Tunnel Seismic Prediction 303 Methods for Detection Fracture Zones Ahead of Tunnel: A Case Study

Authors: Nima Dastanboo, Xiao-Qing Li, Hamed Gharibdoost

Abstract:

The purpose of this study is to investigate about the geological properties ahead of a tunnel face with using Electrical Resistivity Tomography ERT and Tunnel Seismic Prediction TSP303 methods. In deep tunnels with hydro-geological conditions, it is important to study the geological structures of the region before excavating tunnels. Otherwise, it would lead to unexpected accidents that impose serious damage to the project. For constructing Nosoud tunnel in west of Iran, the ERT and TSP303 methods are employed to predict the geological conditions dynamically during the excavation. In this paper, based on the engineering background of Nosoud tunnel, the important results of applying these methods are discussed. This work demonstrates seismic method and electrical tomography as two geophysical techniques that are able to detect a tunnel. The results of these two methods were being in agreement with each other but the results of TSP303 are more accurate and quality. In this case, the TSP 303 method was a useful tool for predicting unstable geological structures ahead of the tunnel face during excavation. Thus, using another geophysical method together with TSP303 could be helpful as a decision support in excavating, especially in complicated geological conditions.

Keywords: tunnel seismic prediction (TSP303), electrical resistivity tomography (ERT), seismic wave, velocity analysis, low-velocity zones

Procedia PDF Downloads 124
1283 A Real Time Development Study for Automated Centralized Remote Monitoring System at Royal Belum Forest

Authors: Amri Yusoff, Shahrizuan Shafiril, Ashardi Abas, Norma Che Yusoff

Abstract:

Nowadays, illegal logging has been causing much effect to our forest. Some of it causes a flash flood, avalanche, global warming, and etc. This comprehensibly makes us wonder why, what, and who has made it happened. Often, it already has been too late after we have known the cause of it. Even the Malaysian Royal Belum forest has not been spared from land clearing or illegal activity by the natives although this area has been gazetted as a protected area preserved for future generations. Furthermore, because of its sizeable and wide area, these illegal activities are difficult to monitor and to maintain. A critical action must be called upon to prevent all of these unhealthy activities from recurrence. Therefore, a remote monitoring device must be developed in order to capture critical real-time data such as temperature, humidity, gaseous, fire, and rain detection which indicates the current and preserved natural state and habitat in the forest. Besides, this device location can be detected via GPS by showing the latitudes and longitudes of its current location and then to be transmitted by SMS via GSM system. All of its readings will be sent in real-time for data management and analysis. This result will be benefited to the monitoring bodies or relevant authority in keeping the forest in the natural habitat. Furthermore, this research is to gather a unified data and then will be analysed for its comparison with an existing method.

Keywords: remote monitoring system, forest data, GSM, GPS, wireless sensor

Procedia PDF Downloads 400
1282 Accurate Position Electromagnetic Sensor Using Data Acquisition System

Authors: Z. Ezzouine, A. Nakheli

Abstract:

This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.

Keywords: electromagnetic sensor, accurately, data acquisition, position measurement

Procedia PDF Downloads 272
1281 Hedonic Price Analysis of Consumer Preference for Musa spp in Northern Nigeria

Authors: Yakubu Suleiman, S. A. Musa

Abstract:

The research was conducted to determine the physical characteristics of banana fruits that influenced consumer preferences for the fruit in Northern Nigeria. Socio-economic characteristics of the respondents were also identified. Simple descriptive statistics and Hedonic prices model were used to analyze the data collected for socio-economic and consumer preference respectively with the aid of 1000 structured questionnaires. The result revealed the value of R2 to be 0.633, meaning that, 63.3% of the variation in the banana price was brought about by the explanatory variables included in the model and the variables are: colour, size, degree of ripeness, softness, surface blemish, cleanliness of the fruits, weight, length, and cluster size of fruits. However, the remaining 36.7% could be attributed to the error term or random disturbance in the model. It could also be seen from the calculated result that the intercept was 1886.5 and was statistically significant (P < 0.01), meaning that about N1886.5 worth of banana fruits could be bought by consumers without considering the variables of banana included in the model. Moreover, consumers showed that they have significant preference for colours, size, degree of ripeness, softness, weight, length and cluster size of banana fruits and they were tested to be significant at either P < 0.01, P < 0.05, and P < 0.1 . Moreover, the result also shows that consumers did not show significance preferences to surface blemish, cleanliness and variety of the banana fruit as all of them showed non-significance level with negative signs. Based on the findings of the research, it is hereby recommended that plant breeders and research institutes should concentrate on the production of banana fruits that have those physical characteristics that were found to be statistically significance like cluster size, degree of ripeness,’ softness, length, size, and skin colour.

Keywords: analysis, consumers, preference, variables

Procedia PDF Downloads 320
1280 Simulation-Based Validation of Safe Human-Robot-Collaboration

Authors: Titanilla Komenda

Abstract:

Human-machine-collaboration defines a direct interaction between humans and machines to fulfil specific tasks. Those so-called collaborative machines are used without fencing and interact with humans in predefined workspaces. Even though, human-machine-collaboration enables a flexible adaption to variable degrees of freedom, industrial applications are rarely found. The reasons for this are not technical progress but rather limitations in planning processes ensuring safety for operators. Until now, humans and machines were mainly considered separately in the planning process, focusing on ergonomics and system performance respectively. Within human-machine-collaboration, those aspects must not be seen in isolation from each other but rather need to be analysed in interaction. Furthermore, a simulation model is needed that can validate the system performance and ensure the safety for the operator at any given time. Following on from this, a holistic simulation model is presented, enabling a simulative representation of collaborative tasks – including both, humans and machines. The presented model does not only include a geometry and a motion model of interacting humans and machines but also a numerical behaviour model of humans as well as a Boole’s probabilistic sensor model. With this, error scenarios can be simulated by validating system behaviour in unplanned situations. As these models can be defined on the basis of Failure Mode and Effects Analysis as well as probabilities of errors, the implementation in a collaborative model is discussed and evaluated regarding limitations and simulation times. The functionality of the model is shown on industrial applications by comparing simulation results with video data. The analysis shows the impact of considering human factors in the planning process in contrast to only meeting system performance. In this sense, an optimisation function is presented that meets the trade-off between human and machine factors and aids in a successful and safe realisation of collaborative scenarios.

Keywords: human-machine-system, human-robot-collaboration, safety, simulation

Procedia PDF Downloads 346
1279 Development of Lipid Architectonics for Improving Efficacy and Ameliorating the Oral Bioavailability of Elvitegravir

Authors: Bushra Nabi, Saleha Rehman, Sanjula Baboota, Javed Ali

Abstract:

Aim: The objective of research undertaken is analytical method validation (HPLC method) of an anti-HIV drug Elvitegravir (EVG). Additionally carrying out the forced degradation studies of the drug under different stress conditions to determine its stability. It is envisaged in order to determine the suitable technique for drug estimation, which would be employed in further research. Furthermore, comparative pharmacokinetic profile of the drug from lipid architectonics and drug suspension would be obtained post oral administration. Method: Lipid Architectonics (LA) of EVR was formulated using probe sonication technique and optimized using QbD (Box-Behnken design). For the estimation of drug during further analysis HPLC method has been validation on the parameters (Linearity, Precision, Accuracy, Robustness) and Limit of Detection (LOD) and Limit of Quantification (LOQ) has been determined. Furthermore, HPLC quantification of forced degradation studies was carried out under different stress conditions (acid induced, base induced, oxidative, photolytic and thermal). For pharmacokinetic (PK) study, Albino Wistar rats were used weighing between 200-250g. Different formulations were given per oral route, and blood was collected at designated time intervals. A plasma concentration profile over time was plotted from which the following parameters were determined:

Keywords: AIDS, Elvitegravir, HPLC, nanostructured lipid carriers, pharmacokinetics

Procedia PDF Downloads 125
1278 Identification of Potential Predictive Biomarkers for Early Diagnosis of Preeclampsia Growth Factors to microRNAs

Authors: Sadia Munir

Abstract:

Preeclampsia is the contributor to the worldwide maternal mortality of approximately 100,000 deaths a year. It complicates about 10% of all pregnancies and is the first cause of maternal admission to intensive care units. Predicting preeclampsia is a major challenge in obstetrics. More importantly, no major progress has been achieved in the treatment of preeclampsia. As placenta is the main cause of the disease, the only way to treat the disease is to extract placental and deliver the baby. In developed countries, the cost of an average case of preeclampsia is estimated at £9000. Interestingly, preeclampsia may have an impact on the health of mother or infant, beyond the pregnancy. We performed a systematic search of PubMed including the combination of terms such as preeclampsia, biomarkers, treatment, hypoxia, inflammation, oxidative stress, vascular endothelial growth factor A, activin A, inhibin A, placental growth factor, transforming growth factor β-1, Nodal, placenta, trophoblast cells, microRNAs. In this review, we have summarized current knowledge on the identification of potential biomarkers for the diagnosis of preeclampsia. Although these studies show promising data in early diagnosis of preeclampsia, the current value of these factors as biomarkers, for the precise prediction of preeclampsia, has its limitation. Therefore, future studies need to be done to support some of the very promising and interesting data to develop affordable and widely available tests for early detection and treatment of preeclampsia.

Keywords: activin, biomarkers, growth factors, miroRNA

Procedia PDF Downloads 425
1277 Self-Calibration of Fish-Eye Camera for Advanced Driver Assistance Systems

Authors: Atef Alaaeddine Sarraj, Brendan Jackman, Frank Walsh

Abstract:

Tomorrow’s car will be more automated and increasingly connected. Innovative and intuitive interfaces are essential to accompany this functional enrichment. For that, today the automotive companies are competing to offer an advanced driver assistance system (ADAS) which will be able to provide enhanced navigation, collision avoidance, intersection support and lane keeping. These vision-based functions require an accurately calibrated camera. To achieve such differentiation in ADAS requires sophisticated sensors and efficient algorithms. This paper explores the different calibration methods applicable to vehicle-mounted fish-eye cameras with arbitrary fields of view and defines the first steps towards a self-calibration method that adequately addresses ADAS requirements. In particular, we present a self-calibration method after comparing different camera calibration algorithms in the context of ADAS requirements. Our method gathers data from unknown scenes while the car is moving, estimates the camera intrinsic and extrinsic parameters and corrects the wide-angle distortion. Our solution enables continuous and real-time detection of objects, pedestrians, road markings and other cars. In contrast, other camera calibration algorithms for ADAS need pre-calibration, while the presented method calibrates the camera without prior knowledge of the scene and in real-time.

Keywords: advanced driver assistance system (ADAS), fish-eye, real-time, self-calibration

Procedia PDF Downloads 232
1276 A Machine Learning Framework Based on Biometric Measurements for Automatic Fetal Head Anomalies Diagnosis in Ultrasound Images

Authors: Hanene Sahli, Aymen Mouelhi, Marwa Hajji, Amine Ben Slama, Mounir Sayadi, Farhat Fnaiech, Radhwane Rachdi

Abstract:

Fetal abnormality is still a public health problem of interest to both mother and baby. Head defect is one of the most high-risk fetal deformities. Fetal head categorization is a sensitive task that needs a massive attention from neurological experts. In this sense, biometrical measurements can be extracted by gynecologist doctors and compared with ground truth charts to identify normal or abnormal growth. The fetal head biometric measurements such as Biparietal Diameter (BPD), Occipito-Frontal Diameter (OFD) and Head Circumference (HC) needs to be monitored, and expert should carry out its manual delineations. This work proposes a new approach to automatically compute BPD, OFD and HC based on morphological characteristics extracted from head shape. Hence, the studied data selected at the same Gestational Age (GA) from the fetal Ultrasound images (US) are classified into two categories: Normal and abnormal. The abnormal subjects include hydrocephalus, microcephaly and dolichocephaly anomalies. By the use of a support vector machines (SVM) method, this study achieved high classification for automated detection of anomalies. The proposed method is promising although it doesn't need expert interventions.

Keywords: biometric measurements, fetal head malformations, machine learning methods, US images

Procedia PDF Downloads 275
1275 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling

Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed

Abstract:

The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.

Keywords: streamflow, neural network, optimisation, algorithm

Procedia PDF Downloads 138
1274 Saudi Human Awareness Needs: A Survey in How Human Causes Errors and Mistakes Leads to Leak Confidential Data with Proposed Solutions in Saudi Arabia

Authors: Amal Hussain Alkhaiwani, Ghadah Abdullah Almalki

Abstract:

Recently human errors have increasingly become a very high factor in security breaches that may affect confidential data, and most of the cyber data breaches are caused by human errors. With one individual mistake, the attacker will gain access to the entire network and bypass the implemented access controls without any immediate detection. Unaware employees will be vulnerable to any social engineering cyber-attacks. Providing security awareness to People is part of the company protection process; the cyber risks cannot be reduced by just implementing technology; the human awareness of security will significantly reduce the risks, which encourage changes in staff cyber-awareness. In this paper, we will focus on Human Awareness, human needs to continue the required security education level; we will review human errors and introduce a proposed solution to avoid the breach from occurring again. Recently Saudi Arabia faced many attacks with different methods of social engineering. As Saudi Arabia has become a target to many countries and individuals, we needed to initiate a defense mechanism that begins with awareness to keep our privacy and protect the confidential data against possible intended attacks.

Keywords: cybersecurity, human aspects, human errors, human mistakes, security awareness, Saudi Arabia, security program, security education, social engineering

Procedia PDF Downloads 135
1273 Metal-Organic Frameworks-Based Materials for Volatile Organic Compounds Sensing Applications: Strategies to Improve Sensing Performances

Authors: Claudio Clemente, Valentina Gargiulo, Alessio Occhicone, Giovanni Piero Pepe, Giovanni Ausanio, Michela Alfè

Abstract:

Volatile organic compound (VOC) emissions represent a serious risk to human health and the integrity of the ecosystems, especially at high concentrations. For this reason, it is very important to continuously monitor environmental quality and develop fast and reliable portable sensors to allow analysis on site. Chemiresistors have become promising candidates for VOC sensing as their ease of fabrication, variety of suitable sensitive materials, and simple sensing data. A chemoresistive gas sensor is a transducer that allows to measure the concentration of an analyte in the gas phase because the changes in resistance are proportional to the amount of the analyte present. The selection of the sensitive material, which interacts with the target analyte, is very important for the sensor performance. The most used VOC detection materials are metal oxides (MOx) for their rapid recovery, high sensitivity to various gas molecules, easy fabrication. Their sensing performance can be improved in terms of operating temperature, selectivity, and detection limit. Metal-organic frameworks (MOFs) have attracted a lot of attention also in the field of gas sensing due to their high porosity, high surface area, tunable morphologies, structural variety. MOFs are generated by the self-assembly of multidentate organic ligands connecting with adjacent multivalent metal nodes via strong coordination interactions, producing stable and highly ordered crystalline porous materials with well-designed structures. However, most MOFs intrinsically exhibit low electrical conductivity. To improve this property, MOFs can be combined with organic and inorganic materials in a hybrid fashion to produce composite materials or can be transformed into more stable structures. MOFs, indeed, can be employed as the precursors of metal oxides with well-designed architectures via the calcination method. The MOF-derived MOx partially preserved the original structure with high surface area and intrinsic open pores, which act as trapping centers for gas molecules, and showed a higher electrical conductivity. Core-shell heterostructures, in which the surface of a metal oxide core is completely coated by a MOF shell, forming a junction at the core-shell heterointerface, can also be synthesized. Also, nanocomposite in which MOF structures are intercalated with graphene related materials can also be produced, and the conductivity increases thanks to the high mobility of electrons of carbon materials. As MOF structures, zinc-based MOFs belonging to the ZIF family were selected in this work. Several Zn-based materials based and/or derived from MOFs were produced, structurally characterized, and arranged in a chemo resistive architecture, also exploring the potentiality of different approaches of sensing layer deposition based on PLD (pulsed laser deposition) and, in case of thermally labile materials, MAPLE (Matrix Assisted Pulsed Laser Evaporation) to enhance the adhesion to the support. The sensors were tested in a controlled humidity chamber, allowing for the possibility of varying the concentration of ethanol, a typical analyte chosen among the VOCs for a first survey. The effect of heating the chemiresistor to improve sensing performances was also explored. Future research will focus on exploring new manufacturing processes for MOF-based gas sensors with the aim to improve sensitivity, selectivity and reduce operating temperatures.

Keywords: chemiresistors, gas sensors, graphene related materials, laser deposition, MAPLE, metal-organic frameworks, metal oxides, nanocomposites, sensing performance, transduction mechanism, volatile organic compounds

Procedia PDF Downloads 38
1272 Correlation between Cephalometric Measurements and Visual Perception of Facial Profile in Skeletal Type II Patients

Authors: Choki, Supatchai Boonpratham, Suwannee Luppanapornlarp

Abstract:

The objective of this study was to find a correlation between cephalometric measurements and visual perception of facial profile in skeletal type II patients. In this study, 250 lateral cephalograms of female patients from age, 20 to 22 years were analyzed. The profile outlines of all the samples were hand traced and transformed into silhouettes by the principal investigator. Profile ratings were done by 9 orthodontists on Visual Analogue Scale from score one to ten (increasing level of convexity). 37 hard issue and soft tissue cephalometric measurements were analyzed by the principal investigator. All the measurements were repeated after 2 weeks interval for error assessment. At last, the rankings of visual perceptions were correlated with cephalometric measurements using Spearman correlation coefficient (P < 0.05). The results show that the increase in facial convexity was correlated with higher values of ANB (A point, nasion and B point), AF-BF (distance from A point to B point in mm), L1-NB (distance from lower incisor to NB line in mm), anterior maxillary alveolar height, posterior maxillary alveolar height, overjet, H angle hard tissue, H angle soft tissue and lower lip to E plane (absolute correlation values from 0.277 to 0.711). In contrast, the increase in facial convexity was correlated with lower values of Pg. to N perpendicular and Pg. to NB (mm) (absolute correlation value -0.302 and -0.294 respectively). From the soft tissue measurements, H angles had a higher correlation with visual perception than facial contour angle, nasolabial angle, and lower lip to E plane. In conclusion, the findings of this study indicated that the correlation of cephalometric measurements with visual perception was less than expected. Only 29% of cephalometric measurements had a significant correlation with visual perception. Therefore, diagnosis based solely on cephalometric analysis can result in failure to meet the patient’s esthetic expectation.

Keywords: cephalometric measurements, facial profile, skeletal type II, visual perception

Procedia PDF Downloads 121
1271 Using Squeezed Vacuum States to Enhance the Sensitivity of Ground Based Gravitational Wave Interferometers beyond the Standard Quantum Limit

Authors: Giacomo Ciani

Abstract:

This paper reviews the impact of quantum noise on modern gravitational wave interferometers and explains how squeezed vacuum states are used to push the noise below the standard quantum limit. With the first detection of gravitational waves from a pair of colliding black holes in September 2015 and subsequent detections including that of gravitational waves from a pair of colliding neutron stars, the ground-based interferometric gravitational wave observatories LIGO and VIRGO have opened the era of gravitational-wave and multi-messenger astronomy. Improving the sensitivity of the detectors is of paramount importance to increase the number and quality of the detections, fully exploiting this new information channel about the universe. Although still in the commissioning phase and not at nominal sensitivity, these interferometers are designed to be ultimately limited by a combination of shot noise and quantum radiation pressure noise, which define an envelope known as the standard quantum limit. Despite the name, this limit can be beaten with the use of advanced quantum measurement techniques, with the use of squeezed vacuum states being currently the most mature and promising. Different strategies for implementation of the technology in the large-scale detectors, in both their frequency-independent and frequency-dependent variations, are presented, together with an analysis of the main technological issues and expected sensitivity gain.

Keywords: gravitational waves, interferometers, squeezed vacuum, standard quantum limit

Procedia PDF Downloads 137
1270 Descriptive Study of Adverse Drug Reactions in a Paediatric Hospital in Mongolia from 2015 to 2019

Authors: Khaliun Nyambayar, Nomindari Azzaya, Batkhuyag Purevjav

Abstract:

Pharmacovigilance was officially introduced in Mongolia in 2003, in accordance with the Health Minister Order 183 for the registry of adverse drug reactions (ADR), approved in 2006 and was reviewed in 2010. This study was designed to evaluate the incidence and common types of adverse drug reactions among hospitalized children, the frequency of adverse drug reaction reported by health care providers, and the follow-up processes resulting from adverse drug reactions. A retrospective study of paediatric patients who experienced an adverse drug reaction from 2015 to 2019, extracted from the “yellow” card at the State Research Center for Maternal and Child Health, (city). A total of 417 adverse drug reactions were reported with an overall incidence was 80 (21.5%). Adverse reactions resulting from the use of antibiotics (particularly gentamycin, cephalosporins, and vancomycin) were usually mild. ADR’s were reported by physicians and nurses (93.8%), pharmacists (6.25%). Although documentation of physician notification occurred for 93% of adverse drug reactions, only 29% of cases were documented in the patient's medical chart, 13% included follow-up education for individuals involved, and 10% were updated in the allergy profile of the hospital computer system. Measures to improve the detection and reporting of adverse drug reactions by all health care professionals should be improved, to enhance our understanding of the nature and impact of these reactions in children.

Keywords: adverse drug reaction, pediatric, yellow card, Mongolia

Procedia PDF Downloads 95
1269 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data

Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar

Abstract:

It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.

Keywords: accuracy, exponential smoothing, forecasting, initial value

Procedia PDF Downloads 164
1268 Surveillance of Adverse Events Following Immunization during New Vaccines Introduction in Cameroon: A Cross-Sectional Study on the Role of Mobile Technology

Authors: Andreas Ateke Njoh, Shalom Tchokfe Ndoula, Amani Adidja, Germain Nguessan Menan, Annie Mengue, Eric Mboke, Hassan Ben Bachir, Sangwe Clovis Nchinjoh, Yauba Saidu, Laurent Cleenewerck De Kiev

Abstract:

Vaccines serve a great deal in protecting the population globally. Vaccine products are subject to rigorous quality control and approval before use to ensure safety. Even if all actors take the required precautions, some people could still have adverse events following immunization (AEFI) caused by the vaccine composition or an error in its administration. AEFI underreporting is pronounced in low-income settings like Cameroon. The Country introduced electronic platforms to strengthen surveillance. With the introduction of many novel vaccines, like COVID-19 and the novel Oral Polio Vaccine (nOPV) 2, there was a need to monitor AEFI in the Country. A cross-sectional study was conducted from July to December 2022. Data on AEFI per region of Cameroon were reviewed for the past five years. Data were analyzed with MS Excel, and the results were presented in proportions. AEFI reporting was uncommon in Cameroon. With the introduction of novel vaccines in 2021, the health authorities engaged in new tools and training to capture cases. AEFI detected almost doubled using the open data kit (ODK) compared to previous platforms, especially following the introduction of the nOPV2 and COVID-19 vaccines. The AEFI rate was 1.9 and 160 per administered 100 000 doses of nOPV2 and COVID-19 vaccines, respectively. This mobile tool captured individual information for people with AEFI from all regions. The platform helped to identify common AEFI following the use of these new vaccines. The ODK mobile technology was vital in improving AEFI reporting and providing data to monitor using new vaccines in Cameroon.

Keywords: adverse events following immunization, cameroon, COVID-19 vaccines, nOPV, ODK

Procedia PDF Downloads 66
1267 Quantitative Evaluation of Endogenous Reference Genes for ddPCR under Salt Stress Using a Moderate Halophile

Authors: Qinghua Xing, Noha M. Mesbah, Haisheng Wang, Jun Li, Baisuo Zhao

Abstract:

Droplet digital PCR (ddPCR) is being increasingly adopted for gene detection and quantification because of its higher sensitivity and specificity. According to previous observations and our lab data, it is essential to use endogenous reference genes (RGs) when investigating gene expression at the mRNA level under salt stress. This study aimed to select and validate suitable RGs for gene expression under salt stress using ddPCR. Six candidate RGs were selected based on the tandem mass tag (TMT)-labeled quantitative proteomics of Alkalicoccus halolimnae at four salinities. The expression stability of these candidate genes was evaluated using statistical algorithms (geNorm, NormFinder, BestKeeper and RefFinder). There was a small fluctuation in cycle threshold (Ct) value and copy number of the pdp gene. Its expression stability was ranked in the vanguard of all algorithms, and was the most suitable RG for quantification of expression by both qPCR and ddPCR of A. halolimnae under salt stress. Single RG pdp and RG combinations were used to normalize the expression of ectA, ectB, ectC, and ectD under four salinities. The present study constitutes the first systematic analysis of endogenous RG selection for halophiles responding to salt stress. This work provides a valuable theory and an approach reference of internal control identification for ddPCR-based stress response models.

Keywords: endogenous reference gene, salt stress, ddPCR, RT-qPCR, Alkalicoccus halolimnae

Procedia PDF Downloads 79
1266 Sound Performance of a Composite Acoustic Coating With Embedded Parallel Plates Under Hydrostatic Pressure

Authors: Bo Hu, Shibo Wang, Haoyang Zhang, Jie Shi

Abstract:

With the development of sonar detection technology, the acoustic stealth technology of underwater vehicles is facing severe challenges. The underwater acoustic coating is developing towards the direction of low-frequency absorption capability and broad absorption frequency bandwidth. In this paper, an acoustic model of underwater acoustic coating of composite material embedded with periodical steel structure is presented. The model has multiple high absorption peaks in the frequency range of 1kHz-8kHz, where achieves high sound absorption and broad bandwidth performance. It is found that the frequencies of the absorption peaks are related to the classic half-wavelength transmission principle. The sound absorption performance of the acoustic model is investigated by the finite element method using COMSOL software. The sound absorption mechanism of the proposed model is explained by the distributions of the displacement vector field. The influence of geometric parameters of periodical steel structure, including thickness and distance, on the sound absorption ability of the proposed model are further discussed. The acoustic model proposed in this study provides an idea for the design of underwater low-frequency broadband acoustic coating, and the results shows the possibility and feasibility for practical underwater application.

Keywords: acoustic coating, composite material, broad frequency bandwidth, sound absorption performance

Procedia PDF Downloads 153
1265 Data Disorders in Healthcare Organizations: Symptoms, Diagnoses, and Treatments

Authors: Zakieh Piri, Shahla Damanabi, Peyman Rezaii Hachesoo

Abstract:

Introduction: Healthcare organizations like other organizations suffer from a number of disorders such as Business Sponsor Disorder, Business Acceptance Disorder, Cultural/Political Disorder, Data Disorder, etc. As quality in healthcare care mostly depends on the quality of data, we aimed to identify data disorders and its symptoms in two teaching hospitals. Methods: Using a self-constructed questionnaire, we asked 20 questions in related to quality and usability of patient data stored in patient records. Research population consisted of 150 managers, physicians, nurses, medical record staff who were working at the time of study. We also asked their views about the symptoms and treatments for any data disorders they mentioned in the questionnaire. Using qualitative methods we analyzed the answers. Results: After classifying the answers, we found six main data disorders: incomplete data, missed data, late data, blurred data, manipulated data, illegible data. The majority of participants believed in their important roles in treatment of data disorders while others believed in health system problems. Discussion: As clinicians have important roles in producing of data, they can easily identify symptoms and disorders of patient data. Health information managers can also play important roles in early detection of data disorders by proactively monitoring and periodic check-ups of data.

Keywords: data disorders, quality, healthcare, treatment

Procedia PDF Downloads 417
1264 Creating Database and Building 3D Geological Models: A Case Study on Bac Ai Pumped Storage Hydropower Project

Authors: Nguyen Chi Quang, Nguyen Duong Tri Nguyen

Abstract:

This article is the first step to research and outline the structure of the geotechnical database in the geological survey of a power project; in the context of this report creating the database that has been carried out for the Bac Ai pumped storage hydropower project. For the purpose of providing a method of organizing and storing geological and topographic survey data and experimental results in a spatial database, the RockWorks software is used to bring optimal efficiency in the process of exploiting, using, and analyzing data in service of the design work in the power engineering consulting. Three-dimensional (3D) geotechnical models are created from the survey data: such as stratigraphy, lithology, porosity, etc. The results of the 3D geotechnical model in the case of Bac Ai pumped storage hydropower project include six closely stacked stratigraphic formations by Horizons method, whereas modeling of engineering geological parameters is performed by geostatistical methods. The accuracy and reliability assessments are tested through error statistics, empirical evaluation, and expert methods. The three-dimensional model analysis allows better visualization of volumetric calculations, excavation and backfilling of the lake area, tunneling of power pipelines, and calculation of on-site construction material reserves. In general, the application of engineering geological modeling makes the design work more intuitive and comprehensive, helping construction designers better identify and offer the most optimal design solutions for the project. The database always ensures the update and synchronization, as well as enables 3D modeling of geological and topographic data to integrate with the designed data according to the building information modeling. This is also the base platform for BIM & GIS integration.

Keywords: database, engineering geology, 3D Model, RockWorks, Bac Ai pumped storage hydropower project

Procedia PDF Downloads 147
1263 ADA Tool for Satellite InSAR-Based Ground Displacement Analysis: The Granada Region

Authors: M. Cuevas-González, O. Monserrat, A. Barra, C. Reyes-Carmona, R.M. Mateos, J. P. Galve, R. Sarro, M. Cantalejo, E. Peña, M. Martínez-Corbella, J. A. Luque, J. M. Azañón, A. Millares, M. Béjar, J. A. Navarro, L. Solari

Abstract:

Geohazard prone areas require continuous monitoring to detect risks, understand the phenomena occurring in those regions and prevent disasters. Satellite interferometry (InSAR) has come to be a trustworthy technique for ground movement detection and monitoring in the last few years. InSAR based techniques allow to process large areas providing high number of displacement measurements at low cost. However, the results provided by such techniques are usually not easy to interpret by non-experienced users hampering its use for decision makers. This work presents a set of tools developed in the framework of different projects (Momit, Safety, U-Geohaz, Riskcoast) and an example of their use in the Granada Coastal area (Spain) is shown. The ADA (Active Displacement Areas) tool have been developed with the aim of easing the management, use and interpretation of InSAR based results. It provides a semi-automatic extraction of the most significant ADAs through the application ADAFinder tool. This tool aims to support the exploitation of the European Ground Motion Service (EU-GMS), which will provide consistent, regular and reliable information regarding natural and anthropogenic ground motion phenomena all over Europe.

Keywords: ground displacements, InSAR, natural hazards, satellite imagery

Procedia PDF Downloads 184
1262 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing

Authors: Yehjune Heo

Abstract:

As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.

Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer

Procedia PDF Downloads 118
1261 E-Vet Smart Rapid System: Detection of Farm Disease Based on Expert System as Supporting to Epidemic Disesase Control

Authors: Malik Abdul Jabbar Zen, Wiwik Misaco Yuniarti, Azisya Amalia Karimasari, Novita Priandini

Abstract:

Zoonos is as an infectiontransmitted froma nimals to human sand vice versa currently having increased in the last 20 years. The experts/scientists predict that zoonosis will be a threat to the community in the future since it leads on 70% emerging infectious diseases (EID) and the high mortality of 50%-90%. The zoonosis’ spread from animal to human is caused by contaminated food known as foodborne disease. One World One Health, as the conceptual prevention toward zoonosis, requires the crossed disciplines cooperation to accelerate and streamlinethe handling ofanimal-based disease. E-Vet Smart Rapid System is an integrated innovation in the veterinary expertise application is able to facilitate the prevention, treatment, and educationagainst pandemic diseases and zoonosis. This system is constructed by Decision Support System (DSS) method provides a database of knowledge that is expected to facilitate the identification of disease rapidly, precisely, and accurately as well as to identify the deduction. The testingis conducted through a black box test case and questionnaire (N=30) by validity and reliability approach. Based on the black box test case reveals that E-Vet Rapid System is able to deliver the results in accordance with system design, and questionnaire shows that this system is valid (r > 0.361) and has a reliability (α > 0.3610).

Keywords: diagnosis, disease, expert systems, livestock, zoonosis

Procedia PDF Downloads 435
1260 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 65
1259 In vivo Mechanical Characterization of Facial Skin Combining Digital Image Correlation and Finite Element

Authors: Huixin Wei, Shibin Wang, Linan Li, Lei Zhou, Xinhao Tu

Abstract:

Facial skin is a biomedical material with complex mechanical properties of anisotropy, viscoelasticity, and hyperelasticity. The mechanical properties of facial skin are crucial for a number of applications including facial plastic surgery, animation, dermatology, cosmetic industry, and impact biomechanics. Skin is a complex multi-layered material which can be broadly divided into three main layers, the epidermis, the dermis, and the hypodermis. Collagen fibers account for 75% of the dry weight of dermal tissue, and it is these fibers which are responsible for the mechanical properties of skin. Many research on the anisotropic mechanical properties are mainly concentrated on in vitro, but there is a great difference between in vivo and in vitro for mechanical properties of the skin. In this study, we presented a method to measure the mechanical properties of facial skin in vivo. Digital image correlation (DIC) and indentation tests were used to obtain the experiment data, including the deformation of facial surface and indentation force-displacement curve. Then, the experiment was simulated using a finite element (FE) model. Application of Computed Tomography (CT) and reconstruction techniques obtained the real tissue geometry. A three-dimensional FE model of facial skin, including a bi-layer system, was obtained. As the epidermis is relatively thin, the epidermis and dermis were regarded as one layer and below it was hypodermis in this study. The upper layer was modeled as a Gasser-Ogden-Holzapfel (GOH) model to describe hyperelastic and anisotropic behaviors of the dermis. The under layer was modeled as a linear elastic model. In conclusion, the material properties of two-layer were determined by minimizing the error between the FE data and experimental data.

Keywords: facial skin, indentation test, finite element, digital image correlation, computed tomography

Procedia PDF Downloads 97
1258 The Effect of Low Power Laser on CK and Some of Markers Delayed Onset Muscle Soreness (DOMS)

Authors: Bahareh Yazdanparast Chaharmahali

Abstract:

The study showed effect of low power laser therapy on knee range of motion (flexion and extension), resting angle of knee joint, knee circumference and rating of delayed onset muscle soreness induced pain, 24 and 48 hours after eccentric training of knee flexor muscle (hamstring muscle). We investigate the effects of pulsed ultrasound on swelling, relaxed, flexion and extension knee angle and pain. 20 volunteers among girl students of college voluntary participated in this research. After eccentric training, subjects were randomly divided into two groups, control and laser therapy. In day 1 and in order to induce delayed onset muscle soreness, subjects eccentrically trained their knee flexor muscles. In day 2, subjects were randomly divided into two groups: control and low power laser therapy. 24 and 48 hours after eccentric training. Variables (knee flexion and extension, srang of motion, resting knee joint angle and knee circumferences) were measured and analyzed. Data are reported as means ± standard error (SE) and repeated measured was used to assess differences within groups. Methods of treatment (low power laser therapy) have significant effects on delayed onset muscle soreness markers. 24 and 48 hours after training a significant difference was observed between mean pains of 2 groups. This difference was significant between low power laser therapy and C groups. The Bonferroni post hock is significant. Low power laser therapy trophy as used in this study did significantly diminish the effects of delayed – onset muscle soreness on swelling, relaxed – knee extension and flexion angle.

Keywords: creatine kinase, DOMS, eccentric training, low power laser

Procedia PDF Downloads 228
1257 Evaluation of Elements Impurities in Drugs According to Pharmacopoeia by use FESEM-EDS Technique

Authors: Rafid Doulab

Abstract:

Elemental Impurities in the Pharmaceuticals industryis are indispensable to ensure pharmaceuticalssafety for 24 elements. Although atomic absorption and inductively coupled plasma are used in the U.S Pharmacopeia and the European Pharmacopoeia, FESEM with energy dispersive spectrometers can be applied as an alternative analysis method for quantitative and qualitative results for a variety of elements without chemical pretreatment, unlike other techniques. This technique characterizes by shortest time, with more less contamination, no reagent consumption, and generation of minimal residue or waste, as well as sample preparations time limiting, with minimal analysis error. Simple dilution for powder or direct analysis for liquid, we analyzed the usefulness of EDS method in testing with field emission scanning electron microscopy (FESEM, SUPRA 55 Carl Zeiss Germany) with an X-ray energy dispersion (XFlash6l10 Bruker Germany). The samples analyzed directly without coating by applied 5µ of known concentrated diluted sample on carbon stub with accelerated voltage according to sample thickness, the result for this spot was in atomic percentage, and by Avogadro converted factor, the final result will be in microgram. Conclusion and recommendation: The conclusion of this study is application of FESEM-EDS in US pharmacopeia and ICH /Q3D guideline to reach a high-precision and accurate method in element impurities analysis of drugs or bulk materials to determine the permitted daily exposure PDE in liquid or solid specimens, and to obtain better results than other techniques, by the way it does not require complex methods or chemicals for digestion, which interfere with the final results with the possibility of to keep the sample at any time for re analysis. The recommendation is to use this technique in pharmacopeia as standard methods like inductively coupled plasma both ICP-AES, ICP-OES, and ICP-MS.

Keywords: pharmacopoeia, FESEM-EDS, element impurities, atomic concentration

Procedia PDF Downloads 97
1256 Detection of Brackish Water Biological Fingerprints in Potable Water

Authors: Abdullah Mohammad, Abdullah Alshemali, Esmaeil Alsaleh

Abstract:

The chemical composition of desalinated water is modified to make it more acceptable to the end-user. Sometimes, this modification is approached by mixing with brackish water that is known to contain a variety of minerals. Expectedly, besides minerals, brackish water indigenous bacterial communities access the final mixture hence reaching the end consumer. The current project examined the safety of using brackish water as an ingredient in potable water. Pseudomonas aeruginosa strains were detected in potable and brackish water samples collected from storage facilities in residential areas as well as from main water distribution and storage tanks. The application of molecular and biochemical fingerprinting methods, including phylogeny, RFLP (restriction fragment length polymorphism), MLST (multilocus sequence typing) and substrate specificity testing, suggested that the potable water P. aeruginosa strains were most probably originated from brackish water. Additionally, all the sixty-four isolates showed multi-drug resistance (MDR) phenotype and harboured the three genes responsible for biofilm formation. These virulence factors represent serious health hazards compelling the scientific community to revise the WHO (World Health Organization) and USEP (US Environmental Protection Agency) A potable water quality guidelines, particularly those related to the types of bacterial genera that evade the current water quality guidelines.

Keywords: potable water, brackish water, pseudomonas aeroginosa, multidrug resistance

Procedia PDF Downloads 104