Search results for: imputation method of missing data
38264 Evaluation of the Efficacy and Tolerance of Gabapentin in the Treatment of Neuropathic Pain
Authors: A. Ibovi Mouondayi, S. Zaher, R. Assadi, K. Erraoui, S. Sboul, J. Daoudim, S. Bousselham, K. Nassar, S. Janani
Abstract:
INTRODUCTION: Neuropathic pain (NP) caused by damage to the somatosensory nervous system has a significant impact on quality of life and is associated with a high economic burden on the individual and society. The treatment of neuropathic pain consists of the use of a wide range of therapeutic agents, including gabapentin, which is used in the treatment of neuropathic pain. OBJECTIF: The objective of this study was to evaluate the efficacy and tolerance of gabapentin in the treatment of neuropathic pain. MATERIAL AND METHOD: This is a monocentric, cross-sectional, descriptive, retrospective study conducted in our department over a period of 19 months from October 2020 to April 2022. The missing parameters were collected during phone calls of the patients concerned. The diagnostic tool adopted was the DN4 questionnaire in the dialectal Arabic version. The impact of NP was assessed by the visual analog scale (VAS) on pain, sleep, and function. The impact of PN on mood was assessed by the "Hospital anxiety, and depression scale HAD" score in the validated Arabic version. The exclusion criteria were patients followed up for depression and other psychiatric pathologies. RESULTS: A total of 67 patients' data were collected. The average age was 64 years (+/- 15 years), with extremes ranging from 26 years to 94 years. 58 women and 9 men with an M/F sex ratio of 0.15. Cervical radiculopathy was found in 21% of this population, and lumbosacral radiculopathy in 61%. Gabapentin was introduced in doses ranging from 300 to 1800 mg per day with an average dose of 864 mg (+/- 346) per day for an average duration of 12.6 months. Before treatment, 93% of patients had a non-restorative sleep quality (VAS>3). 54% of patients had a pain VAS greater than 5. The function was normal in only 9% of patients. The mean anxiety score was 3.25 (standard deviation: 2.70), and the mean HAD depression score was 3.79 (standard deviation: 1.79). After treatment, all patients had improved the quality of their sleep (p<0.0001). A significant difference was noted in pain VAS, function, as well as anxiety and depression, and HAD score. Gabapentin was stopped for side effects (dizziness and drowsiness) and/or unsatisfactory response. CONCLUSION: Our data demonstrate a favorable effect of gabapentin on the management of neuropathic pain with a significant difference before and after treatment on the quality of life of patients associated with an acceptable tolerance profile.Keywords: neuropathic pain, chronic pain, treatment, gabapentin
Procedia PDF Downloads 9538263 The Bidirectional Effect between Parental Burnout and the Child’s Internalized and/or Externalized Behaviors
Authors: Aline Woine, Moïra Mikolajczak, Virginie Dardier, Isabelle Roskam
Abstract:
Background information: Becoming a parent is said to be the happiest event one can ever experience in one’s life. This popular (and almost absolute) truth–which no reasonable and decent human being would ever dare question on pain of being singled out as a bad parent–contrasts with the nuances that reality offers. Indeed, while many parents do thrive in their parenting role, some others falter and become progressively overwhelmed by their parenting role, ineluctably caught in a spiral of exhaustion. Parental burnout (henceforth PB) sets in when parental demands (stressors) exceed parental resources. While it is now generally acknowledged that PB affects the parent’s behavior in terms of neglect and violence toward their offspring, little is known about the impact that the syndrome might have on the children’s internalized (anxious and depressive symptoms, somatic complaints, etc.) and/or externalized (irritability, violence, aggressiveness, conduct disorder, oppositional disorder, etc.) behaviors. Furthermore, at the time of writing, to our best knowledge, no research has yet tested the reverse effect, namely, that of the child's internalized and/or externalized behaviors on the onset and/or maintenance of parental burnout symptoms. Goals and hypotheses: The present pioneering research proposes to fill an important gap in the existing literature related to PB by investigating the bidirectional effect between PB and the child’s internalized and/or externalized behaviors. Relying on a cross-lagged longitudinal study with three waves of data collection (4 months apart), our study tests a transactional model with bidirectional and recursive relations between observed variables and at the three waves, as well as autoregressive paths and cross-sectional correlations. Methods: As we write this, wave-two data are being collected via Qualtrics, and we expect a final sample of about 600 participants composed of French-speaking (snowball sample) and English-speaking (Prolific sample) parents. Structural equation modeling is employed using Stata version 17. In order to retain as much statistical power as possible, we use all available data and therefore apply the maximum likelihood with a missing value (mlmv) as the method of estimation to compute the parameter estimates. To limit (in so far is possible) the shared method variance bias in the evaluation of the child’s behavior, the study relies on a multi-informant evaluation approach. Expected results: We expect our three-wave longitudinal study to show that PB symptoms (measured at T1) raise the occurrence/intensity of the child’s externalized and/or internalized behaviors (measured at T2 and T3). We further expect the child’s occurrence/intensity of externalized and/or internalized behaviors (measured at T1) to augment the risk for PB (measured at T2 and T3). Conclusion: Should our hypotheses be confirmed, our results will make an important contribution to the understanding of both PB and children’s behavioral issues, thereby opening interesting theoretical and clinical avenues.Keywords: exhaustion, structural equation modeling, cross-lagged longitudinal study, violence and neglect, child-parent relationship
Procedia PDF Downloads 7338262 Improved Classification Procedure for Imbalanced and Overlapped Situations
Authors: Hankyu Lee, Seoung Bum Kim
Abstract:
The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.Keywords: classification, imbalanced data with class overlap, split data space, support vector machine
Procedia PDF Downloads 30838261 Semi-Automatic Method to Assist Expert for Association Rules Validation
Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen
Abstract:
In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.Keywords: association rules, rule-based classification, classification quality, validation
Procedia PDF Downloads 44038260 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-time
Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl
Abstract:
In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method Web-App auto-generated twin data structure replication. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi" has been developed. A special login form has been developed with a special instance of data validation; this verification process secures the web application from its early stages. The system has been tested and validated, up to 99% of SQLi attacks have been prevented.Keywords: SQL injection, attacks, web application, accuracy, database
Procedia PDF Downloads 15338259 A Method for Reduction of Association Rules in Data Mining
Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa
Abstract:
The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.Keywords: data mining, association rules, rules reduction, artificial intelligence
Procedia PDF Downloads 16238258 Evaluating the Implementation of a Quality Management System in the COVID-19 Diagnostic Laboratory of a Tertiary Care Hospital in Delhi
Authors: Sukriti Sabharwal, Sonali Bhattar, Shikhar Saxena
Abstract:
Introduction: COVID-19 molecular diagnostic laboratory is the cornerstone of the COVID-19 disease diagnosis as the patient’s treatment and management protocol depend on the molecular results. For this purpose, it is extremely important that the laboratory conducting these results adheres to the quality management processes to increase the accuracy and validity of the reports generated. We started our own molecular diagnostic setup at the onset of the pandemic. Therefore, we conducted this study to generate our quality management data to help us in improving on our weak points. Materials and Methods: A total of 14561 samples were evaluated by the retrospective observational method. The quality variables analysed were classified into pre-analytical, analytical, and post-analytical variables, and the results were presented in percentages. Results: Among the pre-analytical variables, sample leaking was the most common cause of the rejection of samples (134/14561, 0.92%), followed by non-generation of SRF ID (76/14561, 0.52%) and non-compliance to triple packaging (44/14561, 0.3%). The other pre-analytical aspects assessed were incomplete patient identification (17/14561, 0.11%), insufficient quantity of samples (12/14561, 0.08%), missing forms/samples (7/14561, 0.04%), samples in the wrong vials/empty VTM tubes (5/14561, 0.03%) and LIMS entry not done (2/14561, 0.01%). We are unable to obtain internal quality control in 0.37% of samples (55/14561). We also experienced two incidences of cross-contamination among the samples resulting in false-positive results. Among the post-analytical factors, a total of 0.07% of samples (11/14561) could not be dispatched within the stipulated time frame. Conclusion: Adherence to quality control processes is foremost for the smooth running of any diagnostic laboratory, especially the ones involved in critical reporting. Not only do the indicators help in keeping in check the laboratory parameters but they also allow comparison with other laboratories.Keywords: laboratory quality management, COVID-19, molecular diagnostics, healthcare
Procedia PDF Downloads 16538257 Divergence of Innovation Capabilities within the EU
Authors: Vishal Jaunky, Jonas Grafström
Abstract:
The development of the European Union’s (EU) single economic market and rapid technological change has resulted in major structural changes in EU’s member states economies. The general liberalization process that the countries has undergone together has convinced the governments of the member states of need to upgrade their economic and training systems in order to be able to face the economic globalization. Several signs of economic convergence have been found but less is known about the knowledge production. This paper addresses the convergence pattern of technological innovation in 13 European Union (EU) states over the time period 1990-2011 by means of parametric and non-parametric techniques. Parametric approaches revolve around the neoclassical convergence theories. This paper reveals divergence of both the β and σ types. Further, we found evidence of stochastic divergence and non-parametric convergence approach such as distribution dynamics shows a tendency towards divergence. This result is supported with the occurrence of γ-divergence. The policies of the EU to reduce technological gap among its member states seem to be missing its target, something that can have negative long run consequences for the market.Keywords: convergence, patents, panel data, European union
Procedia PDF Downloads 29038256 Problems of Boolean Reasoning Based Biclustering Parallelization
Authors: Marcin Michalak
Abstract:
Biclustering is the way of two-dimensional data analysis. For several years it became possible to express such issue in terms of Boolean reasoning, for processing continuous, discrete and binary data. The mathematical backgrounds of such approach — proved ability of induction of exact and inclusion–maximal biclusters fulfilling assumed criteria — are strong advantages of the method. Unfortunately, the core of the method has quite high computational complexity. In the paper the basics of Boolean reasoning approach for biclustering are presented. In such context the problems of computation parallelization are risen.Keywords: Boolean reasoning, biclustering, parallelization, prime implicant
Procedia PDF Downloads 12538255 Predicting Medical Check-Up Patient Re-Coming Using Sequential Pattern Mining and Association Rules
Authors: Rizka Aisha Rahmi Hariadi, Chao Ou-Yang, Han-Cheng Wang, Rajesri Govindaraju
Abstract:
As the increasing of medical check-up popularity, there are a huge number of medical check-up data stored in database and have not been useful. These data actually can be very useful for future strategic planning if we mine it correctly. In other side, a lot of patients come with unpredictable coming and also limited available facilities make medical check-up service offered by hospital not maximal. To solve that problem, this study used those medical check-up data to predict patient re-coming. Sequential pattern mining (SPM) and association rules method were chosen because these methods are suitable for predicting patient re-coming using sequential data. First, based on patient personal information the data was grouped into … groups then discriminant analysis was done to check significant of the grouping. Second, for each group some frequent patterns were generated using SPM method. Third, based on frequent patterns of each group, pairs of variable can be extracted using association rules to get general pattern of re-coming patient. Last, discussion and conclusion was done to give some implications of the results.Keywords: patient re-coming, medical check-up, health examination, data mining, sequential pattern mining, association rules, discriminant analysis
Procedia PDF Downloads 64238254 Troubleshooting Petroleum Equipment Based on Wireless Sensors Based on Bayesian Algorithm
Authors: Vahid Bayrami Rad
Abstract:
In this research, common methods and techniques have been investigated with a focus on intelligent fault finding and monitoring systems in the oil industry. In fact, remote and intelligent control methods are considered a necessity for implementing various operations in the oil industry, but benefiting from the knowledge extracted from countless data generated with the help of data mining algorithms. It is a avoid way to speed up the operational process for monitoring and troubleshooting in today's big oil companies. Therefore, by comparing data mining algorithms and checking the efficiency and structure and how these algorithms respond in different conditions, The proposed (Bayesian) algorithm using data clustering and their analysis and data evaluation using a colored Petri net has provided an applicable and dynamic model from the point of view of reliability and response time. Therefore, by using this method, it is possible to achieve a dynamic and consistent model of the remote control system and prevent the occurrence of leakage in oil pipelines and refineries and reduce costs and human and financial errors. Statistical data The data obtained from the evaluation process shows an increase in reliability, availability and high speed compared to other previous methods in this proposed method.Keywords: wireless sensors, petroleum equipment troubleshooting, Bayesian algorithm, colored Petri net, rapid miner, data mining-reliability
Procedia PDF Downloads 6738253 Modified Naive Bayes-Based Prediction Modeling for Crop Yield Prediction
Authors: Kefaya Qaddoum
Abstract:
Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.Keywords: tomato yield prediction, naive Bayes, redundancy, WSG
Procedia PDF Downloads 23738252 Comparative Study of Expository and Simulation Method of Teaching Woodwork at Federal University of Technology, Minna, Nigeria
Authors: Robert Ogbanje Okwori
Abstract:
The research studied expository and simulation method of teaching woodwork at Federal University of Technology, Minna, Niger State, Nigeria. The purpose of the study was to compare expository and simulation method of teaching woodwork and determine the method that is more effective in improving performance of students in woodwork. Two research questions and two hypotheses were formulated to guide the study. Fifteen objective questions and two theory questions were used for data collection. The questions set were on structure of timber. The study used the quasi experimental design. The population of the study consisted of 25 woodwork students of Federal University of Technology, Minna, Niger State, Nigeria and three hundred (300) level students were used for the study. The lesson plans for expository method and questions were validated by three lecturers in the Department of Industrial and Technology Education, Federal University of Technology, Minna, Nigeria. The validators checked the appropriates of test items and all the corrections and inputs were effected before administration of the instrument. Data obtained were analyzed using mean, standard deviation and t-test statistical tool. The null hypotheses were formulated and tested using t-test statistics at 0.05 level of significance. The findings of the study showed that simulation method of teaching has improved students’ performance in woodwork and the performance of the students was not influenced by gender. Based on the findings of the study, it was concluded that there was a significant difference in the mean achievement scores of students taught woodwork using simulation method. This implies that simulation method is more effective than expository method of teaching woodwork. Therefore, woodwork teachers should adopt simulation method of teaching woodwork towards better performance. It was recommended that simulation method should be used by woodwork lecturers to teach woodwork since students perform better using the method and also the teachers needs to be trained and re-trained in using simulation method for teaching woodwork. Teachers should be encouraged to use simulation method for their instructional delivery because it will allow them to identify their areas of strength and weakness when imparting knowledge to woodwork students. Government and different agencies should assist in procuring materials and equipment for wood workshops to enable students effectively practice what they have been taught using simulation method.Keywords: comparative, expository, simulation, woodwork
Procedia PDF Downloads 42838251 CFD Modeling of Boiling in a Microchannel Based On Phase-Field Method
Authors: Rahim Jafari, Tuba Okutucu-Özyurt
Abstract:
The hydrodynamics and heat transfer characteristics of a vaporized elongated bubble in a rectangular microchannel have been simulated based on Cahn-Hilliard phase-field method. In the simulations, the initially nucleated bubble starts growing as it comes in contact with superheated water. The growing shape of the bubble compared with the available experimental data in the literature.Keywords: microchannel, boiling, Cahn-Hilliard method, simulation
Procedia PDF Downloads 42538250 Compartmental Model Approach for Dosimetric Calculations of ¹⁷⁷Lu-DOTATOC in Adenocarcinoma Breast Cancer Based on Animal Data
Authors: M. S. Mousavi-Daramoroudi, H. Yousefnia, S. Zolghadri, F. Abbasi-Davani
Abstract:
Dosimetry is an indispensable and precious factor in patient treatment planning; to minimize the absorbed dose in vital tissues. In this study, In accordance with the proper characteristics of DOTATOC and ¹⁷⁷Lu, after preparing ¹⁷⁷Lu-DOTATOC at the optimal conditions for the first time in Iran, radionuclidic and radiochemical purity of the solution was investigated using an HPGe spectrometer and ITLC method, respectively. The biodistribution of the compound was assayed for treatment of adenocarcinoma breast cancer in bearing BALB/c mice. The results have demonstrated that ¹⁷⁷Lu-DOTATOC is a profitable selection for therapy of the tumors. Because of the vital role of internal dosimetry before and during therapy, the effort to improve the accuracy and rapidity of dosimetric calculations is necessary. For this reason, a new method was accomplished to calculate the absorbed dose through mixing between compartmental model, animal dosimetry and extrapolated data from animal to human and using MIRD method. Despite utilization of compartmental model based on the experimental data, it seems this approach may increase the accuracy of dosimetric data, confidently.Keywords: ¹⁷⁷Lu-DOTATOC, biodistribution modeling, compartmental model, internal dosimetry
Procedia PDF Downloads 22138249 Study of a Few Additional Posterior Projection Data to 180° Acquisition for Myocardial SPECT
Authors: Yasuyuki Takahashi, Hirotaka Shimada, Takao Kanzaki
Abstract:
A Dual-detector SPECT system is widely by use of myocardial SPECT studies. With 180-degree (180°) acquisition, reconstructed images are distorted in the posterior wall of myocardium due to the lack of sufficient data of posterior projection. We hypothesized that quality of myocardial SPECT images can be improved by the addition of data acquisition of only a few posterior projections to ordinary 180° acquisition. The proposed acquisition method (180° plus acquisition methods) uses the dual-detector SPECT system with a pair of detector arranged in 90° perpendicular. Sampling angle was 5°, and the acquisition range was 180° from 45° right anterior oblique to 45° left posterior oblique. After the acquisition of 180°, the detector moved to additional acquisition position of reverse side once for 2 projections, twice for 4 projections, or 3 times for 6 projections. Since these acquisition methods cannot be done in the present system, actual data acquisition was done by 360° with a sampling angle of 5°, and projection data corresponding to above acquisition position were extracted for reconstruction. We underwent the phantom studies and a clinical study. SPECT images were compared by profile curve analysis and also quantitatively by contrast ratio. The distortion was improved by 180° plus method. Profile curve analysis showed increased of cardiac cavity. Analysis with contrast ratio revealed that SPECT images of the phantoms and the clinical study were improved from 180° acquisition by the present methods. The difference in the contrast was not clearly recognized between 180° plus 2 projections, 180° plus 4 projections, and 180° plus 6 projections. 180° plus 2 projections method may be feasible for myocardial SPECT because distortion of the image and the contrast were improved.Keywords: 180° plus acquisition method, a few posterior projections, dual-detector SPECT system, myocardial SPECT
Procedia PDF Downloads 29638248 An Analysis of Oil Price Changes and Other Factors Affecting Iranian Food Basket: A Panel Data Method
Authors: Niloofar Ashktorab, Negar Ashktorab
Abstract:
Oil exports fund nearly half of Iran’s government expenditures, since many years other countries have been imposed different sanctions against Iran. Sanctions that primarily target Iran’s key energy sector have harmed Iran’s economy. The strategic effects of sanctions might be reduction as Iran adjusts to them economically. In this study, we evaluate the impact of oil price and sanctions against Iran on food commodity prices by using panel data method. Here, we find that the food commodity prices, the oil price and real exchange rate are stationary. The results show positive effect of oil price changes, real exchange rate and sanctions on food commodity prices.Keywords: oil price, food basket, sanctions, panel data, Iran
Procedia PDF Downloads 35738247 Structural Health Monitoring of the 9-Story Torre Central Building Using Recorded Data and Wave Method
Authors: Tzong-Ying Hao, Mohammad T. Rahmani
Abstract:
The Torre Central building is a 9-story shear wall structure located in Santiago, Chile, and has been instrumented since 2009. Events of different intensity (ambient vibrations, weak and strong earthquake motions) have been recorded, and thus the building can serve as a full-scale benchmark to evaluate the structural health monitoring method developed. The first part of this article presents an analysis of inter-story drifts, and of changes in the first system frequencies (estimated from the relative displacement response of the 8th-floor with respect to the basement from recorded data) as baseline indicators of the occurrence of damage. During 2010 Chile earthquake the system frequencies were detected decreasing approximately 24% in the EW and 27% in NS motions. Near the end of shaking, an increase of about 17% in the EW motion was detected. The structural health monitoring (SHM) method based on changes in wave traveling time (wave method) within a layered shear beam model of structure is presented in the second part of this article. If structural damage occurs the velocity of wave propagated through the structure changes. The wave method measures the velocities of shear wave propagation from the impulse responses generated by recorded data at various locations inside the building. Our analysis and results show that the detected changes in wave velocities are consistent with the observed damages. On this basis, the wave method is proven for actual implementation in structural health monitoring systems.Keywords: Chile earthquake, damage detection, earthquake response, impulse response, layered shear beam, structural health monitoring, Torre Central building, wave method, wave travel time
Procedia PDF Downloads 36438246 Development of a Multi-User Country Specific Food Composition Table for Malawi
Authors: Averalda van Graan, Joelaine Chetty, Malory Links, Agness Mwangwela, Sitilitha Masangwi, Dalitso Chimwala, Shiban Ghosh, Elizabeth Marino-Costello
Abstract:
Food composition data is becoming increasingly important as dealing with food insecurity and malnutrition in its persistent form of under-nutrition is now coupled with increasing over-nutrition and its related ailments in the developing world, of which Malawi is not spared. In the absence of a food composition database (FCDB) inherent to our dietary patterns, efforts were made to develop a country-specific FCDB for nutrition practice, research, and programming. The main objective was to develop a multi-user, country-specific food composition database, and table from existing published and unpublished scientific literature. A multi-phased approach guided by the project framework was employed. Phase 1 comprised a scoping mission to assess the nutrition landscape for compilation activities. Phase 2 involved training of a compiler and data collection from various sources, primarily; institutional libraries, online databases, and food industry nutrient data. Phase 3 subsumed evaluation and compilation of data using FAO and IN FOODS standards and guidelines. Phase 4 concluded the process with quality assurance. 316 Malawian food items categorized into eight food groups for 42 components were captured. The majority were from the baby food group (27%), followed by a staple (22%) and animal (22%) food group. Fats and oils consisted the least number of food items (2%), followed by fruits (6%). Proximate values are well represented; however, the percent missing data is huge for some components, including Se 68%, I 75%, Vitamin A 42%, and lipid profile; saturated fat 53%, mono-saturated fat 59%, poly-saturated fat 59% and cholesterol 56%. A multi-phased approach following the project framework led to the development of the first Malawian FCDB and table. The table reflects inherent Malawian dietary patterns and nutritional concerns. The FCDB can be used by various professionals in nutrition and health. Rising over-nutrition, NCD, and changing diets challenge us for nutrient profiles of processed foods and complete lipid profiles.Keywords: analytical data, dietary pattern, food composition data, multi-phased approach
Procedia PDF Downloads 9438245 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 30038244 Numerical Resolving of Net Faradaic Current in Fast-Scan Cyclic Voltammetry Considering Induced Charging Currents
Authors: Gabriel Wosiak, Dyovani Coelho, Evaldo B. Carneiro-Neto, Ernesto C. Pereira, Mauro C. Lopes
Abstract:
In this work, the theoretical and experimental effects of induced charging currents on fast-scan cyclic voltammetry (FSCV) are investigated. Induced charging currents arise from the effect of ohmic drop in electrochemical systems, which depends on the presence of an uncompensated resistance. They cause the capacitive contribution to the total current to be different from the capacitive current measured in the absence of electroactive species. The paper shows that the induced charging current is relevant when the capacitive current magnitude is close to the total current, even for systems with low time constant. In these situations, the conventional background subtraction method may be inaccurate. A method is developed that separates the faradaic and capacitive currents by using a combination of voltametric experimental data and finite element simulation, by the obtention of a potential-dependent capacitance. The method was tested in a standard electrochemical cell with Platinum ultramicroelectrodes, in different experimental conditions as well in previously reported data in literature. The proposed method allows the real capacitive current to be separated even in situations where the conventional background subtraction method is clearly inappropriate.Keywords: capacitive current, fast-scan cyclic voltammetry, finite-element method, electroanalysis
Procedia PDF Downloads 7538243 Detection of Keypoint in Press-Fit Curve Based on Convolutional Neural Network
Authors: Shoujia Fang, Guoqing Ding, Xin Chen
Abstract:
The quality of press-fit assembly is closely related to reliability and safety of product. The paper proposed a keypoint detection method based on convolutional neural network to improve the accuracy of keypoint detection in press-fit curve. It would provide an auxiliary basis for judging quality of press-fit assembly. The press-fit curve is a curve of press-fit force and displacement. Both force data and distance data are time-series data. Therefore, one-dimensional convolutional neural network is used to process the press-fit curve. After the obtained press-fit data is filtered, the multi-layer one-dimensional convolutional neural network is used to perform the automatic learning of press-fit curve features, and then sent to the multi-layer perceptron to finally output keypoint of the curve. We used the data of press-fit assembly equipment in the actual production process to train CNN model, and we used different data from the same equipment to evaluate the performance of detection. Compared with the existing research result, the performance of detection was significantly improved. This method can provide a reliable basis for the judgment of press-fit quality.Keywords: keypoint detection, curve feature, convolutional neural network, press-fit assembly
Procedia PDF Downloads 23138242 Appropriation of Cryptocurrencies as a Payment Method by South African Retailers
Authors: Neliswa Dyosi
Abstract:
Purpose - Using an integrated Technology-Organization-Environment (TOE) framework and the model of technology appropriation (MTA) as a theoretical lens, this interpretive qualitative study seeks to understand and explain the factors that influence the appropriation, non-appropriation, and disappropriation of bitcoin as a payment method by South African retailers. Design/methodology/approach –The study adopts the interpretivist philosophical paradigm. Multiple case studies will be adopted as a research strategy. For data collection, the study follows a qualitative approach. Qualitative data will be collected from the six retailers in various industries. Semi-structured interviews and documents will be used as the data collection techniques. Purposive and snowballing sampling techniques will be used to identify participants within the organizations. Data will be analyzed using thematic analysis. Originality/value - Using the deduction approach, the study seeks to provide a descriptive and explanatory contribution to theory. The study contributes to theory development by integrating the MTA and TOE frameworks as a means to understand technology adoption behaviors of organizations, in this case, retailers. This is also the first study that looks at an integrated approach of the Technology-Organization-Environment (TOE) framework and the MTA framework to understand the adoption and use of a payment method. South Africa is ranked amongst the top ten countries in the world on cryptocurrency adoption. There is, however, still a dearth of literature on the current state of adoption and usage of bitcoin as a payment method in South Africa. The study will contribute to the existing literature as bitcoin cryptocurrency is gaining popularity as an alternative payment method across the globe.Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriation
Procedia PDF Downloads 13738241 A Comparative Study to Evaluate Chronological Age and Dental Age in the North Indian Population Using Cameriere's Method
Authors: Ranjitkumar Patil
Abstract:
Age estimation has importance in forensic dentistry. Dental age estimation has emerged as an alternative to skeletal age determination. The methods based on stages of tooth formation, as appreciated on radiographs, seem to be more appropriate in the assessment of age than those based on skeletal development. The study was done to evaluate dental age in the north Indian population using Cameriere’s method. Aims/Objectives: The study was conducted to assess the dental age of North Indian children using Cameriere’s method and to compare the chronological age and dental age for validation of the Cameriere’s method in the north Indian population. A comparative study of 02-year duration on the OPG (using PLANMECA Promax 3D) data of 497 individuals with ages ranging from 5 to 15 years was done based on simple random technique ethical approval obtained from institutional ethical committee. The data was obtained based on inclusion and exclusion criteria and was analyzed by software for dental age estimation. Statistical analysis: The student’s t-test was used to compare the morphological variables of males with those of females and to compare observed age with estimated age. The regression formula was also calculated. Results: Present study was a comparative study of 497 subjects with a distribution between males and females, with their dental age assessed by using a Panoramic radiograph, following the method described by Cameriere, which is widely accepted. Statistical analysis in our study indicated that gender does not have a significant influence on age estimation. (R2= 0.787). Conclusion: This infers that Cameriere’s method can be effectively applied to the north Indian population.Keywords: forensic, dental age, skeletal age, chronological age, Cameriere’s method
Procedia PDF Downloads 11638240 Teacher’s Perception of Dalcroze Method Course as Teacher’s Enhancement Course: A Case Study in Hong Kong
Authors: Ka Lei Au
Abstract:
The Dalcroze method has been emerging in music classrooms, and music teachers are encouraged to integrate music and movement in their teaching. Music programs in colleges in Hong Kong have been introducing method courses such as Orff and Dalcroze method in music teaching as teacher’s education program. Since the targeted students of the course are music teachers who are making the decision of what approach to use in their classroom, their perception is significantly valued to identify how this approach is applicable in their teaching in regards to the teaching and learning culture and environment. This qualitative study aims to explore how the Dalcroze method as a teacher’s education course is perceived by music teachers from three aspects: 1) application in music teaching, 2) self-enhancement, 3) expectation. Through the lens of music teachers, data were collected from 30 music teachers who are taking the Dalcroze method course in music teaching in Hong Kong by the survey. The findings reveal the value and their intention of the Dalcroze method in Hong Kong. It also provides a significant reference for better development of such courses in the future in adaption to the culture, teaching and learning environment and teacher’s, student’s and parent’s perception of this approach.Keywords: Dalcroze method, music teaching, perception, self-enhancement, teacher’s education
Procedia PDF Downloads 40538239 Optimizing Communications Overhead in Heterogeneous Distributed Data Streams
Authors: Rashi Bhalla, Russel Pears, M. Asif Naeem
Abstract:
In this 'Information Explosion Era' analyzing data 'a critical commodity' and mining knowledge from vertically distributed data stream incurs huge communication cost. However, an effort to decrease the communication in the distributed environment has an adverse influence on the classification accuracy; therefore, a research challenge lies in maintaining a balance between transmission cost and accuracy. This paper proposes a method based on Bayesian inference to reduce the communication volume in a heterogeneous distributed environment while retaining prediction accuracy. Our experimental evaluation reveals that a significant reduction in communication can be achieved across a diverse range of dataset types.Keywords: big data, bayesian inference, distributed data stream mining, heterogeneous-distributed data
Procedia PDF Downloads 16138238 The Identification of Environmentally Friendly People: A Case of South Sumatera Province, Indonesia
Authors: Marpaleni
Abstract:
The intergovernmental Panel on Climate Change (IPCC) declared in 2007 that global warming and climate change are not just a series of events caused by nature, but rather caused by human behaviour. Thus, to reduce the impact of human activities on climate change it is required to have information about how people respond to the environmental issues and what constraints they face. However, information on these and other phenomena remains largely missing, or not fully integrated within the existing data systems. The proposed study is aimed at filling the gap in this knowledge by focusing on Environmentally Friendly Behaviour (EFB) of the people of Indonesia, by taking the province of South Sumatera as a case of study. EFB is defined as any activity in which people engage to improve the conditions of the natural resources and/or to diminish the impact of their behaviour on the environment. This activity is measured in terms of consumption in five areas at the household level, namely housing, energy, water usage, recycling and transportation. By adopting the Indonesia’s Environmentally Friendly Behaviour conducted by Statistics Indonesia in 2013, this study aims to precisely identify one’s orientation towards EFB based on socio demographic characteristics such as: age, income, occupation, location, education, gender and family size. The results of this research will be useful to precisely identify what support people require to strengthen their EFB, to help identify specific constraints that different actors and groups face and to uncover a more holistic understanding of EFB in relation to particular demographic and socio-economics contexts. As the empirical data are examined from the national data sample framework, which will continue to be collected, it can be used to forecast and monitor the future of EFB.Keywords: environmentally friendly behavior, demographic, South Sumatera, Indonesia
Procedia PDF Downloads 28538237 A Comprehensive Method of Fault Detection and Isolation based on Testability Modeling Data
Authors: Junyou Shi, Weiwei Cui
Abstract:
Testability modeling is a commonly used method in testability design and analysis of system. A dependency matrix will be obtained from testability modeling, and we will give a quantitative evaluation about fault detection and isolation. Based on the dependency matrix, we can obtain the diagnosis tree. The tree provides the procedures of the fault detection and isolation. But the dependency matrix usually includes built-in test (BIT) and manual test in fact. BIT runs the test automatically and is not limited by the procedures. The method above cannot give a more efficient diagnosis and use the advantages of the BIT. A Comprehensive method of fault detection and isolation is proposed. This method combines the advantages of the BIT and Manual test by splitting the matrix. The result of the case study shows that the method is effective.Keywords: fault detection, fault isolation, testability modeling, BIT
Procedia PDF Downloads 33638236 Information Extraction Based on Search Engine Results
Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk
Abstract:
The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.Keywords: search engines, information extraction, agent system
Procedia PDF Downloads 43038235 Big Data Strategy for Telco: Network Transformation
Abstract:
Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and next-generation network, however, are more exorbitant than improved customer relationship management. Next generation of networks are in a prime position to monetize rich supplies of customer information—while being mindful of legal and privacy issues. As data assets are transformed into new revenue streams will become integral to high performance.Keywords: big data, next generation networks, network transformation, strategy
Procedia PDF Downloads 361