Search results for: high correlated data
39511 Gut Metabolite Profiling of the Ethnic Groups from Assam, India
Authors: Madhusmita Dehingia, Supriyo Sen, Bhuwan Bhaskar, Tulsi Joishy, Mojibur R. Khan
Abstract:
Human gut microbes and their metabolites are important for maintaining homeostasis in the gut and are responsible for many metabolic and immune mediated diseases. In the present study, we determined the profiles of the gut metabolites of five different ethnic groups (Bodo, Tai-Phake, Karbi, Tea tribe and Tai-Aiton) of Assam. Fecal metabolite profiling of the 39 individuals belonging to the ethnic groups was carried out using Gas chromatography – Mass spectrometry (GC-MS), and comparison was performed among the tribes for common and unique metabolites produced within their gut. Partial Least Squares Discriminant Analysis (PLS-DA) of the metabolites suggested that the individuals grouped according to their ethnicity. Among the 66 abundant metabolites, 12 metabolites were found to be common among the five ethnic groups. Additionally, ethnicity wise some unique metabolites were also detected. For example, the tea tribe of Assam contained the tea components, Aniline and Benzoate more in their gut in comparison to others. Metabolites of microbial origin were also correlated with the already published metagenomic data of the same ethnic group and functional analysis were carried out based on human metabolome database.Keywords: ethnicity, gut microbiota, GC-MS, metabolites
Procedia PDF Downloads 42339510 Computational Identification of Signalling Pathways in Protein Interaction Networks
Authors: Angela U. Makolo, Temitayo A. Olagunju
Abstract:
The knowledge of signaling pathways is central to understanding the biological mechanisms of organisms since it has been identified that in eukaryotic organisms, the number of signaling pathways determines the number of ways the organism will react to external stimuli. Signaling pathways are studied using protein interaction networks constructed from protein-protein interaction data obtained using high throughput experimental procedures. However, these high throughput methods are known to produce very high rates of false positive and negative interactions. In order to construct a useful protein interaction network from this noisy data, computational methods are applied to validate the protein-protein interactions. In this study, a computational technique to identify signaling pathways from a protein interaction network constructed using validated protein-protein interaction data was designed. A weighted interaction graph of the Saccharomyces cerevisiae (Baker’s Yeast) organism using the proteins as the nodes and interactions between them as edges was constructed. The weights were obtained using Bayesian probabilistic network to estimate the posterior probability of interaction between two proteins given the gene expression measurement as biological evidence. Only interactions above a threshold were accepted for the network model. A pathway was formalized as a simple path in the interaction network from a starting protein and an ending protein of interest. We were able to identify some pathway segments, one of which is a segment of the pathway that signals the start of the process of meiosis in S. cerevisiae.Keywords: Bayesian networks, protein interaction networks, Saccharomyces cerevisiae, signalling pathways
Procedia PDF Downloads 54539509 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach
Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak
Abstract:
Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity
Procedia PDF Downloads 16139508 The Extent of Big Data Analysis by the External Auditors
Authors: Iyad Ismail, Fathilatul Abdul Hamid
Abstract:
This research was mainly investigated to recognize the extent of big data analysis by external auditors. This paper adopts grounded theory as a framework for conducting a series of semi-structured interviews with eighteen external auditors. The research findings comprised the availability extent of big data and big data analysis usage by the external auditors in Palestine, Gaza Strip. Considering the study's outcomes leads to a series of auditing procedures in order to improve the external auditing techniques, which leads to high-quality audit process. Also, this research is crucial for auditing firms by giving an insight into the mechanisms of auditing firms to identify the most important strategies that help in achieving competitive audit quality. These results are aims to instruct the auditing academic and professional institutions in developing techniques for external auditors in order to the big data analysis. This paper provides appropriate information for the decision-making process and a source of future information which affects technological auditing.Keywords: big data analysis, external auditors, audit reliance, internal audit function
Procedia PDF Downloads 7039507 The Effect of Acute Aerobic Exercise after Consumption of Four Different Diets on Serum Levels Irisin, Insulin and Glucose in Overweight Men
Authors: Majid Mardaniyan Ghahfarokhi, Abdolhamid Habibi, Majid Mohammad Shahi
Abstract:
The combination of exercise and diet as the most important strategy to reduce weight and control obesity-related factors, including Irisin, Insulin, and Glucose was raised. The aim of this study was to investigate the effect of aerobic exercise combined with four different diets on serum levels of Irisin, Insulin, and Glucose in overweight men. Methods: In this quasi-experimental study, 8 overweight men (BMI 29.23±0.47) with average age of (23±1.6) voluntarily participated in 4 sessions by one-week interval. The study was done in exercise physiology lab. In each session, subjects performed a 30 minutes treadmill test with 60-70% of maximum heart rate, after consuming a high carbohydrate, high-fat, high-protein and normal diet. For biochemical measurement, three blood samples were taken in fasting state, two hours after meals and after exercise Results: Statistical analysis of data showed that the serum levels of Irisin after consumption all four diets had been reduced which this reduce as a result of high-fat diet that were significantly (p ≤ 0/038). Serum concentration of Insulin and Glucose increased after consuming four diets. However, increase in serum Insulin and Glucose was significant only after consuming high-carbohydrate diet (Respectively p ≤ 0/001, p ≤ 0/042). In addition, during exercise after consuming all four regular diet, high carbohydrate, high-protein and high-fat, Irisin significant increased significantly (Respectively p ≤ 0/021, p ≤ 0/049, p ≤ 0/001, P ≤ 0/003), Insulin decreased significantly (Respectively p ≤ 0/002, p ≤ 0/001, p ≤ 0/001, p ≤ 0/002) and Glucose were significantly reduced (Respectively p ≤ 0/001, p ≤ 0/001, P ≤ 0/001, p ≤ 0/002). After aerobic activity following the consumption of a high protein diet the highest increase in irisin levels, and after aerobic exercise following consumption of high carbohydrate diet the greatest decrease in insulin and glucose levels were observed. Conclusion: It seems that diet alone and exercises following different consumption diets can have a significant effect on Irisin, Insulin, and Glucose serum levels in overweight young men.Keywords: acute aerobic exercise, diet, irisin, overweight
Procedia PDF Downloads 25939506 The Impact of Digital Inclusive Finance on the High-Quality Development of China's Export Trade
Authors: Yao Wu
Abstract:
In the context of financial globalization, China has put forward the policy goal of high-quality development, and the digital economy, with its advantage of information resources, is driving China's export trade to achieve high-quality development. Due to the long-standing financing constraints of small and medium-sized export enterprises, how to expand the export scale of small and medium-sized enterprises has become a major threshold for the development of China's export trade. This paper firstly adopts the hierarchical analysis method to establish the evaluation system of high-quality development of China's export trade; secondly, the panel data of 30 provinces in China from 2011 to 2018 are selected for empirical analysis to establish the impact model of digital inclusive finance on the high-quality development of China's export trade; based on the analysis of heterogeneous enterprise trade model, a mediating effect model is established to verify the mediating role of credit constraint in the development of high-quality export trade in China. Based on the above analysis, this paper concludes that inclusive digital finance, with its unique digital and inclusive nature, alleviates the credit constraint problem among SMEs, enhances the binary marginal effect of SMEs' exports, optimizes their export scale and structure, and promotes the high-quality development of regional and even national export trade. Finally, based on the findings of this paper, we propose insights and suggestions for inclusive digital finance to promote the high-quality development of export trade.Keywords: digital inclusive finance, high-quality development of export trade, fixed effects, binary marginal effects
Procedia PDF Downloads 9339505 A Machine Learning Decision Support Framework for Industrial Engineering Purposes
Authors: Anli Du Preez, James Bekker
Abstract:
Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.Keywords: Data analytics, Industrial engineering, Machine learning, Value creation
Procedia PDF Downloads 16839504 Applying Laser Scanning and Digital Photogrammetry for Developing an Archaeological Model Structure for Old Castle in Germany
Authors: Bara' Al-Mistarehi
Abstract:
Documentation and assessment of conservation state of an archaeological structure is a significant procedure in any management plan. However, it has always been a challenge to apply this with a low coast and safe methodology. It is also a time-demanding procedure. Therefore, a low cost, efficient methodology for documenting the state of a structure is needed. In the scope of this research, this paper will employ digital photogrammetry and laser scanner to one of highly significant structures in Germany, The Old Castle (German: Altes Schloss). The site is well known for its unique features. However, the castle suffers from serious deterioration threats because of the environmental conditions and the absence of continuous monitoring, maintenance and repair plans. Digital photogrammetry is a generally accepted technique for the collection of 3D representations of the environment. For this reason, this image-based technique has been extensively used to produce high quality 3D models of heritage sites and historical buildings for documentation and presentation purposes. Additionally, terrestrial laser scanners are used, which directly measure 3D surface coordinates based on the run-time of reflected light pulses. These systems feature high data acquisition rates, good accuracy and high spatial data density. Despite the potential of each single approach, in this research work maximum benefit is to be expected by a combination of data from both digital cameras and terrestrial laser scanners. Within the paper, the usage, application and advantages of the technique will be investigated in terms of building high realistic 3D textured model for some parts of the old castle. The model will be used as diagnosing tool of the conservation state of the castle and monitoring mean for future changes.Keywords: Digital photogrammetry, Terrestrial laser scanners, 3D textured model, archaeological structure
Procedia PDF Downloads 17939503 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters
Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu
Abstract:
Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning
Procedia PDF Downloads 20139502 South African Students' Statistical Literacy in the Conceptual Understanding about Measures of Central Tendency after Completing Their High School Studies
Authors: Lukanda Kalobo
Abstract:
In South Africa, the High School Mathematics Curriculum provides teachers with specific aims and skills to be developed which involves the understanding about the measures of central tendency. The exploration begins with the definitions of statistical literacy, measurement of central tendency and a discussion on why statistical literacy is essential today. It furthermore discusses the statistical literacy basics involved in understanding the concepts of measures of central tendency. The statistical literacy test on the measures of central tendency, was used to collect data which was administered to 78 first year students direct from high schools. The results indicated that students seemed to have forgotten about the statistical literacy in understanding the concepts of measure of central tendency after completing their high school study. The authors present inferences regarding the alignment between statistical literacy and the understanding of the concepts about the measures of central tendency, leading to the conclusion that there is a need to provide in-service and pre-service training.Keywords: conceptual understanding, mean, median, mode, statistical literacy
Procedia PDF Downloads 30439501 Item-Trait Pattern Recognition of Replenished Items in Multidimensional Computerized Adaptive Testing
Authors: Jianan Sun, Ziwen Ye
Abstract:
Multidimensional computerized adaptive testing (MCAT) is a popular research topic in psychometrics. It is important for practitioners to clearly know the item-trait patterns of administered items when a test like MCAT is operated. Item-trait pattern recognition refers to detecting which latent traits in a psychological test are measured by each of the specified items. If the item-trait patterns of the replenished items in MCAT item pool are well detected, the interpretability of the items can be improved, which can further promote the abilities of the examinees who attending the MCAT to be accurately estimated. This research explores to solve the item-trait pattern recognition problem of the replenished items in MCAT item pool from the perspective of statistical variable selection. The popular multidimensional item response theory model, multidimensional two-parameter logistic model, is assumed to fit the response data of MCAT. The proposed method uses the least absolute shrinkage and selection operator (LASSO) to detect item-trait patterns of replenished items based on the essential information of item responses and ability estimates of examinees collected from a designed MCAT procedure. Several advantages of the proposed method are outlined. First, the proposed method does not strictly depend on the relative order between the replenished items and the selected operational items, so it allows the replenished items to be mixed into the operational items in reasonable order such as considering content constraints or other test requirements. Second, the LASSO used in this research improves the interpretability of the multidimensional replenished items in MCAT. Third, the proposed method can exert the advantage of shrinkage method idea for variable selection, so it can help to check item quality and key dimension features of replenished items and saves more costs of time and labors in response data collection than traditional factor analysis method. Moreover, the proposed method makes sure the dimensions of replenished items are recognized to be consistent with the dimensions of operational items in MCAT item pool. Simulation studies are conducted to investigate the performance of the proposed method under different conditions for varying dimensionality of item pool, latent trait correlation, item discrimination, test lengths and item selection criteria in MCAT. Results show that the proposed method can accurately detect the item-trait patterns of the replenished items in the two-dimensional and the three-dimensional item pool. Selecting enough operational items from the item pool consisting of high discriminating items by Bayesian A-optimality in MCAT can improve the recognition accuracy of item-trait patterns of replenished items for the proposed method. The pattern recognition accuracy for the conditions with correlated traits is better than those with independent traits especially for the item pool consisting of comparatively low discriminating items. To sum up, the proposed data-driven method based on the LASSO can accurately and efficiently detect the item-trait patterns of replenished items in MCAT.Keywords: item-trait pattern recognition, least absolute shrinkage and selection operator, multidimensional computerized adaptive testing, variable selection
Procedia PDF Downloads 13039500 Early Hypothyroidism after Radiotherapy for Nasopharyngeal Carcinoma
Authors: Nejla Fourati, Zied Fessi, Fatma Dhouib, Wicem Siala, Leila Farhat, Afef Khanfir, Wafa Mnejja, Jamel Daoud
Abstract:
Purpose: Radiation induced hypothyroidism in nasopharyngeal cancer (NPC) ranged from 15% to 55%. In reported data, it is considered as a common late complication of definitive radiation and is mainly observed 2 years after the end of treatment. The aim of this study was to evaluate the incidence of early hypothyroidism within 6 months after radiotherapy. Patients and methods: From June 2017 to February 2020, 35 patients treated with concurrent chemo-radiotherapy (CCR) for NPC were included in this prospective study. Median age was 49 years [23-68] with a sex ratio of 2.88. All patients received intensity modulated radiotherapy (IMRT) at a dose of 69.96 Gy in 33 daily fractions with weekly cisplatin (40mg/m²) chemotherapy. Thyroid stimulating hormone (TSH) and Free Thyroxine 4 (FT4) dosage was performed before the start of radiotherapy and 6 months after. Different dosimetric parameters for the thyroid gland were reported: the volume (cc); the mean dose (Dmean) and the %age of volume receiving more than 45 Gy (V45Gy). Wilcoxon Test was used to compare these different parameters between patients with or without hypothyroidism. Results: At baseline, 5 patients (14.3%) had hypothyroidism and were excluded from the analysis. For the remaining 30 patients, 9 patients (30%) developed a hypothyroidism 6 months after the end of radiotherapy. The median thyroid volume was 10.3 cc [4.6-23]. The median Dmean and V45Gy were 48.3 Gy [43.15-55.4] and 74.8 [38.2-97.9] respectively. No significant difference was noted for all studied parameters. Conclusion: Early hypothyroidism occurring within 6 months after CCR for NPC seems to be a common complication (30%) that should be screened. Good patient monitoring with regular dosage of TSH and FT4 makes it possible to treat hypothyroidism in asymptomatic phase. This would be correlated with an improvement in the quality of life of these patients. The results of our study do not show a correlation between the thyroid doses and the occurrence of hypothyroidism. This is probably related to the high doses received by the thyroid in our series. These findings encourage more optimization to limit thyroid doses and then the risk of radiation-induced hypothyroidismKeywords: nasopharyngeal carcinoma, hypothyroidism, early complication, thyroid dose
Procedia PDF Downloads 13139499 Surveying Earthquake Vulnerabilities of District 13 of Kabul City, Afghanistan
Authors: Mohsen Mohammadi, Toshio Fujimi
Abstract:
High population and irregular urban development in Kabul city, Afghanistan's capital, are among factors that increase its vulnerability to earthquake disasters (on top of its location in a high seismic region); this can lead to widespread economic loss and casualties. This study aims to evaluate earthquake risks in Kabul's 13th district based on scientific data. The research data, which include hazard curves of Kabul, vulnerability curves, and a questionnaire survey through sampling in district 13, have been incorporated to develop risk curves. To estimate potential casualties, we used a set of M parameters in a model developed by Coburn and Spence. The results indicate that in the worst case scenario, more than 90% of district 13, which comprises mostly residential buildings, is exposed to high risk; this may lead to nearly 1000 million USD economic loss and 120 thousand casualties (equal to 25.88% of the 13th district's population) for a nighttime earthquake. To reduce risks, we present the reconstruction of the most vulnerable buildings, which are primarily adobe and masonry buildings. A comparison of risk reduction between reconstructing adobe and masonry buildings indicates that rebuilding adobe buildings would be more effective.Keywords: earthquake risk evaluation, Kabul, mitigation, vulnerability
Procedia PDF Downloads 28139498 Evaluation of Fluidized Bed Bioreactor Process for Mmabatho Waste Water Treatment Plant
Authors: Shohreh Azizi, Wag Nel
Abstract:
The rapid population growth in South Africa has increased the requirement of waste water treatment facilities. The aim of this study is to assess the potential use of Fluidized bed Bio Reactor for Mmabatho sewage treatment plant. The samples were collected from the Inlet and Outlet of reactor daily to analysis the pH, Chemical Oxygen Demand (COD), Biochemical Oxygen Demand (BOD), Total Suspended Solid (TSS) as per standard method APHA 2005. The studies were undertaken on a continue laboratory scale, and analytical data was collected before and after treatment. The reduction of 87.22 % COD, 89.80 BOD % was achieved. Fluidized Bed Bio Reactor remove Bod/COD removal as well as nutrient removal. The efforts also made to study the impact of the biological system if the domestic wastewater gets contaminated with any industrial contamination and the result shows that the biological system can tolerate high Total dissolved solids up to 6000 mg/L as well as high heavy metal concentration up to 4 mg/L. The data obtained through the experimental research are demonstrated that the FBBR may be used (<3 h total Hydraulic Retention Time) for secondary treatment in Mmabatho wastewater treatment plant.Keywords: fluidized bed bioreactor, wastewater treatment plant, biological system, high TDS, heavy metal
Procedia PDF Downloads 16739497 Levels of Selected Adipokines in Women with Gestational Diabetes and Type 2 Diabetes, Their Relationship to Metabolic Parameters
Authors: David Karasek, Veronika Kubickova, Ondrej Krystynik, Dominika Goldmannova, Lubica Cibickova, Jan Schovanek
Abstract:
Introduction: Adiponectin, adipocyte-fatty acid-binding protein (A-FABP), and Wnt1 inducible signaling pathway protein-1 (WISP-1) are adipokines particularly associated with insulin resistance. The aim of the study was to compare their levels in women with gestational diabetes (GDM), type 2 diabetes mellitus (T2DM) and healthy controls and determine their relation with metabolic parameters. Methods: Fifty women with GDM, 50 women with T2DM, and 35 healthy women were included in the study. In addition to adipokines, anthropometric, lipid parameters, and markers, insulin resistance, and glucose control were assessed in all participants. Results: Compared to healthy controls only significantly lower levels of adiponectin were detected in women with GDM, whereas lower levels of adiponectin, higher levels of A-FABP and of WISP-1 were present in women with T2DM. Women with T2DM had also lower levels of adiponectin and higher levels of A-FABP compared to women with GDM. In women with GDM or T2DM adiponectin correlated negatively with body mass index (BMI), triglycerides (TG), C-peptide and positively with HDL-cholesterol; A-FABP positively correlated with BMI, TG, waist, and C-peptide. Moreover, there was a positive correlation between WISP-1 and C-peptide in women with T2DM. Conclusion: Adverse adipokines production detecting dysfunctional fat tissue is in women with GDM less presented than in women with T2DM, but more expressed compared to healthy women. Acknowledgment: Supported by AZV NV18-01-00139 and MH CZ DRO (FNOl, 00098892).Keywords: adiponectin, adipocyte-fatty acid binding protein, wnt1 inducible signaling pathway protein-1, gestational diabetes, type 2 diabetes mellitus
Procedia PDF Downloads 13439496 Blockchain Solutions for IoT Challenges: Overview
Authors: Amir Ali Fatoorchi
Abstract:
Regardless of the advantage of LoT devices, they have limitations like storage, compute, and security problems. In recent years, a lot of Blockchain-based research in IoT published and presented. In this paper, we present the Security issues of LoT. IoT has three levels of security issues: Low-level, Intermediate-level, and High-level. We survey and compare blockchain-based solutions for high-level security issues and show how the underlying technology of bitcoin and Ethereum could solve IoT problems.Keywords: Blockchain, security, data security, IoT
Procedia PDF Downloads 21039495 Analysis of School Burnout and Academic Motivation through Structural Equation Modeling
Authors: Ismail Seçer
Abstract:
The purpose of this study is to analyze the relationship between school burnout and academic motivation in high school students. The working group of the study consists of 455 students from the high schools in Erzurum city center, selected with appropriate sampling method. School Burnout Scale and Academic Motivation Scale were used in the study to collect data. Correlation analysis and structural equation modeling were used in the analysis of the data collected through the study. As a result of the study, it was determined that there are significant and negative relations between school burnout and academic motivation, and the school burnout has direct and indirect significant effects on the getting over himself, using knowledge and exploration dimension through the latent variable of academic motivation. Lastly, it was determined that school burnout is a significant predictor of academic motivation.Keywords: school burnout, motivation, structural equation modeling, university
Procedia PDF Downloads 32539494 Spiritual Symbols of African Fruits as Responsive Catalysts for Naturopathy
Authors: Orogun Daniel Oghenekevhwe
Abstract:
Africa being an agrarian continent has an abundance of fruits that are both nutritional and medicinal. Regardless of the abundance of these healing elements, Africa leads the statistics of poor healthcare globally. Among others, there are two noticeable challenges in the healthcare system which are ‘Poor access and high cost of medical healthcare’. The effects of both the access and economic implications are (1) Low responsiveness and (2) High mortality rate. While the United Nations and the global health community continue to work towards reduced mortality rates and poor responsiveness to healthcare and wellness, this paper investigates how some Africans use the spiritual symbols of African fruits as responsive catalysts to embrace naturopathy thereby reducing the effects and impacts of poor healthcare challenges in Africa. The main argument is whether there are links between spiritual symbols and fruits that influence Africans' response to naturopathy and low-cost healthcare. Following that is the question of how medical healthcare responds to such development. Bitter Kola (Garcinia) is the case study fruit, and Sunnyside in Pretoria, South Africa, has been spotted as one of the high-traffic selling points of herbal fruits. A mixed research method is applicable with an expected 20 Quantitative data respondents among sellers and nutritionists and 50 Qualitative Data respondents among consumers. Based on the results, it should be clear how spirituality contributes to alternative healthcare and how it can be further encouraged to bridge the gap between the high demand and low supply of healthcare in Africa and beyond.Keywords: spiritual symbols, naturopathy, African fruits, spirituality, healthcare
Procedia PDF Downloads 7139493 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine
Authors: Adriana Haulica
Abstract:
Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics
Procedia PDF Downloads 7039492 Implementation of Algorithm K-Means for Grouping District/City in Central Java Based on Macro Economic Indicators
Authors: Nur Aziza Luxfiati
Abstract:
Clustering is partitioning data sets into sub-sets or groups in such a way that elements certain properties have shared property settings with a high level of similarity within one group and a low level of similarity between groups. . The K-Means algorithm is one of thealgorithmsclustering as a grouping tool that is most widely used in scientific and industrial applications because the basic idea of the kalgorithm is-means very simple. In this research, applying the technique of clustering using the k-means algorithm as a method of solving the problem of national development imbalances between regions in Central Java Province based on macroeconomic indicators. The data sample used is secondary data obtained from the Central Java Provincial Statistics Agency regarding macroeconomic indicator data which is part of the publication of the 2019 National Socio-Economic Survey (Susenas) data. score and determine the number of clusters (k) using the elbow method. After the clustering process is carried out, the validation is tested using themethodsBetween-Class Variation (BCV) and Within-Class Variation (WCV). The results showed that detection outlier using z-score normalization showed no outliers. In addition, the results of the clustering test obtained a ratio value that was not high, namely 0.011%. There are two district/city clusters in Central Java Province which have economic similarities based on the variables used, namely the first cluster with a high economic level consisting of 13 districts/cities and theclustersecondwith a low economic level consisting of 22 districts/cities. And in the cluster second, namely, between low economies, the authors grouped districts/cities based on similarities to macroeconomic indicators such as 20 districts of Gross Regional Domestic Product, with a Poverty Depth Index of 19 districts, with 5 districts in Human Development, and as many as Open Unemployment Rate. 10 districts.Keywords: clustering, K-Means algorithm, macroeconomic indicators, inequality, national development
Procedia PDF Downloads 15839491 Providing Security to Private Cloud Using Advanced Encryption Standard Algorithm
Authors: Annapureddy Srikant Reddy, Atthanti Mahendra, Samala Chinni Krishna, N. Neelima
Abstract:
In our present world, we are generating a lot of data and we, need a specific device to store all these data. Generally, we store data in pen drives, hard drives, etc. Sometimes we may loss the data due to the corruption of devices. To overcome all these issues, we implemented a cloud space for storing the data, and it provides more security to the data. We can access the data with just using the internet from anywhere in the world. We implemented all these with the java using Net beans IDE. Once user uploads the data, he does not have any rights to change the data. Users uploaded files are stored in the cloud with the file name as system time and the directory will be created with some random words. Cloud accepts the data only if the size of the file is less than 2MB.Keywords: cloud space, AES, FTP, NetBeans IDE
Procedia PDF Downloads 20639490 Efficient Storage and Intelligent Retrieval of Multimedia Streams Using H. 265
Authors: S. Sarumathi, C. Deepadharani, Garimella Archana, S. Dakshayani, D. Logeshwaran, D. Jayakumar, Vijayarangan Natarajan
Abstract:
The need of the hour for the customers who use a dial-up or a low broadband connection for their internet services is to access HD video data. This can be achieved by developing a new video format using H. 265. This is the latest video codec standard developed by ISO/IEC Moving Picture Experts Group (MPEG) and ITU-T Video Coding Experts Group (VCEG) on April 2013. This new standard for video compression has the potential to deliver higher performance than the earlier standards such as H. 264/AVC. In comparison with H. 264, HEVC offers a clearer, higher quality image at half the original bitrate. At this lower bitrate, it is possible to transmit high definition videos using low bandwidth. It doubles the data compression ratio supporting 8K Ultra HD and resolutions up to 8192×4320. In the proposed model, we design a new video format which supports this H. 265 standard. The major areas of applications in the coming future would lead to enhancements in the performance level of digital television like Tata Sky and Sun Direct, BluRay Discs, Mobile Video, Video Conferencing and Internet and Live Video streaming.Keywords: access HD video, H. 265 video standard, high performance, high quality image, low bandwidth, new video format, video streaming applications
Procedia PDF Downloads 35439489 High-Quality Flavor of Black Belly Pork under Lightning Corona Discharge Using Tesla Coil for High Voltage Education
Authors: Kyung-Hoon Jang, Jae-Hyo Park, Kwang-Yeop Jang, Dongjin Kim
Abstract:
The Tesla coil is an electrical resonant transformer circuit designed by inventor Nikola Tesla in 1891. It is used to produce high voltage, low current and high frequency alternating current electricity. Tesla experimented with a number of different configurations consisting of two or sometimes three coupled resonant electric circuits. This paper focuses on development and high voltage education to apply a Tesla coil to cuisine for high quality flavor and taste conditioning as well as high voltage education under 50 kV corona discharge. The result revealed that the velocity of roasted black belly pork by Tesla coil is faster than that of conventional methods such as hot grill and steel plate etc. depending on applied voltage level and applied voltage time. Besides, carbohydrate and crude protein increased, whereas natrium and saccharides significantly decreased after lightning surge by Tesla coil. This idea will be useful in high voltage education and high voltage application.Keywords: corona discharge, Tesla coil, high voltage application, high voltage education
Procedia PDF Downloads 32839488 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 9739487 Anthropometric Parameters of Classroom Furniture in Public and Private Universities of Karachi
Authors: Farhan Iqbal
Abstract:
Ergonomics has its implication in classroom. Present study aimed at finding out the comfort level of students at university level due to classroom furniture which may affect students learning. Two public and one private institution was targeted. Purposive sampling was done. Four hundred and seventy five students volunteered to reply to a questionnaire. Different furniture were measured and descriptively compared with ISO 5970 standard. Overall discomfort was found to be statistically significant as compared to comfort. Comfort and discomfort were found to be negatively correlated. Gender did not differ on upper body discomfort, though, the median score found men to be more comfortable at upper body. GPA was found to be independent of comfort level. Most afflicted areas were neck, shoulder, upper back, lower back and pelvic. The present study will be helpful for all educational institutions of Pakistan. Future studies may be carried out with structural and functional anthropometric data of students for redesigning of the classroom furniture.Keywords: anthropometry, classroom furniture, comfort, discomfort, learning
Procedia PDF Downloads 31139486 Ownership, Management Responsibility and Corporate Performance of the Listed Firms in Kazakhstan
Authors: Gulnara Moldasheva
Abstract:
The research explores the relationship between management responsibility and corporate governance of listed companies in Kazakhstan. This research employs firm level data of randomly selected listed non-financial firms and firm level data “operational” financial sector, consisted from banking sector, insurance companies and accumulated pension funds using multivariate regression analysis under fixed effect model approach. Ownership structure includes institutional ownership, managerial ownership and private investor’s ownership. Management responsibility of the firm is expressed by the decision of the firm on amount of leverage. Results of the cross sectional panel study for non-financial firms showed that only institutional shareholding is significantly negatively correlated with debt to equity ratio. Findings from “operational” financial sector show that leverage is significantly affected only by the CEO/Chair duality and the size of financial institutions, and insignificantly affected by ownership structure. Also, the findings show, that there is a significant negative relationship between profitability and the debt to equity ratio for non-financial firms, which is consistent with pecking order theory. Generally, the found results suggest that corporate governance and a management responsibility play important role in corporate performance of listed firms in Kazakhstan.Keywords: ownership, corporate governance, debt to equity ratio, corporate performance
Procedia PDF Downloads 34339485 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network
Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang
Abstract:
‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.Keywords: deep learning network, smart metering, water end use, water-energy data
Procedia PDF Downloads 30639484 Delineation of Subsurface Tectonic Structures Using Gravity, Magnetic and Geological Data, in the Sarir-Hameimat Arm of the Sirt Basin, NE Libya
Authors: Mohamed Abdalla Saleem, Hana Ellafi
Abstract:
The study area is located in the eastern part of the Sirt Basin, in the Sarir-Hameimat arm of the basin, south of Amal High. The area covers the northern part of the Hamemat Trough and the Rakb High. All of these tectonic elements are part of the major and common tectonics that were created when the old Sirt Arch collapsed, and most of them are trending NW-SE. This study has been conducted to investigate the subsurface structures and the sedimentology characterization of the area and attempt to define its development tectonically and stratigraphically. About 7600 land gravity measurements, 22500 gridded magnetic data, and petrographic core data from some wells were used to investigate the subsurface structural features both vertically and laterally. A third-order separation of the regional trends from the original Bouguer gravity data has been chosen. The residual gravity map reveals a significant number of high anomalies distributed in the area, separated by a group of thick sediment centers. The reduction to the pole magnetic map also shows nearly the same major trends and anomalies in the area. Applying the further interpretation filters reveals that these high anomalies are sourced from different depth levels; some are deep-rooted, and others are intruded igneous bodies within the sediment layers. The petrographic sedimentology study for some wells in the area confirmed the presence of these igneous bodies and defined their composition as most likely to be gabbro hosted by marine shale layers. Depth investigation of these anomalies by the average depth spectrum shows that the average basement depth is about 7.7 km, while the top of the intrusions is about 2.65 km, and some near-surface magnetic sources are about 1.86 km. The depth values of the magnetic anomalies and their location were estimated specifically using the 3D Euler deconvolution technique. The obtained results suggest that the maximum depth of the sources is about 4938m. The total horizontal gradient of the magnetic data shows that the trends are mostly extending NW-SE, others are NE-SW, and a third group has an N-S extension. This variety in trend direction shows that the area experienced different tectonic regimes throughout its geological history.Keywords: sirt basin, tectonics, gravity, magnetic
Procedia PDF Downloads 6739483 Machine Learning Algorithms for Rocket Propulsion
Authors: Rômulo Eustáquio Martins de Souza, Paulo Alexandre Rodrigues de Vasconcelos Figueiredo
Abstract:
In recent years, there has been a surge in interest in applying artificial intelligence techniques, particularly machine learning algorithms. Machine learning is a data-analysis technique that automates the creation of analytical models, making it especially useful for designing complex situations. As a result, this technology aids in reducing human intervention while producing accurate results. This methodology is also extensively used in aerospace engineering since this is a field that encompasses several high-complexity operations, such as rocket propulsion. Rocket propulsion is a high-risk operation in which engine failure could result in the loss of life. As a result, it is critical to use computational methods capable of precisely representing the spacecraft's analytical model to guarantee its security and operation. Thus, this paper describes the use of machine learning algorithms for rocket propulsion to aid the realization that this technique is an efficient way to deal with challenging and restrictive aerospace engineering activities. The paper focuses on three machine-learning-aided rocket propulsion applications: set-point control of an expander-bleed rocket engine, supersonic retro-propulsion of a small-scale rocket, and leak detection and isolation on rocket engine data. This paper describes the data-driven methods used for each implementation in depth and presents the obtained results.Keywords: data analysis, modeling, machine learning, aerospace, rocket propulsion
Procedia PDF Downloads 11539482 A Simple Device for Characterizing High Power Electron Beams for Welding
Authors: Aman Kaur, Colin Ribton, Wamadeva Balachandaran
Abstract:
Electron beam welding due to its inherent advantages is being extensively used for material processing where high precision is required. Especially in aerospace or nuclear industries, there are high quality requirements and the cost of materials and processes is very high which makes it very important to ensure the beam quality is maintained and checked prior to carrying out the welds. Although the processes in these industries are highly controlled, however, even the minor changes in the operating parameters of the electron gun can make large enough variations in the beam quality that can result in poor welding. To measure the beam quality a simple device has been designed that can be used at high powers. The device consists of two slits in x and y axis which collects a small portion of the beam current when the beam is deflected over the slits. The signals received from the device are processed in data acquisition hardware and the dedicated software developed for the device. The device has been used in controlled laboratory environments to analyse the signals and the weld quality relationships by varying the focus current. The results showed matching trends in the weld dimensions and the beam characteristics. Further experimental work is being carried out to determine the ability of the device and signal processing software to detect subtle changes in the beam quality and to relate these to the physical weld quality indicators.Keywords: electron beam welding, beam quality, high power, weld quality indicators
Procedia PDF Downloads 324