Search results for: thermochemical database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1632

Search results for: thermochemical database

1422 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network

Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.

Keywords: big data, k-NN, machine learning, traffic speed prediction

Procedia PDF Downloads 329
1421 Transcriptome Analysis of Saffron (crocus sativus L.) Stigma Focusing on Identification Genes Involved in the Biosynthesis of Crocin

Authors: Parvaneh Mahmoudi, Ahmad Moeni, Seyed Mojtaba Khayam Nekoei, Mohsen Mardi, Mehrshad Zeinolabedini, Ghasem Hosseini Salekdeh

Abstract:

Saffron (Crocus sativus L.) is one of the most important spice and medicinal plants. The three-branch style of C. sativus flowers are the most important economic part of the plant and known as saffron, which has several medicinal properties. Despite the economic and biological significance of this plant, knowledge about its molecular characteristics is very limited. In the present study, we, for the first time, constructed a comprehensive dataset for C. sativus stigma through de novo transcriptome sequencing. We performed de novo transcriptome sequencing of C. sativus stigma using the Illumina paired-end sequencing technology. A total of 52075128 reads were generated and assembled into 118075 unigenes, with an average length of 629 bp and an N50 of 951 bp. A total of 66171unigenes were identified, among them, 66171 (56%) were annotated in the non-redundant National Center for Biotechnology Information (NCBI) database, 30938 (26%) were annotated in the Swiss-Prot database, 10273 (8.7%) unigenes were mapped to 141 Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway database, while 52560 (44%) and 40756 (34%) unigenes were assigned to Gen Ontology (GO) categories and Eukaryotic Orthologous Groups of proteins (KOG), respectively. In addition, 65 candidate genes involved in three stages of crocin biosynthesis were identified. Finally, transcriptome sequencing of saffron stigma was used to identify 6779 potential microsatellites (SSRs) molecular markers. High-throughput de novo transcriptome sequencing provided a valuable resource of transcript sequences of C. sativus in public databases. In addition, most of candidate genes potentially involved in crocin biosynthesis were identified which could be further utilized in functional genomics studies. Furthermore, numerous obtained SSRs might contribute to address open questions about the origin of this amphiploid spices with probable little genetic diversity.

Keywords: saffron, transcriptome, NGS, bioinformatic

Procedia PDF Downloads 52
1420 A General Framework for Knowledge Discovery from Echocardiographic and Natural Images

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: active contour, Bayesian, echocardiographic image, feature vector

Procedia PDF Downloads 415
1419 Real-World Comparison of Adherence to and Persistence with Dulaglutide and Liraglutide in UAE e-Claims Database

Authors: Ibrahim Turfanda, Soniya Rai, Karan Vadher

Abstract:

Objectives— The study aims to compare real-world adherence to and persistence with dulaglutide and liraglutide in patients with type 2 diabetes (T2D) initiating treatment in UAE. Methods— This was a retrospective, non-interventional study (observation period: 01 March 2017–31 August 2019) using the UAE Dubai e-Claims database. Included: adult patients initiating dulaglutide/liraglutide 01 September 2017–31 August 2018 (index period) with: ≥1 claim for T2D in the 6 months before index date (ID); ≥1 claim for dulaglutide/liraglutide during index period; and continuous medical enrolment for ≥6 months before and ≥12 months after ID. Key endpoints, assessed 3/6/12 months after ID: adherence to treatment (proportion of days covered [PDC; PDC ≥80% considered ‘adherent’], per-group mean±standard deviation [SD] PDC); and persistence (number of continuous therapy days from ID until discontinuation [i.e., >45 days gap] or end of observation period). Patients initiating dulaglutide/liraglutide were propensity score matched (1:1) based on baseline characteristics. Between-group comparison of adherence was analysed using the McNemar test (α=0.025). Persistence was analysed using Kaplan–Meier estimates with log-rank tests (α=0.025) for between-group comparisons. This study presents 12-month outcomes. Results— Following propensity score matching, 263 patients were included in each group. Mean±SD PDC for all patients at 12 months was significantly higher in the dulaglutide versus the liraglutide group (dulaglutide=0.48±0.30, liraglutide=0.39±0.28, p=0.0002). The proportion of adherent patients favored dulaglutide (dulaglutide=20.2%, liraglutide=12.9%, p=0.0302), as did the probability of being adherent to treatment (odds ratio [97.5% CI]: 1.70 [0.99, 2.91]; p=0.03). Proportion of persistent patients also favoured dulaglutide (dulaglutide=15.2%, liraglutide=9.1%, p=0.0528), as did the probability of discontinuing treatment 12 months after ID (p=0.027). Conclusions— Based on the UAE Dubai e-Claims database data, dulaglutide initiators exhibited significantly greater adherence in terms of mean PDC versus liraglutide initiators. The proportion of adherent patients and the probability of being adherent favored the dulaglutide group, as did treatment persistence.

Keywords: adherence, dulaglutide, effectiveness, liraglutide, persistence

Procedia PDF Downloads 85
1418 The Study of the Socio-Economic and Environmental Impact on the Semi-Arid Environments Using GIS in the Eastern Aurès, Algeria

Authors: Benmessaoud Hassen

Abstract:

We propose in this study to address the impact of socio-economic and environmental impact on the physical environment, especially their spatiotemporal dynamics in semi-arid and arid eastern Aurès. Including 11 municipalities, the study area spreads out over a relatively large surface area of about 60.000 ha. The hindsight is quite important and is determined by 03 days of analysis of environmental variation spread over thirty years (between 1987 and 2007). The multi-source data acquired in this context are integrated into a geographic information system (GIS).This allows, among other indices to calculate areas and classes for each thematic layer of the 4 layers previously defined by a method inspired MEDALUS (Mediterranean Desertification and Land Use).The database created is composed of four layers of information (population, livestock, farming and land use). His analysis in space and time has been supplemented by a validation of the ground truth. Once the database has corrected it used to develop the comprehensive map with the calculation of the index of socio-economic and environmental (ISCE). The map supports and the resulting information does not consist only of figures on the present situation but could be used to forecast future trends.

Keywords: impact of socio-economic and environmental, spatiotemporal dynamics, semi-arid environments, GIS, Eastern Aurès

Procedia PDF Downloads 290
1417 Mapping and Database on Mass Movements along the Eastern Edge of the East African Rift in Burundi

Authors: L. Nahimana

Abstract:

The eastern edge of the East African Rift in Burundi shows many mass movement phenomena corresponding to landslides, mudflow, debris flow, spectacular erosion (mega-gully), flash floods and alluvial deposits. These phenomena usually occur during the rainy season. Their extent and consecutive damages vary widely. To manage these phenomena, it is necessary to adopt a methodological approach of their mapping with a structured database. The elements for this database are: three-dimensional extent of the phenomenon, natural causes and conditions (geological lithology, slope, weathering depth and products, rainfall patterns, natural environment) and the anthropogenic factors corresponding to the various human activities. The extent of the area provides information about the possibilities and opportunities for mitigation technique. The lithological nature allows understanding the influence of the nature of the rock and its structure on the intensity of the weathering of rocks, as well as the geotechnical properties of the weathering products. The slope influences the land stability. The intensity of annual, monthly and daily rainfall helps to understand the conditions of water saturation of the terrains. Certain natural circumstances such as the presence of streams and rivers promote foot slope erosion and thus the occurrence and activity of mass movements. The construction of some infrastructures such as new roads and agglomerations deeply modify the flow of surface and underground water followed by mass movements. Using geospatial data selected on the East African Rift in Burundi, it is presented case of mass movements illustrating the nature, importance, various factors and the extent of the damages. An analysis of these elements for each hazard can guide the options for mitigation of the phenomenon and its consequences.

Keywords: mass movement, landslide, mudflow, debris flow, spectacular erosion, mega-gully, flash flood, alluvial deposit, East African rift, Burundi

Procedia PDF Downloads 276
1416 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework

Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi

Abstract:

There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.

Keywords: video lectures, big video data, video retrieval, hadoop

Procedia PDF Downloads 493
1415 Least-Square Support Vector Machine for Characterization of Clusters of Microcalcifications

Authors: Baljit Singh Khehra, Amar Partap Singh Pharwaha

Abstract:

Clusters of Microcalcifications (MCCs) are most frequent symptoms of Ductal Carcinoma in Situ (DCIS) recognized by mammography. Least-Square Support Vector Machine (LS-SVM) is a variant of the standard SVM. In the paper, LS-SVM is proposed as a classifier for classifying MCCs as benign or malignant based on relevant extracted features from enhanced mammogram. To establish the credibility of LS-SVM classifier for classifying MCCs, a comparative evaluation of the relative performance of LS-SVM classifier for different kernel functions is made. For comparative evaluation, confusion matrix and ROC analysis are used. Experiments are performed on data extracted from mammogram images of DDSM database. A total of 380 suspicious areas are collected, which contain 235 malignant and 145 benign samples, from mammogram images of DDSM database. A set of 50 features is calculated for each suspicious area. After this, an optimal subset of 23 most suitable features is selected from 50 features by Particle Swarm Optimization (PSO). The results of proposed study are quite promising.

Keywords: clusters of microcalcifications, ductal carcinoma in situ, least-square support vector machine, particle swarm optimization

Procedia PDF Downloads 334
1414 Artificial Intelligent Tax Simulator to Minimize Tax Liability for Multinational Corporations

Authors: Sean Goltz, Michael Mayo

Abstract:

The purpose of this research is to use Global-Regulation.com database of the world laws, focusing on tax treaties between countries, in order to create an AI-driven tax simulator that will run an AI agent through potential tax scenarios across countries. The AI agent goal is to identify the scenario that will result in minimum tax liability based on tax treaties between countries. The results will be visualized by a three dimensional matrix. This will be an online web application. Multinational corporations are running their business through multiple countries. These countries, in turn, have a tax treaty with many other countries to regulate the payment of taxes on income that is transferred between these countries. As a result, planning the best tax scenario across multiple countries and numerous tax treaties is almost impossible. This research propose to use Global-Regulation.com database of word laws in English (machine translated by Google and Microsoft API’s) in order to create a simulator that will include the information in the tax treaties. Once ready, an AI agent will be sent through the simulator to identify the scenario that will result in minimum tax liability. Identifying the best tax scenario across countries may save multinational corporations, like Google, billions of dollars annually. Given the nature of the raw data and the domain of taxes (i.e., numbers), this is a promising ground to employ artificial intelligence towards a practical and beneficial purpose.

Keywords: taxation, law, multinational, corporation

Procedia PDF Downloads 166
1413 Correlation and Prediction of Biodiesel Density

Authors: Nieves M. C. Talavera-Prieto, Abel G. M. Ferreira, António T. G. Portugal, Rui J. Moreira, Jaime B. Santos

Abstract:

The knowledge of biodiesel density over large ranges of temperature and pressure is important for predicting the behavior of fuel injection and combustion systems in diesel engines, and for the optimization of such systems. In this study, cottonseed oil was transesterified into biodiesel and its density was measured at temperatures between 288 K and 358 K and pressures between 0.1 MPa and 30 MPa, with expanded uncertainty estimated as ±1.6 kg.m^-3. Experimental pressure-volume-temperature (pVT) cottonseed data was used along with literature data relative to other 18 biodiesels, in order to build a database used to test the correlation of density with temperarure and pressure using the Goharshadi–Morsali–Abbaspour equation of state (GMA EoS). To our knowledge, this is the first that density measurements are presented for cottonseed biodiesel under such high pressures, and the GMA EoS used to model biodiesel density. The new tested EoS allowed correlations within 0.2 kg•m-3 corresponding to average relative deviations within 0.02%. The built database was used to develop and test a new full predictive model derived from the observed linear relation between density and degree of unsaturation (DU), which depended from biodiesel FAMEs profile. The average density deviation of this method was only about 3 kg.m-3 within the temperature and pressure limits of application. These results represent appreciable improvements in the context of density prediction at high pressure when compared with other equations of state.

Keywords: biodiesel density, correlation, equation of state, prediction

Procedia PDF Downloads 582
1412 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: active contour, bayesian, echocardiographic image, feature vector

Procedia PDF Downloads 389
1411 Approaches to Estimating the Radiation and Socio-Economic Consequences of the Fukushima Daiichi Nuclear Power Plant Accident Using the Data Available in the Public Domain

Authors: Dmitry Aron

Abstract:

Major radiation accidents carry not only the potential risks of negative consequences for public health due to exposure but also because of large-scale emergency measures were taken by authorities to protect the population, which can lead to unreasonable social and economic damage. It is technically difficult, as a rule, to assess the possible costs and damages from decisions on evacuation or resettlement of residents in the shortest possible time, since it requires specially prepared information systems containing relevant information on demographic, economic parameters and incoming data on radiation conditions. Foreign observers also face the difficulties in assessing the consequences of an accident in a foreign territory, since they usually do not have official and detailed statistical data on the territory of foreign state beforehand. Also, they can suppose the application of unofficial data from open Internet sources is an unreliable and overly labor-consuming procedure. This paper describes an approach to prompt creation of relational database that contains detailed actual data on economics, demographics and radiation situation at the Fukushima Prefecture during the Fukushima Daiichi NPP accident, received by the author from open Internet sources. This database was developed and used to assess the number of evacuated population, radiation doses, expected financial losses and other parameters of the affected areas. The costs for the areas with temporarily evacuated and long-term resettled population were investigated, and the radiological and economic effectiveness of the measures taken to protect the population was estimated. Some of the results are presented in the article. The study showed that such a tool for analyzing the consequences of radiation accidents can be prepared in a short space of time for the entire territory of Japan, and it can serve for the modeling of social and economic consequences for hypothetical accidents for any nuclear power plant in its territory.

Keywords: Fukushima, radiation accident, emergency measures, database

Procedia PDF Downloads 159
1410 A Review on Microbial Enhanced Oil Recovery and Controlling Its Produced Hydrogen Sulfide Effects on Reservoir and Transporting Pipelines

Authors: Ali Haratian, Soroosh Emami Meybodi

Abstract:

Using viable microbial cultures within hydrocarbon reservoirs so as to the enhancement of oil recovery through metabolic activities is exactly what we recognize as microbial enhanced oil recovery (MEOR). In similar to many other processes in industries, there are some cons and pros following with MEOR. The creation of sulfides such as hydrogen sulfide as a result of injecting the sulfate-containing seawater into hydrocarbon reservoirs in order to maintain the required reservoir pressure leads to production and growth of sulfate reducing bacteria (SRB) approximately near the injection wells, turning the reservoir into sour; however, SRB is not considered as the only microbial process stimulating the formation of sulfides. Along with SRB, thermochemical sulfate reduction or thermal redox reaction (TSR) is also known to be highly effective at resulting in having extremely concentrated zones of ?2S in the reservoir fluids eligible to cause corrosion. Owing to extent of the topic, more information on the formation of ?₂S is going to be put finger on. Besides, confronting the undesirable production of sulfide species in the reservoirs can lead to serious operational, environmental, and financial problems, in particular the transporting pipelines. Consequently, conjuring up reservoir souring control strategies on the way production of oil and gas is the only way to prevent possible damages in terms of environment, finance, and manpower which requires determining the compound’s reactivity, origin, and partitioning behavior. This article is going to provide a comprehensive review of progress made in this field and the possible advent of new strategies in this technologically advanced world of the petroleum industry.

Keywords: corrosion, hydrogen sulfide, NRB, reservoir souring, SRB

Procedia PDF Downloads 179
1409 Climate Change and Health in Policies

Authors: Corinne Kowalski, Lea de Jong, Rainer Sauerborn, Niamh Herlihy, Anneliese Depoux, Jale Tosun

Abstract:

Climate change is considered one of the biggest threats to human health of the 21st century. The link between climate change and health has received relatively little attention in the media, in research and in policy-making. A long term and broad overview of how health is represented in the legislation on climate change is missing in the legislative literature. It is unknown if or how the argument for health is referred in legal clauses addressing climate change, in national and European legislation. Integrating scientific based evidence into policies regarding the impacts of climate change on health could be a key step to inciting the political and societal changes necessary to decelerate global warming. This may also drive the implementation of new strategies to mitigate the consequences on health systems. To provide an overview of this issue, we are analyzing the Global Climate Legislation Database provided by the Grantham Research Institute on Climate Change and the Environment. This institution was established in 2008 at the London School of Economics and Political Science. The database consists of (updated as of 1st January 2015) legislations on climate change in 99 countries around the world. This tool offers relevant information about the state of climate related policies. We will use the database to systematically analyze the 829 identified legislations to identify how health is represented as a relevant aspect of climate change legislation. We are conducting explorative research of national and supranational legislations and anticipate health to be addressed in various forms. The goal is to highlight how often, in what specific terms, which aspects of health or health risks of climate change are mentioned in various legislations. The position and recurrence of the mention of health is also of importance. Data will be extracted with complete quotation of the sentence which mentions health, which will allow for second qualitative stage to analyze which aspects of health are represented and in what context. This study is part of an interdisciplinary project called 4CHealth that confronts results of the research done on scientific, political and press literature to better understand how the knowledge on climate change and health circulates within those different fields and whether and how it is translated to real world change.

Keywords: climate change, explorative research, health, policies

Procedia PDF Downloads 335
1408 Fluid Prescribing Post Laparotomies

Authors: Gusa Hall, Barrie Keeler, Achal Khanna

Abstract:

Introduction: NICE guidelines have highlighted the consequences of IV fluid mismanagement. The main aim of this study was to audit fluid prescribing post laparotomies to identify if fluids were prescribed in accordance to NICE guidelines. Methodology: Retrospective database search of eight specific laparotomy procedures (colectomy right and left, Hartmann’s procedure, small bowel resection, perforated ulcer, abdominal perineal resection, anterior resection, pan proctocolectomy, subtotal colectomy) highlighted 29 laparotomies between April 2019 and May 2019. Two of 29 patients had secondary procedures during the same admission, n=27 (patients). Database case notes were reviewed for date of procedure, length of admission, fluid prescribed and amount, nasal gastric tube output, daily bloods results for electrolytes sodium and potassium and operational losses. Results: n=27 based on 27 identified patients between April 2019 – May 2019, 93% (25/27) received IV fluids, only 19% (5/27) received the correct IV fluids in accordance to NICE guidelines, 93% (25/27) who received IV fluids had the correct electrolytes levels (sodium & potassium), 100% (27/27) patients received blood tests (U&E’s) for correct electrolytes levels. 0% (0/27) no documentation on operational losses. IV fluids matched nasogastric tube output in 100% (3/3) of the number of patients that had a nasogastric tube in situ. Conclusion: A PubMed database literature review on barriers to safer IV prescribing highlighted educational interventions focused on prescriber knowledge rather than how to execute the prescribing task. This audit suggests IV fluids post laparotomies are not being prescribed consistently in accordance to NICE guidelines. Surgical management plans should be clearer on IV fluids and electrolytes requirements for the following 24 hours after the plan has been initiated. In addition, further teaching and training around IV prescribing is needed together with frequent surgical audits on IV fluid prescribing post-surgery to evaluate improvements.

Keywords: audit, IV Fluid prescribing, laparotomy, NICE guidelines

Procedia PDF Downloads 91
1407 Database of Pharmacogenetics HLA-A*31:01 Allele in Thai Population and Carbamazepine-Induced SCARs

Authors: Watchawin Ekphinitphithaya, Patompong Satapornpong

Abstract:

Introduction: Carbamazepine (CBZ) is one of the most prescribed antiepileptic drugs (AEDs) by neurologists and non-neurologist worldwide. CBZ is usually prescribed along with other drugs, leading to the possibility of severe cutaneous adverse drug reactions (SCARs). The HLA-B*15:02 is strongly associated with CBZ-induced Stevens-Johnson syndrome and toxic epidermal necrolysis (SJS–TEN) in the Han Chinese and other Asian populations but not in European populations, while HLA-A*31:01 allele has been reported to be associated with CBZ-induced SCARs in European population and Japanese. Objective: The aim of this study is to investigate the distribution of pharmacogenetics HLA-A*31:01 marker in a healthy Thai population associated with Carbamazepine-induced SCARs. Materials and Methods: Prospective study, 350 unrelated healthy Thais were recruited in this study. Human leukocyte antigen-A alleles were genotyped using PCR-sequence specific oligonucleotides (PCR-SSOs). Results: The frequency of HLA-A alleles were HLA-A*11:01 (190 alleles, 27.14%), HLA-A*24:02 (82 alleles, 11.71%), HLA-A*02:03 (80 alleles, 11.43%), HLA-A*33:03 (76 alleles, 10.86%), HLA-A*02:07 (58 alleles, 8.29%), HLA-A*02:01 (35 alleles, 5.00%), HLA-A*24:07 (29 alleles, 4.14%), HLA-A*02:06 – HLA-A*30:01 (15 alleles, 2.14%), and HLA-A*01:01 (14 alleles, 2.00%). Particularly, the number of HLA-A*31:01 alleles was 6 of 700 (0.86%) in the healthy Thai population. Many research presented varying distributions of HLA-A*31:01 in Asians, including 2% of Han Chinese, 9% of Japanese and 5% of Koreans. In addition, this allele was found approximately 2-5% in the Caucasian population. Conclusions: Thus, the pharmacogenetics database is vital to support in many populations, especially in Thais, for screening HLA-A*31:01 allele to avoid CBZ-induced SCARs before initiating treatments in each population.

Keywords: Carbamazepine, HLA-A*31:01, Thai population, pharmacogenetics

Procedia PDF Downloads 139
1406 Forensic Analysis of Signal Messenger on Android

Authors: Ward Bakker, Shadi Alhakimi

Abstract:

The amount of people moving towards more privacy focused instant messaging applications has grown significantly. Signal is one of these instant messaging applications, which makes Signal interesting for digital investigators. In this research, we evaluate the artifacts that are generated by the Signal messenger for Android. This evaluation was done by using the features that Signal provides to create artifacts, whereafter, we made an image of the internal storage and the process memory. This image was analysed manually. The manual analysis revealed the content that Signal stores in different locations during its operation. From our research, we were able to identify the artifacts and interpret how they were used. We also examined the source code of Signal. Using our obtain knowledge from the source code, we developed a tool that decrypts some of the artifacts using the key stored in the Android Keystore. In general, we found that most artifacts are encrypted and encoded, even after decrypting some of the artifacts. During data visualization, some artifacts were found, such as that Signal does not use relationships between the data. In this research, two interesting groups of artifacts were identified, those related to the database and those stored in the process memory dump. In the database, we found plaintext private- and group chats, and in the memory dump, we were able to retrieve the plaintext access code to the application. Nevertheless, we conclude that Signal contains a wealth of artifacts that could be very valuable to a digital forensic investigation.

Keywords: forensic, signal, Android, digital

Procedia PDF Downloads 50
1405 A Cloud Computing System Using Virtual Hyperbolic Coordinates for Services Distribution

Authors: Telesphore Tiendrebeogo, Oumarou Sié

Abstract:

Cloud computing technologies have attracted considerable interest in recent years. Thus, these latters have become more important for many existing database applications. It provides a new mode of use and of offer of IT resources in general. Such resources can be used “on demand” by anybody who has access to the internet. Particularly, the Cloud platform provides an ease to use interface between providers and users, allow providers to develop and provide software and databases for users over locations. Currently, there are many Cloud platform providers support large scale database services. However, most of these only support simple keyword-based queries and can’t response complex query efficiently due to lack of efficient in multi-attribute index techniques. Existing Cloud platform providers seek to improve performance of indexing techniques for complex queries. In this paper, we define a new cloud computing architecture based on a Distributed Hash Table (DHT) and design a prototype system. Next, we perform and evaluate our cloud computing indexing structure based on a hyperbolic tree using virtual coordinates taken in the hyperbolic plane. We show through our experimental results that we compare with others clouds systems to show our solution ensures consistence and scalability for Cloud platform.

Keywords: virtual coordinates, cloud, hyperbolic plane, storage, scalability, consistency

Procedia PDF Downloads 392
1404 Landslide Susceptibility Mapping: A Comparison between Logistic Regression and Multivariate Adaptive Regression Spline Models in the Municipality of Oudka, Northern of Morocco

Authors: S. Benchelha, H. C. Aoudjehane, M. Hakdaoui, R. El Hamdouni, H. Mansouri, T. Benchelha, M. Layelmam, M. Alaoui

Abstract:

The logistic regression (LR) and multivariate adaptive regression spline (MarSpline) are applied and verified for analysis of landslide susceptibility map in Oudka, Morocco, using geographical information system. From spatial database containing data such as landslide mapping, topography, soil, hydrology and lithology, the eight factors related to landslides such as elevation, slope, aspect, distance to streams, distance to road, distance to faults, lithology map and Normalized Difference Vegetation Index (NDVI) were calculated or extracted. Using these factors, landslide susceptibility indexes were calculated by the two mentioned methods. Before the calculation, this database was divided into two parts, the first for the formation of the model and the second for the validation. The results of the landslide susceptibility analysis were verified using success and prediction rates to evaluate the quality of these probabilistic models. The result of this verification was that the MarSpline model is the best model with a success rate (AUC = 0.963) and a prediction rate (AUC = 0.951) higher than the LR model (success rate AUC = 0.918, rate prediction AUC = 0.901).

Keywords: landslide susceptibility mapping, regression logistic, multivariate adaptive regression spline, Oudka, Taounate

Procedia PDF Downloads 160
1403 Catalytic Pyrolysis of Barley Straw for the Production of Fuels and Chemicals

Authors: Funda Ates

Abstract:

Primary energy sources, such as petroleum, coal and natural gas are principle responsible of world’s energy consumption. However, the rapid worldwide increase in the depletion of these energy sources is remarkable. In addition to this, they have damaging environmentally effect. Renewable energy sources are capable of providing a considerable fraction of World energy demand in this century. Biomass is one of the most abundant and utilized sources of renewable energy in the world. It can be converted into commercial fuels, suitable to substitute for fossil fuels. A high number of biomass types can be converted through thermochemical processes into solid, liquid or gaseous fuels. Pyrolysis is the thermal decomposition of biomass in the absence of air or oxygen. In this study, barley straw has been investigated as an alternative feedstock to obtain fuels and chemicals via pyrolysis in fixed-bed reactor. The influence of pyrolysis temperature in the range 450–750 °C as well as the catalyst effects on the products was investigated and the obtained results were compared. The results indicated that a maximum oil yield of 20.4% was obtained at a moderate temperature of 550 °C. Oil yield decreased by using catalyst. Pyrolysis oils were examined by using instrumental analysis and GC/MS. Analyses revealed that the pyrolysis oils were chemically very heterogeneous at all temperatures. It was determined that the most abundant compounds composing the bio-oil were phenolics. Catalyst decreased the reaction temperature. Most of the components obtained using a catalyst at moderate temperatures was close to those obtained at high temperatures without using a catalyst. Moreover, the use of a catalyst also decreased the amount of oxygenated compounds produced.

Keywords: Barley straw, pyrolysis, catalyst, phenolics

Procedia PDF Downloads 191
1402 Bibliometric Analysis of Global Research Trends on Organization Culture, Strategic Leadership and Performance Using Scopus Database

Authors: Anyia Nduka, Aslan Bin Amad Senin

Abstract:

Taking a behavioral perspective of Organization Culture, Strategic Leadership, and performance (OC, SLP). We examine the role of Strategic Leadership as key vicious mechanism linking OC,SLP to the organizational capacities. Given the increasing degree of dependence of modern businesses on the use and scientific discovery of relevant data, research efforts around the entire globe have been accelerated. In today's corporate world, Strategic Leadership is still the most sustainable option of performance and competitive advantage. This is why it is critical to gain a deep understanding of research area and to strengthen new collaborative networks in efforts to support research transition towards these integrative efforts. This bibliometric analysis is aimed to examine global trends in OC,SLP research based on publication output, author co-authorships, and co-occurrences of author keywords among authors and affiliated countries. 2829 journal articles were retrieved from the Scopus database Between 1974 and 2021. From the research findings, there is a significant increase in number of publications with strong global collaboration (e.g., USA & UK). We also discovered that while most countries/territories without affiliations were centered in developing countries, the outstanding performance of Asian countries and the volume of their collaborations should be emulated.

Keywords: organizational culture, strategic leadership, organizational resilience, performance

Procedia PDF Downloads 46
1401 Correlation between Funding and Publications: A Pre-Step towards Future Research Prediction

Authors: Ning Kang, Marius Doornenbal

Abstract:

Funding is a very important – if not crucial – resource for research projects. Usually, funding organizations will publish a description of the funded research to describe the scope of the funding award. Logically, we would expect research outcomes to align with this funding award. For that reason, we might be able to predict future research topics based on present funding award data. That said, it remains to be shown if and how future research topics can be predicted by using the funding information. In this paper, we extract funding project information and their generated paper abstracts from the Gateway to Research database as a group, and use the papers from the same domains and publication years in the Scopus database as a baseline comparison group. We annotate both the project awards and the papers resulting from the funded projects with linguistic features (noun phrases), and then calculate tf-idf and cosine similarity between these two set of features. We show that the cosine similarity between the project-generated papers group is bigger than the project-baseline group, and also that these two groups of similarities are significantly different. Based on this result, we conclude that the funding information actually correlates with the content of future research output for the funded project on the topical level. How funding really changes the course of science or of scientific careers remains an elusive question.

Keywords: natural language processing, noun phrase, tf-idf, cosine similarity

Procedia PDF Downloads 217
1400 Attribute Based Comparison and Selection of Modular Self-Reconfigurable Robot Using Multiple Attribute Decision Making Approach

Authors: Manpreet Singh, V. P. Agrawal, Gurmanjot Singh Bhatti

Abstract:

From the last decades, there is a significant technological advancement in the field of robotics, and a number of modular self-reconfigurable robots were introduced that can help in space exploration, bucket to stuff, search, and rescue operation during earthquake, etc. As there are numbers of self-reconfigurable robots, choosing the optimum one is always a concern for robot user since there is an increase in available features, facilities, complexity, etc. The objective of this research work is to present a multiple attribute decision making based methodology for coding, evaluation, comparison ranking and selection of modular self-reconfigurable robots using a technique for order preferences by similarity to ideal solution approach. However, 86 attributes that affect the structure and performance are identified. A database for modular self-reconfigurable robot on the basis of different pertinent attribute is generated. This database is very useful for the user, for selecting a robot that suits their operational needs. Two visual methods namely linear graph and spider chart are proposed for ranking of modular self-reconfigurable robots. Using five robots (Atron, Smores, Polybot, M-Tran 3, Superbot), an example is illustrated, and raking of the robots is successfully done, which shows that Smores is the best robot for the operational need illustrated, and this methodology is found to be very effective and simple to use.

Keywords: self-reconfigurable robots, MADM, TOPSIS, morphogenesis, scalability

Procedia PDF Downloads 193
1399 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET

Authors: Tyler T. Procko, Steve Collins

Abstract:

New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.

Keywords: API data access, database, JSON, .NET core, SQL server

Procedia PDF Downloads 42
1398 Association of Non Synonymous SNP in DC-SIGN Receptor Gene with Tuberculosis (Tb)

Authors: Saima Suleman, Kalsoom Sughra, Naeem Mahmood Ashraf

Abstract:

Mycobacterium tuberculosis is a communicable chronic illness. This disease is being highly focused by researchers as it is present approximately in one third of world population either in active or latent form. The genetic makeup of a person plays an important part in producing immunity against disease. And one important factor association is single nucleotide polymorphism of relevant gene. In this study, we have studied association between single nucleotide polymorphism of CD-209 gene (encode DC-SIGN receptor) and patients of tuberculosis. Dry lab (in silico) and wet lab (RFLP) analysis have been carried out. GWAS catalogue and GEO database have been searched to find out previous association data. No association study has been found related to CD-209 nsSNPs but role of CD-209 in pulmonary tuberculosis have been addressed in GEO database.Therefore, CD-209 has been selected for this study. Different databases like ENSEMBLE and 1000 Genome Project has been used to retrieve SNP data in form of VCF file which is further submitted to different software to sort SNPs into benign and deleterious. Selected SNPs are further annotated by using 3-D modeling techniques using I-TASSER online software. Furthermore, selected nsSNPs were checked in Gujrat and Faisalabad population through RFLP analysis. In this study population two SNPs are found to be associated with tuberculosis while one nsSNP is not found to be associated with the disease.

Keywords: association, CD209, DC-SIGN, tuberculosis

Procedia PDF Downloads 282
1397 The Use of Voice in Online Public Access Catalog as Faster Searching Device

Authors: Maisyatus Suadaa Irfana, Nove Eka Variant Anna, Dyah Puspitasari Sri Rahayu

Abstract:

Technological developments provide convenience to all the people. Nowadays, the communication of human with the computer is done via text. With the development of technology, human and computer communications have been conducted with a voice like communication between human beings. It provides an easy facility for many people, especially those who have special needs. Voice search technology is applied in the search of book collections in the OPAC (Online Public Access Catalog), so library visitors will find it faster and easier to find books that they need. Integration with Google is needed to convert the voice into text. To optimize the time and the results of searching, Server will download all the book data that is available in the server database. Then, the data will be converted into JSON format. In addition, the incorporation of some algorithms is conducted including Decomposition (parse) in the form of array of JSON format, the index making, analyzer to the result. It aims to make the process of searching much faster than the usual searching in OPAC because the data are directly taken to the database for every search warrant. Data Update Menu is provided with the purpose to enable users perform their own data updates and get the latest data information.

Keywords: OPAC, voice, searching, faster

Procedia PDF Downloads 318
1396 Macrocycles Enable Tuning of Uranyl Electrochemistry by Lewis Acids

Authors: Amit Kumar, Davide Lionetti, Victor Day, James Blakemore

Abstract:

Capture and activation of the water-soluble uranyl dication (UO22+) remains a challenging problem, as few rational approaches are available for modulating the reactivity of this species. Here, we report the divergent synthesis of heterobimetallic complexes in which UO22+ is held in close proximity to a range of redox-inactive metals by tailored macrocyclic ligands. Crystallographic and spectroscopic studies confirm assembly of homologous UVI(μ-OAr)2Mn+ cores with a range of mono-, di-, and trivalent Lewis acids (Mn+). X-ray diffraction (XRD) and cyclic voltammetry (CV) data suggest preferential binding of K+ in an 18-crown-6-like cavity and Na+ in a 15-crown-5-like cavity, both appended to Schiff-base type sites that selectively bind UO22+. CV data demonstrate that the UVI/UV reduction potential in these complexes shifts positive and the rate of electron transfer decreases with increasing Lewis acidity of the incorporated redox-inactive metals. Moreover, spectroelectrochemical studies confirm the formation of [UV] species in the case of monometallic UO22+ complex, consistent with results from prior studies. However, unique features were observed during spectroelectrochemical studies in the presence of the K+ ion, suggesting new insights into electronic structure may be accessible with the heterobimetallic complexes. Overall, these findings suggest that interactions with Lewis acids could be effectively leveraged for rational tuning of the electronic and thermochemical properties of the 5f elements, reminiscent of strategies more commonly employed with 3d transition metals.

Keywords: electrochemistry, Lewis acid, macrocycle, uranyl

Procedia PDF Downloads 110
1395 Gypsum Composites with CDW as Raw Material

Authors: R. Santos Jiménez, A. San-Antonio-González, M. del Río Merino, M. González Cortina, C. Viñas Arrebola

Abstract:

On average, Europe generates around 890 million tons of construction and demolition waste (CDW) per year and only 50% of these CDW are recycled. This is far from the objectives determined in the European Directive for 2020 and aware of this situation, the European Countries are implementing national policies to prevent the waste that can be avoidable and to promote measures to increase recycling and recovering. In Spain, one of these measures has been the development of a CDW recycling guide for the manufacture of mortar, concrete, bricks and lightweight aggregates. However, there is still not enough information on the possibility of incorporating CDW materials in the manufacture of gypsum products. In view of the foregoing, the Universidad Politécnica de Madrid is creating a database with information on the possibility of incorporating CDW materials in the manufacture of gypsum products. The objective of this study is to improve this database by analysing the feasibility of incorporating two different CDW in a gypsum matrix: ceramic waste bricks (perforated brick and double hollow brick), and extruded polystyrene (XPS) waste. Results show that it is possible to incorporate up to 25% of ceramic waste and 4% of XPS waste over the weight of gypsum in a gypsum matrix. Furhtermore, with the addition of ceramic waste an 8% of surface hardness increase and a 25% of capillary water absorption reduction can be obtained. On the other hand, with the addition of XPS, a 26% reduction of density and a 37% improvement of thermal conductivity can be obtained.

Keywords: CDW, waste materials, ceramic waste, XPS, construction materials, gypsum

Procedia PDF Downloads 479
1394 An Engineer-Oriented Life Cycle Assessment Tool for Building Carbon Footprint: The Building Carbon Footprint Evaluation System in Taiwan

Authors: Hsien-Te Lin

Abstract:

The purpose of this paper is to introduce the BCFES (building carbon footprint evaluation system), which is a LCA (life cycle assessment) tool developed by the Low Carbon Building Alliance (LCBA) in Taiwan. A qualified BCFES for the building industry should fulfill the function of evaluating carbon footprint throughout all stages in the life cycle of building projects, including the production, transportation and manufacturing of materials, construction, daily energy usage, renovation and demolition. However, many existing BCFESs are too complicated and not very designer-friendly, creating obstacles in the implementation of carbon reduction policies. One of the greatest obstacle is the misapplication of the carbon footprint inventory standards of PAS2050 or ISO14067, which are designed for mass-produced goods rather than building projects. When these product-oriented rules are applied to building projects, one must compute a tremendous amount of data for raw materials and the transportation of construction equipment throughout the construction period based on purchasing lists and construction logs. This verification method is very cumbersome by nature and unhelpful to the promotion of low carbon design. With a view to provide an engineer-oriented BCFE with pre-diagnosis functions, a component input/output (I/O) database system and a scenario simulation method for building energy are proposed herein. Most existing BCFESs base their calculations on a product-oriented carbon database for raw materials like cement, steel, glass, and wood. However, data on raw materials is meaningless for the purpose of encouraging carbon reduction design without a feedback mechanism, because an engineering project is not designed based on raw materials but rather on building components, such as flooring, walls, roofs, ceilings, roads or cabinets. The LCBA Database has been composited from existing carbon footprint databases for raw materials and architectural graphic standards. Project designers can now use the LCBA Database to conduct low carbon design in a much more simple and efficient way. Daily energy usage throughout a building's life cycle, including air conditioning, lighting, and electric equipment, is very difficult for the building designer to predict. A good BCFES should provide a simplified and designer-friendly method to overcome this obstacle in predicting energy consumption. In this paper, the author has developed a simplified tool, the dynamic Energy Use Intensity (EUI) method, to accurately predict energy usage with simple multiplications and additions using EUI data and the designed efficiency levels for the building envelope, AC, lighting and electrical equipment. Remarkably simple to use, it can help designers pre-diagnose hotspots in building carbon footprint and further enhance low carbon designs. The BCFES-LCBA offers the advantages of an engineer-friendly component I/O database, simplified energy prediction methods, pre-diagnosis of carbon hotspots and sensitivity to good low carbon designs, making it an increasingly popular carbon management tool in Taiwan. To date, about thirty projects have been awarded BCFES-LCBA certification and the assessment has become mandatory in some cities.

Keywords: building carbon footprint, life cycle assessment, energy use intensity, building energy

Procedia PDF Downloads 115
1393 Incidence of Lymphoma and Gonorrhea Infection: A Retrospective Study

Authors: Diya Kohli, Amalia Ardeljan, Lexi Frankel, Jose Garcia, Lokesh Manjani, Omar Rashid

Abstract:

Gonorrhea is the second most common sexually transmitted disease (STDs) in the United States of America. Gonorrhea affects the urethra, rectum, or throat and the cervix in females. Lymphoma is a cancer of the immune network called the lymphatic system that includes the lymph nodes/glands, spleen, thymus gland, and bone marrow. Lymphoma can affect many organs in the body. When a lymphocyte develops a genetic mutation, it signals other cells into rapid proliferation that causes many mutated lymphocytes. Multiple studies have explored the incidence of cancer in people infected with STDs such as Gonorrhea. For instance, the studies conducted by Wang Y-C and Co., as well as Caini, S and Co. established a direct co-relationship between Gonorrhea infection and incidence of prostate cancer. We hypothesize that Gonorrhea infection also increases the incidence of Lymphoma in patients. This research study aimed to evaluate the correlation between Gonorrhea infection and the incidence of Lymphoma. The data for the research was provided by a Health Insurance Portability and Accountability Act (HIPAA) compliant national database. This database was utilized to evaluate patients infected with Gonorrhea versus the ones who were not infected to establish a correlation with the prevalence of Lymphoma using ICD-10 and ICD-9 codes. Access to the database was granted by the Holy Cross Health, Fort Lauderdale for academic research. Standard statistical methods were applied throughout. Between January 2010 and December 2019, the query was analyzed and resulted in 254 and 808 patients in both the infected and control group, respectively. The two groups were matched by Age Range and CCI score. The incidence of Lymphoma was 0.998% (254 patients out of 25455) in the Gonorrhea group (patients infected with Gonorrhea that was Lymphoma Positive) compared to 3.174% and 808 patients in the control group (Patients negative for Gonorrhea but with Lymphoma). This was statistically significant by a p-value < 2.210-16 with an OR= 0.431 (95% CI 0.381-0.487). The patients were then matched by antibiotic treatment to avoid treatment bias. The incidence of Lymphoma was 1.215% (82 patients out of 6,748) in the Gonorrhea group compared to 2.949% (199 patients out of 6748) in the control group. This was statistically significant by a p-value <5.410-10 with an OR= 0.468 (95% CI 0.367-0.596). The study shows a statistically significant correlation between Gonorrhea and a reduced incidence of Lymphoma. Further evaluation is recommended to assess the potential of Gonorrhea in reducing Lymphoma.

Keywords: gonorrhea, lymphoma, STDs, cancer, ICD

Procedia PDF Downloads 170