Search results for: claims database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1955

Search results for: claims database

1505 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering

Authors: Emiel Caron

Abstract:

Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.

Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics

Procedia PDF Downloads 188
1504 The Quranic Case for Resurrection

Authors: Maira Farooq Maneka

Abstract:

Death has increasingly caused humans to investigate its reality and what lies after it, if something at all, with personal conviction and concern. Till date it remains a matter of speculation. We do not encounter arguments other than ‘faith’ from major world religions when justifying claims about life after death (LAD) as it is an unseen phenomenon. This paper attempts to analyse the Islamic idea of resurrection (after death) and its justification that is distinct from faith but instead contemplative in nature. To do this a legal lens was adopted which allowed the categorisation of selected Quranic arguments under the heading of direct evidence, indirect evidence and intuitive reasoning. Results: Four kinds of direct evidences are discussed under the themes of sleep, droughts, predictions and Quranic challenge. The section of indirect evidences narrows its scope only to two, out of many, broad possible signs that pointed towards the reality of resurrection. These include the signs found in nature such as sun and water as well as signs one finds within the human body such as the creation and function of human fingertips. Finally the last section tries to amalgamate Quran’s appeal to human rationality that facilitates the reader in accepting the possibility of resurrection and hence a final Day of Judgement. These include the notion of accountability, pleasure, pain and human agency.

Keywords: Islam, life after death, Quran, resurrection

Procedia PDF Downloads 92
1503 Evalution of Antiurolithiatic Potentials from Cucumis sativus Fruits

Authors: H. J. Pramod, S. Pethkar

Abstract:

The evaluation of antiurolithiatic potentials from the extracts of Cucumis sativus fruits at different doses and cystone (standard formulation) at a dose of 750 mg/kg were measured for both preventive and curative regimen in wistar rats by adding 0.75% v/v ethylene glycol (EG) to drinking water for 28 days, except normal rats. After the completion of the experimental period, (28th day) urinary parameters like (urine volume, routine urine analysis, levels of calcium, phosphate, oxalate, magnesium, sodium) serum biomarkers like (creatinine, BUN, uric acid, ALP, ALT, AST) kidney homogenate analysis for (levels of calcium, oxalate and phosphate) were analysed. The treated groups shows increased in the urine output significantly compared to the normal. The extract shows significantly decreased in the urinary excretion of the calcium, phosphate, magnesium, sodium and oxalate. The both preventive and curative treatment of extracts showed decrease in the stone forming constituents in the kidneys of urolithiatic rats further the kidneys of all the groups were excised and sectioned for histopathological examination which further claims to posses antiurolithiatic activity.

Keywords: Cucumis sativus, urolithiasis, ethylene glycol, cystone

Procedia PDF Downloads 538
1502 Association of Clostridium difficile Infection and Bone Cancer

Authors: Daniela Prado, Lexi Frankel, Amalia Ardeljan, Lokesh Manjani, Matthew Cardeiro, Omar Rashid

Abstract:

Background: Clostridium difficile (C. diff) is a gram-positive bacterium that is known to cause life-threatening diarrhea and severe inflammation of the colon. It originates as an alteration of the gut microbiome and can be transmitted through spores. Recent studies have shown a high association between the development of C. diff in cancer patients due to extensive hospitalization. However, research is lacking regarding C. diff’s association in the causation or prevention of cancer. The objective of this study was to therefore assess the correlation between Clostridium difficile infection (CDI) and the incidence of bone cancer. Methods: This retrospective analysis used data provided by a Health Insurance Portability and Accountability Act (HIPAA) compliant national database to evaluate the patients infected versus patients not infected with C. diff using ICD-10 and ICD-9 codes. Access to the database was granted by the Holy Cross Health, Fort Lauderdale, for the purpose of academic research. Standard statistical methods were used. Results: Between January 2010 and December 2019, the query was analyzed and resulted in 78863 patients in both the infected and control group, respectively. The two groups were matched by age range and CCI score. The incidence of bone cancer was 659 patients (0.835%) in the C. diff group compared to 1941 patients (2.461%) in the control group. The difference was statistically significant by a P-value < 2.2x10^-16 with an odds ratio (OR)= 0.33 (0.31-0.37) with a 95% confidence interval (CI). Treatment for CDI was analyzed for both C. diff infected and noninfected populations. 91 out of 16,676 (0.55%) patients with a prior C. diff infection and treated with antibiotics were compared to the control group were 275 out of 16,676 (1.65%) patients with no history of CDI and received antibiotic treatment. Results remained statistically significant by P-value <2.2x10-16 with an OR= 0.42 (0.37, 0.48). and a 95% CI. Conclusion: The study shows a statistically significant correlation between C. diff and a reduced incidence of bone cancer. Further evaluation is recommended to assess the potential of C. difficile in reducing bone cancer incidence.

Keywords: bone cancer, colitis, clostridium difficile, microbiome

Procedia PDF Downloads 273
1501 Killed by the ‘Subhuman’: Jane Longhurst’s Murder and the Construction of the ‘Extreme Pornography’ Problem in the British National Press

Authors: Dimitrios Akrivos, Alexandros K. Antoniou

Abstract:

This paper looks at the crucial role of the British news media in the construction of extreme pornography as a social problem, suggesting that this paved the way for the subsequent criminalization of such material through the introduction of the Criminal Justice and Immigration Act 2008. Focusing on the high-profile case of Graham Coutts, it examines the British national press’ reaction to Jane Longhurst’s murder through a qualitative content analysis of 251 relevant news articles. Specifically, the paper documents the key arguments expressed in the corresponding claims-making process. It considers the different ways in which the consequent ‘trial by media’ presented this exceptional case as the ‘tip of the iceberg’ and eventually translated into policy. The analysis sheds light on the attempts to ‘piggyback’ the issue of extreme pornography on child sexual abuse images as well as the textual and visual mechanisms used to establish an ‘us versus them’ dichotomy in the pertinent media discourse. Finally, the paper assesses the severity of the actual risk posed by extreme pornography, concluding that its criminalization should not merely be dismissed as the outcome of an institutionalized media panic.

Keywords: criminalization, extreme pornography, social problem, trial by media

Procedia PDF Downloads 237
1500 Energy Intensity: A Case of Indian Manufacturing Industries

Authors: Archana Soni, Arvind Mittal, Manmohan Kapshe

Abstract:

Energy has been recognized as one of the key inputs for the economic growth and social development of a country. High economic growth naturally means a high level of energy consumption. However, in the present energy scenario where there is a wide gap between the energy generation and energy consumption, it is extremely difficult to match the demand with the supply. India being one of the largest and rapidly growing developing countries, there is an impending energy crisis which requires immediate measures to be adopted. In this situation, the concept of Energy Intensity comes under special focus to ensure energy security in an environmentally sustainable way. Energy Intensity is defined as the energy consumed per unit output in the context of industrial energy practices. It is a key determinant of the projections of future energy demands which assists in policy making. Energy Intensity is inversely related to energy efficiency; lesser the energy required to produce a unit of output or service, the greater is the energy efficiency. Energy Intensity of Indian manufacturing industries is among the highest in the world and stands for enormous energy consumption. Hence, reducing the Energy Intensity of Indian manufacturing industries is one of the best strategies to achieve a low level of energy consumption and conserve energy. This study attempts to analyse the factors which influence the Energy Intensity of Indian manufacturing firms and how they can be used to reduce the Energy Intensity. The paper considers six of the largest energy consuming manufacturing industries in India viz. Aluminium, Cement, Iron & Steel Industries, Textile Industries, Fertilizer and Paper industries and conducts a detailed Energy Intensity analysis using the data from PROWESS database of the Centre for Monitoring Indian Economy (CMIE). A total of twelve independent explanatory variables based on various factors such as raw material, labour, machinery, repair and maintenance, production technology, outsourcing, research and development, number of employees, wages paid, profit margin and capital invested have been taken into consideration for the analysis.

Keywords: energy intensity, explanatory variables, manufacturing industries, PROWESS database

Procedia PDF Downloads 327
1499 Development of DNA Fingerprints in Selected Medicinal Plants of India

Authors: V. Verma, Hazi Raja

Abstract:

Conventionally, morphological descriptors are routinely used for establishing the identity of varieties. But these morphological descriptors suffer from many drawbacks such as influence of environment on trait expression, epistatic interactions, pleiotrophic effects etc. Furthermore, the paucity of a sufficient number of these descriptors for unequivocal identification of increasing number of reference collection varieties enforces to look for alternatives. Therefore, DNA based finger-print based techniques were selected to define the systematic position of the selected medicinal plants like Plumbago zeylanica, Desmodium gangeticum, Uraria picta. DNA fingerprinting of herbal plants can be useful in authenticating the various claims of medical uses related to the plants, in germplasm characterization and conservation. In plants it has not only helped in identifying species but also in defining a new realm in plant genomics, plant breeding and in conserving the biodiversity. With world paving way for developments in biotechnology, DNA fingerprinting promises a very powerful tool in our future endeavors. Data will be presented on the development of microsatellite markers (SSR) used to fingerprint, characterize, and assess genetic diversity among 12 accessions of both Plumbago zeylanica, 4 accessions of Desmodium gengaticum, 4 accessions of Uraria Picta.

Keywords: Plumbago zeylanica, Desmodium gangeticum, Uraria picta, microsaetllite markers

Procedia PDF Downloads 211
1498 ROSgeoregistration: Aerial Multi-Spectral Image Simulator for the Robot Operating System

Authors: Andrew R. Willis, Kevin Brink, Kathleen Dipple

Abstract:

This article describes a software package called ROS-georegistration intended for use with the robot operating system (ROS) and the Gazebo 3D simulation environment. ROSgeoregistration provides tools for the simulation, test, and deployment of aerial georegistration algorithms and is available at github.com/uncc-visionlab/rosgeoregistration. A model creation package is provided which downloads multi-spectral images from the Google Earth Engine database and, if necessary, incorporates these images into a single, possibly very large, reference image. Additionally a Gazebo plugin which uses the real-time sensor pose and image formation model to generate simulated imagery using the specified reference image is provided along with related plugins for UAV relevant data. The novelty of this work is threefold: (1) this is the first system to link the massive multi-spectral imaging database of Google’s Earth Engine to the Gazebo simulator, (2) this is the first example of a system that can simulate geospatially and radiometrically accurate imagery from multiple sensor views of the same terrain region, and (3) integration with other UAS tools creates a new holistic UAS simulation environment to support UAS system and subsystem development where real-world testing would generally be prohibitive. Sensed imagery and ground truth registration information is published to client applications which can receive imagery synchronously with telemetry from other payload sensors, e.g., IMU, GPS/GNSS, barometer, and windspeed sensor data. To highlight functionality, we demonstrate ROSgeoregistration for simulating Electro-Optical (EO) and Synthetic Aperture Radar (SAR) image sensors and an example use case for developing and evaluating image-based UAS position feedback, i.e., pose for image-based Guidance Navigation and Control (GNC) applications.

Keywords: EO-to-EO, EO-to-SAR, flight simulation, georegistration, image generation, robot operating system, vision-based navigation

Procedia PDF Downloads 99
1497 Prioritizing Roads Safety Based on the Quasi-Induced Exposure Method and Utilization of the Analytical Hierarchy Process

Authors: Hamed Nafar, Sajad Rezaei, Hamid Behbahani

Abstract:

Safety analysis of the roads through the accident rates which is one of the widely used tools has been resulted from the direct exposure method which is based on the ratio of the vehicle-kilometers traveled and vehicle-travel time. However, due to some fundamental flaws in its theories and difficulties in gaining access to the data required such as traffic volume, distance and duration of the trip, and various problems in determining the exposure in a specific time, place, and individual categories, there is a need for an algorithm for prioritizing the road safety so that with a new exposure method, the problems of the previous approaches would be resolved. In this way, an efficient application may lead to have more realistic comparisons and the new method would be applicable to a wider range of time, place, and individual categories. Therefore, an algorithm was introduced to prioritize the safety of roads using the quasi-induced exposure method and utilizing the analytical hierarchy process. For this research, 11 provinces of Iran were chosen as case study locations. A rural accidents database was created for these provinces, the validity of quasi-induced exposure method for Iran’s accidents database was explored, and the involvement ratio for different characteristics of the drivers and the vehicles was measured. Results showed that the quasi-induced exposure method was valid in determining the real exposure in the provinces under study. Results also showed a significant difference in the prioritization based on the new and traditional approaches. This difference mostly would stem from the perspective of the quasi-induced exposure method in determining the exposure, opinion of experts, and the quantity of accidents data. Overall, the results for this research showed that prioritization based on the new approach is more comprehensive and reliable compared to the prioritization in the traditional approach which is dependent on various parameters including the driver-vehicle characteristics.

Keywords: road safety, prioritizing, Quasi-induced exposure, Analytical Hierarchy Process

Procedia PDF Downloads 332
1496 A Real-World Roadmap and Exploration of Quantum Computers Capacity to Trivialise Internet Security

Authors: James Andrew Fitzjohn

Abstract:

This paper intends to discuss and explore the practical aspects of cracking encrypted messages with quantum computers. The theory of this process has been shown and well described both in academic papers and headline-grabbing news articles, but with all theory and hyperbole, we must be careful to assess the practicalities of these claims. Therefore, we will use real-world devices and proof of concept code to prove or disprove the notion that quantum computers will render the encryption technologies used by many websites unfit for purpose. It is time to discuss and implement the practical aspects of the process as many advances in quantum computing hardware/software have recently been made. This paper will set expectations regarding the useful lifespan of RSA and cipher lengths and propose alternative encryption technologies. We will set out comprehensive roadmaps describing when and how encryption schemes can be used, including when they can no longer be trusted. The cost will also be factored into our investigation; for example, it would make little financial sense to spend millions of dollars on a quantum computer to factor a private key in seconds when a commodity GPU could perform the same task in hours. It is hoped that the real-world results depicted in this paper will help influence the owners of websites who can take appropriate actions to improve the security of their provisions.

Keywords: quantum computing, encryption, RSA, roadmap, real world

Procedia PDF Downloads 123
1495 GIS-Based Identification of Overloaded Distribution Transformers and Calculation of Technical Electric Power Losses

Authors: Awais Ahmed, Javed Iqbal

Abstract:

Pakistan has been for many years facing extreme challenges in energy deficit due to the shortage of power generation compared to increasing demand. A part of this energy deficit is also contributed by the power lost in transmission and distribution network. Unfortunately, distribution companies are not equipped with modern technologies and methods to identify and eliminate these losses. According to estimate, total energy lost in early 2000 was between 20 to 26 percent. To address this issue the present research study was designed with the objectives of developing a standalone GIS application for distribution companies having the capability of loss calculation as well as identification of overloaded transformers. For this purpose, Hilal Road feeder in Faisalabad Electric Supply Company (FESCO) was selected as study area. An extensive GPS survey was conducted to identify each consumer, linking it to the secondary pole of the transformer, geo-referencing equipment and documenting conductor sizes. To identify overloaded transformer, accumulative kWH reading of consumer on transformer was compared with threshold kWH. Technical losses of 11kV and 220V lines were calculated using the data from substation and resistance of the network calculated from the geo-database. To automate the process a standalone GIS application was developed using ArcObjects with engineering analysis capabilities. The application uses GIS database developed for 11kV and 220V lines to display and query spatial data and present results in the form of graphs. The result shows that about 14% of the technical loss on both high tension (HT) and low tension (LT) network while about 4 out of 15 general duty transformers were found overloaded. The study shows that GIS can be a very effective tool for distribution companies in management and planning of their distribution network.

Keywords: geographical information system, GIS, power distribution, distribution transformers, technical losses, GPS, SDSS, spatial decision support system

Procedia PDF Downloads 372
1494 Interpreting Privacy Harms from a Non-Economic Perspective

Authors: Christopher Muhawe, Masooda Bashir

Abstract:

With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.

Keywords: data breach and misuse, economic harms, privacy harms, psychological harms

Procedia PDF Downloads 192
1493 International Conference on Comparative Religion and Mythology

Authors: Mara Varelaki

Abstract:

In response to the challenge of the environmental crisis the discipline of environmental ethics examines the relation of human beings towards the environment and the value of the non-human constituents of the surrounding world. In the face of this crisis, assumptions regarding human and nature relations ought to be traced and reexamined because they can cause difficulties in diagnosing problematic attitudes towards the environment and non-human animals. This paper presents the claims that European and the Judea-Christian cosmogonic myths place the human figure in the core of the creation of the cosmos, thus verifying a hierarchical structure where humans occupy the top, and they establish a perception of nature as a non-human other. By doing so, these narratives provide some justification to the notion of the human-nature dichotomy and the human domination over other life forms and ecosystems. These anthropocentric assumptions evolved into what Hilde Lindemann terms master narratives and their influence extents to ecocentric ethical theories which attempt, and often fail, to shed the anthropocentrism of the western ethical tradition. The goal of this paper is (1) to trace the anthropocentric assumptions embedded in western thought and (2) articulate how they maintain their grip on our contemporary understanding of the human relation to and position within the environment, thus showing the need for a method of detecting and bracketing anthropocentric assumptions in social narratives and ethical frameworks.

Keywords: cosmogonies, anthropocentrism, human/nature dichotomy, master narratives, ecocentrism

Procedia PDF Downloads 103
1492 Musical Instruments Classification Using Machine Learning Techniques

Authors: Bhalke D. G., Bormane D. S., Kharate G. K.

Abstract:

This paper presents classification of musical instrument using machine learning techniques. The classification has been carried out using temporal, spectral, cepstral and wavelet features. Detail feature analysis is carried out using separate and combined features. Further, instrument model has been developed using K-Nearest Neighbor and Support Vector Machine (SVM). Benchmarked McGill university database has been used to test the performance of the system. Experimental result shows that SVM performs better as compared to KNN classifier.

Keywords: feature extraction, SVM, KNN, musical instruments

Procedia PDF Downloads 476
1491 Risk of Fractures at Different Anatomic Sites in Patients with Irritable Bowel Syndrome: A Nationwide Population-Based Cohort Study

Authors: Herng-Sheng Lee, Chi-Yi Chen, Wan-Ting Huang, Li-Jen Chang, Solomon Chih-Cheng Chen, Hsin-Yi Yang

Abstract:

A variety of gastrointestinal disorders, such as Crohn’s disease, ulcerative colitis, and coeliac disease, are recognized as risk factors for osteoporosis and osteoporotic fractures. One recent study suggests that individuals with irritable bowel syndrome (IBS) might also be at increased risk of osteoporosis and osteoporotic fractures. Up to now, the association between IBS and the risk of fractures at different anatomic sites occurrences is not completely clear. We conducted a population-based cohort analysis to investigate the fracture risk of IBS in comparison with non-IBS group. We identified 29,505 adults aged ≥ 20 years with newly diagnosed IBS using the Taiwan National Health Insurance Research Database in 2000-2012. A comparison group was constructed of patients without IBS who were matched according to gender and age. The occurrence of fracture was monitored until the end of 2013. We analyzed the risk of fracture events to occur in IBS by using Cox proportional hazards regression models. Patients with IBS had a higher incidence of osteoporotic fractures compared with non-IBS group (12.34 versus 9.45 per 1,000 person-years) and an increased risk of osteoporotic fractures (adjusted hazard ratio [aHR] = 1.27, 95 % confidence interval [CI] = 1.20 – 1.35). Site specific analysis showed that the IBS group had a higher risk of fractures for spine, forearm, hip and hand than did the non-IBS group. With further stratification for gender and age, a higher aHR value for osteoporotic fractures in IBS group was seen across all age groups in males, but seen in elderly females. In addition, female, elderly, low income, hypertension, coronary artery disease, cerebrovascular disease, and depressive disorders as independent osteoporotic fracture risk factors in IBS patients. The IBS is considered as a risk factor for osteoporotic fractures, particularly in female individuals and fracture sites located at the spine, forearm, hip and hand.

Keywords: irritable bowel syndrome, fracture, gender difference, longitudinal health insurance database, public health

Procedia PDF Downloads 225
1490 Rethinking the Pre-Trial Detention Law of Ethiopia: An International Law and Constitutional Law Perspective

Authors: Addisu Teshama

Abstract:

The existing criminal procedure law which is the main determinant of the phenomena of pre-trial detention is under revision in Ethiopia. The drafting work is completed and submitted for approval to the House of Peoples Representatives. The drafters of the draft law claim that the existing law is not in harmony with the constitutionally and internationally recognized principles pertinent to pretrial detention regulation. Further, the drafters allege that the drafting process is dictated by human rights principles recognized in the FDRE constitution and international human rights instruments ratified by Ethiopia. This article aims to the asses the plausibility of the claims of the drafters. For that purpose, this article uses the standards and guidelines articulated by international human rights standard setters as bench marks to juxtapose and judge the existing law and the draft criminal procedure and evidence code (DCrimPEC). The study found that the many aspects of the pre-trial detention law of Ethiopia are not in compliance with international law standards in the existing criminal procedure law. The DCrimPEC is aimed to harmonize the existing law with the constitution and international law standards. In this regard, the study found that the DCrimPEC has made significant changes on pre-trial detention policies which are not in harmony the principle of presumption of innocence. However, there are still gaps.

Keywords: pre-trial detention, right to personal liberty, right to bail, Ethiopia

Procedia PDF Downloads 47
1489 Investigate the Side Effects of Patients With Severe COVID-19 and Choose the Appropriate Medication Regimens to Deal With Them

Authors: Rasha Ahmadi

Abstract:

In December 2019, a coronavirus, currently identified as SARS-CoV-2, produced a series of acute atypical respiratory illnesses in Wuhan, Hubei Province, China. The sickness induced by this virus was named COVID-19. The virus is transmittable between humans and has caused pandemics worldwide. The number of death tolls continues to climb and a huge number of countries have been obliged to perform social isolation and lockdown. Lack of focused therapy continues to be a problem. Epidemiological research showed that senior patients were more susceptible to severe diseases, whereas children tend to have milder symptoms. In this study, we focus on other possible side effects of COVID-19 and more detailed treatment strategies. Using bioinformatics analysis, we first isolated the gene expression profile of patients with severe COVID-19 from the GEO database. Patients' blood samples were used in the GSE183071 dataset. We then categorized the genes with high and low expression. In the next step, we uploaded the genes separately to the Enrichr database and evaluated our data for signs and symptoms as well as related medication regimens. The results showed that 138 genes with high expression and 108 genes with low expression were observed differentially in the severe COVID-19 VS control group. Symptoms and diseases such as embolism and thrombosis of the abdominal aorta, ankylosing spondylitis, suicidal ideation or attempt, regional enteritis were observed in genes with high expression and in genes with low expression of acute and subacute forms of ischemic heart, CNS infection and poliomyelitis, synovitis and tenosynovitis. Following the detection of diseases and possible signs and symptoms, Carmustine, Bithionol, Leflunomide were evaluated more significantly for high-expression genes and Chlorambucil, Ifosfamide, Hydroxyurea, Bisphenol for low-expression genes. In general, examining the different and invisible aspects of COVID-19 and identifying possible treatments can help us significantly in the emergency and hospitalization of patients.

Keywords: phenotypes, drug regimens, gene expression profiles, bioinformatics analysis, severe COVID-19

Procedia PDF Downloads 135
1488 A Framework for an Automated Decision Support System for Selecting Safety-Conscious Contractors

Authors: Rawan A. Abdelrazeq, Ahmed M. Khalafallah, Nabil A. Kartam

Abstract:

Selection of competent contractors for construction projects is usually accomplished through competitive bidding or negotiated contracting in which the contract bid price is the basic criterion for selection. The evaluation of contractor’s safety performance is still not a typical criterion in the selection process, despite the existence of various safety prequalification procedures. There is a critical need for practical and automated systems that enable owners and decision makers to evaluate contractor safety performance, among other important contractor selection criteria. These systems should ultimately favor safety-conscious contractors to be selected by the virtue of their past good safety records and current safety programs. This paper presents an exploratory sequential mixed-methods approach to develop a framework for an automated decision support system that evaluates contractor safety performance based on a multitude of indicators and metrics that have been identified through a comprehensive review of construction safety research, and a survey distributed to domain experts. The framework is developed in three phases: (1) determining the indicators that depict contractor current and past safety performance; (2) soliciting input from construction safety experts regarding the identified indicators, their metrics, and relative significance; and (3) designing a decision support system using relational database models to integrate the identified indicators and metrics into a system that assesses and rates the safety performance of contractors. The proposed automated system is expected to hold several advantages including: (1) reducing the likelihood of selecting contractors with poor safety records; (2) enhancing the odds of completing the project safely; and (3) encouraging contractors to exert more efforts to improve their safety performance and practices in order to increase their bid winning opportunities which can lead to significant safety improvements in the construction industry. This should prove useful to decision makers and researchers, alike, and should help improve the safety record of the construction industry.

Keywords: construction safety, contractor selection, decision support system, relational database

Procedia PDF Downloads 276
1487 The Optimal Irrigation in the Mitidja Plain

Authors: Gherbi Khadidja

Abstract:

In the Mediterranean region, water resources are limited and very unevenly distributed in space and time. The main objective of this project is the development of a wireless network for the management of water resources in northern Algeria, the Mitidja plain, which helps farmers to irrigate in the most optimized way and solve the problem of water shortage in the region. Therefore, we will develop an aid tool that can modernize and replace some traditional techniques, according to the real needs of the crops and according to the soil conditions as well as the climatic conditions (soil moisture, precipitation, characteristics of the unsaturated zone), These data are collected in real-time by sensors and analyzed by an algorithm and displayed on a mobile application and the website. The results are essential information and alerts with recommendations for action to farmers to ensure the sustainability of the agricultural sector under water shortage conditions. In the first part: We want to set up a wireless sensor network, for precise management of water resources, by presenting another type of equipment that allows us to measure the water content of the soil, such as the Watermark probe connected to the sensor via the acquisition card and an Arduino Uno, which allows collecting the captured data and then program them transmitted via a GSM module that will send these data to a web site and store them in a database for a later study. In a second part: We want to display the results on a website or a mobile application using the database to remotely manage our smart irrigation system, which allows the farmer to use this technology and offers the possibility to the growers to access remotely via wireless communication to see the field conditions and the irrigation operation, at home or at the office. The tool to be developed will be based on satellite imagery as regards land use and soil moisture. These tools will make it possible to follow the evolution of the needs of the cultures in time, but also to time, and also to predict the impact on water resources. According to the references consulted, if such a tool is used, it can reduce irrigation volumes by up to up to 40%, which represents more than 100 million m3 of savings per year for the Mitidja. This volume is equivalent to a medium-size dam.

Keywords: optimal irrigation, soil moisture, smart irrigation, water management

Procedia PDF Downloads 102
1486 A Research on Determining the Viability of a Job Board Website for Refugees in Kenya

Authors: Prince Mugoya, Collins Oduor Ondiek, Patrick Kanyi Wamuyu

Abstract:

Refugee Job Board Website is a web-based application that provides a platform for organizations to post jobs specifically for refugees. Organizations upload job opportunities and refugees can view them on the website. The website also allows refugees to input their skills and qualifications. The methodology used to develop this system is a waterfall (traditional) methodology. Software development tools include Brackets which will be used to code the website and PhpMyAdmin to store all the data in a database.

Keywords: information technology, refugee, skills, utilization, economy, jobs

Procedia PDF Downloads 163
1485 Operationalizing the Concept of Community Resilience through Community Capitals Framework-Based Index

Authors: Warda Ajaz

Abstract:

This study uses the ‘Community Capitals Framework’ (CCF) to develop a community resilience index that can serve as a useful tool for measuring resilience of communities in diverse contexts and backgrounds. CCF is an important analytical tool to assess holistic community change. This framework identifies seven major types of community capitals: natural, cultural, human, social, political, financial and built, and claims that the communities that have been successful in supporting healthy sustainable community and economic development have paid attention to all these capitals. The framework, therefore, proposes to study the community development through identification of assets in these major capitals (stock), investment in these capitals (flow), and the interaction between these capitals. Capital based approaches have been extensively used to assess community resilience, especially in the context of natural disasters and extreme events. Therefore, this study identifies key indicators for estimating each of the seven capitals through an extensive literature review and then develops an index to calculate a community resilience score. The CCF-based community resilience index presents an innovative way of operationalizing the concept of community resilience and will contribute toward decision-relevant research regarding adaptation and mitigation of community vulnerabilities to climate change-induced, as well as other adverse events.

Keywords: adverse events, community capitals, community resilience, climate change, economic development, sustainability

Procedia PDF Downloads 264
1484 DHL CSI Solution Design Project

Authors: Mohammed Al-Yamani, Yaser Miaji

Abstract:

DHL Customer Solutions and Innovation Department (CSI) have been experiencing difficulties while comparing quotes for different customers in different years. Currently, the employees are processing data by opening several loaded Excel files where the quotes are and manually copying values to another Excel Workbook where the comparison is made. This project consists of developing a new and effective database for DHL CSI department so that information is stored altogether on the same catalog. That being said, we have been assigned to find an efficient algorithm that can deal with the different formats of the Excel Workbooks to copy and store the express customer rates for core products (DOX, WPX, IMP) for comparisons purposes.

Keywords: DHL, solution design, ORACLE, EXCEL

Procedia PDF Downloads 405
1483 "Black Book": Dutch Prototype or Jewish Outsider

Authors: Eyal Boers

Abstract:

This paper shall demonstrate how films can offer a valuable and innovative approach to the study of images, stereotypes, and national identity. "Black Book" ("Zwartboek", 2006), a World War Two film directed by Paul Verhoeven, tells the story of Rachel Stein, a young Jewish woman who becomes a member of a resistance group in the Netherlands. The main hypothesis in this paper maintains that Rachel's character possesses both features of the Dutch prototype (a white, secular, sexual, freedom-loving individualist who seems "Dutch" enough to be accepted into a Dutch resistance group and even infiltrate the local Nazi headquarters) and features which can be defined as specifically Jewish (a black-haired victim persecuted by the Nazis, transforming herself into a gentile, while remaining loyal to her fellow Jews and ultimately immigrating to Israel and becoming a Hebrew teacher in a Kibbutz). Finally, this paper claims that Rachel's "Dutchness" is symptomatic of Dutch nostalgia in the 21st century for the Jews as "others" who blend into dominant Dutch culture, while Rachel's "Jewish Otherness" reflects a transnational identity – one that is always shifting and traverses cultural and national boundaries. In this sense, a film about Dutch Jews in the Second World War reflects on issues of identity in the 21st Century.

Keywords: Dutch, film, stereotypes, identity

Procedia PDF Downloads 121
1482 A Study on the Role of Human Rights in the Aid Allocations of China and the United States

Authors: Shazmeen Maroof

Abstract:

The study is motivated by a desire to investigate whether there is substance to claims that, relative to traditional donors, China disregards human rights considerations when allocating overseas aid. While the stated policy of the U.S. is that consideration of potential aid recipients’ respect for human rights is mandatory, some quantitative studies have cast doubt on whether this is reflected in actual allocations. There is a lack of academic literature that formally assesses the extent to which the two countries' aid allocations differ; which is essential to test whether the criticisms of China's aid policy in comparison to that of the U.S. are justified. Using data on two standard human rights measures, 'Political Terror Scale' and 'Civil Liberties', the study analyse the two donors’ aid allocations among 125 countries over the period 2000 to 2014. The bivariate analysis demonstrated that a significant share of China’s aid flow to countries with poor human rights record. At the same time, the U.S. seems little different in providing aid to these countries. The empirical results obtained from the Fractional Logit model also provided some support to the general pessimism regarding China’s provision of aid to countries with poor human rights record, yet challenge the optimists expecting better targeted aid from the U.S. These findings are consistent with the split between humanitarian and non-humanitarian aid and in the sample of countries whose human rights record is below some threshold level.

Keywords: China's aid policy, foreign aid allocation, human rights, United States Foreign Assistance Act

Procedia PDF Downloads 107
1481 Artificial Neural Networks and Hidden Markov Model in Landslides Prediction

Authors: C. S. Subhashini, H. L. Premaratne

Abstract:

Landslides are the most recurrent and prominent disaster in Sri Lanka. Sri Lanka has been subjected to a number of extreme landslide disasters that resulted in a significant loss of life, material damage, and distress. It is required to explore a solution towards preparedness and mitigation to reduce recurrent losses associated with landslides. Artificial Neural Networks (ANNs) and Hidden Markov Model (HMMs) are now widely used in many computer applications spanning multiple domains. This research examines the effectiveness of using Artificial Neural Networks and Hidden Markov Model in landslides predictions and the possibility of applying the modern technology to predict landslides in a prominent geographical area in Sri Lanka. A thorough survey was conducted with the participation of resource persons from several national universities in Sri Lanka to identify and rank the influencing factors for landslides. A landslide database was created using existing topographic; soil, drainage, land cover maps and historical data. The landslide related factors which include external factors (Rainfall and Number of Previous Occurrences) and internal factors (Soil Material, Geology, Land Use, Curvature, Soil Texture, Slope, Aspect, Soil Drainage, and Soil Effective Thickness) are extracted from the landslide database. These factors are used to recognize the possibility to occur landslides by using an ANN and HMM. The model acquires the relationship between the factors of landslide and its hazard index during the training session. These models with landslide related factors as the inputs will be trained to predict three classes namely, ‘landslide occurs’, ‘landslide does not occur’ and ‘landslide likely to occur’. Once trained, the models will be able to predict the most likely class for the prevailing data. Finally compared two models with regards to prediction accuracy, False Acceptance Rates and False Rejection rates and This research indicates that the Artificial Neural Network could be used as a strong decision support system to predict landslides efficiently and effectively than Hidden Markov Model.

Keywords: landslides, influencing factors, neural network model, hidden markov model

Procedia PDF Downloads 382
1480 Associated Factors of Hypertension, Hypercholesterolemia and Double Burden Hypertension-Hypercholesterolemia in Patients With Congestive Heart Failure: Hospital Based Study

Authors: Pierre Mintom, William Djeukeu Asongni, Michelle Moni, William Dakam, Christine Fernande Nyangono Biyegue.

Abstract:

Background: In order to prevent congestive heart failure, control of hypertension and hypercholesterolemia is necessary because those risk factors frequently occur in combination. Objective: The aim of the study is to determine the prevalence and risk factors of hypertension, hypercholesterolemia and double burden HTA-Hypercholesterolemia in patients with congestive heart failure. Methodology: A database of 98 patients suffering from congestive heart failure was used. The latter were recruited from August 15, 2017, to March 5, 2018, in the Cardiology department of Deido District Hospital of Douala. This database provides information on sociodemographic parameters, biochemical examinations, characteristics of heart failure and food consumption. ESC/ESH and NCEP-ATPIII definitions were used to define Hypercholesterolemia (total cholesterol ≥200mg/dl), Hypertension (SBP≥140mmHg and/or DBP≥90mmHg). Double burden hypertension-hypercholesterolemia was defined as follows: total cholesterol (CT)≥200mg/dl, SBP≥140mmHg and DBP≥90mmHg. Results: The prevalence of hypertension (HTA), hypercholesterolemia (hyperchol) and double burden HTA-Hyperchol were 61.2%, 66.3% and 45.9%, respectively. No sociodemographic factor was associated with hypertension, hypercholesterolemia and double burden, but Male gender was significantly associated (p<0.05) with hypercholesterolemia. HypoHDLemia significantly increased hypercholesterolemia and the double burden by 19.664 times (p=0.001) and 14.968 times (p=0.021), respectively. Regarding dietary habits, the consumption of rice, peanuts and derivatives and cottonseed oil respectively significantly (p<0.05) exposed to the occurrence of hypertension. The consumption of tomatoes, green bananas, corn and derivatives, peanuts and derivatives and cottonseed oil significantly exposed (p<0.05) to the occurrence of hypercholesterolemia. The consumption of palm oil and cottonseed oil exposed the occurrence of the double burden of hypertension-hypercholesterolemia. Consumption of eggs protects against hypercholesterolemia, and consumption of peanuts and tomatoes protects against the double burden. Conclusion: hypercholesterolemia associated with hypertension appears as a complicating factor of congestive heart failure. Key risk factors are mainly diet-based, suggesting the importance of nutritional education for patients. New management protocols emphasizing diet should be considered.

Keywords: risk factors, hypertension, hypercholesterolemia, congestive heart failure

Procedia PDF Downloads 66
1479 Methodology to Achieve Non-Cooperative Target Identification Using High Resolution Range Profiles

Authors: Olga Hernán-Vega, Patricia López-Rodríguez, David Escot-Bocanegra, Raúl Fernández-Recio, Ignacio Bravo

Abstract:

Non-Cooperative Target Identification has become a key research domain in the Defense industry since it provides the ability to recognize targets at long distance and under any weather condition. High Resolution Range Profiles, one-dimensional radar images where the reflectivity of a target is projected onto the radar line of sight, are widely used for identification of flying targets. According to that, to face this problem, an approach to Non-Cooperative Target Identification based on the exploitation of Singular Value Decomposition to a matrix of range profiles is presented. Target Identification based on one-dimensional radar images compares a collection of profiles of a given target, namely test set, with the profiles included in a pre-loaded database, namely training set. The classification is improved by using Singular Value Decomposition since it allows to model each aircraft as a subspace and to accomplish recognition in a transformed domain where the main features are easier to extract hence, reducing unwanted information such as noise. Singular Value Decomposition permits to define a signal subspace which contain the highest percentage of the energy, and a noise subspace which will be discarded. This way, only the valuable information of each target is used in the recognition process. The identification algorithm is based on finding the target that minimizes the angle between subspaces and takes place in a transformed domain. Two metrics, F1 and F2, based on Singular Value Decomposition are accomplished in the identification process. In the case of F2, the angle is weighted, since the top vectors set the importance in the contribution to the formation of a target signal, on the contrary F1 simply shows the evolution of the unweighted angle. In order to have a wide database or radar signatures and evaluate the performance, range profiles are obtained through numerical simulation of seven civil aircraft at defined trajectories taken from an actual measurement. Taking into account the nature of the datasets, the main drawback of using simulated profiles instead of actual measured profiles is that the former implies an ideal identification scenario, since measured profiles suffer from noise, clutter and other unwanted information and simulated profiles don't. In this case, the test and training samples have similar nature and usually a similar high signal-to-noise ratio, so as to assess the feasibility of the approach, the addition of noise has been considered before the creation of the test set. The identification results applying the unweighted and weighted metrics are analysed for demonstrating which algorithm provides the best robustness against noise in an actual possible scenario. So as to confirm the validity of the methodology, identification experiments of profiles coming from electromagnetic simulations are conducted, revealing promising results. Considering the dissimilarities between the test and training sets when noise is added, the recognition performance has been improved when weighting is applied. Future experiments with larger sets are expected to be conducted with the aim of finally using actual profiles as test sets in a real hostile situation.

Keywords: HRRP, NCTI, simulated/synthetic database, SVD

Procedia PDF Downloads 350
1478 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data

Authors: Michelangelo Sofo, Giuseppe Labianca

Abstract:

In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.

Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm

Procedia PDF Downloads 14
1477 Comparison of Machine Learning-Based Models for Predicting Streptococcus pyogenes Virulence Factors and Antimicrobial Resistance

Authors: Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Diego Santibañez Oyarce, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán

Abstract:

Streptococcus pyogenes is a gram-positive bacteria involved in a wide range of diseases and is a major-human-specific bacterial pathogen. In Chile, this year the 'Ministerio de Salud' declared an alert due to the increase in strains throughout the year. This increase can be attributed to the multitude of factors including antimicrobial resistance (AMR) and Virulence Factors (VF). Understanding these VF and AMR is crucial for developing effective strategies and improving public health responses. Moreover, experimental identification and characterization of these pathogenic mechanisms are labor-intensive and time-consuming. Therefore, new computational methods are required to provide robust techniques for accelerating this identification. Advances in Machine Learning (ML) algorithms represent the opportunity to refine and accelerate the discovery of VF associated with Streptococcus pyogenes. In this work, we evaluate the accuracy of various machine learning models in predicting the virulence factors and antimicrobial resistance of Streptococcus pyogenes, with the objective of providing new methods for identifying the pathogenic mechanisms of this organism.Our comprehensive approach involved the download of 32,798 genbank files of S. pyogenes from NCBI dataset, coupled with the incorporation of data from Virulence Factor Database (VFDB) and Antibiotic Resistance Database (CARD) which contains sequences of AMR gene sequence and resistance profiles. These datasets provided labeled examples of both virulent and non-virulent genes, enabling a robust foundation for feature extraction and model training. We employed preprocessing, characterization and feature extraction techniques on primary nucleotide/amino acid sequences and selected the optimal more for model training. The feature set was constructed using sequence-based descriptors (e.g., k-mers and One-hot encoding), and functional annotations based on database prediction. The ML models compared are logistic regression, decision trees, support vector machines, neural networks among others. The results of this work show some differences in accuracy between the algorithms, these differences allow us to identify different aspects that represent unique opportunities for a more precise and efficient characterization and identification of VF and AMR. This comparative analysis underscores the value of integrating machine learning techniques in predicting S. pyogenes virulence and AMR, offering potential pathways for more effective diagnostic and therapeutic strategies. Future work will focus on incorporating additional omics data, such as transcriptomics, and exploring advanced deep learning models to further enhance predictive capabilities.

Keywords: antibiotic resistance, streptococcus pyogenes, virulence factors., machine learning

Procedia PDF Downloads 17
1476 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management

Authors: Chokri Slim

Abstract:

The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.

Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines

Procedia PDF Downloads 147