Search results for: data mining techniques
28785 Affects Associations Analysis in Emergency Situations
Authors: Joanna Grzybowska, Magdalena Igras, Mariusz Ziółko
Abstract:
Association rule learning is an approach for discovering interesting relationships in large databases. The analysis of relations, invisible at first glance, is a source of new knowledge which can be subsequently used for prediction. We used this data mining technique (which is an automatic and objective method) to learn about interesting affects associations in a corpus of emergency phone calls. We also made an attempt to match revealed rules with their possible situational context. The corpus was collected and subjectively annotated by two researchers. Each of 3306 recordings contains information on emotion: (1) type (sadness, weariness, anxiety, surprise, stress, anger, frustration, calm, relief, compassion, contentment, amusement, joy) (2) valence (negative, neutral, or positive) (3) intensity (low, typical, alternating, high). Also, additional information, that is a clue to speaker’s emotional state, was annotated: speech rate (slow, normal, fast), characteristic vocabulary (filled pauses, repeated words) and conversation style (normal, chaotic). Exponentially many rules can be extracted from a set of items (an item is a previously annotated single information). To generate the rules in the form of an implication X → Y (where X and Y are frequent k-itemsets) the Apriori algorithm was used - it avoids performing needless computations. Then, two basic measures (Support and Confidence) and several additional symmetric and asymmetric objective measures (e.g. Laplace, Conviction, Interest Factor, Cosine, correlation coefficient) were calculated for each rule. Each applied interestingness measure revealed different rules - we selected some top rules for each measure. Owing to the specificity of the corpus (emergency situations), most of the strong rules contain only negative emotions. There are though strong rules including neutral or even positive emotions. Three examples of the strongest rules are: {sadness} → {anxiety}; {sadness, weariness, stress, frustration} → {anger}; {compassion} → {sadness}. Association rule learning revealed the strongest configurations of affects (as well as configurations of affects with affect-related information) in our emergency phone calls corpus. The acquired knowledge can be used for prediction to fulfill the emotional profile of a new caller. Furthermore, a rule-related possible context analysis may be a clue to the situation a caller is in.Keywords: data mining, emergency phone calls, emotional profiles, rules
Procedia PDF Downloads 40828784 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit
Authors: Ahmed Elrewainy
Abstract:
Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets
Procedia PDF Downloads 19528783 From Text to Data: Sentiment Analysis of Presidential Election Political Forums
Authors: Sergio V Davalos, Alison L. Watkins
Abstract:
User generated content (UGC) such as website post has data associated with it: time of the post, gender, location, type of device, and number of words. The text entered in user generated content (UGC) can provide a valuable dimension for analysis. In this research, each user post is treated as a collection of terms (words). In addition to the number of words per post, the frequency of each term is determined by post and by the sum of occurrences in all posts. This research focuses on one specific aspect of UGC: sentiment. Sentiment analysis (SA) was applied to the content (user posts) of two sets of political forums related to the US presidential elections for 2012 and 2016. Sentiment analysis results in deriving data from the text. This enables the subsequent application of data analytic methods. The SASA (SAIL/SAI Sentiment Analyzer) model was used for sentiment analysis. The application of SASA resulted with a sentiment score for each post. Based on the sentiment scores for the posts there are significant differences between the content and sentiment of the two sets for the 2012 and 2016 presidential election forums. In the 2012 forums, 38% of the forums started with positive sentiment and 16% with negative sentiment. In the 2016 forums, 29% started with positive sentiment and 15% with negative sentiment. There also were changes in sentiment over time. For both elections as the election got closer, the cumulative sentiment score became negative. The candidate who won each election was in the more posts than the losing candidates. In the case of Trump, there were more negative posts than Clinton’s highest number of posts which were positive. KNIME topic modeling was used to derive topics from the posts. There were also changes in topics and keyword emphasis over time. Initially, the political parties were the most referenced and as the election got closer the emphasis changed to the candidates. The performance of the SASA method proved to predict sentiment better than four other methods in Sentibench. The research resulted in deriving sentiment data from text. In combination with other data, the sentiment data provided insight and discovery about user sentiment in the US presidential elections for 2012 and 2016.Keywords: sentiment analysis, text mining, user generated content, US presidential elections
Procedia PDF Downloads 19228782 Aerobic Bioprocess Control Using Artificial Intelligence Techniques
Authors: M. Caramihai, Irina Severin
Abstract:
This paper deals with the design of an intelligent control structure for a bioprocess of Hansenula polymorpha yeast cultivation. The objective of the process control is to produce biomass in a desired physiological state. The work demonstrates that the designed Hybrid Control Techniques (HCT) are able to recognize specific evolution bioprocess trajectories using neural networks trained specifically for this purpose, in order to estimate the model parameters and to adjust the overall bioprocess evolution through an expert system and a fuzzy structure. The design of the control algorithm as well as its tuning through realistic simulations is presented. Taking into consideration the synergism of different paradigms like fuzzy logic, neural network, and symbolic artificial intelligence (AI), in this paper we present a real and fulfilled intelligent control architecture with application in bioprocess control.Keywords: bioprocess, intelligent control, neural nets, fuzzy structure, hybrid techniques
Procedia PDF Downloads 42128781 Teaching, Learning and Evaluation Enhancement of Information Communication Technology Education in Schools through Pedagogical and E-Learning Techniques in the Sri Lankan Context
Authors: M. G. N. A. S. Fernando
Abstract:
This study uses a researchable framework to improve the quality of ICT education and the Teaching Learning Assessment/ Evaluation (TLA/TLE) process. It utilizes existing resources while improving the methodologies along with pedagogical techniques and e-Learning approaches used in the secondary schools of Sri Lanka. The study was carried out in two phases. Phase I focused on investigating the factors which affect the quality of ICT education. Based on the key factors of phase I, the Phase II focused on the design of an Experimental Application Model with 6 activity levels. Each Level in the Activity Model covers one or more levels in the Revised Bloom’s Taxonomy. Towards further enhancement of activity levels, other pedagogical techniques (activity based learning, e-learning techniques, problem solving activities and peer discussions etc.) were incorporated to each level in the activity model as appropriate. The application model was validated by a panel of teachers including a domain expert and was tested in the school environment too. The validity of performance was proved using 6 hypotheses testing and other methodologies. The analysis shows that student performance with problem solving activities increased by 19.5% due to the different treatment levels used. Compared to existing process it was also proved that the embedded techniques (mixture of traditional and modern pedagogical methods and their applications) are more effective with skills development of teachers and students.Keywords: activity models, Bloom’s taxonomy, ICT education, pedagogies
Procedia PDF Downloads 16328780 Incidences and Factors Associated with Perioperative Cardiac Arrest in Trauma Patient Receiving Anesthesia
Authors: Visith Siriphuwanun, Yodying Punjasawadwong, Suwinai Saengyo, Kittipan Rerkasem
Abstract:
Objective: To determine incidences and factors associated with perioperative cardiac arrest in trauma patients who received anesthesia for emergency surgery. Design and setting: Retrospective cohort study in trauma patients during anesthesia for emergency surgery at a university hospital in northern Thailand country. Patients and methods: This study was permitted by the medical ethical committee, Faculty of Medicine at Maharaj Nakorn Chiang Mai Hospital, Thailand. We clarified data of 19,683 trauma patients receiving anesthesia within a decade between January 2007 to March 2016. The data analyzed patient characteristics, traumas surgery procedures, anesthesia information such as ASA physical status classification, anesthesia techniques, anesthetic drugs, location of anesthesia performed, and cardiac arrest outcomes. This study excluded the data of trauma patients who had received local anesthesia by surgeons or monitoring anesthesia care (MAC) and the patient which missing more information. The factor associated with perioperative cardiac arrest was identified with univariate analyses. Multiple regressions model for risk ratio (RR) and 95% confidence intervals (CI) were used to conduct factors correlated with perioperative cardiac arrest. The multicollinearity of all variables was examined by bivariate correlation matrix. A stepwise algorithm was chosen at a p-value less than 0.02 was selected to further multivariate analysis. A P-value of less than 0.05 was concluded as statistically significant. Measurements and results: The occurrence of perioperative cardiac arrest in trauma patients receiving anesthesia for emergency surgery was 170.04 per 10,000 cases. Factors associated with perioperative cardiac arrest in trauma patients were age being more than 65 years (RR=1.41, CI=1.02–1.96, p=0.039), ASA physical status 3 or higher (RR=4.19–21.58, p < 0.001), sites of surgery (intracranial, intrathoracic, upper intra-abdominal, and major vascular, each p < 0.001), cardiopulmonary comorbidities (RR=1.55, CI=1.10–2.17, p < 0.012), hemodynamic instability with shock prior to receiving anesthesia (RR=1.60, CI=1.21–2.11, p < 0.001) , special techniques for surgery such as cardiopulmonary bypass (CPB) and hypotensive techniques (RR=5.55, CI=2.01–15.36, p=0.001; RR=6.24, CI=2.21–17.58, p=0.001, respectively), and patients who had a history of being alcoholic (RR=5.27, CI=4.09–6.79, p < 0.001). Conclusion: Incidence of perioperative cardiac arrest in trauma patients receiving anesthesia for emergency surgery was very high and correlated with many factors, especially age of patient and cardiopulmonary comorbidities, patient having a history of alcoholic addiction, increasing ASA physical status, preoperative shock, special techniques for surgery, and sites of surgery including brain, thorax, abdomen, and major vascular region. Anesthesiologists and multidisciplinary teams in pre- and perioperative periods should remain alert for warning signs of pre-cardiac arrest and be quick to manage the high-risk group of surgical trauma patients. Furthermore, a healthcare policy should be promoted for protecting against accidents in high-risk groups of the population as well.Keywords: perioperative cardiac arrest, trauma patients, emergency surgery, anesthesia, factors risk, incidence
Procedia PDF Downloads 16928779 Insights on Behavior of Tunisian Auditors
Authors: Dammak Saida, Mbarek Sonia
Abstract:
This paper aims to examine the impact of public interest commitment, the attitude towards independence enforcement, and organizational ethical culture on auditors' ethical behavior. It also tests the moderating effect of gender diversity on these relationships. The sample consisted of 100 Tunisian chartered accountants. An online survey was used to collect the data. Data analysis techniques used to test hypotheses The findings of this study provide practical implications for accounting professionals, regulators, and audit firms as they help understand auditors' beliefs and behaviors, which implies more effective mechanisms for improving their ethical values.Keywords: public interest, independence, organizational culture, professional behavior, Tunisian auditors
Procedia PDF Downloads 7428778 Designing Energy Efficient Buildings for Seasonal Climates Using Machine Learning Techniques
Authors: Kishor T. Zingre, Seshadhri Srinivasan
Abstract:
Energy consumption by the building sector is increasing at an alarming rate throughout the world and leading to more building-related CO₂ emissions into the environment. In buildings, the main contributors to energy consumption are heating, ventilation, and air-conditioning (HVAC) systems, lighting, and electrical appliances. It is hypothesised that the energy efficiency in buildings can be achieved by implementing sustainable technologies such as i) enhancing the thermal resistance of fabric materials for reducing heat gain (in hotter climates) and heat loss (in colder climates), ii) enhancing daylight and lighting system, iii) HVAC system and iv) occupant localization. Energy performance of various sustainable technologies is highly dependent on climatic conditions. This paper investigated the use of machine learning techniques for accurate prediction of air-conditioning energy in seasonal climates. The data required to train the machine learning techniques is obtained using the computational simulations performed on a 3-story commercial building using EnergyPlus program plugged-in with OpenStudio and Google SketchUp. The EnergyPlus model was calibrated against experimental measurements of surface temperatures and heat flux prior to employing for the simulations. It has been observed from the simulations that the performance of sustainable fabric materials (for walls, roof, and windows) such as phase change materials, insulation, cool roof, etc. vary with the climate conditions. Various renewable technologies were also used for the building flat roofs in various climates to investigate the potential for electricity generation. It has been observed that the proposed technique overcomes the shortcomings of existing approaches, such as local linearization or over-simplifying assumptions. In addition, the proposed method can be used for real-time estimation of building air-conditioning energy.Keywords: building energy efficiency, energyplus, machine learning techniques, seasonal climates
Procedia PDF Downloads 11428777 Use of Locally Effective Microorganisms in Conjunction with Biochar to Remediate Mine-Impacted Soils
Authors: Thomas F. Ducey, Kristin M. Trippe, James A. Ippolito, Jeffrey M. Novak, Mark G. Johnson, Gilbert C. Sigua
Abstract:
The Oronogo-Duenweg mining belt –approximately 20 square miles around the Joplin, Missouri area– is a designated United States Environmental Protection Agency Superfund site due to lead-contaminated soil and groundwater by former mining and smelting operations. Over almost a century of mining (from 1848 to the late 1960’s), an estimated ten million tons of cadmium, lead, and zinc containing material have been deposited on approximately 9,000 acres. Sites that have undergone remediation, in which the O, A, and B horizons have been removed along with the lead contamination, the exposed C horizon remains incalcitrant to revegetation efforts. These sites also suffer from poor soil microbial activity, as measured by soil extracellular enzymatic assays, though 16S ribosomal ribonucleic acid (rRNA) indicates that microbial diversity is equal to sites that have avoided mine-related contamination. Soil analysis reveals low soil organic carbon, along with high levels of bio-available zinc, that reflect the poor soil fertility conditions and low microbial activity. Our study looked at the use of several materials to restore and remediate these sites, with the goal of improving soil health. The following materials, and their purposes for incorporation into the study, were as follows: manure-based biochar for the binding of zinc and other heavy metals responsible for phytotoxicity, locally sourced biosolids and compost to incorporate organic carbon into the depleted soils, effective microorganisms harvested from nearby pristine sites to provide a stable community for nutrient cycling in the newly composited 'soil material'. Our results indicate that all four materials used in conjunction result in the greatest benefit to these mine-impacted soils, based on above ground biomass, microbial biomass, and soil enzymatic activities.Keywords: locally effective microorganisms, biochar, remediation, reclamation
Procedia PDF Downloads 21728776 Two-Stage Hospital Efficiency Analysis Including Qualitative Evidence: A Greek Case
Authors: Panos Xenos, Milton Nektarios, John Yfantopoulos
Abstract:
Background: Policy makers, professional organizations and payers have introduced a variety of initiatives and reforms for the health systems worldwide, aimed at improving hospital efficiency. Their efforts are concentrated in two main categories: to constrain increasing healthcare costs and to enhance quality of services provided. Research Objectives: This study examines the efficiency of 112 Greek public hospitals for the year 2009, evaluates the importance of bootstrapping techniques and investigates the effect of contextual factors on hospital efficiency. Furthermore, the effect of qualitative evidence, on hospital efficiency is explored using data from 28 large hospitals. Methods: We applied Data Envelopment Analysis, augmented by bootstrapping techniques, to estimate efficiency scores. In order to measure the effect of environmental factors on hospital efficiency we used Tobit regression analysis. The significance of our models is evaluated using statistical tests to compare distributions. Results: The Kolmogorov-Smirnov test between the original and the bootstrap-corrected efficiency indicates that their distributions are significantly different (p-value<0.01). The environmental factors, that seem to influence efficiency, are Occupancy Rating and the ratio between Outpatient Visits and Inpatient Days. Results indicate that the inclusion of the quality variable in DEA modelling generates statistically significant variations in efficiency scores (p-value<0.05). Conclusions: The inclusion of quality variables and the use of bootstrap resampling in efficiency analysis impose a statistically significant effect on the distribution of efficiency scores. As a policy conclusion we highlight the importance of these methods on hospital efficiency analysis and, by implication, on healthcare resource allocation.Keywords: hospitals, efficiency, quality, data envelopment analysis, Greek public hospital sector
Procedia PDF Downloads 30928775 Quantifying User-Related, System-Related, and Context-Related Patterns of Smartphone Use
Authors: Andrew T. Hendrickson, Liven De Marez, Marijn Martens, Gytha Muller, Tudor Paisa, Koen Ponnet, Catherine Schweizer, Megan Van Meer, Mariek Vanden Abeele
Abstract:
Quantifying and understanding the myriad ways people use their phones and how that impacts their relationships, cognitive abilities, mental health, and well-being is increasingly important in our phone-centric society. However, most studies on the patterns of phone use have focused on theory-driven tests of specific usage hypotheses using self-report questionnaires or analyses of smaller datasets. In this work we present a series of analyses from a large corpus of over 3000 users that combine data-driven and theory-driven analyses to identify reliable smartphone usage patterns and clusters of similar users. Furthermore, we compare the stability of user clusters across user- and system-initiated sessions, as well as during the hypothesized ritualized behavior times directly before and after sleeping. Our results indicate support for some hypothesized usage patterns but present a more complete and nuanced view of how people use smartphones.Keywords: data mining, experience sampling, smartphone usage, health and well being
Procedia PDF Downloads 16328774 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 28528773 Objective Evaluation on Medical Image Compression Using Wavelet Transformation
Authors: Amhimmid Mohammed Saffour, Mustafa Mohamed Abdullah
Abstract:
The use of computers for handling image data in the healthcare is growing. However, the amount of data produced by modern image generating techniques is vast. This data might be a problem from a storage point of view or when the data is sent over a network. This paper using wavelet transform technique for medical images compression. MATLAB program, are designed to evaluate medical images storage and transmission time problem at Sebha Medical Center Libya. In this paper, three different Computed Tomography images which are abdomen, brain and chest have been selected and compressed using wavelet transform. Objective evaluation has been performed to measure the quality of the compressed images. For this evaluation, the results show that the Peak Signal to Noise Ratio (PSNR) which indicates the quality of the compressed image is ranging from (25.89db to 34.35db for abdomen images, 23.26db to 33.3db for brain images and 25.5db to 36.11db for chest images. These values shows that the compression ratio is nearly to 30:1 is acceptable.Keywords: medical image, Matlab, image compression, wavelet's, objective evaluation
Procedia PDF Downloads 28528772 Hydrological Analysis for Urban Water Management
Authors: Ranjit Kumar Sahu, Ramakar Jha
Abstract:
Urban Water Management is the practice of managing freshwater, waste water, and storm water as components of a basin-wide management plan. It builds on existing water supply and sanitation considerations within an urban settlement by incorporating urban water management within the scope of the entire river basin. The pervasive problems generated by urban development have prompted, in the present work, to study the spatial extent of urbanization in Golden Triangle of Odisha connecting the cities Bhubaneswar (20.2700° N, 85.8400° E), Puri (19.8106° N, 85.8314° E) and Konark (19.9000° N, 86.1200° E)., and patterns of periodic changes in urban development (systematic/random) in order to develop future plans for (i) urbanization promotion areas, and (ii) urbanization control areas. Remote Sensing, using USGS (U.S. Geological Survey) Landsat8 maps, supervised classification of the Urban Sprawl has been done for during 1980 - 2014, specifically after 2000. This Work presents the following: (i) Time series analysis of Hydrological data (ground water and rainfall), (ii) Application of SWMM (Storm Water Management Model) and other soft computing techniques for Urban Water Management, and (iii) Uncertainty analysis of model parameters (Urban Sprawl and correlation analysis). The outcome of the study shows drastic growth results in urbanization and depletion of ground water levels in the area that has been discussed briefly. Other relative outcomes like declining trend of rainfall and rise of sand mining in local vicinity has been also discussed. Research on this kind of work will (i) improve water supply and consumption efficiency (ii) Upgrade drinking water quality and waste water treatment (iii) Increase economic efficiency of services to sustain operations and investments for water, waste water, and storm water management, and (iv) engage communities to reflect their needs and knowledge for water management.Keywords: Storm Water Management Model (SWMM), uncertainty analysis, urban sprawl, land use change
Procedia PDF Downloads 42528771 Zinc (II) Complexes of Nitrogen, Oxygen and Sulfur Coordination Modes: Synthesis, Spectral Studies and Antibacterial Activities
Authors: Ayodele Odularu, Peter Ajibade, Albert Bolhuis
Abstract:
This study aimed at assessing the antibacterial activities of four zinc (II) complexes. Zinc (II) complexes of nitrogen, oxygen and sulfur coordination modes were synthesized using direct substitution reaction. The characterization techniques involved physicochemical properties (molar conductivity) and spectroscopic techniques. The molar conductivity gave the non-electrolytic nature of zinc (II) complexes. The spectral studies of zinc (II) complexes were done using electronic spectra (UV-Vis) and Fourier Transform Infra-red Spectroscopy (FT-IR). Spectral data from the spectroscopic studies confirmed the coordination of the mixed ligands with zinc (II) ion. The antibacterial activities of zinc(II) complexes of were all in supportive of Overtone’s concept and Tweedy’s theory of chelation for bacterial strains of S. aureus MRSA252 and E coli MC4100 because the zones of inhibition were greater than the corresponding ligands. In summary, all zinc (II) complexes of ZEPY, ZE1PH, ZE1PY and ZE135PY all have potentials for antibacterial activities.Keywords: antibacterial activities, spectral studies, syntheses, zinc(II) complexes
Procedia PDF Downloads 28128770 Ibrutinib and the Potential Risk of Cardiac Failure: A Review of Pharmacovigilance Data
Authors: Abdulaziz Alakeel, Roaa Alamri, Abdulrahman Alomair, Mohammed Fouda
Abstract:
Introduction: Ibrutinib is a selective, potent, and irreversible small-molecule inhibitor of Bruton's tyrosine kinase (BTK). It forms a covalent bond with a cysteine residue (CYS-481) at the active site of Btk, leading to inhibition of Btk enzymatic activity. The drug is indicated to treat certain type of cancers such as mantle cell lymphoma (MCL), chronic lymphocytic leukaemia and Waldenström's macroglobulinaemia (WM). Cardiac failure is a condition referred to inability of heart muscle to pump adequate blood to human body organs. There are multiple types of cardiac failure including left and right-sided heart failure, systolic and diastolic heart failures. The aim of this review is to evaluate the risk of cardiac failure associated with the use of ibrutinib and to suggest regulatory recommendations if required. Methodology: Signal Detection team at the National Pharmacovigilance Center (NPC) of Saudi Food and Drug Authority (SFDA) performed a comprehensive signal review using its national database as well as the World Health Organization (WHO) database (VigiBase), to retrieve related information for assessing the causality between cardiac failure and ibrutinib. We used the WHO- Uppsala Monitoring Centre (UMC) criteria as standard for assessing the causality of the reported cases. Results: Case Review: The number of resulted cases for the combined drug/adverse drug reaction are 212 global ICSRs as of July 2020. The reviewers have selected and assessed the causality for the well-documented ICSRs with completeness scores of 0.9 and above (35 ICSRs); the value 1.0 presents the highest score for best-written ICSRs. Among the reviewed cases, more than half of them provides supportive association (four probable and 15 possible cases). Data Mining: The disproportionality of the observed and the expected reporting rate for drug/adverse drug reaction pair is estimated using information component (IC), a tool developed by WHO-UMC to measure the reporting ratio. Positive IC reflects higher statistical association while negative values indicates less statistical association, considering the null value equal to zero. The results of (IC=1.5) revealed a positive statistical association for the drug/ADR combination, which means “Ibrutinib” with “Cardiac Failure” have been observed more than expected when compared to other medications available in WHO database. Conclusion: Health regulators and health care professionals must be aware for the potential risk of cardiac failure associated with ibrutinib and the monitoring of any signs or symptoms in treated patients is essential. The weighted cumulative evidences identified from causality assessment of the reported cases and data mining are sufficient to support a causal association between ibrutinib and cardiac failure.Keywords: cardiac failure, drug safety, ibrutinib, pharmacovigilance, signal detection
Procedia PDF Downloads 12928769 Solar Power Generation in a Mining Town: A Case Study for Australia
Authors: Ryan Chalk, G. M. Shafiullah
Abstract:
Climate change is a pertinent issue facing governments and societies around the world. The industrial revolution has resulted in a steady increase in the average global temperature. The mining and energy production industries have been significant contributors to this change prompting government to intervene by promoting low emission technology within these sectors. This paper initially reviews the energy problem in Australia and the mining sector with a focus on the energy requirements and production methods utilised in Western Australia (WA). Renewable energy in the form of utility-scale solar photovoltaics (PV) provides a solution to these problems by providing emission-free energy which can be used to supplement the existing natural gas turbines in operation at the proposed site. This research presents a custom renewable solution for the mining site considering the specific township network, local weather conditions, and seasonal load profiles. A summary of the required PV output is presented to supply slightly over 50% of the towns power requirements during the peak (summer) period, resulting in close to full coverage in the trench (winter) period. Dig Silent Power Factory Software has been used to simulate the characteristics of the existing infrastructure and produces results of integrating PV. Large scale PV penetration in the network introduce technical challenges, that includes; voltage deviation, increased harmonic distortion, increased available fault current and power factor. Results also show that cloud cover has a dramatic and unpredictable effect on the output of a PV system. The preliminary analyses conclude that mitigation strategies are needed to overcome voltage deviations, unacceptable levels of harmonics, excessive fault current and low power factor. Mitigation strategies are proposed to control these issues predominantly through the use of high quality, made for purpose inverters. Results show that use of inverters with harmonic filtering reduces the level of harmonic injections to an acceptable level according to Australian standards. Furthermore, the configuration of inverters to supply active and reactive power assist in mitigating low power factor problems. Use of FACTS devices; SVC and STATCOM also reduces the harmonics and improve the power factor of the network, and finally, energy storage helps to smooth the power supply.Keywords: climate change, mitigation strategies, photovoltaic (PV), power quality
Procedia PDF Downloads 16628768 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning
Authors: Arun Sanjel, Greg Speegle
Abstract:
Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC
Procedia PDF Downloads 10728767 Advanced Structural Analysis of Energy Storage Materials
Authors: Disha Gupta
Abstract:
The aim of this research is to conduct X-ray and e-beam characterization techniques on lithium-ion battery materials for the improvement of battery performance. The key characterization techniques employed are the synchrotron X-ray Absorption Spectroscopy (XAS) combined with X-ray diffraction (XRD), scanning electron microscopy (SEM) and transmission electron microscopy (TEM) to obtain a more holistic approach to understanding material properties. This research effort provides additional battery characterization knowledge that promotes the development of new cathodes, anodes, electrolyte and separator materials for batteries, hence, leading to better and more efficient battery performance. Both ex-situ and in-situ synchrotron experiments were performed on LiFePO₄, one of the most common cathode material, from different commercial sources and their structural analysis, were conducted using Athena/Artemis software. This analysis technique was then further extended to study other cathode materials like LiMnxFe(₁₋ₓ)PO₄ and even some sulphate systems like Li₂Mn(SO₄)₂ and Li₂Co0.5Mn₀.₅ (SO₄)₂. XAS data were collected for Fe and P K-edge for LiFePO4, and Fe, Mn and P-K-edge for LiMnxFe(₁₋ₓ)PO₄ to conduct an exhaustive study of the structure. For the sulphate system, Li₂Mn(SO₄)₂, XAS data was collected at both Mn and S K-edge. Finite Difference Method for Near Edge Structure (FDMNES) simulations were also conducted for various iron, manganese and phosphate model compounds and compared with the experimental XANES data to understand mainly the pre-edge structural information of the absorbing atoms. The Fe K-edge XAS results showed a charge compensation occurring on the Fe atom for all the differently synthesized LiFePO₄ materials as well as the LiMnxFe(₁₋ₓ)PO₄ systems. However, the Mn K-edge showed a difference in results as the Mn concentration changed in the materials. For the sulphate-based system Li₂Mn(SO₄)₂, however, no change in the Mn K-edge was observed, even though electrochemical studies showed Mn redox reactions.Keywords: li-ion batteries, electrochemistry, X-ray absorption spectroscopy, XRD
Procedia PDF Downloads 15028766 Breast Cancer Survivability Prediction via Classifier Ensemble
Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia
Abstract:
This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.Keywords: classifier ensemble, breast cancer survivability, data mining, SEER
Procedia PDF Downloads 32828765 Global City Typologies: 300 Cities and Over 100 Datasets
Authors: M. Novak, E. Munoz, A. Jana, M. Nelemans
Abstract:
Cities and local governments the world over are interested to employ circular strategies as a means to bring about food security, create employment and increase resilience. The selection and implementation of circular strategies is facilitated by modeling the effects of strategies locally and understanding the impacts such strategies have had in other (comparable) cities and how that would translate locally. Urban areas are heterogeneous because of their geographic, economic, social characteristics, governance, and culture. In order to better understand the effect of circular strategies on urban systems, we create a dataset for over 300 cities around the world designed to facilitate circular strategy scenario modeling. This new dataset integrates data from over 20 prominent global national and urban data sources, such as the Global Human Settlements layer and International Labour Organisation, as well as incorporating employment data from over 150 cities collected bottom up from local departments and data providers. The dataset is made to be reproducible. Various clustering techniques are explored in the paper. The result is sets of clusters of cities, which can be used for further research, analysis, and support comparative, regional, and national policy making on circular cities.Keywords: data integration, urban innovation, cluster analysis, circular economy, city profiles, scenario modelling
Procedia PDF Downloads 18028764 Intrusion Detection Using Dual Artificial Techniques
Authors: Rana I. Abdulghani, Amera I. Melhum
Abstract:
With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.Keywords: IDS, SI, BP, NSL_KDD, PSO
Procedia PDF Downloads 38228763 The Journey of a Malicious HTTP Request
Authors: M. Mansouri, P. Jaklitsch, E. Teiniker
Abstract:
SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect high-level attacks such as SQL injection.Keywords: Linux system calls, web attack detection, interception, SQL
Procedia PDF Downloads 35928762 Phytoremediation of artisanal gold mine tailings - Potential of Chrysopogon zizanioides and Andropogon gayanus in the Sahelian climate
Authors: Yamma Rose, Kone Martine, Yonli Arsène, Wanko Ngnien Adrien
Abstract:
Soil pollution and, consequently, water resources by micropollutants from gold mine tailings constitute a major threat in developing countries due to the lack of waste treatment. Phytoremediation is an alternative for extracting or trapping micropollutants from contaminated soils by mining residues. The potentialities of Chrysopogon zizanioides (acclimated plant) and Andropogon gayanus (native plant) to accumulate arsenic (As), mercury (Hg), iron (Fe) and zinc (Zn) were studied in artisanal gold mine in Ouagadougou, Burkina Faso. The phytoremediation effectiveness of two plant species was studied in 75 pots of 30 liters each, containing mining residues from the artisanal gold processing site in the rural commune of Nimbrogo. The experiments cover three modalities: Tn - planted unpolluted soils; To – unplanted mine tailings and Tp – planted mine tailings arranged in a randomized manner. The pots were amended quarterly with compost to provide nutrients to the plants. The phytoremediation assessment consists of comparing the growth, biomass and capacity of these two herbaceous plants to extract or to trap Hg, Fe, Zn and As in mining residues in a controlled environment. The analysis of plant species parameters cultivated in mine tailings shows indices of relative growth of A. gayanus very significantly high (34.38%) compared to 20.37% for C.zizanioides. While biomass analysis reveals that C. zizanioides has greater foliage and root system growth than A. gayanus. The results after a culture time of 6 months showed that C. zizanioides and A. gayanus have the potential to accumulate Hg, Fe, Zn and As. Root biomass has a more significant accumulation than aboveground biomass for both herbaceous species. Although the BCF bioaccumulation factor values for both plants together are low (<1), the removal efficiency of Hg, Fe, Zn and As is 45.13%, 42.26%, 21.5% and 2.87% respectively in 24 weeks of culture with C. zizanioides. However, pots grown with A. gayanus gives an effectiveness rate of 43.55%; 41.52%; 2.87% and 1.35% respectively for Fe, Zn, Hg and As. The results indicate that the plant species studied have a strong phytoremediation potential, although that of A. gayanus is relatively less than C. zizanioides.Keywords: artisanal gold mine tailings, andropogon gayanus, chrysopogon zizanioides, phytoremediation
Procedia PDF Downloads 6528761 Harmonic Mitigation and Total Harmonic Distortion Reduction in Grid-Connected PV Systems: A Case Study Using Real-Time Data and Filtering Techniques
Authors: Atena Tazikeh Lemeski, Ismail Ozdamar
Abstract:
This study presents a detailed analysis of harmonic distortion in a grid-connected photovoltaic (PV) system using real-time data captured from a solar power plant. Harmonics introduced by inverters in PV systems can degrade power quality and lead to increased Total Harmonic Distortion (THD), which poses challenges such as transformer overheating, increased power losses, and potential grid instability. This research addresses these issues by applying Fast Fourier Transform (FFT) to identify significant harmonic components and employing notch filters to target specific frequencies, particularly the 3rd harmonic (150 Hz), which was identified as the largest contributor to THD. Initial analysis of the unfiltered voltage signal revealed a THD of 21.15%, with prominent harmonic peaks at 150 Hz, 250 Hz and 350 Hz, corresponding to the 3rd, 5th, and 7th harmonics, respectively. After implementing the notch filters, the THD was reduced to 5.72%, demonstrating the effectiveness of this approach in mitigating harmonic distortion without affecting the fundamental frequency. This paper provides practical insights into the application of real-time filtering techniques in PV systems and their role in improving overall grid stability and power quality. The results indicate that targeted harmonic mitigation is crucial for the sustainable integration of renewable energy sources into modern electrical grids.Keywords: grid-connected photovoltaic systems, fast Fourier transform, harmonic filtering, inverter-induced harmonics
Procedia PDF Downloads 3428760 Gold-Bearing Alteration Zones in South Eastern Desert of Egypt: Geology and Remote Sensing Analysis
Authors: Mohamed F. Sadek, Safaa M. Hassan, Safwat S. Gabr
Abstract:
Several alteration zones hosting gold mineralization are wide spreading in the South Eastern Desert of Egypt where gold has been mined from many localities since the time of the Pharaohs. The Sukkari is the only mine currently producing gold in the Eastern Desert of Egypt. Therefore, it is necessary to conduct more detailed studies on these locations using modern exploratory methods. The remote sensing plays an important role in lithological mapping and detection of associated hydrothermal mineralization particularly the exploration of gold mineralization. This study is focused on three localities in South Eastern Desert of Egypt, namely Beida, Defiet and Hoteib-Eiqat aiming to detect the gold-bearing hydrothermal alteration zones using the integrated data of remote sensing, field study and mineralogical investigation. Generally, these areas are dominated by Precambrian basement rocks including metamorphic and magmatic assemblages. They comprise ophiolitic serpentinite-talc carbonate, island-arc metavolcanics which were intruded by syn to late orogenic mafic and felsic intrusions mainly gabbro, granodiorite and monzogranite. The processed data of Advanced Spaceborne Thermal Emission and Reflection (ASTER) and Landsat-8 images are used in the present study to map the gold bearing-hydrothermal alteration zones. Band rationing and principal component analysis techniques are used to discriminate the different lithologic units exposed in the studied three areas. Field study and mineralogical investigation have been used to verify the remote sensing data. This study concluded that, the integrated remote sensing data with geological, field and mineralogical investigations are very effective in lithological discrimination, detailed geological mapping and detection of the gold-bearing hydrothermal alteration zones. More detailed exploration for gold mineralization with the help of remote sensing techniques is recommended to evaluate its potentiality in the study areas.Keywords: pan-african, Egypt, landsat-8; ASTER, gold, alteration zones
Procedia PDF Downloads 12728759 Multi-Scaled Non-Local Means Filter for Medical Images Denoising: Empirical Mode Decomposition vs. Wavelet Transform
Authors: Hana Rabbouch
Abstract:
In recent years, there has been considerable growth of denoising techniques mainly devoted to medical imaging. This important evolution is not only due to the progress of computing techniques, but also to the emergence of multi-resolution analysis (MRA) on both mathematical and algorithmic bases. In this paper, a comparative study is conducted between the two best-known MRA-based decomposition techniques: the Empirical Mode Decomposition (EMD) and the Discrete Wavelet Transform (DWT). The comparison is carried out in a framework of multi-scale denoising, where a Non-Local Means (NLM) filter is performed scale-by-scale to a sample of benchmark medical images. The results prove the effectiveness of the multiscaled denoising, especially when the NLM filtering is coupled with the EMD.Keywords: medical imaging, non local means, denoising, multiscaled analysis, empirical mode decomposition, wavelets
Procedia PDF Downloads 14128758 Machine Learning Techniques in Seismic Risk Assessment of Structures
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine
Procedia PDF Downloads 10628757 A Proposal of Advanced Key Performance Indicators for Assessing Six Performances of Construction Projects
Authors: Wi Sung Yoo, Seung Woo Lee, Youn Kyoung Hur, Sung Hwan Kim
Abstract:
Large-scale construction projects are continuously increasing, and the need for tools to monitor and evaluate the project success is emphasized. At the construction industry level, there are limitations in deriving performance evaluation factors that reflect the diversity of construction sites and systems that can objectively evaluate and manage performance. Additionally, there are difficulties in integrating structured and unstructured data generated at construction sites and deriving improvements. In this study, we propose the Key Performance Indicators (KPIs) to enable performance evaluation that reflects the increased diversity of construction sites and the unstructured data generated, and present a model for measuring performance by the derived indicators. The comprehensive performance of a unit construction site is assessed based on 6 areas (Time, Cost, Quality, Safety, Environment, Productivity) and 26 indicators. We collect performance indicator information from 30 construction sites that meet legal standards and have been successfully performed. And We apply data augmentation and optimization techniques into establishing measurement standards for each indicator. In other words, the KPI for construction site performance evaluation presented in this study provides standards for evaluating performance in six areas using institutional requirement data and document data. This can be expanded to establish a performance evaluation system considering the scale and type of construction project. Also, they are expected to be used as a comprehensive indicator of the construction industry and used as basic data for tracking competitiveness at the national level and establishing policies.Keywords: key performance indicator, performance measurement, structured and unstructured data, data augmentation
Procedia PDF Downloads 4228756 Using Open Source Data and GIS Techniques to Overcome Data Deficiency and Accuracy Issues in the Construction and Validation of Transportation Network: Case of Kinshasa City
Authors: Christian Kapuku, Seung-Young Kho
Abstract:
An accurate representation of the transportation system serving the region is one of the important aspects of transportation modeling. Such representation often requires developing an abstract model of the system elements, which also requires important amount of data, surveys and time. However, in some cases such as in developing countries, data deficiencies, time and budget constraints do not always allow such accurate representation, leaving opportunities to assumptions that may negatively affect the quality of the analysis. With the emergence of Internet open source data especially in the mapping technologies as well as the advances in Geography Information System, opportunities to tackle these issues have raised. Therefore, the objective of this paper is to demonstrate such application through a practical case of the development of the transportation network for the city of Kinshasa. The GIS geo-referencing was used to construct the digitized map of Transportation Analysis Zones using available scanned images. Centroids were then dynamically placed at the center of activities using an activities density map. Next, the road network with its characteristics was built using OpenStreet data and other official road inventory data by intersecting their layers and cleaning up unnecessary links such as residential streets. The accuracy of the final network was then checked, comparing it with satellite images from Google and Bing. For the validation, the final network was exported into Emme3 to check for potential network coding issues. Results show a high accuracy between the built network and satellite images, which can mostly be attributed to the use of open source data.Keywords: geographic information system (GIS), network construction, transportation database, open source data
Procedia PDF Downloads 167