Search results for: decision tree algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6270

Search results for: decision tree algorithms

990 Copyright Clearance for Artificial Intelligence Training Data: Challenges and Solutions

Authors: Erva Akin

Abstract:

– The use of copyrighted material for machine learning purposes is a challenging issue in the field of artificial intelligence (AI). While machine learning algorithms require large amounts of data to train and improve their accuracy and creativity, the use of copyrighted material without permission from the authors may infringe on their intellectual property rights. In order to overcome copyright legal hurdle against the data sharing, access and re-use of data, the use of copyrighted material for machine learning purposes may be considered permissible under certain circumstances. For example, if the copyright holder has given permission to use the data through a licensing agreement, then the use for machine learning purposes may be lawful. It is also argued that copying for non-expressive purposes that do not involve conveying expressive elements to the public, such as automated data extraction, should not be seen as infringing. The focus of such ‘copy-reliant technologies’ is on understanding language rules, styles, and syntax and no creative ideas are being used. However, the non-expressive use defense is within the framework of the fair use doctrine, which allows the use of copyrighted material for research or educational purposes. The questions arise because the fair use doctrine is not available in EU law, instead, the InfoSoc Directive provides for a rigid system of exclusive rights with a list of exceptions and limitations. One could only argue that non-expressive uses of copyrighted material for machine learning purposes do not constitute a ‘reproduction’ in the first place. Nevertheless, the use of machine learning with copyrighted material is difficult because EU copyright law applies to the mere use of the works. Two solutions can be proposed to address the problem of copyright clearance for AI training data. The first is to introduce a broad exception for text and data mining, either mandatorily or for commercial and scientific purposes, or to permit the reproduction of works for non-expressive purposes. The second is that copyright laws should permit the reproduction of works for non-expressive purposes, which opens the door to discussions regarding the transposition of the fair use principle from the US into EU law. Both solutions aim to provide more space for AI developers to operate and encourage greater freedom, which could lead to more rapid innovation in the field. The Data Governance Act presents a significant opportunity to advance these debates. Finally, issues concerning the balance of general public interests and legitimate private interests in machine learning training data must be addressed. In my opinion, it is crucial that robot-creation output should fall into the public domain. Machines depend on human creativity, innovation, and expression. To encourage technological advancement and innovation, freedom of expression and business operation must be prioritised.

Keywords: artificial intelligence, copyright, data governance, machine learning

Procedia PDF Downloads 77
989 A Rule Adumbrated: Bailment on Terms

Authors: David Gibbs-Kneller

Abstract:

Only parties to a contract can enforce it. This is the privity of the contract. Carriage contracts frequently involve intermediated relationships. While the carrier and cargo-owner will agree on a contract for carriage, there is no privity or consideration between the cargo-owner and third parties. To overcome this, the contract utilizes ‘bailment on terms’ or the rule in Morris. Morris v C W Martin & Sons Ltd is authority for the following: A sub-bailee and bailor may rely on terms of a bailment where the bailor has consented to sub-bailment “on terms”. Bailment on terms can play a significant part in making litigation decisions and determining liability. It is used in standard form contracts and courts have also strived to find consent to bailment on terms in agreements so as to avoid the consequences of privity of contract. However, what this paper exposes is the false legal basis for this model. Lord Denning gave an account adumbrated of the law of bailments to justify the rule in Morris. What Lord Denning was really doing was objecting to the doctrine of privity. To do so, he wrongly asserted there was a lacuna in law that meant third parties could not avail themselves upon terms of a contract. Next, he provided a false analogy between purely contractual rights and possessory liens. Finally, he gave accounts of authorities to say they supported the rule in Morris when they did not. Surprisingly, subsequent case law on the point has not properly engaged with this reasoning. The Pioneer Container held that since the rule in Morris lay in bailments, the decision is not dependent on the doctrine of privity. Yet the basis for this statement was Morris. Once these reasons have been discounted, all bailment on terms rests on is the claim that the law of bailments is an independent source of law. Bailment on terms should not be retained, for it is contrary to established principles in the law of property, tort, and contract. That undermines the certainty of those principles by risking their collapse because there is nothing that keeps bailment on terms within the confines of bailments only. As such, bailment on terms is not good law and should not be used in standard form contracts or by the courts as a means of determining liability. If bailment on terms is a pragmatic rule to retain, it is recommended that rules governing carriage contracts should be amended.

Keywords: bailment, carriage of goods, contract law, privity

Procedia PDF Downloads 184
988 Empirical Analysis of Forensic Accounting Practices for Tackling Persistent Fraud and Financial Irregularities in the Nigerian Public Sector

Authors: Sani AbdulRahman Bala

Abstract:

This empirical study delves into the realm of forensic accounting practices within the Nigerian Public Sector, seeking to quantitatively analyze their efficacy in addressing the persistent challenges of fraud and financial irregularities. With a focus on empirical data, this research employs a robust methodology to assess the current state of fraud in the Nigerian Public Sector and evaluate the performance of existing forensic accounting measures. Through quantitative analyses, including statistical models and data-driven insights, the study aims to identify patterns, trends, and correlations associated with fraudulent activities. The research objectives include scrutinizing documented fraud cases, examining the effectiveness of established forensic accounting practices, and proposing data-driven strategies for enhancing fraud detection and prevention. Leveraging quantitative methodologies, the study seeks to measure the impact of technological advancements on forensic accounting accuracy and efficiency. Additionally, the research explores collaborative mechanisms among government agencies, regulatory bodies, and the private sector by quantifying the effects of information sharing on fraud prevention. The empirical findings from this study are expected to provide a nuanced understanding of the challenges and opportunities in combating fraud within the Nigerian Public Sector. The quantitative insights derived from real-world data will contribute to the refinement of forensic accounting strategies, ensuring their effectiveness in addressing the unique complexities of financial irregularities in the public sector. The study's outcomes aim to inform policymakers, practitioners, and stakeholders, fostering evidence-based decision-making and proactive measures for a more resilient and fraud-resistant financial governance system in Nigeria.

Keywords: fraud, financial irregularities, nigerian public sector, quantitative investigation

Procedia PDF Downloads 50
987 BIM4Cult Leveraging BIM and IoT for Enhancing Fire Safety in Historical Buildings

Authors: Anastasios Manos, Despina Elisabeth Filippidou

Abstract:

Introduction: Historical buildings are an inte-gral part of the cultural heritage of every place, and beyond the obvious need for protection against risks, they have specific requirements regarding the handling of hazards and disasters such as fire, floods, earthquakes, etc. Ensuring high levels of protection and safety for these buildings is impera-tive for two distinct but interconnected reasons: a) they themselves constitute cultural heritage, and b) they are often used as museums/cultural spaces, necessitating the protection of both human life (vis-itors and workers) and the cultural treasures they house. However, these buildings present serious constraints in implementing the necessary measures to protect them from destruction due to their unique architecture, construction methods, and/or the structural materials used in the past, which have created an existing condition that is sometimes challenging to reshape and operate within the framework of modern regulations and protection measures. One of the most devastating risks that threaten historical buildings is fire. Catastrophic fires demonstrate the need for timely evaluation of fire safety measures in historical buildings. Recog-nizing the criticality of protecting historical build-ings from the risk of fire, the Confederation of Fire Protection Associations in Europe (CFPA E) issued specific guidelines in 2013 (CFPA-E Guideline No 30:2013 F) for the fire protection of historical buildings at the European level. However, until now, few actions have been implemented towards leveraging modern technologies in the field of con-struction and maintenance of buildings, such as Building Information Modeling (BIM) and the Inter-net of Things (IoT), for the protection of historical buildings from risks like fires, floods, etc. The pro-ject BIM4Cult has bee developed in order to fill this gap. It is a tool for timely assessing and monitoring of the fire safety level of historical buildings using BIM and IoT technologies in an integrated manner. The tool serves as a decision support expert system for improving the fire safety of historical buildings by continuously monitoring, controlling and as-sessing critical risk factors for fire.

Keywords: Iot, fire, BIM, expert system

Procedia PDF Downloads 62
986 Evaluating Viability of Using South African Forestry Process Biomass Waste Mixtures as an Alternative Pyrolysis Feedstock in the Production of Bio Oil

Authors: Thembelihle Portia Lubisi, Malusi Ntandoyenkosi Mkhize, Jonas Kalebe Johakimu

Abstract:

Fertilizers play an important role in maintaining the productivity and quality of plants. Inorganic fertilizers (containing nitrogen, phosphorus, and potassium) are largely used in South Africa as they are considered inexpensive and highly productive. When applied, a portion of the excess fertilizer will be retained in the soil, a portion enters water streams due to surface runoff or the irrigation system adopted. Excess nutrient from the fertilizers entering the water stream eventually results harmful algal blooms (HABs) in freshwater systems, which not only disrupt wildlife but can also produce toxins harmful to humans. Use of agro-chemicals such as pesticides and herbicides has been associated with increased antimicrobial resistance (AMR) in humans as the plants are consumed by humans. This resistance of bacterial poses a threat as it prevents the Health sector from being able to treat infectious disease. Archaeological studies have found that pyrolysis liquids were already used in the time of the Neanderthal as a biocide and plant protection product. Pyrolysis is thermal degradation process of plant biomass or organic material under anaerobic conditions leading to production of char, bio-oils and syn gases. Bio-oil constituents can be categorized as water soluble (wood vinegar) and water insoluble fractions (tar and light oils). Wood vinegar (pyro-ligneous acid) is said to contain contains highly oxygenated compounds including acids, alcohols, aldehydes, ketones, phenols, esters, furans, and other multifunctional compounds with various molecular weights and compositions depending on the biomass material derived from and pyrolysis operating conditions. Various researchers have found the wood vinegar to be efficient in the eradication of termites, effective in plant protection and plant growth, has antibacterial characteristics and was found effective in inhibiting the micro-organisms such as candida yeast, E-coli, etc. This study investigated characterisation of South African forestry product processing waste with intention of evaluating the potential of using the respective biomass waste as feedstock for boil oil production via pyrolysis process. Ability to use biomass waste materials in production of wood-vinegar has advantages that it does not only allows for reduction of environmental pollution and landfill requirement, but it also does not negatively affect food security. The biomass wastes investigated were from the popular tree types in KZN, which are, pine saw dust (PSD), pine bark (PB), eucalyptus saw dust (ESD) and eucalyptus bark (EB). Furthermore, the research investigates the possibility of mixing the different wastes with an aim to lessen the cost of raw material separation prior to feeding into pyrolysis process and mixing also increases the amount of biomass material available for beneficiation. A 50/50 mixture of PSD and ESD (EPSD) and mixture containing pine saw dust; eucalyptus saw dust, pine bark and eucalyptus bark (EPSDB). Characterisation of the biomass waste will look at analysis such as proximate (volatiles, ash, fixed carbon), ultimate (carbon, hydrogen, nitrogen, oxygen, sulphur), high heating value, structural (cellulose, hemicellulose and lignin) and thermogravimetric analysis.

Keywords: characterisation, biomass waste, saw dust, wood waste

Procedia PDF Downloads 56
985 Risk in the South African Sectional Title Industry: An Assurance Perspective

Authors: Leandi Steenkamp

Abstract:

The sectional title industry has been a part of the property landscape in South Africa for almost half a century, and plays a significant role in addressing the housing problem in the country. Stakeholders such as owners and investors in sectional title property are in most cases not directly involved in the management thereof, and place reliance on the audited annual financial statements of bodies corporate for decision-making purposes. Although the industry seems to be highly regulated, the legislation regarding accounting and auditing of sectional title is vague and ambiguous. Furthermore, there are no industry-specific auditing and accounting standards to guide accounting and auditing practitioners in performing their work and industry financial benchmarks are not readily available. In addition, financial pressure on sectional title schemes is often very high due to the fact that some owners exercise unrealistic pressure to keep monthly levies as low as possible. All these factors have an impact on the business risk as well as audit risk of bodies corporate. Very little academic research has been undertaken on the sectional title industry in South Africa from an accounting and auditing perspective. The aim of this paper is threefold: Firstly, to discuss the findings of a literature review on uncertainties, ambiguity and confusing aspects in current legislation regarding the audit of a sectional title property that may cause or increase audit and business risk. Secondly, empirical findings of risk-related aspects from the results of interviews with three groups of body corporate role-players will be discussed. The role-players were body corporate trustee chairpersons, body corporate managing agents and accounting and auditing practitioners of bodies corporate. Specific reference will be made to business risk and audit risk. Thirdly, practical recommendations will be made on possibilities of closing the audit expectation gap, and further research opportunities in this regard will be discussed.

Keywords: assurance, audit, audit risk, body corporate, corporate governance, sectional title

Procedia PDF Downloads 259
984 Unraveling the Political Complexities of the Textile and Clothing Waste Ecosystem; A Case Study on Melbourne Metropolitan Civic Waste Management Practices

Authors: Yasaman Samie

Abstract:

The ever-increasing rate of textile and clothing (T&C) waste generation and the common ineffective waste management practices have been for long a challenge for civic waste management. This challenge stems from not only the complexity in the T&C material components but also the heterogeneous nature of the T&C waste management sector and the disconnection between the stakeholders. To date, there is little research that investigates the importance of a governmental structure and its role in T&C waste managerial practices and decision makings. This paper reflects on the impacts and involvement of governments, the Acts, and legislation on the effectiveness of T&C waste management practices, which are carried out by multiple players in a city context. In doing so, this study first develops a methodical framework for holistically analyzing a city’s T&C waste ecosystem. Central to this framework are six dimensions: social, environmental, economic, political, cultural, and educational, as well as the connection between these dimensions such as Socio-Political and Cultural-Political. Second, it delves into the political dimension and its interconnections with varying aspects of T&C waste. In this manner, this case-study takes metropolitan Melbourne as a case and draws on social theories of Actor-Network Theory and the principals of supply chain design and planning. Data collection was through two rounds of semi-structured interviews with 18 key players of T&C waste ecosystem (including charities, city councils, private sector providers and producers) mainly within metropolitan Melbourne and also other Australian and European cities. Research findings expand on the role of the politics of waste in facilitating a proactive approach to T&C waste management in the cities. That is achieved through a revised definition for T&C waste and its characteristics, discussing the varying perceptions of value in waste, prioritizing waste types in civic waste management practices and how all these aspects shall be reflected in the in-placed acts and legislations.

Keywords: civic waste management, multi-stakeholder ecosystem, textile and clothing waste, waste and governments

Procedia PDF Downloads 105
983 Strengths and Weaknesses of Tally, an LCA Tool for Comparative Analysis

Authors: Jacob Seddlemeyer, Tahar Messadi, Hongmei Gu, Mahboobeh Hemmati

Abstract:

The main purpose of this first tier of the study is to quantify and compare the embodied environmental impacts associated with alternative materials applied to Adohi Hall, a residence building at the University of Arkansas campus, Fayetteville, AR. This 200,000square foot building has5 stories builtwith mass timber and is compared to another scenario where the same edifice is built with a steel frame. Based on the defined goal and scope of the project, the materials respectivetothe respective to the two building options are compared in terms of Global Warming Potential (GWP), starting from cradle to the construction site, which includes the material manufacturing stage (raw material extract, process, supply, transport, and manufacture) plus transportation to the site (module A1-A4, based on standard EN 15804 definition). The consumedfossil fuels and emitted CO2 associated with the buildings are the major reason for the environmental impacts of climate change. In this study, GWP is primarily assessed to the exclusion of other environmental factors. The second tier of this work is to evaluate Tally’s performance in the decision-making process through the design phases, as well as determine its strengths and weaknesses. Tally is a Life Cycle Assessment (LCA) tool capable of conducting a cradle-to-grave analysis. As opposed to other software applications, Tally is specifically targeted at buildings LCA. As a peripheral application, this software tool is directly run within the core modeling application platform called Revit. This unique functionality causes Tally to stand out from other similar tools in the building sector LCA analysis. The results of this study also provide insights for making more environmentally efficient decisions in the building environment and help in the move forward to reduce Green House Gases (GHGs) emissions and GWP mitigation.

Keywords: comparison, GWP, LCA, materials, tally

Procedia PDF Downloads 218
982 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 76
981 Disaster Education and Children with Visual Impairment

Authors: Vassilis Argyropoulos, Magda Nikolaraizi, Maria Papazafiri

Abstract:

This study describes a series of learning workshops, which took place within CUIDAR project. The workshops aimed to empower children to share their experiences and views in relation to natural hazards and disasters. The participants in the workshops were ten primary school students who had severe visual impairments or multiple disabilities and visual impairments (MDVI). The main objectives of the workshops were: a) to promote access of the children through the use of appropriate educational material such as texts in braille, enlarged text, tactile maps and the implementation of differentiated instruction, b) to make children aware regarding their rights to have access to information and to participate in planning and decision-making especially in relation to disaster education programs, and c) to encourage children to have an active role during the workshops through child-led and experiential learning activities. The children expressed their views regarding the meaning of hazards and disasters. Following, they discussed their experiences and emotions regarding natural hazards and disasters, and they chose to place the emphasis on a hazard, which was more pertinent to them, their community and their region, namely fires. Therefore, they recalled fires that have caused major disasters, and they discussed about the impact that these fires had on their community or on their country. Furthermore, they were encouraged to become aware regarding their own role and responsibility to prevent a fire or get prepared and know how to behave if a fire occurs. They realized that prevention and preparation are a matter of personal responsibility. They also felt the responsibility to inform their own families. Finally, they met important people involved in fire protection such as rescuers and firefighters and had the opportunity to carry dialogues. In conclusion, through child led workshops, experiential and accessible activities, the students had the opportunity to share their own experiences, to express their views and their questions, to broaden their knowledge and to realize their personal responsibility in disaster risk reduction, specifically in relation to fires.

Keywords: accessibility, children, disasters, visual impairment

Procedia PDF Downloads 204
980 Modeling the Effects of Temperature on Ambient Air Quality Using AERMOD

Authors: Mustapha Babatunde, Bassam Tawabini, Ole John Nielson

Abstract:

Air dispersion (AD) models such as AERMOD are important tools for estimating the environmental impacts of air pollutant emissions into the atmosphere from anthropogenic sources. The outcome of these models is significantly linked to the climate condition like air temperature, which is expected to differ in the future due to the global warming phenomenon. With projections from scientific sources of impending changes to the future climate of Saudi Arabia, especially anticipated temperature rise, there is a potential direct impact on the dispersion patterns of air pollutants results from AD models. To our knowledge, no similar studies were carried out in Saudi Arabia to investigate such impact. Therefore, this research investigates the effects of climate temperature change on air quality in the Dammam Metropolitan area, Saudi Arabia, using AERMOD coupled with Station data using Sulphur dioxide (SO₂) – as a model air pollutant. The research uses AERMOD model to predict the SO₂ dispersion trends in the surrounding area. Emissions from five (5) industrial stacks on twenty-eight (28) receptors in the study area were considered for the climate period (2010-2019) and future period of mid-century (2040-2060) under different scenarios of elevated temperature profiles (+1ᵒC, + 3ᵒC and + 5ᵒC) across averaging time periods of 1hr, 4hr and 8hr. Results showed that levels of SO₂ at the receiving sites under current and simulated future climactic condition fall within the allowable limit of WHO and KSA air quality standards. Results also revealed that the projected rise in temperature would only have mild increment on the SO₂ concentration levels. The average increase of SO₂ levels was 0.04%, 0.14%, and 0.23% due to the temperature increase of 1, 3, and 5 degrees, respectively. In conclusion, the outcome of this work elucidates the degree of the effects of global warming and climate changes phenomena on air quality and can help the policymakers in their decision-making, given the significant health challenges associated with ambient air pollution in Saudi Arabia.

Keywords: air quality, sulfur dioxide, dispersion models, global warming, KSA

Procedia PDF Downloads 70
979 Ministers of Parliament and Their Official Web Sites; New Media Tool of Political Communication

Authors: Wijayanada Rupasinghe, A. H. Dinithi Jayasekara

Abstract:

In a modern democracy, new media can be used by governments to involve citizens in decision-making, and by civil society to engage people in specific issues. However new media can also be used to broaden political participation by helping citizens to communicate with their representatives and with each other. Arguably this political communication is most important during election campaigns when political parties and candidates seek to mobilize citizens and persuade them to vote for a given party or candidate. The new media must be used by Parliaments, Parliamentarians, governments and political parties as they are highly effective tools to involve and inform citizens in public policymaking and in the formation of governments. But all these groups must develop strategies to deal with a wide array of both positive and negative effects of these rapidly growing media.New media has begun to take precedent over other communication outlets in part because of its heightened accessibility and usability. Using personal website can empower the public in a way that is far faster, cheaper and more pervasive than other forms of communication. They encourage pluralism, reach young people more than other media and encourage greater participation, accountability and transparency. This research discusses the impact politicians’ personal websites has over their overall electability and likability and explores the integration of website is an essential campaign tactic on both the local and national level. This research examined the impact of having personal website have over the way constituents view politicians. This research examined how politicians can use their website in the most effective fashion and incorporate these new media outlets as essential campaign tools and tactics. A mixed-method approach using content analysis. Content analysis selected thirty websites in sri Lankan politicians. Research revealed that politician’s new media usage significantly influenced and enriched the experience an individual has with the public figure.

Keywords: election campaign ministers, new media, parliament, politicians websites

Procedia PDF Downloads 356
978 SAFECARE: Integrated Cyber-Physical Security Solution for Healthcare Critical Infrastructure

Authors: Francesco Lubrano, Fabrizio Bertone, Federico Stirano

Abstract:

Modern societies strongly depend on Critical Infrastructures (CI). Hospitals, power supplies, water supplies, telecommunications are just few examples of CIs that provide vital functions to societies. CIs like hospitals are very complex environments, characterized by a huge number of cyber and physical systems that are becoming increasingly integrated. Ensuring a high level of security within such critical infrastructure requires a deep knowledge of vulnerabilities, threats, and potential attacks that may occur, as well as defence and prevention or mitigation strategies. The possibility to remotely monitor and control almost everything is pushing the adoption of network-connected devices. This implicitly introduces new threats and potential vulnerabilities, posing a risk, especially to those devices connected to the Internet. Modern medical devices used in hospitals are not an exception and are more and more being connected to enhance their functionalities and easing the management. Moreover, hospitals are environments with high flows of people, that are difficult to monitor and can somehow easily have access to the same places used by the staff, potentially creating damages. It is therefore clear that physical and cyber threats should be considered, analysed, and treated together as cyber-physical threats. This means that an integrated approach is required. SAFECARE, an integrated cyber-physical security solution, tries to respond to the presented issues within healthcare infrastructures. The challenge is to bring together the most advanced technologies from the physical and cyber security spheres, to achieve a global optimum for systemic security and for the management of combined cyber and physical threats and incidents and their interconnections. Moreover, potential impacts and cascading effects are evaluated through impact propagation models that rely on modular ontologies and a rule-based engine. Indeed, SAFECARE architecture foresees i) a macroblock related to cyber security field, where innovative tools are deployed to monitor network traffic, systems and medical devices; ii) a physical security macroblock, where video management systems are coupled with access control management, building management systems and innovative AI algorithms to detect behavior anomalies; iii) an integration system that collects all the incoming incidents, simulating their potential cascading effects, providing alerts and updated information regarding assets availability.

Keywords: cyber security, defence strategies, impact propagation, integrated security, physical security

Procedia PDF Downloads 155
977 The Outcome of Using Machine Learning in Medical Imaging

Authors: Adel Edwar Waheeb Louka

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deeplearning, image processing, machine learningSarapin, intraarticular, chronic knee pain, osteoarthritisFNS, trauma, hip, neck femur fracture, minimally invasive surgery

Procedia PDF Downloads 57
976 Climate Change and Urban Flooding: The Need to Rethinking Urban Flood Management through Resilience

Authors: Suresh Hettiarachchi, Conrad Wasko, Ashish Sharma

Abstract:

The ever changing and expanding urban landscape increases the stress on urban systems to support and maintain safe and functional living spaces. Flooding presents one of the more serious threats to this safety, putting a larger number of people in harm’s way in congested urban settings. Climate change is adding to this stress by creating a dichotomy in the urban flood response. On the one hand, climate change is causing storms to intensify, resulting in more destructive, rarer floods, while on the other hand, longer dry periods are decreasing the severity of more frequent, less intense floods. This variability is creating a need to be more agile and innovative in how we design for and manage urban flooding. Here, we argue that to cope with this challenge climate change brings, we need to move towards urban flood management through resilience rather than flood prevention. We also argue that dealing with the larger variation in flood response to climate change means that we need to look at flooding from all aspects rather than the single-dimensional focus of flood depths and extents. In essence, we need to rethink how we manage flooding in the urban space. This change in our thought process and approach to flood management requires a practical way to assess and quantify resilience that is built into the urban landscape so that informed decision-making can support the required changes in planning and infrastructure design. Towards that end, we propose a Simple Urban Flood Resilience Index (SUFRI) based on a robust definition of resilience as a tool to assess flood resilience. The application of a simple resilience index such as the SUFRI can provide a practical tool that considers urban flood management in a multi-dimensional way and can present solutions that were not previously considered. When such an index is grounded on a clear and relevant definition of resilience, it can be a reliable and defensible way to assess and assist the process of adapting to the increasing challenges in urban flood management with climate change.

Keywords: urban flood resilience, climate change, flood management, flood modelling

Procedia PDF Downloads 40
975 Effect of Environmental Parameters on the Water Solubility of the Polycyclic Aromatic Hydrocarbons and Derivatives using Taguchi Experimental Design Methodology

Authors: Pranudda Pimsee, Caroline Sablayrolles, Pascale De Caro, Julien Guyomarch, Nicolas Lesage, Mireille Montréjaud-Vignoles

Abstract:

The MIGR’HYCAR research project was initiated to provide decisional tools for risks connected to oil spill drifts in continental waters. These tools aim to serve in the decision-making process once oil spill pollution occurs and/or as reference tools to study scenarios of potential impacts of pollutions on a given site. This paper focuses on the study of the distribution of polycyclic aromatic hydrocarbons (PAHs) and derivatives from oil spill in water as function of environmental parameters. Eight petroleum oils covering a representative range of commercially available products were tested. 41 Polycyclic Aromatic Hydrocarbons (PAHs) and derivate, among them 16 EPA priority pollutants were studied by dynamic tests at laboratory scale. The chemical profile of the water soluble fraction was different from the parent oil profile due to the various water solubility of oil components. Semi-volatile compounds (naphtalenes) constitute the major part of the water soluble fraction. A large variation in composition of the water soluble fraction was highlighted depending on oil type. Moreover, four environmental parameters (temperature, suspended solid quantity, salinity, and oil: water surface ratio) were investigated with the Taguchi experimental design methodology. The results showed that oils are divided into three groups: the solubility of Domestic fuel and Jet A1 presented a high sensitivity to parameters studied, meaning they must be taken into account. For gasoline (SP95-E10) and diesel fuel, a medium sensitivity to parameters was observed. In fact, the four others oils have shown low sensitivity to parameters studied. Finally, three parameters were found to be significant towards the water soluble fraction.

Keywords: mornitoring, PAHs, water soluble fraction, SBSE, Taguchi experimental design

Procedia PDF Downloads 310
974 A Qualitative Study into the Success and Challenges in Embedding Evidence-Based Research Methods in Operational Policing Interventions

Authors: Ahmed Kadry, Gwyn Dodd

Abstract:

There has been a growing call globally for police forces to embed evidence-based policing research methods into police interventions in order to better understand and evaluate their impact. This research study highlights the success and challenges that police forces may encounter when trying to embed evidence-based research methods within their organisation. 10 in-depth qualitative interviews were conducted with police officers and staff at Greater Manchester Police (GMP) who were tasked with integrating evidence-based research methods into their operational interventions. The findings of the study indicate that with adequate resources and individual expertise, evidence-based research methods can be applied to operational work, including the testing of initiatives with strict controls in order to fully evaluate the impact of an intervention. However, the findings also indicate that this may only be possible where an operational intervention is heavily resourced with police officers and staff who have a strong understanding of evidence-based policing research methods, attained for example through their own graduate studies. In addition, the findings reveal that ample planning time was needed to trial operational interventions that would require strict parameters for what would be tested and how it would be evaluated. In contrast, interviewees underscored that operational interventions with the need for a speedy implementation were less likely to have evidence-based research methods applied. The study contributes to the wider literature on evidence-based policing by providing considerations for police forces globally wishing to apply evidence-based research methods to more of their operational work in order to understand their impact. The study also provides considerations for academics who work closely with police forces in assisting them to embed evidence-based policing. This includes how academics can provide their expertise to police decision makers wanting to underpin their work through evidence-based research methods, such as providing guidance on how to evaluate the impact of their work with varying research methods that they may otherwise be unaware of.  

Keywords: evidence based policing, evidence-based practice, operational policing, organisational change

Procedia PDF Downloads 129
973 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 107
972 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution

Authors: Masomeh Jamshid Nejad

Abstract:

Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.

Keywords: statistics, excel-based instruction, data visualization, pedagogy

Procedia PDF Downloads 46
971 The Influence of Travel Experience within Perceived Public Transport Quality

Authors: Armando Cartenì, Ilaria Henke

Abstract:

The perceived public transport quality is an important driver that influences both customer satisfaction and mobility choices. The competition among transport operators needs to improve the quality of the services and identify which attributes are perceived as relevant by passengers. Among the “traditional” public transport quality attributes there are, for example: travel and waiting time, regularity of the services, and ticket price. By contrast, there are some “non-conventional” attributes that could significantly influence customer satisfaction jointly with the “traditional” ones. Among these, the beauty/aesthetics of the transport terminals (e.g. rail station and bus terminal) is probably one of the most impacting on user perception. Starting from these considerations, the point stressed in this paper was if (and how munch) the travel experience of the overall travel (e.g. how long is the travel, how many transport modes must be used) influences the perception of the public transport quality. The aim of this paper was to investigate the weight of the terminal quality (e.g. aesthetic, comfort and service offered) within the overall travel experience. The case study was the extra-urban Italian bus network. The passengers of the major Italian terminal bus were interviewed and the analysis of the results shows that about the 75% of the travelers, are available to pay up to 30% more for the ticket price for having a high quality terminal. A travel experience effect was observed: the average perceived transport quality varies with the characteristic of the overall trip. The passengers that have a “long trip” (travel time greater than 2 hours) perceived as “low” the overall quality of the trip even if they pass through a high quality terminal. The opposite occurs for the “short trip” passengers. This means that if a traveler passes through a high quality station, the overall perception of that terminal could be significantly reduced if he is tired from a long trip. This result is important and if confirmed through other case studies, will allow to conclude that the “travel experience impact" must be considered as an explicit design variable for public transport services and planning.

Keywords: transportation planning, sustainable mobility, decision support system, discrete choice model, design problem

Procedia PDF Downloads 286
970 The Investigation of Oil Price Shocks by Using a Dynamic Stochastic General Equilibrium: The Case of Iran

Authors: Bahram Fathi, Karim Alizadeh, Azam Mohammadbagheri

Abstract:

The aim of this paper is to investigate the role of oil price shocks in explaining business cycles in Iran using a dynamic stochastic general equilibrium approach. This model incorporates both productivity and oil revenue shocks. The results indicate that productivity shocks are relatively more important to business cycles than oil shocks. The model with two shocks produces different values for volatility, but these values have the same ranking as that of the actual data for most variables. In addition, the actual data are close to the ratio of standard deviations to the output obtained from the model with two shocks. The results indicate that productivity shocks are relatively more important to business cycles than the oil shocks. The model with only a productivity shock produces the most similar figures in term of volatility magnitude to that of the actual data. Next, we use the Impulse Response Functions (IRF) to evaluate the capability of the model. The IRF shows no effect of an oil shock on the capital stocks and on labor hours, which is a feature of the model. When the log-linearized system of equations is solved numerically, investment and labor hours were not found to be functions of the oil shock. This research recommends using different techniques to compare the model’s robustness. One method by which to do this is to have all decision variables as a function of the oil shock by inducing the stationary to the model differently. Another method is to impose a bond adjustment cost. This study intends to fill that gap. To achieve this objective, we derive a DSGE model that allows for the world oil price and productivity shocks. Second, we calibrate the model to the Iran economy. Next, we compare the moments from the theoretical model with both single and multiple shocks with that obtained from the actual data to see the extent to which business cycles in Iran can be explained by total oil revenue shock. Then, we use an impulse response function to evaluate the role of world oil price shocks. Finally, I present implications of the findings and interpretations in accordance with economic theory.

Keywords: oil price, shocks, dynamic stochastic general equilibrium, Iran

Procedia PDF Downloads 424
969 Tiger Team Strategy as a Health District Response to the COVID-19 Pandemic in Sydney, Australia during the Period between March 2020 to January 2022

Authors: Rehana Khan

Abstract:

Background: The study investigates the experiences of Tiger Teams within the Sydney Local Health District during the COVID-19 pandemic. Aim: The aims were to understand the experiences of the Tiger Team members, to evaluate the effectiveness of Tiger Teams, and to elicit any learnings for future implementation of Tiger Teams in a similar context. Methods: Tiger Team members who worked from March 2020 to January 2022 were approached, with 23 members agreeing to participate in the study. Individual interviews were undertaken by a researcher on a virtual platform. Thematic analysis was used to analyse the data. Saturation was deemed to have been reached when no new themes or subthemes arose within the final three interviews. Results: Four themes emerged: diversity worked well in Tiger Teams; fear of the unknown and challenging conversations were the main challenges of Tiger Teams; improved use of resources and more structure around the strategy of the Tiger Team model would help in future implementations; and Sydney Local Health District’s response to the pandemic was uniformly considered effective in keeping the community safe. In relation to Sydney Local Health District’s response in future pandemics, participants suggested having a pool of staff in readiness to undertake Tiger Team duties when required; prioritise staff welfare at all levels of involvement during a pandemic; maintaining transparent communication and relationship building between Executive level, Tiger Team members and clinical floor level in relation to decision making; and improve documentation, including evaluations of the COVID-19 pandemic response. Implications: The study provides constructive insights into the experiences of Tiger Team members, and these findings will help inform future planning for surge and secondment of staff in public health emergencies.

Keywords: Tiger Team, pandemic response, future planning, COVID-19

Procedia PDF Downloads 70
968 Community Involvement in Reducing Maternal and Perinatal Mortality in Cross River State, Nigeria: 'The Saving Mother Giving Life' Strategic Approach in Cross River State

Authors: Oluwayemisi Femi-Pius, Kazeem Arogundade, Eberechukwu Eke, Jimmy Eko

Abstract:

Introduction: Globally, community involvement in improving their own health has been widely adopted as a strategy in Sub-Saharan Africa principally to ensure equitable access to essential health care as well as improve the uptake of maternal and newborn health services especially in poor-resource settings. Method: The Saving Mother Giving Life (SMGL) Initiative implemented by Pathfinder International with funding support from USAID conducted a Health Facility Assessment (HFA) and found out that maternal mortality ratio in Cross River State was 812 per 100,000 live birth and perinatal mortality was 160 per 1000 live birth. To reduce maternal and perinatal mortality, Pathfinder International mobilized, selected and trained community members as community volunteers, traditional birth attendants, and emergency transport service volunteer drivers mainly to address the delay in decision making and reaching the health facility among pregnant women. Results: The results showed that maternal mortality ratio in Cross River State decrease by 25% from 812 per 100,000 live birth at baseline to 206 per 100,000 live birth at June 2018 and perinatal mortality reduced by 35% from 160 per 100,000 at baseline to 58 per 1000 live birth at June 2018. Data also show that ANC visit increased from 7,451 to 11,344; institutional delivery increased from 8,931 at baseline to 10,784 in June 2018. There was also a remarkable uptake of post-partum family planning from 0 at baseline to 233 in June 2018. Conclusion: There is clear evidence that community involvement yields positive maternal outcomes and is pivotal for sustaining most health interventions.

Keywords: maternal mortality, Nigeria, pathfinder international, perinatal mortality, saving mother giving life

Procedia PDF Downloads 180
967 Impact of Work Cycles on Autonomous Digital Learning

Authors: Bi̇rsen Tutunis, Zuhal Aydin

Abstract:

Guided digital learning has attracted many researchers as it leads to autonomous learning.The developments in Guided digital learning have led to changes in teaching and learning in English Language Teaching classes (Jeong-Bae, 2014). This study reports on tasks designed under the principles of learner autonomy in an online learning platform ‘’Webquest’’ with the purpose of teaching English to Turkish tertiary level students at a foundation university in Istanbul. Guided digital learning blog project contents were organized according to work-cycles phases (planning and negotiation phase, decision-making phase, project phase and evaluation phase) which are compatible with the principles of autonomous learning (Legenhausen,2003). The aim of the study was to implement the class blog project to find out its impact on students’ behaviours and beliefs towards autonomous learning. The mixed method research approach was taken. 24 tertiary level students participated in the study on voluntary basis. Data analysis was performed with Statistical Package for the Social Sciences. According to the results, students' attitudes towards digital learning did not differ before and after the training application. The learning styles of the students and their knowledge on digital learning scores differed. It has been observed that the students' learning styles and their digital learning scores increased after the training application. Autonomous beliefs, autonomous behaviors, group cohesion and group norms differed before and after the training application. Students' motivation level, strategies for learning English, perceptions of responsibility and out-of-class activity scores differed before and after the training application. It was seen that work-cycles in online classes create student centered learning that fosters autonomy. This paper will display the work cycles in detail and the researchers will give examples of in and beyond class activities and blog projects.

Keywords: guided digital learning, work cycles, english language teaching, autonomous learning

Procedia PDF Downloads 64
966 Modeling the Effects of Temperature on Air Pollutant Concentration

Authors: Mustapha Babatunde, Bassam Tawabini, Ole John Nielson

Abstract:

Air dispersion (AD) models such as AERMOD are important tools for estimating the environmental impacts of air pollutant emissions into the atmosphere from anthropogenic sources. The outcome of these models is significantly linked to the climate condition like air temperature, which is expected to differ in the future due to the global warming phenomenon. With projections from scientific sources of impending changes to the future climate of Saudi Arabia, especially anticipated temperature rise, there is a potential direct impact on the dispersion patterns of air pollutants results from AD models. To our knowledge, no similar studies were carried out in Saudi Arabia to investigate such impact. Therefore, this research investigates the effects of climate temperature change on air quality in the Dammam Metropolitan area, Saudi Arabia, using AERMOD coupled with Station data using Sulphur dioxide (SO2) – as a model air pollutant. The research uses AERMOD model to predict the SO2 dispersion trends on the surrounding area. Emissions from five (5) industrial stacks, on twenty-eight (28) receptors in the study area were considered for the climate period (2010-2019) and future period of mid-century (2040-2060) under different scenarios of elevated temperature profiles (+1oC, + 3oC and + 5oC) across averaging time periods of 1hr, 4hr and 8hr. Results showed that levels of SO2 at the receiving sites under current and simulated future climactic condition fall within the allowable limit of WHO and KSA air quality standards. Results also revealed that the projected rise in temperature would only have mild increment on the SO2 concentration levels. The average increase of SO2 levels were 0.04%, 0.14%, and 0.23% due to the temperature increase of 1, 3, and 5 degrees respectively. In conclusion, the outcome of this work elucidates the degree of the effects of global warming and climate changes phenomena on air quality and can help the policymakers in their decision-making, given the significant health challenges associated with ambient air pollution in Saudi Arabia.

Keywords: air quality, sulphur dioxide, global warming, air dispersion model

Procedia PDF Downloads 126
965 Contribution of PALB2 and BLM Mutations to Familial Breast Cancer Risk in BRCA1/2 Negative South African Breast Cancer Patients Detected Using High-Resolution Melting Analysis

Authors: N. C. van der Merwe, J. Oosthuizen, M. F. Makhetha, J. Adams, B. K. Dajee, S-R. Schneider

Abstract:

Women representing high-risk breast cancer families, who tested negative for pathogenic mutations in BRCA1 and BRCA2, are four times more likely to develop breast cancer compared to women in the general population. Sequencing of genes involved in genomic stability and DNA repair led to the identification of novel contributors to familial breast cancer risk. These include BLM and PALB2. Bloom's syndrome is a rare homozygous autosomal recessive chromosomal instability disorder with a high incidence of various types of neoplasia and is associated with breast cancer when in a heterozygous state. PALB2, on the other hand, binds to BRCA2 and together, they partake actively in DNA damage repair. Archived DNA samples of 66 BRCA1/2 negative high-risk breast cancer patients were retrospectively selected based on the presence of an extensive family history of the disease ( > 3 affecteds per family). All coding regions and splice-site boundaries of both genes were screened using High-Resolution Melting Analysis. Samples exhibiting variation were bi-directionally automated Sanger sequenced. The clinical significance of each variant was assessed using various in silico and splice site prediction algorithms. Comprehensive screening identified a total of 11 BLM and 26 PALB2 variants. The variants detected ranged from global to rare and included three novel mutations. Three BLM and two PALB2 likely pathogenic mutations were identified that could account for the disease in these extensive breast cancer families in the absence of BRCA mutations (BLM c.11T > A, p.V4D; BLM c.2603C > T, p.P868L; BLM c.3961G > A, p.V1321I; PALB2 c.421C > T, p.Gln141Ter; PALB2 c.508A > T, p.Arg170Ter). Conclusion: The study confirmed the contribution of pathogenic mutations in BLM and PALB2 to the familial breast cancer burden in South Africa. It explained the presence of the disease in 7.5% of the BRCA1/2 negative families with an extensive family history of breast cancer. Segregation analysis will be performed to confirm the clinical impact of these mutations for each of these families. These results justify the inclusion of both these genes in a comprehensive breast and ovarian next generation sequencing cancer panel and should be screened simultaneously with BRCA1 and BRCA2 as it might explain a significant percentage of familial breast and ovarian cancer in South Africa.

Keywords: Bloom Syndrome, familial breast cancer, PALB2, South Africa

Procedia PDF Downloads 221
964 Investigating the Influence of Activation Functions on Image Classification Accuracy via Deep Convolutional Neural Network

Authors: Gulfam Haider, sana danish

Abstract:

Convolutional Neural Networks (CNNs) have emerged as powerful tools for image classification, and the choice of optimizers profoundly affects their performance. The study of optimizers and their adaptations remains a topic of significant importance in machine learning research. While numerous studies have explored and advocated for various optimizers, the efficacy of these optimization techniques is still subject to scrutiny. This work aims to address the challenges surrounding the effectiveness of optimizers by conducting a comprehensive analysis and evaluation. The primary focus of this investigation lies in examining the performance of different optimizers when employed in conjunction with the popular activation function, Rectified Linear Unit (ReLU). By incorporating ReLU, known for its favorable properties in prior research, the aim is to bolster the effectiveness of the optimizers under scrutiny. Specifically, we evaluate the adjustment of these optimizers with both the original Softmax activation function and the modified ReLU activation function, carefully assessing their impact on overall performance. To achieve this, a series of experiments are conducted using a well-established benchmark dataset for image classification tasks, namely the Canadian Institute for Advanced Research dataset (CIFAR-10). The selected optimizers for investigation encompass a range of prominent algorithms, including Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad), and Stochastic Gradient Descent (SGD). The performance analysis encompasses a comprehensive evaluation of the classification accuracy, convergence speed, and robustness of the CNN models trained with each optimizer. Through rigorous experimentation and meticulous assessment, we discern the strengths and weaknesses of the different optimization techniques, providing valuable insights into their suitability for image classification tasks. By conducting this in-depth study, we contribute to the existing body of knowledge surrounding optimizers in CNNs, shedding light on their performance characteristics for image classification. The findings gleaned from this research serve to guide researchers and practitioners in making informed decisions when selecting optimizers and activation functions, thus advancing the state-of-the-art in the field of image classification with convolutional neural networks.

Keywords: deep neural network, optimizers, RMsprop, ReLU, stochastic gradient descent

Procedia PDF Downloads 110
963 Comparison of Parametric and Bayesian Survival Regression Models in Simulated and HIV Patient Antiretroviral Therapy Data: Case Study of Alamata Hospital, North Ethiopia

Authors: Zeytu G. Asfaw, Serkalem K. Abrha, Demisew G. Degefu

Abstract:

Background: HIV/AIDS remains a major public health problem in Ethiopia and heavily affecting people of productive and reproductive age. We aimed to compare the performance of Parametric Survival Analysis and Bayesian Survival Analysis using simulations and in a real dataset application focused on determining predictors of HIV patient survival. Methods: A Parametric Survival Models - Exponential, Weibull, Log-normal, Log-logistic, Gompertz and Generalized gamma distributions were considered. Simulation study was carried out with two different algorithms that were informative and noninformative priors. A retrospective cohort study was implemented for HIV infected patients under Highly Active Antiretroviral Therapy in Alamata General Hospital, North Ethiopia. Results: A total of 320 HIV patients were included in the study where 52.19% females and 47.81% males. According to Kaplan-Meier survival estimates for the two sex groups, females has shown better survival time in comparison with their male counterparts. The median survival time of HIV patients was 79 months. During the follow-up period 89 (27.81%) deaths and 231 (72.19%) censored individuals registered. The average baseline cluster of differentiation 4 (CD4) cells count for HIV/AIDS patients were 126.01 but after a three-year antiretroviral therapy follow-up the average cluster of differentiation 4 (CD4) cells counts were 305.74, which was quite encouraging. Age, functional status, tuberculosis screen, past opportunistic infection, baseline cluster of differentiation 4 (CD4) cells, World Health Organization clinical stage, sex, marital status, employment status, occupation type, baseline weight were found statistically significant factors for longer survival of HIV patients. The standard error of all covariate in Bayesian log-normal survival model is less than the classical one. Hence, Bayesian survival analysis showed better performance than classical parametric survival analysis, when subjective data analysis was performed by considering expert opinions and historical knowledge about the parameters. Conclusions: Thus, HIV/AIDS patient mortality rate could be reduced through timely antiretroviral therapy with special care on the potential factors. Moreover, Bayesian log-normal survival model was preferable than the classical log-normal survival model for determining predictors of HIV patients survival.

Keywords: antiretroviral therapy (ART), Bayesian analysis, HIV, log-normal, parametric survival models

Procedia PDF Downloads 180
962 Maximizing the Community Services of Multi-Location Public Facilities in Urban Residential Areas by the Use of Constructing the Accessibility Index and Spatial Buffer Zone

Authors: Yen-Jong Chen, Jei-An Su

Abstract:

Public use facilities provide the basic infrastructure supporting the needs of urban sustainable development. These facilities include roads (streets), parking areas, green spaces, public schools, and city parks. However, how to acquire land with the proper location and size still remains uncertain in a capitalist economy where land is largely privately owned, such as in cities in Taiwan. The issue concerning the proper acquisition of reserved land for local public facilities (RLPF) policies has been continuously debated by the Taiwanese government for more than 30 years. Lately, the government has been re-evaluating projects connected with existing RLPF policies from the viewpoints of the needs of local residents, including the living environments of older adults. This challenging task includes addressing the requests of official bureau administrators, citizens whose property rights and current use status are affected, and other stakeholders, along with the means of development. To simplify the decision to acquire or release public land, we selected only public facilities that are needed for living in the local community, including parks, green spaces, plaza squares, and land for kindergartens, schools, and local stadiums. This study categorized these spaces as the community’s “leisure public facilities” (LPF). By constructing an accessibility index of the services of such multi-function facilities, we computed and produced a GIS map of spatial buffer zones for each LPF. Through these procedures, the service needs provided by each LPF were clearly identified. We then used spatial buffer zone envelope mapping to evaluate these service areas. The results obtained can help decide which RLPF should be acquired or released so that community services can be maximized under a limited budget.

Keywords: urban public facilities, community demand, accessibility, spatial buffer zone, Taiwan

Procedia PDF Downloads 69
961 Parental Awareness and Willingness to Vaccinate Adolescent Daughters against Human Papilloma Virus for Cervical Cancer Prevention in Eastern Region of Kenya: Towards Affirmative Action

Authors: Jacinta Musyoka, Wesley Too

Abstract:

Cervical cancer is the second leading cause of cancer-related deaths in Kenya and the second most common cancer among women, yet preventable following prevention strategies put in place, which includes vaccination with Human Papilloma Virus Vaccine (HPPV) among the young adolescent girls. Kenya has the highest burden of cervical cancer and the leading cause of death among women of reproductive age and is a known frequent type of cancer amongst women. This is expected to double by 2025 if the necessary steps are not taken, which include vaccinating girls between the ages of 9 and 14 and screening women. Parental decision is critical in ensuring that their daughters receive this vaccine. Hence this study sought to establish parental willingness and factors associate with the acceptability to vaccine adolescent daughters against the human papilloma virus for cervical cancer prevention in Machakos County, Eastern Region of Kenya. Method: Cross-sectional study design utilizing a mixed methods approach was used to collect data from Nguluni Health Centre in Machakos County; Matungulu sub-county, Kenya. This study targeted all parents of adolescent girls seeking health care services in the Matungulu sub-county area who were aged 18 years and above. A total of 220 parents with adolescent girls aged 10-14 years were enrolled into the study after informed consent were sought. All ethical considerations were observed. Quantitative data were analyzed using Multivariate regression analysis, and thematic analysis was used for qualitative data related to perceptions of parents on HPVV. Results, conclusions, and recommendations- ongoing. We expect to report findings and articulate contributions based on the study findings in due course before October 2022

Keywords: adolescents, human papilloma virus, kenya, parents

Procedia PDF Downloads 104