Search results for: artificial agency
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2868

Search results for: artificial agency

2268 Medical Diagnosis of Retinal Diseases Using Artificial Intelligence Deep Learning Models

Authors: Ethan James

Abstract:

Over one billion people worldwide suffer from some level of vision loss or blindness as a result of progressive retinal diseases. Many patients, particularly in developing areas, are incorrectly diagnosed or undiagnosed whatsoever due to unconventional diagnostic tools and screening methods. Artificial intelligence (AI) based on deep learning (DL) convolutional neural networks (CNN) have recently gained a high interest in ophthalmology for its computer-imaging diagnosis, disease prognosis, and risk assessment. Optical coherence tomography (OCT) is a popular imaging technique used to capture high-resolution cross-sections of retinas. In ophthalmology, DL has been applied to fundus photographs, optical coherence tomography, and visual fields, achieving robust classification performance in the detection of various retinal diseases including macular degeneration, diabetic retinopathy, and retinitis pigmentosa. However, there is no complete diagnostic model to analyze these retinal images that provide a diagnostic accuracy above 90%. Thus, the purpose of this project was to develop an AI model that utilizes machine learning techniques to automatically diagnose specific retinal diseases from OCT scans. The algorithm consists of neural network architecture that was trained from a dataset of over 20,000 real-world OCT images to train the robust model to utilize residual neural networks with cyclic pooling. This DL model can ultimately aid ophthalmologists in diagnosing patients with these retinal diseases more quickly and more accurately, therefore facilitating earlier treatment, which results in improved post-treatment outcomes.

Keywords: artificial intelligence, deep learning, imaging, medical devices, ophthalmic devices, ophthalmology, retina

Procedia PDF Downloads 181
2267 Metareasoning Image Optimization Q-Learning

Authors: Mahasa Zahirnia

Abstract:

The purpose of this paper is to explore new and effective ways of optimizing satellite images using artificial intelligence, and the process of implementing reinforcement learning to enhance the quality of data captured within the image. In our implementation of Bellman's Reinforcement Learning equations, associated state diagrams, and multi-stage image processing, we were able to enhance image quality, detect and define objects. Reinforcement learning is the differentiator in the area of artificial intelligence, and Q-Learning relies on trial and error to achieve its goals. The reward system that is embedded in Q-Learning allows the agent to self-evaluate its performance and decide on the best possible course of action based on the current and future environment. Results show that within a simulated environment, built on the images that are commercially available, the rate of detection was 40-90%. Reinforcement learning through Q-Learning algorithm is not just desired but required design criteria for image optimization and enhancements. The proposed methods presented are a cost effective method of resolving uncertainty of the data because reinforcement learning finds ideal policies to manage the process using a smaller sample of images.

Keywords: Q-learning, image optimization, reinforcement learning, Markov decision process

Procedia PDF Downloads 215
2266 From Biosensors towards Artificial Intelligence: A New Era in Toxoplasmosis Diagnostics and Therapeutics

Authors: Gehan Labib Abuelenain, Azza Fahmi, Salma Awad Mahmoud

Abstract:

Toxoplasmosis is a global parasitic disease caused by the protozoan Toxoplasma gondii (T. gondii), with a high infection rate that affects one third of the human population and results in severe implications in pregnant women, neonates, and immunocompromised patients. Anti-parasitic treatments and schemes available against toxoplasmosis have barely evolved over the last two decades. The available T. gondii therapeutics cannot completely eradicate tissue cysts produced by the parasite and are not well-tolerated by immunocompromised patients. This work aims to highlight new trends in Toxoplasma gondii diagnosis by providing a comprehensive overview of the field, summarizing recent findings, and discussing the new technological advancements in toxoplasma diagnosis and treatment. Advancements in therapeutics utilizing trends in molecular biophysics, such as biosensors, epigenetics, and artificial intelligence (AI), might provide solutions for disease management and prevention. These insights will provide tools to identify research gaps and proffer planning options for disease control.

Keywords: toxoplamosis, diagnosis, therapeutics, biosensors, AI

Procedia PDF Downloads 35
2265 Nonlinear Aerodynamic Parameter Estimation of a Supersonic Air to Air Missile by Using Artificial Neural Networks

Authors: Tugba Bayoglu

Abstract:

Aerodynamic parameter estimation is very crucial in missile design phase, since accurate high fidelity aerodynamic model is required for designing high performance and robust control system, developing high fidelity flight simulations and verification of computational and wind tunnel test results. However, in literature, there is not enough missile aerodynamic parameter identification study for three main reasons: (1) most air to air missiles cannot fly with constant speed, (2) missile flight test number and flight duration are much less than that of fixed wing aircraft, (3) variation of the missile aerodynamic parameters with respect to Mach number is higher than that of fixed wing aircraft. In addition to these challenges, identification of aerodynamic parameters for high wind angles by using classical estimation techniques brings another difficulty in the estimation process. The reason for this, most of the estimation techniques require employing polynomials or splines to model the behavior of the aerodynamics. However, for the missiles with a large variation of aerodynamic parameters with respect to flight variables, the order of the proposed model increases, which brings computational burden and complexity. Therefore, in this study, it is aimed to solve nonlinear aerodynamic parameter identification problem for a supersonic air to air missile by using Artificial Neural Networks. The method proposed will be tested by using simulated data which will be generated with a six degree of freedom missile model, involving a nonlinear aerodynamic database. The data will be corrupted by adding noise to the measurement model. Then, by using the flight variables and measurements, the parameters will be estimated. Finally, the prediction accuracy will be investigated.

Keywords: air to air missile, artificial neural networks, open loop simulation, parameter identification

Procedia PDF Downloads 279
2264 An Analysis of a Canadian Personalized Learning Curriculum

Authors: Ruthanne Tobin

Abstract:

The shift to a personalized learning (PL) curriculum in Canada represents an innovative approach to teaching and learning that is also evident in various initiatives across the 32-nation OECD. The premise behind PL is that empowering individual learners to have more input into how they access and construct knowledge, and express their understanding of it, will result in more meaningful school experiences and academic success. In this paper presentation, the author reports on a document analysis of the new curriculum in the province of British Columbia. Three theoretical frameworks are used to analyze the new curriculum. Framework 1 focuses on five dominant aspects (FDA) of PL at the classroom level. Framework 2 focuses on conceptualizing and enacting personalized learning (CEPL) within three spheres of influence. Framework 3 focuses on the integration of three types of knowledge (content, technological, and pedagogical). Analysis is ongoing, but preliminary findings suggest that the new curriculum addresses framework 1 quite well, which identifies five areas of personalized learning: 1) assessment for learning; 2) effective teaching and learning; 3) curriculum entitlement (choice); 4) school organization; and 5) “beyond the classroom walls” (learning in the community). Framework 2 appears to be less well developed in the new curriculum. This framework speaks to the dynamics of PL within three spheres of interaction: 1) nested agency, comprised of overarching constraints [and enablers] from policy makers, school administrators and community; 2) relational agency, which refers to a capacity for professionals to develop a network of expertise to serve shared goals; and 3) students’ personalized learning experience, which integrates differentiation with self-regulation strategies. Framework 3 appears to be well executed in the new PL curriculum, as it employs the theoretical model of technological, pedagogical content knowledge (TPACK) in which there are three interdependent bodies of knowledge. Notable within this framework is the emphasis on the pairing of technologies with excellent pedagogies to significantly assist students and teachers. This work will be of high relevance to educators interested in innovative school reform.

Keywords: curriculum reform, K-12 school change, innovations in education, personalized learning

Procedia PDF Downloads 282
2263 Artificial Neural Network-Based Prediction of Effluent Quality of Wastewater Treatment Plant Employing Data Preprocessing Approaches

Authors: Vahid Nourani, Atefeh Ashrafi

Abstract:

Prediction of treated wastewater quality is a matter of growing importance in water treatment procedure. In this way artificial neural network (ANN), as a robust data-driven approach, has been widely used for forecasting the effluent quality of wastewater treatment. However, developing ANN model based on appropriate input variables is a major concern due to the numerous parameters which are collected from treatment process and the number of them are increasing in the light of electronic sensors development. Various studies have been conducted, using different clustering methods, in order to classify most related and effective input variables. This issue has been overlooked in the selecting dominant input variables among wastewater treatment parameters which could effectively lead to more accurate prediction of water quality. In the presented study two ANN models were developed with the aim of forecasting effluent quality of Tabriz city’s wastewater treatment plant. Biochemical oxygen demand (BOD) was utilized to determine water quality as a target parameter. Model A used Principal Component Analysis (PCA) for input selection as a linear variance-based clustering method. Model B used those variables identified by the mutual information (MI) measure. Therefore, the optimal ANN structure when the result of model B compared with model A showed up to 15% percent increment in Determination Coefficient (DC). Thus, this study highlights the advantage of PCA method in selecting dominant input variables for ANN modeling of wastewater plant efficiency performance.

Keywords: Artificial Neural Networks, biochemical oxygen demand, principal component analysis, mutual information, Tabriz wastewater treatment plant, wastewater treatment plant

Procedia PDF Downloads 128
2262 [Keynote Speech]: Determination of Naturally Occurring and Artificial Radionuclide Activity Concentrations in Marine Sediments in Western Marmara, Turkey

Authors: Erol Kam, Z. U. Yümün

Abstract:

Natural and artificial radionuclides cause radioactive contamination in environments, just as the other non-biodegradable pollutants (heavy metals, etc.) sink to the sea floor and accumulate in sediments. Especially the habitat of benthic foraminifera living on the surface of sediments or in sediments at the seafloor are affected by radioactive pollution in the marine environment. Thus, it is important for pollution analysis to determine the radionuclides. Radioactive pollution accumulates in the lowest level of the food chain and reaches humans at the highest level. The more the accumulation, the more the environment is endangered. This study used gamma spectrometry to investigate the natural and artificial radionuclide distribution of sediment samples taken from living benthic foraminifera habitats in the Western Marmara Sea. The radionuclides, K-40, Cs-137, Ra-226, Mn 54, Zr-95+ and Th-232, were identified in the sediment samples. For this purpose, 18 core samples were taken from depths of about 25-30 meters in the Marmara Sea in 2016. The locations of the core samples were specifically selected exclusively from discharge points for domestic and industrial areas, port locations, and so forth to represent pollution in the study area. Gamma spectrometric analysis was used to determine the radioactive properties of sediments. The radionuclide concentration activity values in the sediment samples obtained were Cs-137=0.9-9.4 Bq/kg, Th-232=18.9-86 Bq/kg, Ra-226=10-50 Bq/kg, K-40=24.4–670 Bq/kg, Mn 54=0.71–0.9 Bq/kg and Zr-95+=0.18–0.19 Bq/kg. These values were compared with the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) data, and an environmental analysis was carried out. The Ra-226 series, the Th-232 series, and the K-40 radionuclides accumulate naturally and are increasing every day due to anthropogenic pollution. Although the Ra-226 values obtained in the study areas remained within normal limits according to the UNSCEAR values, the K-40, and Th-232 series values were found to be high in almost all the locations.

Keywords: Ra-226, Th-232, K-40, Cs-137, Mn 54, Zr-95+, radionuclides, Western Marmara Sea

Procedia PDF Downloads 420
2261 Application of Artificial Neural Network for Prediction of Retention Times of Some Secoestrane Derivatives

Authors: Nataša Kalajdžija, Strahinja Kovačević, Davor Lončar, Sanja Podunavac Kuzmanović, Lidija Jevrić

Abstract:

In order to investigate the relationship between retention and structure, a quantitative Structure Retention Relationships (QSRRs) study was applied for the prediction of retention times of a set of 23 secoestrane derivatives in a reversed-phase thin-layer chromatography. After the calculation of molecular descriptors, a suitable set of molecular descriptors was selected by using step-wise multiple linear regressions. Artificial Neural Network (ANN) method was employed to model the nonlinear structure-activity relationships. The ANN technique resulted in 5-6-1 ANN model with the correlation coefficient of 0.98. We found that the following descriptors: Critical pressure, total energy, protease inhibition, distribution coefficient (LogD) and parameter of lipophilicity (miLogP) have a significant effect on the retention times. The prediction results are in very good agreement with the experimental ones. This approach provided a new and effective method for predicting the chromatographic retention index for the secoestrane derivatives investigated.

Keywords: lipophilicity, QSRR, RP TLC retention, secoestranes

Procedia PDF Downloads 454
2260 Eco-Agriculture for Effective Solid Waste Management in Minna, Nigeria

Authors: A. Abdulkadir, Y. M. Bello, A. A. Okhimamhe, H. Ibrahim, M. B. Matazu, L. S. Barau

Abstract:

The increasing volume of solid waste generated, collected and disposed daily complicate adequate management of solid waste by the relevant agency like Niger State Environmental Protection Agency (NISEPA). In addition, the impacts of solid waste on the natural environment and human livelihood require identification of cost-effective ways for sustainable municipal waste management in Nigeria. These signal the need for identifying environment-friendly initiative and local solution to address municipal solid waste. A research field was secured at Pago, Minna, Niger State which is located in the guinea savanna belt of Nigeria, within longitude 60 3614311- 4511 and latitude 90 291 37.6111- .6211 N. Poultry droppings, decomposed household waste manure and NPK treatment were used. The experimental field was divided into three replications and four (4) treatments on each replication making a total of twelve (12) plots. The treatments were allotted using Randomized Complete Block Design (RCBD) and Data collected was analyzed using SPSS software and RCBD. The result depicts variation in plant height and number of leaves at 50% flowering; Poultry dropping records the highest height as a number of leaves for waste manure competes fairly well with NPK treatment. Similarly, the varying treatments significantly increase vegetable yield, as the control (Nontreatment) records the least yield for the three vegetable samples. Adoption of this organic manure for cultivation does not only enhance environment quality and attainment of food security but will contribute to local economic development, poverty alleviation, and social inclusion.

Keywords: environmental issues, food security, NISEPA, solid waste

Procedia PDF Downloads 345
2259 Analyzing Middle Actors' Influence on Land Use Policy: A Case Study in Central Kalimantan, Indonesia

Authors: Kevin Soubly, Kaysara Khatun

Abstract:

This study applies the existing Middle-Out Perspective (MOP) as a complementing analytical alternative to the customary dichotomous options of top-down vs. bottom-up strategies of international development and commons governance. It expands the framework by applying it to a new context of land management and environmental change, enabling fresh understandings of decision making around land use. Using a case study approach in Central Kalimantan, Indonesia among a village of indigenous Dayak, this study explores influences from both internal and external middle actors, utilizing qualitative empirical evidence and incorporating responses across 25 village households and 11 key stakeholders. Applying the factors of 'agency' and 'capacity' specific to the MOP, this study demonstrates middle actors’ unique capabilities and criticality to change due to their influence across various levels of decision-making. Study results indicate that middle actors play a large role, both passively and actively, both directly and indirectly, across various levels of decision-making, perception-shaping, and commons governance. In addition, the prominence of novel 'passive' middle actors, such as the internet, can provide communities themselves with a level of agency beyond that provided by other middle actors such as NGOs and palm oil industry entities – which often operate at the behest of the 'top' or out of self-interest. Further, the study posits that existing development and decision-making frameworks may misidentify the 'bottom' as the 'middle,' raising questions about traditional development and livelihood discourse, strategies, and support, from agricultural production to forest management. In conclusion, this study provides recommendations including that current policy preconceptions be reevaluated to engage middle actors in locally-adapted, integrative manners in order to improve governance and rural development efforts more broadly.

Keywords: environmental management, governance, Indonesia, land use, middle actors, middle-out perspective

Procedia PDF Downloads 115
2258 Evaluation of the Conditions of Managed Aquifer Recharge in the West African Basement Area

Authors: Palingba Aimé Marie Doilkom, Mahamadou Koïta, Jean-michel Vouillamoz, Angelbert Biaou

Abstract:

Most African populations rely on groundwater in rural areas for their consumption. Indeed, in the face of climate change and strong demographic growth, groundwater, particularly in the basement, is increasingly in demand. The question of the sustainability of water resources in this type of environment is therefore becoming a major issue. Groundwater recharge can be natural or artificial. Unlike natural recharge, which often results from the natural infiltration of surface water (e.g. a share of rainfall), artificial recharge consists of causing water infiltration through appropriate developments to artificially replenish the water stock of an aquifer. Artificial recharge is, therefore, one of the measures that can be implemented to secure water supply, combat the effects of climate change, and, more generally, contribute to improving the quantitative status of groundwater bodies. It is in this context that the present research is conducted with the aim of developing artificial recharge in order to contribute to the sustainability of basement aquifers in a context of climatic variability and constantly increasing water needs of populations. In order to achieve the expected results, it is therefore important to determine the characteristics of the infiltration basins and to identify the areas suitable for their implementation. The geometry of the aquifer was reproduced, and the hydraulic properties of the aquifer were collected and characterized, including boundary conditions, hydraulic conductivity, effective porosity, recharge, Van Genuchten parameters, and saturation indices. The aquifer of the Sanon experimental site is made up of three layers, namely the saprolite, the fissured horizon, and the healthy basement. Indeed, the saprolite and the fissured medium were considered for the simulations. The first results with FEFLOW model show that the water table reacts continuously for the first 100 days before stabilizing. The hydraulic charge increases by an average of 1 m. The further away from the basin, the less the water table reacts. However, if a variable hydraulic head is imposed on the basins, it can be seen that the response of the water table is not uniform over time. The lower the basin hydraulic head, the less it affects the water table. These simulations must be continued by improving the characteristics of the basins in order to obtain the appropriate characteristics for a good recharge.

Keywords: basement area, FEFLOW, infiltration basin, MAR

Procedia PDF Downloads 76
2257 A Comprehensive Theory of Communication with Biological and Non-Biological Intelligence for a 21st Century Curriculum

Authors: Thomas Schalow

Abstract:

It is commonly recognized that our present curriculum is not preparing students to function in the 21st century. This is particularly true in regard to communication needs across cultures - both human and non-human. In this paper, a comprehensive theory of communication-based on communication with non-human cultures and intelligences is presented to meet the following three imminent contingencies: communicating with sentient biological intelligences, communicating with extraterrestrial intelligences, and communicating with artificial super-intelligences. The paper begins with the argument that we need to become much more serious about communicating with the non-human, intelligent life forms that already exists around us here on Earth. We need to broaden our definition of communication and reach out to other sentient life forms in order to provide humanity with a better perspective of its place within our ecosystem. The paper next examines the science and philosophy behind CETI (communication with extraterrestrial intelligences) and how it could prove useful even in the absence of contact with alien life. However, CETI’s assumptions and methodology need to be revised in accordance with the communication theory being proposed in this paper if we are truly serious about finding and communicating with life beyond Earth. The final theme explored in this paper is communication with non-biological super-intelligences. Humanity has never been truly compelled to converse with other species, and our failure to seriously consider such intercourse has left us largely unprepared to deal with communication in a future that will be mediated and controlled by computer algorithms. Fortunately, our experience dealing with other cultures can provide us with a framework for this communication. The basic concepts behind intercultural communication can be applied to the three types of communication envisioned in this paper if we are willing to recognize that we are in fact dealing with other cultures when we interact with other species, alien life, and artificial super-intelligence. The ideas considered in this paper will require a new mindset for humanity, but a new disposition will yield substantial gains. A curriculum that is truly ready for the 21st century needs to be aligned with this new theory of communication.

Keywords: artificial intelligence, CETI, communication, language

Procedia PDF Downloads 364
2256 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques

Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas

Abstract:

The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.

Keywords: Artificial Neural network, Competitive dynamics, Logistic Regression, Text classification, Text mining

Procedia PDF Downloads 121
2255 The AI Arena: A Framework for Distributed Multi-Agent Reinforcement Learning

Authors: Edward W. Staley, Corban G. Rivera, Ashley J. Llorens

Abstract:

Advances in reinforcement learning (RL) have resulted in recent breakthroughs in the application of artificial intelligence (AI) across many different domains. An emerging landscape of development environments is making powerful RL techniques more accessible for a growing community of researchers. However, most existing frameworks do not directly address the problem of learning in complex operating environments, such as dense urban settings or defense-related scenarios, that incorporate distributed, heterogeneous teams of agents. To help enable AI research for this important class of applications, we introduce the AI Arena: a scalable framework with flexible abstractions for distributed multi-agent reinforcement learning. The AI Arena extends the OpenAI Gym interface to allow greater flexibility in learning control policies across multiple agents with heterogeneous learning strategies and localized views of the environment. To illustrate the utility of our framework, we present experimental results that demonstrate performance gains due to a distributed multi-agent learning approach over commonly-used RL techniques in several different learning environments.

Keywords: reinforcement learning, multi-agent, deep learning, artificial intelligence

Procedia PDF Downloads 157
2254 Implications of Humanizing Pedagogy on Learning Design in a Technology-Enhanced Language Learning Environment: Critical Reflections on Student Identity and Agency

Authors: Mukhtar Raban

Abstract:

Nelson Mandela University subscribes to a humanizing pedagogy (HP), as housed under broader critical pedagogy, that underpins and informs learning and teaching activities at the institution. The investigation sought to explore the implications of humanizing and critical pedagogical considerations for a technology-enhanced language learning (TELL) environment in a university course. The paper inquires into the design of a learning resource in an online learning environment of an English communication module, that applied HP principles. With an objective of creating agentive spaces for foregrounding identity, student voice, critical self-reflection, and recognition of others’ humanity; a flexible and open 'My Presence' feature was added to the TELL environment that allowed students and lecturers to share elements of their backgrounds in a ‘mutually vulnerable’ manner as a way of establishing digital identity and a more ‘human’ presence in the online language learning encounter, serving as a catalyst for the recognition of the ‘other’. Following a qualitative research design, the study adopted an auto-ethnographic approach, complementing the critical inquiry nature embedded into the activity’s practices. The study’s findings provide critical reflections and deductions on the possibilities of leveraging digital human expression within a humanizing pedagogical framework to advance the realization of HP-adoption in language learning and teaching encounters. It was found that the consideration of humanizing pedagogical principles in the design of online learning was more effective when the critical outcomes were explicated to students and lecturers prior to the completion of the activities. The integration of humanizing pedagogy also led to a contextual advancement of ‘affective’ language learning. Upon critical reflection and analysis, student identity and agency can flourish in a technology-enhanced learning environment when humanizing, and critical pedagogy influences the learning design.

Keywords: critical reflection, humanizing pedagogy, student identity, technology-enhanced language learning

Procedia PDF Downloads 135
2253 The Implication of Disaster Risk Identification to Cultural Heritage-The Scenarios of Flood Risk in Taiwan

Authors: Jieh-Jiuh Wang

Abstract:

Disasters happen frequently due to the global climate changes today. The cultural heritage conservation should be considered from the perspectives of surrounding environments and large-scale disasters. Most current thoughts about the disaster prevention of cultural heritages in Taiwan are single-point thoughts emphasizing firefighting, decay prevention, and construction reinforcement and ignoring the whole concept of the environment. The traditional conservation cannot defend against more and more tremendous and frequent natural disasters caused by climate changes. More and more cultural heritages are confronting the high risk of disasters. This study adopts the perspective of risk identification and takes flood as the main disaster category. It analyzes the amount and categories of cultural heritages that might suffer from disasters with the geographic information system integrating the latest flooding potential data from National Fire Agency and Water Resources Agency and the basic data of cultural heritages. It examines the actual risk of cultural heritages confronting floods and serves as the accordance for future considerations of risk measures and preparation for reducing disasters. The result of the study finds the positive relationship between the disaster affected situation of national cultural heritages and the rainfall intensity. The order of impacted level by floods is historical buildings, historical sites indicated by municipalities and counties, and national historical sites and relics. However, traditional settlements and cultural landscapes are not impacted. It might be related to the taboo space in the traditional culture of site selection (concepts of disaster avoidance). As for the regional distribution on the other hand, cultural heritages in central and northern Taiwan suffer from more shocking floods, while the heritages in northern and eastern Taiwan suffer from more serious flooding depth.

Keywords: cultural heritage, flood, preventive conservation, risk management

Procedia PDF Downloads 338
2252 Prediction of Compressive Strength Using Artificial Neural Network

Authors: Vijay Pal Singh, Yogesh Chandra Kotiyal

Abstract:

Structures are a combination of various load carrying members which transfer the loads to the foundation from the superstructure safely. At the design stage, the loading of the structure is defined and appropriate material choices are made based upon their properties, mainly related to strength. The strength of materials kept on reducing with time because of many factors like environmental exposure and deformation caused by unpredictable external loads. Hence, to predict the strength of materials used in structures, various techniques are used. Among these techniques, Non-Destructive Techniques (NDT) are the one that can be used to predict the strength without damaging the structure. In the present study, the compressive strength of concrete has been predicted using Artificial Neural Network (ANN). The predicted strength was compared with the experimentally obtained actual compressive strength of concrete and equations were developed for different models. A good co-relation has been obtained between the predicted strength by these models and experimental values. Further, the co-relation has been developed using two NDT techniques for prediction of strength by regression analysis. It was found that the percentage error has been reduced between the predicted strength by using combined techniques in place of single techniques.

Keywords: rebound, ultra-sonic pulse, penetration, ANN, NDT, regression

Procedia PDF Downloads 428
2251 Thermalytix: An Advanced Artificial Intelligence Based Solution for Non-Contact Breast Screening

Authors: S. Sudhakar, Geetha Manjunath, Siva Teja Kakileti, Himanshu Madhu

Abstract:

Diagnosis of breast cancer at early stages has seen better clinical and survival outcomes. Survival rates in developing countries like India are very low due to accessibility and affordability issues of screening tests such as Mammography. In addition, Mammography is not much effective in younger women with dense breasts. This leaves a gap in current screening methods. Thermalytix is a new technique for detecting breast abnormality in a non-contact, non-invasive way. It is an AI-enabled computer-aided diagnosis solution that automates interpretation of high resolution thermal images and identifies potential malignant lesions. The solution is low cost, easy to use, portable and is effective in all age groups. This paper presents the results of a retrospective comparative analysis of Thermalytix over Mammography and Clinical Breast Examination for breast cancer screening. Thermalytix was found to have better sensitivity than both the tests, with good specificity as well. In addition, Thermalytix identified all malignant patients without palpable lumps.

Keywords: breast cancer screening, radiology, thermalytix, artificial intelligence, thermography

Procedia PDF Downloads 291
2250 The Role of ChatGPT in Enhancing ENT Surgical Training

Authors: Laura Brennan, Ram Balakumar

Abstract:

ChatGPT has been developed by Open AI (Nov 2022) as a powerful artificial intelligence (AI) language model which has been designed to produce human-like text from user written prompts. To gain the most from the system, user written prompts must give context specific information. This article aims to give guidance on how to optimise the ChatGPT system in the context of education for otolaryngology. Otolaryngology is a specialist field which sees little time dedicated to providing education to both medical students and doctors. Additionally, otolaryngology trainees have seen a reduction in learning opportunities since the COVID-19 pandemic. In this article we look at these various barriers to medical education in Otolaryngology training and suggest ways that ChatGPT can overcome them and assist in simulation-based training. Examples provide how this can be achieved using the Authors’ experience to further highlight the practicalities. What this article has found is that while ChatGPT cannot replace traditional mentorship and practical surgical experience, it can serve as an invaluable supplementary resource to simulation based medical education in Otolaryngology.

Keywords: artificial intelligence, otolaryngology, surgical training, medical education

Procedia PDF Downloads 159
2249 Investigating the Effect of Artificial Intelligence on the Improvement of Green Supply Chain in Industry

Authors: Sepinoud Hamedi

Abstract:

Over the past few decades, companies have appeared developing concerns in connection to the natural affect of their fabricating exercises. Green supply chain administration has been considered by the producers as a attainable choice to decrease the natural affect of operations whereas at the same time moving forward their operational execution. Contemporaneously the coming of digitalization and globalization within the supply chain space has driven to a developing acknowledgment of the importance of data preparing methodologies, such as enormous information analytics and fake insights innovations, in improving and optimizing supply chain execution. Also, supply chain collaboration in part intervenes the relationship between manufactured innovation and supply chain execution Ponders appear that the use of BDA-AI advances includes a significant impact on natural handle integration and green supply chain collaboration conjointly underlines that both natural handle integration and green supply chain collaboration have a critical affect on natural execution. Correspondingly savvy supply chain contributes to green execution through overseeing green connections and setting up green operations.

Keywords: green supply chain, artificial intelligence, manufacturers, technology, environmental

Procedia PDF Downloads 73
2248 Machine Learning Techniques in Bank Credit Analysis

Authors: Fernanda M. Assef, Maria Teresinha A. Steiner

Abstract:

The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.

Keywords: artificial neural networks (ANNs), classifier algorithms, credit risk assessment, logistic regression, machine Learning, support vector machines

Procedia PDF Downloads 103
2247 Dogmatic Analysis of Legal Risks of Using Artificial Intelligence: The European Union and Polish Perspective

Authors: Marianna Iaroslavska

Abstract:

ChatGPT is becoming commonplace. However, only a few people think about the legal risks of using Large Language Model in their daily work. The main dilemmas concern the following areas: who owns the copyright to what somebody creates through ChatGPT; what can OpenAI do with the prompt you enter; can you accidentally infringe on another creator's rights through ChatGPT; what about the protection of the data somebody enters into the chat. This paper will present these and other legal risks of using large language models at work using dogmatic methods and case studies. The paper will present a legal analysis of AI risks against the background of European Union law and Polish law. This analysis will answer questions about how to protect data, how to make sure you do not violate copyright, and what is at stake with the AI Act, which recently came into force in the EU. If your work is related to the EU area, and you use AI in your work, this paper will be a real goldmine for you. The copyright law in force in Poland does not protect your rights to a work that is created with the help of AI. So if you start selling such a work, you may face two main problems. First, someone may steal your work, and you will not be entitled to any protection because work created with AI does not have any legal protection. Second, the AI may have created the work by infringing on another person's copyright, so they will be able to claim damages from you. In addition, the EU's current AI Act imposes a number of additional obligations related to the use of large language models. The AI Act divides artificial intelligence into four risk levels and imposes different requirements depending on the level of risk. The EU regulation is aimed primarily at those developing and marketing artificial intelligence systems in the EU market. In addition to the above obstacles, personal data protection comes into play, which is very strictly regulated in the EU. If you violate personal data by entering information into ChatGPT, you will be liable for violations. When using AI within the EU or in cooperation with entities located in the EU, you have to take into account a lot of risks. This paper will highlight such risks and explain how they can be avoided.

Keywords: EU, AI act, copyright, polish law, LLM

Procedia PDF Downloads 20
2246 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection

Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari

Abstract:

In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.

Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs

Procedia PDF Downloads 364
2245 New Advanced Medical Software Technology Challenges and Evolution of the Regulatory Framework in Expert Software, Artificial Intelligence, and Machine Learning

Authors: Umamaheswari Shanmugam, Silvia Ronchi, Radu Vornicu

Abstract:

Software, artificial intelligence, and machine learning can improve healthcare through innovative and advanced technologies that are able to use the large amount and variety of data generated during healthcare services every day. As we read the news, over 500 machine learning or other artificial intelligence medical devices have now received FDA clearance or approval, the first ones even preceding the year 2000. One of the big advantages of these new technologies is the ability to get experience and knowledge from real-world use and to continuously improve their performance. Healthcare systems and institutions can have a great benefit because the use of advanced technologies improves the same time efficiency and efficacy of healthcare. Software-defined as a medical device, is stand-alone software that is intended to be used for patients for one or more of these specific medical intended uses: - diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of a disease, any other health conditions, replacing or modifying any part of a physiological or pathological process–manage the received information from in vitro specimens derived from the human samples (body) and without principal main action of its principal intended use by pharmacological, immunological or metabolic definition. Software qualified as medical devices must comply with the general safety and performance requirements applicable to medical devices. These requirements are necessary to ensure high performance and quality and also to protect patients’ safety. The evolution and the continuous improvement of software used in healthcare must take into consideration the increase in regulatory requirements, which are becoming more complex in each market. The gap between these advanced technologies and the new regulations is the biggest challenge for medical device manufacturers. Regulatory requirements can be considered a market barrier, as they can delay or obstacle the device approval, but they are necessary to ensure performance, quality, and safety, and at the same time, they can be a business opportunity if the manufacturer is able to define in advance the appropriate regulatory strategy. The abstract will provide an overview of the current regulatory framework, the evolution of the international requirements, and the standards applicable to medical device software in the potential market all over the world.

Keywords: artificial intelligence, machine learning, SaMD, regulatory, clinical evaluation, classification, international requirements, MDR, 510k, PMA, IMDRF, cyber security, health care systems.

Procedia PDF Downloads 89
2244 Application of Neural Network in Portfolio Product Companies: Integration of Boston Consulting Group Matrix and Ansoff Matrix

Authors: M. Khajezadeh, M. Saied Fallah Niasar, S. Ali Asli, D. Davani Davari, M. Godarzi, Y. Asgari

Abstract:

This study aims to explore the joint application of both Boston and Ansoff matrices in the operational development of the product. We conduct deep analysis, by utilizing the Artificial Neural Network, to predict the position of the product in the market while the company is interested in increasing its share. The data are gathered from two industries, called hygiene and detergent. In doing so, the effort is being made by investigating the behavior of top player companies and, recommend strategic orientations. In conclusion, this combination analysis is appropriate for operational development; as well, it plays an important role in providing the position of the product in the market for both hygiene and detergent industries. More importantly, it will elaborate on the company’s strategies to increase its market share related to a combination of the Boston Consulting Group (BCG) Matrix and Ansoff Matrix.

Keywords: artificial neural network, portfolio analysis, BCG matrix, Ansoff matrix

Procedia PDF Downloads 142
2243 A Philosophical Investigation into African Conceptions of Personhood in the Fourth Industrial Revolution

Authors: Sanelisiwe Ndlovu

Abstract:

Cities have become testbeds for automation and experimenting with artificial intelligence (AI) in managing urban services and public spaces. Smart Cities and AI systems are changing most human experiences from health and education to personal relations. For instance, in healthcare, social robots are being implemented as tools to assist patients. Similarly, in education, social robots are being used as tutors or co-learners to promote cognitive and affective outcomes. With that general picture in mind, one can now ask a further question about Smart Cities and artificial agents and their moral standing in the African context of personhood. There has been a wealth of literature on the topic of personhood; however, there is an absence of literature on African personhood in highly automated environments. Personhood in African philosophy is defined by the role one can and should play in the community. However, in today’s technologically advanced world, a risk is that machines become more capable of accomplishing tasks that humans would otherwise do. Further, on many African communitarian accounts, personhood and moral standing are associated with active relationality with the community. However, in the Smart City, human closeness is gradually diminishing. For instance, humans already do engage and identify with robotic entities, sometimes even romantically. The primary aim of this study is to investigate how African conceptions of personhood and community interact in a highly automated environment such as Smart Cities. Accordingly, this study lies in presenting a rarely discussed African perspective that emphasizes the necessity and the importance of relationality in handling Smart Cities and AI ethically. Thus, the proposed approach can be seen as the sub-Saharan African contribution to personhood and the growing AI debates, which takes the reality of the interconnectedness of society seriously. And it will also open up new opportunities to tackle old problems and use existing resources to confront new problems in the Fourth Industrial Revolution.

Keywords: smart city, artificial intelligence, personhood, community

Procedia PDF Downloads 202
2242 Contraceptives: Experiences of Agency and Coercion of Young People Living in Colombia

Authors: Paola Montenegro, Maria de los Angeles Balaguera Villa

Abstract:

Contraceptive methods play a fundamental role in preventing unwanted pregnancies and protecting users from sexually transmitted infections (STIs). Despite being known to almost the entire population of reproductive age living in Colombia, there are barriers, practices and complex notions about contraceptives that affect their desired mass use and effectiveness. This work aims to analyse some of the perceptions and practices discussed with young people (13-28 years old) living in Colombia regarding the use of contraceptives in their daily lives, preferences, needs and perceived side effects. This research also examines the perceived paradox in autonomy that young people experience regarding contraceptive use: in one hand, its use (or lack of it) is interpreted as an act of self-determination and primary example of reproductive agency, on the other hand, it was frequently associated with coercion and limited autonomy derived from the gaps in reliable information available for young people, the difficulty of accessing certain preferred methods, and sometimes the experienced coercion exercise by doctors, partners and/or family members. The data and analysis discussed in this work stems from a research project whose objective was to provide information about needs and preferences in sexual and reproductive health of young people living in Colombia in relation to a possible telehealth service that could close the gap in access to quality care and safe information. Through a mixed methods approach, this study collected 5.736 responses to a virtual survey disseminated nationwide in Colombia and 47 inperson interviews (24 of them with people who were assigned female at birth and 21 with local key stakeholders in the abortion ecosystem). Quantitative data was analyzed using Stata SE Version 16.0 and qualitative analysis was completed through NVivo using thematic analysis. Key findings on contraception use in young people living in Colombia reveal that 85,8% of participants had used a contraceptive method in the last two years, and that the most commonly used methods were condoms, contraceptive pills, the morning-after pill and the method of interruption. The remaining 14,2% of respondents who declared to not have used contraceptives in the last two years expressed that the main four barriers to access were: "Lack of knowledge about contraceptive methods and where to obtain information and/or access them (13.9%)", "Have had sex with people who have vaginas (10.2%)", "Cost of contraceptive method (8.4%)" and "Difficulties in obtaining medical authorisations (7.6%)". These barriers coincided with the ones used to explain the non-use of contraceptives in young people, which reveals that limitations in information, cost, and quality care represent structural issues that need to be address in programmes, services, and public policy. Finally, interviews showed that young people perceive contraceptive use and non-use as an example of reaffirming reproductive agency and limitations to this can be explained through the widespread incomplete knowledge about how methods work and the prevalence of other social representations of contraception associated with trust, fidelity, and partner preferences, that in the end create limitations to young people’s autonomy.

Keywords: contraception, family planning, premarital fertility, unplanned pregnancy

Procedia PDF Downloads 76
2241 Profit-Based Artificial Neural Network (ANN) Trained by Migrating Birds Optimization: A Case Study in Credit Card Fraud Detection

Authors: Ashkan Zakaryazad, Ekrem Duman

Abstract:

A typical classification technique ranks the instances in a data set according to the likelihood of belonging to one (positive) class. A credit card (CC) fraud detection model ranks the transactions in terms of probability of being fraud. In fact, this approach is often criticized, because firms do not care about fraud probability but about the profitability or costliness of detecting a fraudulent transaction. The key contribution in this study is to focus on the profit maximization in the model building step. The artificial neural network proposed in this study works based on profit maximization instead of minimizing the error of prediction. Moreover, some studies have shown that the back propagation algorithm, similar to other gradient–based algorithms, usually gets trapped in local optima and swarm-based algorithms are more successful in this respect. In this study, we train our profit maximization ANN using the Migrating Birds optimization (MBO) which is introduced to literature recently.

Keywords: neural network, profit-based neural network, sum of squared errors (SSE), MBO, gradient descent

Procedia PDF Downloads 475
2240 Thermal Barrier Coated Diesel Engine With Neural Networks Mathematical Modelling

Authors: Hanbey Hazar, Hakan Gul

Abstract:

In this study; piston, exhaust, and suction valves of a diesel engine were coated in 300 mm thickness with Tungsten Carbide (WC) by using the HVOF coating method. Mathematical modeling of a coated and uncoated (standardized) engine was performed by using ANN (Artificial Neural Networks). The purpose was to decrease the number of repetitions of tests and reduce the test cost through mathematical modeling of engines by using ANN. The results obtained from the tests were entered in ANN and therefore engines' values at all speeds were estimated. Results obtained from the tests were compared with those obtained from ANN and they were observed to be compatible. It was also observed that, with thermal barrier coating, hydrocarbon (HC), carbon monoxide (CO), and smoke density values of the diesel engine decreased; but nitrogen oxides (NOx) increased. Furthermore, it was determined that results obtained through mathematical modeling by means of ANN reduced the number of test repetitions. Therefore, it was understood that time, fuel and labor could be saved in this way.

Keywords: Artificial Neural Network, Diesel Engine, Mathematical Modelling, Thermal Barrier Coating

Procedia PDF Downloads 528
2239 Modeling of Digital and Settlement Consolidation of Soil under Oedomete

Authors: Yu-Lin Shen, Ming-Kuen Chang

Abstract:

In addition to a considerable amount of machinery and equipment, intricacies of the transmission pipeline exist in Petrochemical plants. Long term corrosion may lead to pipeline thinning and rupture, causing serious safety concerns. With the advances in non-destructive testing technology, more rapid and long-range ultrasonic detection techniques are often used for pipeline inspection, EMAT without coupling to detect, it is a non-contact ultrasonic, suitable for detecting elevated temperature or roughened e surface of line. In this study, we prepared artificial defects in pipeline for Electromagnetic Acoustic Transducer Testing (EMAT) to survey the relationship between the defect location, sizing and the EMAT signal. It was found that the signal amplitude of EMAT exhibited greater signal attenuation with larger defect depth and length.. In addition, with bigger flat hole diameter, greater amplitude attenuation was obtained. In summary, signal amplitude attenuation of EMAT was affected by the defect depth, defect length and the hole diameter and size.

Keywords: EMAT, artificial defect, NDT, ultrasonic testing

Procedia PDF Downloads 333