Search results for: artificial inteligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2026

Search results for: artificial inteligence

1156 Leveraging xAPI in a Corporate e-Learning Environment to Facilitate the Tracking, Modelling, and Predictive Analysis of Learner Behaviour

Authors: Libor Zachoval, Daire O Broin, Oisin Cawley

Abstract:

E-learning platforms, such as Blackboard have two major shortcomings: limited data capture as a result of the limitations of SCORM (Shareable Content Object Reference Model), and lack of incorporation of Artificial Intelligence (AI) and machine learning algorithms which could lead to better course adaptations. With the recent development of Experience Application Programming Interface (xAPI), a large amount of additional types of data can be captured and that opens a window of possibilities from which online education can benefit. In a corporate setting, where companies invest billions on the learning and development of their employees, some learner behaviours can be troublesome for they can hinder the knowledge development of a learner. Behaviours that hinder the knowledge development also raise ambiguity about learner’s knowledge mastery, specifically those related to gaming the system. Furthermore, a company receives little benefit from their investment if employees are passing courses without possessing the required knowledge and potential compliance risks may arise. Using xAPI and rules derived from a state-of-the-art review, we identified three learner behaviours, primarily related to guessing, in a corporate compliance course. The identified behaviours are: trying each option for a question, specifically for multiple-choice questions; selecting a single option for all the questions on the test; and continuously repeating tests upon failing as opposed to going over the learning material. These behaviours were detected on learners who repeated the test at least 4 times before passing the course. These findings suggest that gauging the mastery of a learner from multiple-choice questions test scores alone is a naive approach. Thus, next steps will consider the incorporation of additional data points, knowledge estimation models to model knowledge mastery of a learner more accurately, and analysis of the data for correlations between knowledge development and identified learner behaviours. Additional work could explore how learner behaviours could be utilised to make changes to a course. For example, course content may require modifications (certain sections of learning material may be shown to not be helpful to many learners to master the learning outcomes aimed at) or course design (such as the type and duration of feedback).

Keywords: artificial intelligence, corporate e-learning environment, knowledge maintenance, xAPI

Procedia PDF Downloads 113
1155 Advancements in AI Training and Education for a Future-Ready Healthcare System

Authors: Shamie Kumar

Abstract:

Background: Radiologists and radiographers (RR) need to educate themselves and their colleagues to ensure that AI is integrated safely, useful, and in a meaningful way with the direction it always benefits the patients. AI education and training are fundamental to the way RR work and interact with it, such that they feel confident using it as part of their clinical practice in a way they understand it. Methodology: This exploratory research will outline the current educational and training gaps for radiographers and radiologists in AI radiology diagnostics. It will review the status, skills, challenges of educating and teaching. Understanding the use of artificial intelligence within daily clinical practice, why it is fundamental, and justification on why learning about AI is essential for wider adoption. Results: The current knowledge among RR is very sparse, country dependent, and with radiologists being the majority of the end-users for AI, their targeted training and learning AI opportunities surpass the ones available to radiographers. There are many papers that suggest there is a lack of knowledge, understanding, and training of AI in radiology amongst RR, and because of this, they are unable to comprehend exactly how AI works, integrates, benefits of using it, and its limitations. There is an indication they wish to receive specific training; however, both professions need to actively engage in learning about it and develop the skills that enable them to effectively use it. There is expected variability amongst the profession on their degree of commitment to AI as most don’t understand its value; this only adds to the need to train and educate RR. Currently, there is little AI teaching in either undergraduate or postgraduate study programs, and it is not readily available. In addition to this, there are other training programs, courses, workshops, and seminars available; most of these are short and one session rather than a continuation of learning which cover a basic understanding of AI and peripheral topics such as ethics, legal, and potential of AI. There appears to be an obvious gap between the content of what the training program offers and what the RR needs and wants to learn. Due to this, there is a risk of ineffective learning outcomes and attendees feeling a lack of clarity and depth of understanding of the practicality of using AI in a clinical environment. Conclusion: Education, training, and courses need to have defined learning outcomes with relevant concepts, ensuring theory and practice are taught as a continuation of the learning process based on use cases specific to a clinical working environment. Undergraduate and postgraduate courses should be developed robustly, ensuring the delivery of it is with expertise within that field; in addition, training and other programs should be delivered as a way of continued professional development and aligned with accredited institutions for a degree of quality assurance.

Keywords: artificial intelligence, training, radiology, education, learning

Procedia PDF Downloads 81
1154 Hidro-IA: An Artificial Intelligent Tool Applied to Optimize the Operation Planning of Hydrothermal Systems with Historical Streamflow

Authors: Thiago Ribeiro de Alencar, Jacyro Gramulia Junior, Patricia Teixeira Leite

Abstract:

The area of the electricity sector that deals with energy needs by the hydroelectric in a coordinated manner is called Operation Planning of Hydrothermal Power Systems (OPHPS). The purpose of this is to find a political operative to provide electrical power to the system in a given period, with reliability and minimal cost. Therefore, it is necessary to determine an optimal schedule of generation for each hydroelectric, each range, so that the system meets the demand reliably, avoiding rationing in years of severe drought, and that minimizes the expected cost of operation during the planning, defining an appropriate strategy for thermal complementation. Several optimization algorithms specifically applied to this problem have been developed and are used. Although providing solutions to various problems encountered, these algorithms have some weaknesses, difficulties in convergence, simplification of the original formulation of the problem, or owing to the complexity of the objective function. An alternative to these challenges is the development of techniques for simulation optimization and more sophisticated and reliable, it can assist the planning of the operation. Thus, this paper presents the development of a computational tool, namely Hydro-IA for solving optimization problem identified and to provide the User an easy handling. Adopted as intelligent optimization technique is Genetic Algorithm (GA) and programming language is Java. First made the modeling of the chromosomes, then implemented the function assessment of the problem and the operators involved, and finally the drafting of the graphical interfaces for access to the User. The results with the Genetic Algorithms were compared with the optimization technique nonlinear programming (NLP). Tests were conducted with seven hydroelectric plants interconnected hydraulically with historical stream flow from 1953 to 1955. The results of comparison between the GA and NLP techniques shows that the cost of operating the GA becomes increasingly smaller than the NLP when the number of hydroelectric plants interconnected increases. The program has managed to relate a coherent performance in problem resolution without the need for simplification of the calculations together with the ease of manipulating the parameters of simulation and visualization of output results.

Keywords: energy, optimization, hydrothermal power systems, artificial intelligence and genetic algorithms

Procedia PDF Downloads 416
1153 Smart Construction Sites in KSA: Challenges and Prospects

Authors: Ahmad Mohammad Sharqi, Mohamed Hechmi El Ouni, Saleh Alsulamy

Abstract:

Due to the emerging technologies revolution worldwide, the need to exploit and employ innovative technologies for other functions and purposes in different aspects has become a remarkable matter. Saudi Arabia is considered one of the most powerful economic countries in the world, where the construction sector participates effectively in its economy. Thus, the construction sector in KSA should convoy the rapid digital revolution and transformation and implement smart devices on sites. A Smart Construction Site (SCS) includes smart devices, artificial intelligence, the internet of things, augmented reality, building information modeling, geographical information systems, and cloud information. This paper aims to study the level of implementation of SCS in KSA, analyze the obstacles and challenges of adopting SCS and find out critical success factors for its implementation. A survey of close-ended questions (scale and multi-choices) has been conducted on professionals in the construction sector of Saudi Arabia. A total number of twenty-nine questions has been prepared for respondents. Twenty-four scale questions were established, and those questions were categorized into several themes: quality, scheduling, cost, occupational safety and health, technologies and applications, and general perception. Consequently, the 5-point Likert scale tool (very low to very high) was adopted for this survey. In addition, five close-ended questions with multi-choice types have also been prepared; these questions have been derived from a previous study implemented in the United Kingdom (UK) and the Dominic Republic (DR), these questions have been rearranged and organized to fit the structured survey in order to place the Kingdom of Saudi Arabia in comparison with the United Kingdom (UK) as well as the Dominican Republic (DR). A total number of one hundred respondents have participated in this survey from all regions of the Kingdom of Saudi Arabia: southern, central, western, eastern, and northern regions. The drivers, obstacles, and success factors for implementing smart devices and technologies in KSA’s construction sector have been investigated and analyzed. Besides, it has been concluded that KSA is on the right path toward adopting smart construction sites with attractive results comparable to and even better than the UK in some factors.

Keywords: artificial intelligence, construction projects management, internet of things, smart construction sites, smart devices

Procedia PDF Downloads 144
1152 Data Access, AI Intensity, and Scale Advantages

Authors: Chuping Lo

Abstract:

This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.

Keywords: digital intensity, digital divide, international trade, scale of economics

Procedia PDF Downloads 56
1151 Ethical Issues in AI: Analyzing the Gap Between Theory and Practice - A Case Study of AI and Robotics Researchers

Authors: Sylvie Michel, Emmanuelle Gagnou, Joanne Hamet

Abstract:

New major ethical dilemmas are posed by artificial intelligence. This article identifies an existing gap between the ethical questions that AI/robotics researchers grapple with in their research practice and those identified by literature review. The objective is to understand which ethical dilemmas are identified or concern AI researchers in order to compare them with the existing literature. This will enable to conduct training and awareness initiatives for AI researchers, encouraging them to consider these questions during the development of AI. Qualitative analyses were conducted based on direct observation of an AI/Robotics research team focused on collaborative robotics over several months. Subsequently, semi-structured interviews were conducted with 16 members of the team. The entire process took place during the first semester of 2023. The observations were analyzed using an analytical framework, and the interviews were thematically analyzed using Nvivo software. While the literature identifies three primary ethical concerns regarding AI—transparency, bias, and responsibility—the results firstly demonstrate that AI researchers are primarily concerned with the publication and valorization of their work, with the initial ethical concerns revolving around this matter. Questions arise regarding the extent to which to "market" publications and the usefulness of some publications. Research ethics are a central consideration for these teams. Secondly, another result shows that the researchers studied adopt a consequentialist ethics (though not explicitly formulated as such). They ponder the consequences of their development in terms of safety (for humans in relation to Robots/AI), worker autonomy in relation to the robot, and the role of work in society (can robots take over jobs?). Lastly, results indicate that the ethical dilemmas highlighted in the literature (responsibility, transparency, bias) do not explicitly appear in AI/Robotics research. AI/robotics researchers raise specific and pragmatic ethical questions, primarily concerning publications initially and consequentialist considerations afterward. Results demonstrate that these concerns are distant from the existing literature. However, the dilemmas highlighted in the literature also deserve to be explicitly contemplated by researchers. This article proposes that the journals these researchers target should mandate ethical reflection for all presented works. Furthermore, results suggest offering awareness programs in the form of short educational sessions for researchers.

Keywords: ethics, artificial intelligence, research, robotics

Procedia PDF Downloads 71
1150 Exploring the Potential of Replika: An AI Chatbot for Mental Health Support

Authors: Nashwah Alnajjar

Abstract:

This research paper provides an overview of Replika, an AI chatbot application that uses natural language processing technology to engage in conversations with users. The app was developed to provide users with a virtual AI friend who can converse with them on various topics, including mental health. This study explores the experiences of Replika users using quantitative research methodology. A survey was conducted with 12 participants to collect data on their demographics, usage patterns, and experiences with the Replika app. The results showed that Replika has the potential to play a role in mental health support and well-being.

Keywords: Replika, chatbot, mental health, artificial intelligence, natural language processing

Procedia PDF Downloads 74
1149 A Comparative Study of Multi-SOM Algorithms for Determining the Optimal Number of Clusters

Authors: Imèn Khanchouch, Malika Charrad, Mohamed Limam

Abstract:

The interpretation of the quality of clusters and the determination of the optimal number of clusters is still a crucial problem in clustering. We focus in this paper on multi-SOM clustering method which overcomes the problem of extracting the number of clusters from the SOM map through the use of a clustering validity index. We then tested multi-SOM using real and artificial data sets with different evaluation criteria not used previously such as Davies Bouldin index, Dunn index and silhouette index. The developed multi-SOM algorithm is compared to k-means and Birch methods. Results show that it is more efficient than classical clustering methods.

Keywords: clustering, SOM, multi-SOM, DB index, Dunn index, silhouette index

Procedia PDF Downloads 591
1148 Half-Circle Fuzzy Number Threshold Determination via Swarm Intelligence Method

Authors: P. W. Tsai, J. W. Chen, C. W. Chen, C. Y. Chen

Abstract:

In recent years, many researchers are involved in the field of fuzzy theory. However, there are still a lot of issues to be resolved. Especially on topics related to controller design such as the field of robot, artificial intelligence, and nonlinear systems etc. Besides fuzzy theory, algorithms in swarm intelligence are also a popular field for the researchers. In this paper, a concept of utilizing one of the swarm intelligence method, which is called Bacterial-GA Foraging, to find the stabilized common P matrix for the fuzzy controller system is proposed. An example is given in in the paper, as well.

Keywords: half-circle fuzzy numbers, predictions, swarm intelligence, Lyapunov method

Procedia PDF Downloads 674
1147 Compression and Air Storage Systems for Small Size CAES Plants: Design and Off-Design Analysis

Authors: Coriolano Salvini, Ambra Giovannelli

Abstract:

The use of renewable energy sources for electric power production leads to reduced CO2 emissions and contributes to improving the domestic energy security. On the other hand, the intermittency and unpredictability of their availability poses relevant problems in fulfilling safely and in a cost efficient way the load demand along the time. Significant benefits in terms of “grid system applications”, “end-use applications” and “renewable applications” can be achieved by introducing energy storage systems. Among the currently available solutions, CAES (Compressed Air Energy Storage) shows favorable features. Small-medium size plants equipped with artificial air reservoirs can constitute an interesting option to get efficient and cost-effective distributed energy storage systems. The present paper is addressed to the design and off-design analysis of the compression system of small size CAES plants suited to absorb electric power in the range of hundreds of kilowatt. The system of interest is constituted by an intercooled (in case aftercooled) multi-stage reciprocating compressor and a man-made reservoir obtained by connecting large diameter steel pipe sections. A specific methodology for the system preliminary sizing and off-design modeling has been developed. Since during the charging phase the electric power absorbed along the time has to change according to the peculiar CAES requirements and the pressure ratio increases continuously during the filling of the reservoir, the compressor has to work at variable mass flow rate. In order to ensure an appropriately wide range of operations, particular attention has been paid to the selection of the most suitable compressor capacity control device. Given the capacity regulation margin of the compressor and the actual level of charge of the reservoir, the proposed approach allows the instant-by-instant evaluation of minimum and maximum electric power absorbable from the grid. The developed tool gives useful information to appropriately size the compression system and to manage it in the most effective way. Various cases characterized by different system requirements are analysed. Results are given and widely discussed.

Keywords: artificial air storage reservoir, compressed air energy storage (CAES), compressor design, compression system management.

Procedia PDF Downloads 219
1146 The Use of Artificial Intelligence in the Context of a Space Traffic Management System: Legal Aspects

Authors: George Kyriakopoulos, Photini Pazartzis, Anthi Koskina, Crystalie Bourcha

Abstract:

The need for securing safe access to and return from outer space, as well as ensuring the viability of outer space operations, maintains vivid the debate over the promotion of organization of space traffic through a Space Traffic Management System (STM). The proliferation of outer space activities in recent years as well as the dynamic emergence of the private sector has gradually resulted in a diverse universe of actors operating in outer space. The said developments created an increased adverse impact on outer space sustainability as the case of the growing number of space debris clearly demonstrates. The above landscape sustains considerable threats to outer space environment and its operators that need to be addressed by a combination of scientific-technological measures and regulatory interventions. In this context, recourse to recent technological advancements and, in particular, to Artificial Intelligence (AI) and machine learning systems, could achieve exponential results in promoting space traffic management with respect to collision avoidance as well as launch and re-entry procedures/phases. New technologies can support the prospects of a successful space traffic management system at an international scale by enabling, inter alia, timely, accurate and analytical processing of large data sets and rapid decision-making, more precise space debris identification and tracking and overall minimization of collision risks and reduction of operational costs. What is more, a significant part of space activities (i.e. launch and/or re-entry phase) takes place in airspace rather than in outer space, hence the overall discussion also involves the highly developed, both technically and legally, international (and national) Air Traffic Management System (ATM). Nonetheless, from a regulatory perspective, the use of AI for the purposes of space traffic management puts forward implications that merit particular attention. Key issues in this regard include the delimitation of AI-based activities as space activities, the designation of the applicable legal regime (international space or air law, national law), the assessment of the nature and extent of international legal obligations regarding space traffic coordination, as well as the appropriate liability regime applicable to AI-based technologies when operating for space traffic coordination, taking into particular consideration the dense regulatory developments at EU level. In addition, the prospects of institutionalizing international cooperation and promoting an international governance system, together with the challenges of establishment of a comprehensive international STM regime are revisited in the light of intervention of AI technologies. This paper aims at examining regulatory implications advanced by the use of AI technology in the context of space traffic management operations and its key correlating concepts (SSA, space debris mitigation) drawing in particular on international and regional considerations in the field of STM (e.g. UNCOPUOS, International Academy of Astronautics, European Space Agency, among other actors), the promising advancements of the EU approach to AI regulation and, last but not least, national approaches regarding the use of AI in the context of space traffic management, in toto. Acknowledgment: The present work was co-funded by the European Union and Greek national funds through the Operational Program "Human Resources Development, Education and Lifelong Learning " (NSRF 2014-2020), under the call "Supporting Researchers with an Emphasis on Young Researchers – Cycle B" (MIS: 5048145).

Keywords: artificial intelligence, space traffic management, space situational awareness, space debris

Procedia PDF Downloads 240
1145 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation

Authors: Jonathan Gong

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning

Procedia PDF Downloads 121
1144 Electromechanical Behaviour of Chitosan Based Electroactive Polymer

Authors: M. Sarikanat, E. Akar, I. Şen, Y. Seki, O. C. Yılmaz, B. O. Gürses, L. Cetin, O. Özdemir, K. Sever

Abstract:

Chitosan is a natural, nontoxic, polyelectrolyte, cheap polymer. In this study, chitosan based electroactive polymer (CBEAP) was fabricated. Electroactive properties of this polymer were investigated at different voltages. It exhibited excellent tip displacement at low voltages (1, 3, 5, 7 V). Tip displacement was increased as the applied voltage increased. Best tip displacement was investigated as 28 mm at 5V. Characterization of CBEAP was investigated by scanning electron microscope, X-ray diffraction and tensile testing. CBEAP exhibited desired electroactive properties at low voltages. It is suitable for using in artificial muscle and various robotic applications.

Keywords: chitosan, electroactive polymer, electroactive properties

Procedia PDF Downloads 505
1143 Adapting Liability in the Era of Automated Decision-Making: A South African Labour Law Perspective

Authors: Aisha Adam

Abstract:

This study critically examines the transformative impact of automated decision-making (ADM) and artificial intelligence (AI) systems on South African labour law. As AI technologies increasingly infiltrate workplaces, existing liability frameworks face challenges in addressing the unique complexities presented by these innovations. This article explores the necessity of redefining liability to accommodate the nuanced landscape of ADM and AI within South African labour law. It emphasises the importance of ensuring responsible deployment and safeguarding the rights of workers amid evolving technological dynamics. This research investigates the central concern of fairness, bias, and discrimination in ADM and AI decision-making. Focusing on algorithmic bias and discriminatory outcomes, the paper advocates for the integration of mechanisms within the South African legal framework, particularly under the Promotion of Equality and Prevention of Unfair Discrimination Act (PEPUDA) and the Employment Equity Act (EEA). The study scrutinises the shifting dynamics of the employment relationship, calling for clear guidelines on the responsibilities and liabilities of employers, employees, and technology providers. Furthermore, the article analyses legal and policy responses to ADM and AI within South African labour law, exploring potential amendments to legislation, guidelines, and codes of practice. It assesses the role of regulatory bodies, specifically the Commission for Conciliation, Mediation, and Arbitration (CCMA), in overseeing and enforcing responsible practices in the workplace. Lastly, the research evaluates the impact of ADM and AI on human and social rights in the South African context. Emphasising the protection of constitutional rights, including fair labour practices, privacy, and equality, the study proposes remedies and safeguards. It advocates for a multidisciplinary approach involving legal, technological, and ethical considerations to redefine liability in South African labour law effectively. The article contends that a shift from accountability to responsibility is crucial for promoting fairness, antidiscrimination, and the protection of human and social rights in the age of automated decision-making. It calls for collaborative efforts among stakeholders to shape responsible practices and redefine liability in this evolving technological landscape.

Keywords: automated decision-making, artificial intelligence, labour law, vicarious liability

Procedia PDF Downloads 66
1142 Towards a Computational Model of Consciousness: Global Abstraction Workspace

Authors: Halim Djerroud, Arab Ali Cherif

Abstract:

We assume that conscious functions are implemented automatically. In other words that consciousness as well as the non-consciousness aspect of human thought, planning, and perception, are produced by biologically adaptive algorithms. We propose that the mechanisms of consciousness can be produced using similar adaptive algorithms to those executed by the mechanism. In this paper, we propose a computational model of consciousness, the ”Global Abstraction Workspace” which is an internal environmental modelling perceived as a multi-agent system. This system is able to evolve and generate new data and processes as well as actions in the environment.

Keywords: artificial consciousness, cognitive architecture, global abstraction workspace, multi-agent system

Procedia PDF Downloads 329
1141 Using the Semantic Web Technologies to Bring Adaptability in E-Learning Systems

Authors: Fatima Faiza Ahmed, Syed Farrukh Hussain

Abstract:

The last few decades have seen a large proportion of our population bending towards e-learning technologies, starting from learning tools used in primary and elementary schools to competency based e-learning systems specifically designed for applications like finance and marketing. The huge diversity in this crowd brings about a large number of challenges for the designers of these e-learning systems, one of which is the adaptability of such systems. This paper focuses on adaptability in the learning material in an e-learning course and how artificial intelligence and the semantic web can be used as an effective tool for this purpose. The study proved that the semantic web, still a hot topic in the area of computer science can prove to be a powerful tool in designing and implementing adaptable e-learning systems.

Keywords: adaptable e-learning, HTMLParser, information extraction, semantic web

Procedia PDF Downloads 313
1140 Selecting the Best RBF Neural Network Using PSO Algorithm for ECG Signal Prediction

Authors: Najmeh Mohsenifar, Narjes Mohsenifar, Abbas Kargar

Abstract:

In this paper, has been presented a stable method for predicting the ECG signals through the RBF neural networks, by the PSO algorithm. In spite of quasi-periodic ECG signal from a healthy person, there are distortions in electro cardiographic data for a patient. Therefore, there is no precise mathematical model for prediction. Here, we have exploited neural networks that are capable of complicated nonlinear mapping. Although the architecture and spread of RBF networks are usually selected through trial and error, the PSO algorithm has been used for choosing the best neural network. In this way, 2 second of a recorded ECG signal is employed to predict duration of 20 second in advance. Our simulations show that PSO algorithm can find the RBF neural network with minimum MSE and the accuracy of the predicted ECG signal is 97 %.

Keywords: electrocardiogram, RBF artificial neural network, PSO algorithm, predict, accuracy

Procedia PDF Downloads 617
1139 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 110
1138 A New Approach to Predicting Physical Biometrics from Behavioural Biometrics

Authors: Raid R. O. Al-Nima, S. S. Dlay, W. L. Woo

Abstract:

A relationship between face and signature biometrics is established in this paper. A new approach is developed to predict faces from signatures by using artificial intelligence. A multilayer perceptron (MLP) neural network is used to generate face details from features extracted from signatures, here face is the physical biometric and signatures is the behavioural biometric. The new method establishes a relationship between the two biometrics and regenerates a visible face image from the signature features. Furthermore, the performance efficiencies of our new technique are demonstrated in terms of minimum error rates compared to published work.

Keywords: behavioural biometric, face biometric, neural network, physical biometric, signature biometric

Procedia PDF Downloads 468
1137 Evaluation of Low Temperature as Treatment Tool for Eradication of Mediterranean Fruit Fly (Ceratitis capitata) in Artificial Diet

Authors: Farhan J. M. Al-Behadili, Vineeta Bilgi, Miyuki Taniguchi, Junxi Li, Wei Xu

Abstract:

Mediterranean fruit fly (Ceratitis capitata) is one of the most destructive pests of fruits and vegetables. Medfly originated from Africa and spread in many countries, and is currently an endemic pest in Western Australia. Medfly has been recorded from over 300 plant species including fruits, vegetables, nuts and its main hosts include blueberries, citrus, stone fruit, pome fruits, peppers, tomatoes, and figs. Global trade of fruits and other farm fresh products are suffering from the damages of this pest, which prompted towards the need to develop more effective ways to control these pests. The available quarantine treatment technologies mainly include chemical treatment (e.g., fumigation) and non-chemical treatments (e.g., cold, heat and irradiation). In recent years, with the loss of several chemicals, it has become even more important to rely on non-chemical postharvest control technologies (i.e., heat, cold and irradiation) to control fruit flies. Cold treatment is one of the most potential trends of focus in postharvest treatment because it is free of chemical residues, mitigates or kills the pest population, increases the strength of the fruits, and prolongs storage time. It can also be applied to fruits after packing and ‘in transit’ during lengthy transport by sea during their exports. However, limited systematic study on cold treatment of Medfly stages in artificial diets was reported, which is critical to provide a scientific basis to compare with previous research in plant products and design an effective cold treatment suitable for exported plant products. The overall purpose of this study was to evaluate and understand Medfly responses to cold treatments. Medfly stages were tested. The long-term goal was to optimize current postharvest treatments and develop more environmentally-friendly, cost-effective, and efficient treatments for controlling Medfly. Cold treatment with different exposure times is studied to evaluate cold eradication treatment of Mediterranean fruit fly (Ceratitis capitata), that reared on carrot diet. Mortality is important aspect was studied in this study. On the other hand, study effects of exposure time on mortality means of medfly stages.

Keywords: cold treatment, fruit fly, Ceratitis capitata, carrot diet, temperature effects

Procedia PDF Downloads 219
1136 Intelligent Prediction System for Diagnosis of Heart Attack

Authors: Oluwaponmile David Alao

Abstract:

Due to an increase in the death rate as a result of heart attack. There is need to develop a system that can be useful in the diagnosis of the disease at the medical centre. This system will help in preventing misdiagnosis that may occur from the medical practitioner or the physicians. In this research work, heart disease dataset obtained from UCI repository has been used to develop an intelligent prediction diagnosis system. The system is modeled on a feedforwad neural network and trained with back propagation neural network. A recognition rate of 86% is obtained from the testing of the network.

Keywords: heart disease, artificial neural network, diagnosis, prediction system

Procedia PDF Downloads 440
1135 Best Resource Recommendation for a Stochastic Process

Authors: Likewin Thomas, M. V. Manoj Kumar, B. Annappa

Abstract:

The aim of this study was to develop an Artificial Neural Network0 s recommendation model for an online process using the complexity of load, performance, and average servicing time of the resources. Here, the proposed model investigates the resource performance using stochastic gradient decent method for learning ranking function. A probabilistic cost function is implemented to identify the optimal θ values (load) on each resource. Based on this result the recommendation of resource suitable for performing the currently executing task is made. The test result of CoSeLoG project is presented with an accuracy of 72.856%.

Keywords: ADALINE, neural network, gradient decent, process mining, resource behaviour, polynomial regression model

Procedia PDF Downloads 378
1134 The Impact of Artificial Intelligence on Human Rights Development

Authors: Kerols Seif Said Botros

Abstract:

The relationship between development and human rights has been debated for a long time. Various principles, from the right to development to development-based human rights, are applied to understand the dynamics between these two concepts. Despite the measures calculated, the connection between enhancement and human rights remains vague. Despite, the connection between these two opinions and the need to strengthen human rights have increased in recent years. It will then be examined whether the right to sustainable development is acceptable or not. In various human rights instruments and this is a good vibe to the request cited above. The book then cites domestic and international human rights treaties, as well as jurisprudence and regulations defining human rights institutions, to support this view.

Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.

Procedia PDF Downloads 42
1133 Immuno-Modulatory Role of Weeds in Feeds of Cyprinus Carpio

Authors: Vipin Kumar Verma, Neeta Sehgal, Om Prakash

Abstract:

Cyprinus carpio has a wide spread occurrence in the lakes and rivers of Europe and Asia. Heavy losses in natural environment due to anthropogenic activities, including pollution as well as pathogenic diseases have landed this fish in IUCN red list of vulnerable species. The significance of a suitable diet in preserving the health status of fish is widely recognized. In present study, artificial feed supplemented with leaves of two weed plants, Eichhornia crassipes and Ricinus communis were evaluated for their role on the fish immune system. To achieve this objective fish were acclimatized to laboratory conditions (25 ± 1 °C; 12 L: 12D) for 10 days prior to start of experiment and divided into 4 groups: non-challenged (negative control= A), challenged [positive control (B) and experimental (C & D)]. Group A, B were fed with non-supplemented feed while group C & D were fed with feed supplemented with 5% Eichhornia crassipes and 5% Ricinus communis respectively. Supplemented feeds were evaluated for their effect on growth, health, immune system and disease resistance in fish when challenged with Vibrio harveyi. Fingerlings of C. carpio (weight, 2.0±0.5 g) were exposed with fresh overnight culture of V. harveyi through bath immunization (concentration 2 Χ 105) for 2 hours on 10 days interval for 40 days. The growth was monitored through increase in their relative weight. The rate of mortality due to bacterial infection as well as due to effect of feed was recorded accordingly. Immune response of fish was analyzed through differential leucocyte count, percentage phagocytosis and phagocytic index. The effect of V. harveyi on fish organs were examined through histo-pathological examination of internal organs like spleen, liver and kidney. The change in the immune response was also observed through gene expression analysis. The antioxidant potential of plant extracts was measured through DPPH and FRAP assay and amount of total phenols and flavonoids were calculates through biochemical analysis. The chemical composition of plant’s methanol extracts was determined by GC-MS analysis, which showed presence of various secondary metabolites and other compounds. Investigation revealed immuno-modulatory effect of plants, when supplemented with the artificial feed of fish.

Keywords: immuno-modulation, gc-ms, Cyprinus carpio, Eichhornia crassipes, Ricinus communis

Procedia PDF Downloads 479
1132 Identification of Suitable Sites for Rainwater Harvesting in Salt Water Intruded Area by Using Geospatial Techniques in Jafrabad, Amreli District, India

Authors: Pandurang Balwant, Ashutosh Mishra, Jyothi V., Abhay Soni, Padmakar C., Rafat Quamar, Ramesh J.

Abstract:

The sea water intrusion in the coastal aquifers has become one of the major environmental concerns. Although, it is a natural phenomenon but, it can be induced with anthropogenic activities like excessive exploitation of groundwater, seacoast mining, etc. The geological and hydrogeological conditions including groundwater heads and groundwater pumping pattern in the coastal areas also influence the magnitude of seawater intrusion. However, this problem can be remediated by taking some preventive measures like rainwater harvesting and artificial recharge. The present study is an attempt to identify suitable sites for rainwater harvesting in salt intrusion affected area near coastal aquifer of Jafrabad town, Amreli district, Gujrat, India. The physico-chemical water quality results show that out of 25 groundwater samples collected from the study area most of samples were found to contain high concentration of Total Dissolved Solids (TDS) with major fractions of Na and Cl ions. The Cl/HCO3 ratio was also found greater than 1 which indicates the salt water contamination in the study area. The geophysical survey was conducted at nine sites within the study area to explore the extent of contamination of sea water. From the inverted resistivity sections, low resistivity zone (<3 Ohm m) associated with seawater contamination were demarcated in North block pit and south block pit of NCJW mines, Mitiyala village Lotpur and Lunsapur village at the depth of 33 m, 12 m, 40 m, 37 m, 24 m respectively. Geospatial techniques in combination of Analytical Hierarchy Process (AHP) considering hydrogeological factors, geographical features, drainage pattern, water quality and geophysical results for the study area were exploited to identify potential zones for the Rainwater Harvesting. Rainwater harvesting suitability model was developed in ArcGIS 10.1 software and Rainwater harvesting suitability map for the study area was generated. AHP in combination of the weighted overlay analysis is an appropriate method to identify rainwater harvesting potential zones. The suitability map can be further utilized as a guidance map for the development of rainwater harvesting infrastructures in the study area for either artificial groundwater recharge facilities or for direct use of harvested rainwater.

Keywords: analytical hierarchy process, groundwater quality, rainwater harvesting, seawater intrusion

Procedia PDF Downloads 163
1131 An Easy Approach for Fabrication of Macroporous Apatite-Based Bone Cement Used As Potential Trabecular Bone Substitute

Authors: Vimal Kumar Dewangan, T. S. Sampath Kumar, Mukesh Doble, Viju Daniel Varghese

Abstract:

The apatite-based, i.e., calcium-deficient hydroxyapatite (CDHAp) bone cement is well-known potential bone graft/substitute in orthopaedics due to its similar chemical composition with natural bone minerals. Therefore, an easy approach was attempted to fabricate the apatite-based (CDHAp) bone cement with improved injectability, bioresorbability, and macroporosity. In this study, the desired bone cement was developed by mixing the solid phase (consisting of wet chemically synthesized nanocrystalline hydroxyapatite and commercially available (synthetic) tricalcium phosphate) and the liquid phase (consisting of cement binding accelerator with few biopolymers in a dilute acidic solution) along with a liquid porogen as polysorbate or a solid porogen as mannitol (for comparison) in an optimized liquid-to-powder ratio. The fabricated cement sets within clinically preferred setting time (≤20 minutes) are better injectable (>70%) and also stable at ~7.3-7.4 (physiological pH). The CDHAp phased bone cement was resulted by immersing the fabricated after-set cement in phosphate buffer solution and other similar artificial body fluids and incubated at physiological conditions for seven days, confirmed through the X-ray diffraction and Fourier transform-infrared spectroscopy analyses. The so-formed synthetic apatite-based bone cement holds the acceptable compressive strength (within the range of trabecular bone) with average interconnected pores size falls in a macropores range (~50-200μm) inside the cement, verified by scanning electron microscopy (SEM), mercury intrusion porosimetry and micro-CT analysis techniques. Also, it is biodegradable (degrades ~19-22% within 10-12 weeks) when incubated in artificial body fluids under physiological conditions. The biocompatibility study of the bone cement, when incubated with MG63 cells, shows a significant increase in the cell viability after 3rd day of incubation compared with the control, and the cells were well-attached and spread completely on the surface of the bone cement, confirmed through SEM and fluorescence microscopy analyses. With this all, we can conclude that the developed synthetic macroporous apatite-based bone cement may have the potential to become promising material used as a trabecular bone substitute.

Keywords: calcium deficient hydroxyapatite, synthetic apatite-based bone cement, injectability, macroporosity, trabecular bone substitute

Procedia PDF Downloads 79
1130 Scope of Rainwater Harvesting in Residential Plots of Dhaka City

Authors: Jubaida Gulshan Ara, Zebun Nasreen Ahmed

Abstract:

Urban flood and drought has been a major problem of Dhaka city, particularly in recent years. Continuous increase of the city built up area, and limiting rainwater infiltration zone, are thought to be the main causes of the problem. Proper rainwater management, even at the individual plot level, might bring significant improvement in this regard. As residential use pattern occupies a significant portion of the city surface, the scope of rainwater harvesting (RWH) in residential buildings can be investigated. This paper reports on a research which explored the scope of rainwater harvesting in residential plots, with multifamily apartment buildings, in Dhaka city. The research investigated the basics of RWH, contextual information, i.e., hydro-geological, meteorological data of Dhaka city and the rules and legislations for residential building construction. The study also explored contemporary rainwater harvesting practices in the local and international contexts. On the basis of theoretical understanding, 21 sample case-studies, in different phases of construction, were selected from seven different categories of plot sizes, in different residential areas of Dhaka city. Primary data from the 21 case-study buildings were collected from a physical survey, from design drawings, accompanied by a questionnaire survey. All necessary secondary data were gathered from published and other relevant sources. Collected primary and secondary data were used to calculate and analyze the RWH needs for each case study, based on the theoretical understanding. The main findings have been compiled and compared, to observe residential development trends with regards to building rainwater harvesting system. The study has found that, in ‘Multifamily Apartment Building’ of Dhaka city, storage, and recharge structure size for rainwater harvesting, increases along with occupants’ number, and with the increasing size of the plot. Hence, demand vs. supply ratio remains almost the same for different sizes of plots, and consequently, the size of the storage structure increases significantly, in large-scale plots. It has been found that rainwater can meet only 12%-30% of the total restricted water demand of these residential buildings of Dhaka city. Therefore, artificial groundwater recharge might be the more suitable option for RWH, than storage. The study came up with this conclusion that, in multifamily residential apartments of Dhaka city, artificial groundwater recharge might be the more suitable option for RWH, than storing the rainwater on site.

Keywords: Dhaka city, rainwater harvesting, residential plots, urban flood

Procedia PDF Downloads 182
1129 Market Index Trend Prediction using Deep Learning and Risk Analysis

Authors: Shervin Alaei, Reza Moradi

Abstract:

Trading in financial markets is subject to risks due to their high volatilities. Here, using an LSTM neural network, and by doing some risk-based feature engineering tasks, we developed a method that can accurately predict trends of the Tehran stock exchange market index from a few days ago. Our test results have shown that the proposed method with an average prediction accuracy of more than 94% is superior to the other common machine learning algorithms. To the best of our knowledge, this is the first work incorporating deep learning and risk factors to accurately predict market trends.

Keywords: deep learning, LSTM, trend prediction, risk management, artificial neural networks

Procedia PDF Downloads 139
1128 The Extent of Virgin Olive-Oil Prices' Distribution Revealing the Behavior of Market Speculators

Authors: Fathi Abid, Bilel Kaffel

Abstract:

The olive tree, the olive harvest during winter season and the production of olive oil better known by professionals under the name of the crushing operation have interested institutional traders such as olive-oil offices and private companies such as food industry refining and extracting pomace olive oil as well as export-import public and private companies specializing in olive oil. The major problem facing producers of olive oil each winter campaign, contrary to what is expected, it is not whether the harvest will be good or not but whether the sale price will allow them to cover production costs and achieve a reasonable margin of profit or not. These questions are entirely legitimate if we judge by the importance of the issue and the heavy complexity of the uncertainty and competition made tougher by a high level of indebtedness and the experience and expertise of speculators and producers whose objectives are sometimes conflicting. The aim of this paper is to study the formation mechanism of olive oil prices in order to learn about speculators’ behavior and expectations in the market, how they contribute by their industry knowledge and their financial alliances and the size the financial challenge that may be involved for them to build private information hoses globally to take advantage. The methodology used in this paper is based on two stages, in the first stage we study econometrically the formation mechanisms of olive oil price in order to understand the market participant behavior by implementing ARMA, SARMA, GARCH and stochastic diffusion processes models, the second stage is devoted to prediction purposes, we use a combined wavelet- ANN approach. Our main findings indicate that olive oil market participants interact with each other in a way that they promote stylized facts formation. The unstable participant’s behaviors create the volatility clustering, non-linearity dependent and cyclicity phenomena. By imitating each other in some periods of the campaign, different participants contribute to the fat tails observed in the olive oil price distribution. The best prediction model for the olive oil price is based on a back propagation artificial neural network approach with input information based on wavelet decomposition and recent past history.

Keywords: olive oil price, stylized facts, ARMA model, SARMA model, GARCH model, combined wavelet-artificial neural network, continuous-time stochastic volatility mode

Procedia PDF Downloads 330
1127 Pattern Identification in Statistical Process Control Using Artificial Neural Networks

Authors: M. Pramila Devi, N. V. N. Indra Kiran

Abstract:

Control charts, predominantly in the form of X-bar chart, are important tools in statistical process control (SPC). They are useful in determining whether a process is behaving as intended or there are some unnatural causes of variation. A process is out of control if a point falls outside the control limits or a series of point’s exhibit an unnatural pattern. In this paper, a study is carried out on four training algorithms for CCPs recognition. For those algorithms optimal structure is identified and then they are studied for type I and type II errors for generalization without early stopping and with early stopping and the best one is proposed.

Keywords: control chart pattern recognition, neural network, backpropagation, generalization, early stopping

Procedia PDF Downloads 362