Search results for: standardization artificial intelligence
2032 Calm, Confusing and Chaotic: Investigating Humanness through Sentiment Analysis of Abstract Artworks
Authors: Enya Autumn Trenholm-Jensen, Hjalte Hviid Mikkelsen
Abstract:
This study was done in the pursuit of nuancing the discussion surrounding what it means to be human in a time of unparalleled technological development. Subjectivity was deemed to be an accessible example of humanity to study, and art was a fitting medium through which to probe subjectivity. Upon careful theoretical consideration, abstract art was found to fit the parameters of the study with the added bonus of being, as of yet, uninterpretable from an AI perspective. It was hypothesised that dissimilar appraisals of the art stimuli would be found through sentiment and terminology. Opinion data was collected through survey responses and analysed using Valence Aware Dictionary for sEntiment Reasoning (VADER) sentiment analysis. The results reflected the enigmatic nature of subjectivity through erratic ratings of the art stimuli. However, significant themes were found in the terminology used in the responses. The implications of the findings are discussed in relation to the uniqueness, or lack thereof, of human subjectivity, and directions for future research are provided.Keywords: abstract art, artificial intelligence, cognition, sentiment, subjectivity
Procedia PDF Downloads 1162031 Influence of Temperature and Immersion on the Behavior of a Polymer Composite
Authors: Quentin C.P. Bourgogne, Vanessa Bouchart, Pierre Chevrier, Emmanuel Dattoli
Abstract:
This study presents an experimental and theoretical work conducted on a PolyPhenylene Sulfide reinforced with 40%wt of short glass fibers (PPS GF40) and its matrix. Thermoplastics are widely used in the automotive industry to lightweight automotive parts. The replacement of metallic parts by thermoplastics is reaching under-the-hood parts, near the engine. In this area, the parts are subjected to high temperatures and are immersed in cooling liquid. This liquid is composed of water and glycol and can affect the mechanical properties of the composite. The aim of this work was thus to quantify the evolution of mechanical properties of the thermoplastic composite, as a function of temperature and liquid aging effects, in order to develop a reliable design of parts. An experimental campaign in the tensile mode was carried out at different temperatures and for various glycol proportions in the cooling liquid, for monotonic and cyclic loadings on a neat and a reinforced PPS. The results of these tests allowed to highlight some of the main physical phenomena occurring during these solicitations under tough hydro-thermal conditions. Indeed, the performed tests showed that temperature and liquid cooling aging can affect the mechanical behavior of the material in several ways. The more the cooling liquid contains water, the more the mechanical behavior is affected. It was observed that PPS showed a higher sensitivity to absorption than to chemical aggressiveness of the cooling liquid, explaining this dominant sensitivity. Two kinds of behaviors were noted: an elasto-plastic type under the glass transition temperature and a visco-pseudo-plastic one above it. It was also shown that viscosity is the leading phenomenon above the glass transition temperature for the PPS and could also be important under this temperature, mostly under cyclic conditions and when the stress rate is low. Finally, it was observed that soliciting this composite at high temperatures is decreasing the advantages of the presence of fibers. A new phenomenological model was then built to take into account these experimental observations. This new model allowed the prediction of the evolution of mechanical properties as a function of the loading environment, with a reduced number of parameters compared to precedent studies. It was also shown that the presented approach enables the description and the prediction of the mechanical response with very good accuracy (2% of average error at worst), over a wide range of hydrothermal conditions. A temperature-humidity equivalence principle was underlined for the PPS, allowing the consideration of aging effects within the proposed model. Then, a limit of improvement of the reachable accuracy was determinate for all models using this set of data by the application of an artificial intelligence-based model allowing a comparison between artificial intelligence-based models and phenomenological based ones.Keywords: aging, analytical modeling, mechanical testing, polymer matrix composites, sequential model, thermomechanical
Procedia PDF Downloads 1162030 Current Applications of Artificial Intelligence (AI) in Chest Radiology
Authors: Angelis P. Barlampas
Abstract:
Learning Objectives: The purpose of this study is to inform briefly the reader about the applications of AI in chest radiology. Background: Currently, there are 190 FDA-approved radiology AI applications, with 42 (22%) pertaining specifically to thoracic radiology. Imaging findings OR Procedure details Aids of AI in chest radiology1: Detects and segments pulmonary nodules. Subtracts bone to provide an unobstructed view of the underlying lung parenchyma and provides further information on nodule characteristics, such as nodule location, nodule two-dimensional size or three dimensional (3D) volume, change in nodule size over time, attenuation data (i.e., mean, minimum, and/or maximum Hounsfield units [HU]), morphological assessments, or combinations of the above. Reclassifies indeterminate pulmonary nodules into low or high risk with higher accuracy than conventional risk models. Detects pleural effusion . Differentiates tension pneumothorax from nontension pneumothorax. Detects cardiomegaly, calcification, consolidation, mediastinal widening, atelectasis, fibrosis and pneumoperitoneum. Localises automatically vertebrae segments, labels ribs and detects rib fractures. Measures the distance from the tube tip to the carina and localizes both endotracheal tubes and central vascular lines. Detects consolidation and progression of parenchymal diseases such as pulmonary fibrosis or chronic obstructive pulmonary disease (COPD).Can evaluate lobar volumes. Identifies and labels pulmonary bronchi and vasculature and quantifies air-trapping. Offers emphysema evaluation. Provides functional respiratory imaging, whereby high-resolution CT images are post-processed to quantify airflow by lung region and may be used to quantify key biomarkers such as airway resistance, air-trapping, ventilation mapping, lung and lobar volume, and blood vessel and airway volume. Assesses the lung parenchyma by way of density evaluation. Provides percentages of tissues within defined attenuation (HU) ranges besides furnishing automated lung segmentation and lung volume information. Improves image quality for noisy images with built-in denoising function. Detects emphysema, a common condition seen in patients with history of smoking and hyperdense or opacified regions, thereby aiding in the diagnosis of certain pathologies, such as COVID-19 pneumonia. It aids in cardiac segmentation and calcium detection, aorta segmentation and diameter measurements, and vertebral body segmentation and density measurements. Conclusion: The future is yet to come, but AI already is a helpful tool for the daily practice in radiology. It is assumed, that the continuing progression of the computerized systems and the improvements in software algorithms , will redder AI into the second hand of the radiologist.Keywords: artificial intelligence, chest imaging, nodule detection, automated diagnoses
Procedia PDF Downloads 722029 Modeling Visual Memorability Assessment with Autoencoders Reveals Characteristics of Memorable Images
Authors: Elham Bagheri, Yalda Mohsenzadeh
Abstract:
Image memorability refers to the phenomenon where certain images are more likely to be remembered by humans than others. It is a quantifiable and intrinsic attribute of an image. Understanding how visual perception and memory interact is important in both cognitive science and artificial intelligence. It reveals the complex processes that support human cognition and helps to improve machine learning algorithms by mimicking the brain's efficient data processing and storage mechanisms. To explore the computational underpinnings of image memorability, this study examines the relationship between an image's reconstruction error, distinctiveness in latent space, and its memorability score. A trained autoencoder is used to replicate human-like memorability assessment inspired by the visual memory game employed in memorability estimations. This study leverages a VGG-based autoencoder that is pre-trained on the vast ImageNet dataset, enabling it to recognize patterns and features that are common to a wide and diverse range of images. An empirical analysis is conducted using the MemCat dataset, which includes 10,000 images from five broad categories: animals, sports, food, landscapes, and vehicles, along with their corresponding memorability scores. The memorability score assigned to each image represents the probability of that image being remembered by participants after a single exposure. The autoencoder is finetuned for one epoch with a batch size of one, attempting to create a scenario similar to human memorability experiments where memorability is quantified by the likelihood of an image being remembered after being seen only once. The reconstruction error, which is quantified as the difference between the original and reconstructed images, serves as a measure of how well the autoencoder has learned to represent the data. The reconstruction error of each image, the error reduction, and its distinctiveness in latent space are calculated and correlated with the memorability score. Distinctiveness is measured as the Euclidean distance between each image's latent representation and its nearest neighbor within the autoencoder's latent space. Different structural and perceptual loss functions are considered to quantify the reconstruction error. The results indicate that there is a strong correlation between the reconstruction error and the distinctiveness of images and their memorability scores. This suggests that images with more unique distinct features that challenge the autoencoder's compressive capacities are inherently more memorable. There is also a negative correlation between the reduction in reconstruction error compared to the autoencoder pre-trained on ImageNet, which suggests that highly memorable images are harder to reconstruct, probably due to having features that are more difficult to learn by the autoencoder. These insights suggest a new pathway for evaluating image memorability, which could potentially impact industries reliant on visual content and mark a step forward in merging the fields of artificial intelligence and cognitive science. The current research opens avenues for utilizing neural representations as instruments for understanding and predicting visual memory.Keywords: autoencoder, computational vision, image memorability, image reconstruction, memory retention, reconstruction error, visual perception
Procedia PDF Downloads 902028 Improving the Gain of a Multiband Antenna by Adding an Artificial Magnetic Conductor Metasurface
Authors: Amira Bousselmi
Abstract:
This article presents a PIFA antenna designed for geolocation applications (GNSS) operating on 1.278 GHz, 2.8 GHz, 5.7 GHz and 10 GHz. To improve the performance of the antenna, an artificial magnetic conductor structure (AMC) was used. Adding the antenna with AMC resulted in a measured gain of 4.78 dBi. The results of simulations and measurements are presented. CST Microwave Studio is used to design and compare antenna performance. An antenna design methodology, design and characterization of the AMC surface are described as well as the simulated and measured performances of the AMC antenna are then discussed. Finally, in Section V, there is a conclusion.Keywords: antenna multiband, global navigation system, AMC, Galeleo
Procedia PDF Downloads 772027 Optimal Cropping Pattern in an Irrigation Project: A Hybrid Model of Artificial Neural Network and Modified Simplex Algorithm
Authors: Safayat Ali Shaikh
Abstract:
Software has been developed for optimal cropping pattern in an irrigation project considering land constraint, water availability constraint and pick up flow constraint using modified Simplex Algorithm. Artificial Neural Network Models (ANN) have been developed to predict rainfall. AR (1) model used to generate 1000 years rainfall data to train the ANN. Simulation has been done with expected rainfall data. Eight number crops and three types of soil class have been considered for optimization model. Area under each crop and each soil class have been quantified using Modified Simplex Algorithm to get optimum net return. Efficacy of the software has been tested using data of large irrigation project in India.Keywords: artificial neural network, large irrigation project, modified simplex algorithm, optimal cropping pattern
Procedia PDF Downloads 2032026 An Exploration of Anti-Terrorism Laws in Nigeria
Authors: Sani Mohammed Adam
Abstract:
This work seeks to review the security challenges facing Nigeria and explore the relevance of laws and policies in tackling the menace. The work looks at the adequacy of available legislations and the functionality of relevant institutions such as the Armed Forces, the Nigeria Police Force, the State Security Service, the Defence Intelligence Agency and the Nigerian Intelligence Agency etc. Comparisons would be made with other jurisdictions, such as inter alia, the Homeland Security in the USA and Counter Terrorism Laws of the United Kingdom. Recommendations would be made on how to strengthen both institutions and laws to curtail the growth of Terrorism in Nigeria.Keywords: legislations, Nigeria, security, terrorism
Procedia PDF Downloads 6792025 Improving Fingerprinting-Based Localization System Using Generative AI
Authors: Getaneh Berie Tarekegn, Li-Chia Tai
Abstract:
With the rapid advancement of artificial intelligence, low-power built-in sensors on Internet of Things devices, and communication technologies, location-aware services have become increasingly popular and have permeated every aspect of people’s lives. Global navigation satellite systems (GNSSs) are the default method of providing continuous positioning services for ground and aerial vehicles, as well as consumer devices (smartphones, watches, notepads, etc.). However, the environment affects satellite positioning systems, particularly indoors, in dense urban and suburban cities enclosed by skyscrapers, or when deep shadows obscure satellite signals. This is because (1) indoor environments are more complicated due to the presence of many objects surrounding them; (2) reflection within the building is highly dependent on the surrounding environment, including the positions of objects and human activity; and (3) satellite signals cannot be reached in an indoor environment, and GNSS doesn't have enough power to penetrate building walls. GPS is also highly power-hungry, which poses a severe challenge for battery-powered IoT devices. Due to these challenges, IoT applications are limited. Consequently, precise, seamless, and ubiquitous Positioning, Navigation and Timing (PNT) systems are crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarms, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 422024 A Survey of Response Generation of Dialogue Systems
Authors: Yifan Fan, Xudong Luo, Pingping Lin
Abstract:
An essential task in the field of artificial intelligence is to allow computers to interact with people through natural language. Therefore, researches such as virtual assistants and dialogue systems have received widespread attention from industry and academia. The response generation plays a crucial role in dialogue systems, so to push forward the research on this topic, this paper surveys various methods for response generation. We sort out these methods into three categories. First one includes finite state machine methods, framework methods, and instance methods. The second contains full-text indexing methods, ontology methods, vast knowledge base method, and some other methods. The third covers retrieval methods and generative methods. We also discuss some hybrid methods based knowledge and deep learning. We compare their disadvantages and advantages and point out in which ways these studies can be improved further. Our discussion covers some studies published in leading conferences such as IJCAI and AAAI in recent years.Keywords: deep learning, generative, knowledge, response generation, retrieval
Procedia PDF Downloads 1342023 Analysis of Q-Learning on Artificial Neural Networks for Robot Control Using Live Video Feed
Authors: Nihal Murali, Kunal Gupta, Surekha Bhanot
Abstract:
Training of artificial neural networks (ANNs) using reinforcement learning (RL) techniques is being widely discussed in the robot learning literature. The high model complexity of ANNs along with the model-free nature of RL algorithms provides a desirable combination for many robotics applications. There is a huge need for algorithms that generalize using raw sensory inputs, such as vision, without any hand-engineered features or domain heuristics. In this paper, the standard control problem of line following robot was used as a test-bed, and an ANN controller for the robot was trained on images from a live video feed using Q-learning. A virtual agent was first trained in simulation environment and then deployed onto a robot’s hardware. The robot successfully learns to traverse a wide range of curves and displays excellent generalization ability. Qualitative analysis of the evolution of policies, performance and weights of the network provide insights into the nature and convergence of the learning algorithm.Keywords: artificial neural networks, q-learning, reinforcement learning, robot learning
Procedia PDF Downloads 3722022 Identifying Confirmed Resemblances in Problem-Solving Engineering, Both in the Past and Present
Authors: Colin Schmidt, Adrien Lecossier, Pascal Crubleau, Philippe Blanchard, Simon Richir
Abstract:
Introduction:The widespread availability of artificial intelligence, exemplified by Generative Pre-trained Transformers (GPT) relying on large language models (LLM), has caused a seismic shift in the realm of knowledge. Everyone now has the capacity to swiftly learn how these models can either serve them well or not. Today, conversational AI like ChatGPT is grounded in neural transformer models, a significant advance in natural language processing facilitated by the emergence of renowned LLMs constructed using neural transformer architecture. Inventiveness of an LLM : OpenAI's GPT-3 stands as a premier LLM, capable of handling a broad spectrum of natural language processing tasks without requiring fine-tuning, reliably producing text that reads as if authored by humans. However, even with an understanding of how LLMs respond to questions asked, there may be lurking behind OpenAI’s seemingly endless responses an inventive model yet to be uncovered. There may be some unforeseen reasoning emerging from the interconnection of neural networks here. Just as a Soviet researcher in the 1940s questioned the existence of Common factors in inventions, enabling an Under standing of how and according to what principles humans create them, it is equally legitimate today to explore whether solutions provided by LLMs to complex problems also share common denominators. Theory of Inventive Problem Solving (TRIZ) : We will revisit some fundamentals of TRIZ and how Genrich ALTSHULLER was inspired by the idea that inventions and innovations are essential means to solve societal problems. It's crucial to note that traditional problem-solving methods often fall short in discovering innovative solutions. The design team is frequently hampered by psychological barriers stemming from confinement within a highly specialized knowledge domain that is difficult to question. We presume ChatGPT Utilizes TRIZ 40. Hence, the objective of this research is to decipher the inventive model of LLMs, particularly that of ChatGPT, through a comparative study. This will enhance the efficiency of sustainable innovation processes and shed light on how the construction of a solution to a complex problem was devised. Description of the Experimental Protocol : To confirm or reject our main hypothesis that is to determine whether ChatGPT uses TRIZ, we will follow a stringent protocol that we will detail, drawing on insights from a panel of two TRIZ experts. Conclusion and Future Directions : In this endeavor, we sought to comprehend how an LLM like GPT addresses complex challenges. Our goal was to analyze the inventive model of responses provided by an LLM, specifically ChatGPT, by comparing it to an existing standard model: TRIZ 40. Of course, problem solving is our main focus in our endeavours.Keywords: artificial intelligence, Triz, ChatGPT, inventiveness, problem-solving
Procedia PDF Downloads 742021 Deep Neural Network Approach for Navigation of Autonomous Vehicles
Authors: Mayank Raj, V. G. Narendra
Abstract:
Ever since the DARPA challenge on autonomous vehicles in 2005, there has been a lot of buzz about ‘Autonomous Vehicles’ amongst the major tech giants such as Google, Uber, and Tesla. Numerous approaches have been adopted to solve this problem, which can have a long-lasting impact on mankind. In this paper, we have used Deep Learning techniques and TensorFlow framework with the goal of building a neural network model to predict (speed, acceleration, steering angle, and brake) features needed for navigation of autonomous vehicles. The Deep Neural Network has been trained on images and sensor data obtained from the comma.ai dataset. A heatmap was used to check for correlation among the features, and finally, four important features were selected. This was a multivariate regression problem. The final model had five convolutional layers, followed by five dense layers. Finally, the calculated values were tested against the labeled data, where the mean squared error was used as a performance metric.Keywords: autonomous vehicles, deep learning, computer vision, artificial intelligence
Procedia PDF Downloads 1582020 Utilizing Federated Learning for Accurate Prediction of COVID-19 from CT Scan Images
Authors: Jinil Patel, Sarthak Patel, Sarthak Thakkar, Deepti Saraswat
Abstract:
Recently, the COVID-19 outbreak has spread across the world, leading the World Health Organization to classify it as a global pandemic. To save the patient’s life, the COVID-19 symptoms have to be identified. But using an AI (Artificial Intelligence) model to identify COVID-19 symptoms within the allotted time was challenging. The RT-PCR test was found to be inadequate in determining the COVID status of a patient. To determine if the patient has COVID-19 or not, a Computed Tomography Scan (CT scan) of patient is a better alternative. It will be challenging to compile and store all the data from various hospitals on the server, though. Federated learning, therefore, aids in resolving this problem. Certain deep learning models help to classify Covid-19. This paper will have detailed work of certain deep learning models like VGG19, ResNet50, MobileNEtv2, and Deep Learning Aggregation (DLA) along with maintaining privacy with encryption.Keywords: federated learning, COVID-19, CT-scan, homomorphic encryption, ResNet50, VGG-19, MobileNetv2, DLA
Procedia PDF Downloads 732019 Estimating Solar Irradiance on a Tilted Surface Using Artificial Neural Networks with Differential Outputs
Authors: Hsu-Yung Cheng, Kuo-Chang Hsu, Chi-Chang Chan, Mei-Hui Tseng, Chih-Chang Yu, Ya-Sheng Liu
Abstract:
Photovoltaics modules are usually not installed horizontally to avoid water or dust accumulation. However, the measured irradiance data on tilted surfaces are rarely available since installing pyranometers with various tilt angles induces high costs. Therefore, estimating solar irradiance on tilted surfaces is an important research topic. In this work, artificial neural networks (ANN) are utilized to construct the transfer model to estimate solar irradiance on tilted surfaces. Instead of predicting tilted irradiance directly, the proposed method estimates the differences between the horizontal irradiance and the irradiance on a tilted surface. The outputs of the ANNs in the proposed design are differential values. The experimental results have shown that the proposed ANNs with differential outputs can substantially improve the estimation accuracy compared to ANNs that estimate the titled irradiance directly.Keywords: photovoltaics, artificial neural networks, tilted irradiance, solar energy
Procedia PDF Downloads 3972018 A Hybrid Distributed Algorithm for Multi-Objective Dynamic Flexible Job Shop Scheduling Problem
Authors: Aydin Teymourifar, Gurkan Ozturk
Abstract:
In this paper, a hybrid distributed algorithm has been suggested for multi-objective dynamic flexible job shop scheduling problem. The proposed algorithm is high level, in which several algorithms search the space on different machines simultaneously also it is a hybrid algorithm that takes advantages of the artificial intelligence, evolutionary and optimization methods. Distribution is done at different levels and new approaches are used for design of the algorithm. Apache spark and Hadoop frameworks have been used for the distribution of the algorithm. The Pareto optimality approach is used for solving the multi-objective benchmarks. The suggested algorithm that is able to solve large-size problems in short times has been compared with the successful algorithms of the literature. The results prove high speed and efficiency of the algorithm.Keywords: distributed algorithms, apache-spark, Hadoop, flexible dynamic job shop scheduling, multi-objective optimization
Procedia PDF Downloads 3542017 Prediction of Wind Speed by Artificial Neural Networks for Energy Application
Authors: S. Adjiri-Bailiche, S. M. Boudia, H. Daaou, S. Hadouche, A. Benzaoui
Abstract:
In this work the study of changes in the wind speed depending on the altitude is calculated and described by the model of the neural networks, the use of measured data, the speed and direction of wind, temperature and the humidity at 10 m are used as input data and as data targets at 50m above sea level. Comparing predict wind speeds and extrapolated at 50 m above sea level is performed. The results show that the prediction by the method of artificial neural networks is very accurate.Keywords: MATLAB, neural network, power low, vertical extrapolation, wind energy, wind speed
Procedia PDF Downloads 6922016 Application of Artificial Neural Network for Prediction of High Tensile Steel Strands in Post-Tensioned Slabs
Authors: Gaurav Sancheti
Abstract:
This study presents an impacting approach of Artificial Neural Networks (ANNs) in determining the quantity of High Tensile Steel (HTS) strands required in post-tensioned (PT) slabs. Various PT slab configurations were generated by varying the span and depth of the slab. For each of these slab configurations, quantity of required HTS strands were recorded. ANNs with backpropagation algorithm and varying architectures were developed and their performance was evaluated in terms of Mean Square Error (MSE). The recorded data for the quantity of HTS strands was used as a feeder database for training the developed ANNs. The networks were validated using various validation techniques. The results show that the proposed ANNs have a great potential with good prediction and generalization capability.Keywords: artificial neural networks, back propagation, conceptual design, high tensile steel strands, post tensioned slabs, validation techniques
Procedia PDF Downloads 2212015 DURAFILE: A Collaborative Tool for Preserving Digital Media Files
Authors: Santiago Macho, Miquel Montaner, Raivo Ruusalepp, Ferran Candela, Xavier Tarres, Rando Rostok
Abstract:
During our lives, we generate a lot of personal information such as photos, music, text documents and videos that link us with our past. This data that used to be tangible is now digital information stored in our computers, which implies a software dependence to make them accessible in the future. Technology, however, constantly evolves and goes through regular shifts, quickly rendering various file formats obsolete. The need for accessing data in the future affects not only personal users but also organizations. In a digital environment, a reliable preservation plan and the ability to adapt to fast changing technology are essential for maintaining data collections in the long term. We present in this paper the European FP7 project called DURAFILE that provides the technology to preserve media files for personal users and organizations while maintaining their quality.Keywords: artificial intelligence, digital preservation, social search, digital preservation plans
Procedia PDF Downloads 4452014 Smoker Recognition from Lung X-Ray Images Using Convolutional Neural Network
Authors: Moumita Chanda, Md. Fazlul Karim Patwary
Abstract:
Smoking is one of the most popular recreational drug use behaviors, and it contributes to birth defects, COPD, heart attacks, and erectile dysfunction. To completely eradicate this disease, it is imperative that it be identified and treated. Numerous smoking cessation programs have been created, and they demonstrate how beneficial it may be to help someone stop smoking at the ideal time. A tomography meter is an effective smoking detector. Other wearables, such as RF-based proximity sensors worn on the collar and wrist to detect when the hand is close to the mouth, have been proposed in the past, but they are not impervious to deceptive variables. In this study, we create a machine that can discriminate between smokers and non-smokers in real-time with high sensitivity and specificity by watching and collecting the human lung and analyzing the X-ray data using machine learning. If it has the highest accuracy, this machine could be utilized in a hospital, in the selection of candidates for the army or police, or in university entrance.Keywords: CNN, smoker detection, non-smoker detection, OpenCV, artificial Intelligence, X-ray Image detection
Procedia PDF Downloads 842013 The Impact of Artificial Intelligence on the Behavior of Children and Autism
Authors: Sara Fayez Fawzy Mikhael
Abstract:
Inclusive education services for students with Autism remains in its early developmental stages in Thailand. Despite many more children with autism are attending schools since the Thai government introduced the Education Provision for People with Disabilities Act in 2008, the services students with autism and their families receive are generally lacking. This quantitative study used Attitude and Preparedness to Teach Students with Autism Scale (APTSAS) to investigate 110 primary school teachers’ attitude and preparedness to teach students with autism in the general education classroom. Descriptive statistical analysis of the data found that student behavior was the most significant factor in building teachers’ negative attitudes students with autism. The majority of teachers also indicated that their pre-service education did not prepare them to meet the learning needs of children with autism in particular, those who are non-verbal. The study is significant and provides direction for enhancing teacher education for inclusivity in Thailand.Keywords: attitude, autism, teachers, thailandsports activates, movement skills, motor skills
Procedia PDF Downloads 1002012 The Impact of Artificial Intelligence on Autism Attitude and Skills
Authors: Samwail Fahmi Francis Yacoub
Abstract:
Inclusive education services for students with Autism remains in its early developmental stages in Thailand. Despite many more children with autism are attending schools since the Thai government introduced the Education Provision for People with Disabilities Act in 2008, the services students with autism and their families receive are generally lacking. This quantitative study used Attitude and Preparedness to Teach Students with Autism Scale (APTSAS) to investigate 110 primary school teachers’ attitude and preparedness to teach students with autism in the general education classroom. Descriptive statistical analysis of the data found that student behavior was the most significant factor in building teachers’ negative attitudes students with autism. The majority of teachers also indicated that their pre-service education did not prepare them to meet the learning needs of children with autism in particular, those who are non-verbal. The study is significant and provides direction for enhancing teacher education for inclusivity in Thailand.Keywords: attitude, autism, teachers, movement skills, motor skills, children, behavior.
Procedia PDF Downloads 522011 Real Estate Trend Prediction with Artificial Intelligence Techniques
Authors: Sophia Liang Zhou
Abstract:
For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.Keywords: linear regression, random forest, artificial neural network, real estate price prediction
Procedia PDF Downloads 1032010 Study of a Crude Oil Desalting Plant of the National Iranian South Oil Company in Gachsaran by Using Artificial Neural Networks
Authors: H. Kiani, S. Moradi, B. Soltani Soulgani, S. Mousavian
Abstract:
Desalting/dehydration plants (DDP) are often installed in crude oil production units in order to remove water-soluble salts from an oil stream. In order to optimize this process, desalting unit should be modeled. In this research, artificial neural network is used to model efficiency of desalting unit as a function of input parameter. The result of this research shows that the mentioned model has good agreement with experimental data.Keywords: desalting unit, crude oil, neural networks, simulation, recovery, separation
Procedia PDF Downloads 4502009 Current Zonal Isolation Regulation and Standards: A Compare and Contrast Review in Plug and Abandonment
Authors: Z. A. Al Marhoon, H. S. Al Ramis, C. Teodoriu
Abstract:
Well-integrity is one of the major elements considered for drilling geothermal, oil, and gas wells. Well-integrity is minimizing the risk of unplanned fluid flow in the well bore throughout the well lifetime. Well integrity is maximized by applying technical concepts along with practical practices and strategic planning. These practices are usually governed by standardization and regulation entities. Practices during well construction can affect the integrity of the seal at the time of abandonment. On the other hand, achieving a perfect barrier system is impracticable due to the needed cost. This results in a needed balance between regulations requirements and practical applications. The guidelines are only effective when they are attainable in practical applications. Various governmental regulations and international standards have different guidelines on what constitutes high-quality isolation from unwanted flow. Each regulating or standardization body differ in requirements based on the abandonment objective. Some regulation account more for the environmental impact, water table contamination, and possible leaks. Other regulation might lean towards driving more economical benefits while achieving an acceptable isolation criteria. The research methodology used in this topic is derived from a literature review method combined with a compare and contrast analysis. The literature review on various zonal isolation regulations and standards has been conducted. A review includes guidelines from NORSOK (Norwegian governing entity), BSEE (USA offshore governing entity), API (American Petroleum Institute) combined with ISO (International Standardization Organization). The compare and contrast analysis is conducted by assessing the objective of each abandonment regulations and standardization. The current state of well barrier regulation is in balancing action. From one side of this balance, the environmental impact and complete zonal isolation is considered. The other side of the scale is practical application and associated cost. Some standards provide a fair amount of details concerning technical requirements and are often flexible with the needed associated cost. These guidelines cover environmental impact with laws that prevent major or disastrous environmental effects of improper sealing of wells. Usually these regulations are concerned with the near future of sealing rather than long-term. Consequently, applying these guidelines become more feasible from a cost point of view to the required plugging entities. On the other hand, other regulation have well integrity procedures and regulations that lean toward more restrictions environmentally with an increased associated cost requirements. The environmental impact is detailed and covered with its entirety, including medium to small environmental impact in barrier installing operations. Clear and precise attention to long-term leakage prevention is present in these regulations. The result of the compare and contrast analysis of the literature showed that there are various objectives that might tip the scale from one side of the balance (cost) to the other (sealing quality) especially in reference to zonal isolation. Furthermore, investing in initial well construction is a crucial part of ensuring safe final well abandonment. The safety and the cost saving at the end of the well life cycle is dependent upon a well-constructed isolation systems at the beginning of the life cycle. Long term studies on zonal isolation using various hydraulic or mechanical materials need to take place to further assess permanently abandoned wells to achieve the desired balance. Well drilling and isolation techniques will be more effective when they are operationally feasible and have reasonable associated cost to aid the local economy.Keywords: plug and abandon, P&A regulation, P&A standards, international guidelines, gap analysis
Procedia PDF Downloads 1332008 Estimation of Fouling in a Cross-Flow Heat Exchanger Using Artificial Neural Network Approach
Authors: Rania Jradi, Christophe Marvillet, Mohamed Razak Jeday
Abstract:
One of the most frequently encountered problems in industrial heat exchangers is fouling, which degrades the thermal and hydraulic performances of these types of equipment, leading thus to failure if undetected. And it occurs due to the accumulation of undesired material on the heat transfer surface. So, it is necessary to know about the heat exchanger fouling dynamics to plan mitigation strategies, ensuring a sustainable and safe operation. This paper proposes an Artificial Neural Network (ANN) approach to estimate the fouling resistance in a cross-flow heat exchanger by the collection of the operating data of the phosphoric acid concentration loop. The operating data of 361 was used to validate the proposed model. The ANN attains AARD= 0.048%, MSE= 1.811x10⁻¹¹, RMSE= 4.256x 10⁻⁶ and r²=99.5 % of accuracy which confirms that it is a credible and valuable approach for industrialists and technologists who are faced with the drawbacks of fouling in heat exchangers.Keywords: cross-flow heat exchanger, fouling, estimation, phosphoric acid concentration loop, artificial neural network approach
Procedia PDF Downloads 1982007 A Review Paper on Data Security in Precision Agriculture Using Internet of Things
Authors: Tonderai Muchenje, Xolani Mkhwanazi
Abstract:
Precision agriculture uses a number of technologies, devices, protocols, and computing paradigms to optimize agricultural processes. Big data, artificial intelligence, cloud computing, and edge computing are all used to handle the huge amounts of data generated by precision agriculture. However, precision agriculture is still emerging and has a low level of security features. Furthermore, future solutions will demand data availability and accuracy as key points to help farmers, and security is important to build robust and efficient systems. Since precision agriculture comprises a wide variety and quantity of resources, security addresses issues such as compatibility, constrained resources, and massive data. Moreover, conventional protection schemes used in the traditional internet may not be useful for agricultural systems, creating extra demands and opportunities. Therefore, this paper aims at reviewing state of the art of precision agriculture security, particularly in open field agriculture, discussing its architecture, describing security issues, and presenting the major challenges and future directions.Keywords: precision agriculture, security, IoT, EIDE
Procedia PDF Downloads 902006 Technology Impact in Learning and Teaching English Language Writing
Authors: Laura Naka
Abstract:
The invention of computer writing programs has changed the way of teaching second language writing. This artificial intelligence engine can provide students with feedback on their essays, on their grammatical and spelling errors, convenient writing and editing tools to facilitate student’s writing process. However, it is not yet proved if this technology is helping students to improve their writing skills. There are several programs that are of great assistance for students concerning their writing skills. New technology provides students with different software programs which enable them to be more creative, to express their opinions and ideas in words, pictures and sounds, but at the end main and most correct feedback should be given by their teachers. No matter how new technology affects in writing skills, always comes from their teachers. This research will try to present some of the advantages and disadvantages that new technology has in writing process for students. The research takes place in the University of Gjakova ‘’Fehmi Agani’’ Faculty of Education-Preschool Program. The research aims to provide random sample response by using questionnaires and observation.Keywords: English language learning, technology, academic writing, teaching L2.
Procedia PDF Downloads 5712005 Oxygen Transport in Blood Flows Pasts Staggered Fiber Arrays: A Computational Fluid Dynamics Study of an Oxygenator in Artificial Lung
Authors: Yu-Chen Hsu, Kuang C. Lin
Abstract:
The artificial lung called extracorporeal membrane oxygenation (ECMO) is an important medical machine that supports persons whose heart and lungs dysfunction. Previously, investigation of steady deoxygenated blood flows passing through hollow fibers for oxygen transport was carried out experimentally and computationally. The present study computationally analyzes the effect of biological pulsatile flow on the oxygen transport in blood. A 2-D model with a pulsatile flow condition is employed. The power law model is used to describe the non-Newtonian flow and the Hill equation is utilized to simulate the oxygen saturation of hemoglobin. The dimensionless parameters for the physical model include Reynolds numbers (Re), Womersley parameters (α), pulsation amplitudes (A), Sherwood number (Sh) and Schmidt number (Sc). The present model with steady-state flow conditions is well validated against previous experiment and simulations. It is observed that pulsating flow amplitudes significantly influence the velocity profile, pressure of oxygen (PO2), saturation of oxygen (SO2) and the oxygen mass transfer rates (m ̇_O2). In comparison between steady-state and pulsating flows, our findings suggest that the consideration of pulsating flow in the computational model is needed when Re is raised from 2 to 10 in a typical range for flow in artificial lung.Keywords: artificial lung, oxygen transport, non-Newtonian flows, pulsating flows
Procedia PDF Downloads 3112004 A Nanofi Brous PHBV Tube with Schwann Cell as Artificial Nerve Graft Contributing to Rat Sciatic Nerve Regeneration across a 30-Mm Defect Bridge
Authors: Esmaeil Biazar
Abstract:
A nanofibrous PHBV nerve conduit has been used to evaluate its efficiency based on the promotion of nerve regeneration in rats. The designed conduits were investigated by physical, mechanical and microscopic analyses. The conduits were implanted into a 30-mm gap in the sciatic nerves of the rats. Four months after surgery, the regenerated nerves were evaluated by macroscopic assessments and histology. This polymeric conduit had sufficiently high mechanical properties to serve as a nerve guide. The results demonstrated that in the nanofibrous graft with cells, the sciatic nerve trunk had been reconstructed with restoration of nerve continuity and formatted nerve fibers with myelination. For the grafts especially the nanofibrous conduits with cells, muscle cells of gastrocnemius on the operated side were uniform in their size and structures. This study proves the feasibility of artificial conduit with Schwann cells for nerve regeneration by bridging a longer defect in a rat model.Keywords: sciatic regeneration, Schwann cell, artificial conduit, nanofibrous PHBV, histological assessments
Procedia PDF Downloads 3232003 Black Swans Public Administration and Informatics
Authors: Anastasis Petrou
Abstract:
Black Swan Theories (BSTs) have existed since the 2nd Century BC. However, problematisation in the interdisciplinary field of Public Administration and Informatics (PA&I) about the impact of Black Swans as rare events in Society is a more recent phenomenon but with a growing, although dispersed, body of research literature. This paper offers a synopsis of core issues and questions raised in PA&I literature about the impacts of rare events in Society, the need for knowledge accumulation and explainability processes about rare events and asks what could help explain the occurrence, severity, heterogeneity, overall impact of Black Swans and the challenges they represent to established scientific methods. The second part of the paper considers how the use of Artificial Intelligence (AI) could assist researchers in better explaining rare events in PA&I. However, the research shows that whilst AI use at the start of knowledge accumulation and explainability processes about rare events is beneficial it is also fraught with challenges discussed herein. The paper concludes with recommendations for future research.Keywords: black swans, public administration, AI, informatics
Procedia PDF Downloads 15