Search results for: artificial air storage reservoir
2963 Dispersions of Carbon Black in Microemulsions
Authors: Mohamed Youssry, Dominique Guyomard, Bernard Lestriez
Abstract:
In order to enhance the energy and power densities of electrodes for energy storage systems, the formulation and processing of electrode slurries proved to be a critical issue in determining the electrode performance. In this study, we introduce novel approach to formulate carbon black slurries based on microemulsion and lyotropic liquid crystalline phases (namely, lamellar phase) composed of non-ionic surfactant (Triton X100), decanol and water. Simultaneous measurements of electrical properties of slurries under shear flow (rheology) have been conducted to elucidate the microstructure evolution with the surfactant concentration and decanol/water ratio at rest, as well as, the structural transition under steady-shear which has been confirmed by rheo-microscopy. Interestingly, the carbon black slurries at low decanol/water ratio are weak-gel (flowable) with higher electrical conductivity than those at higher ratio which behave strong-gel viscoelastic response. In addition, the slurries show recoverable electrical behaviour under shear flow in tandem with the viscosity trend. It is likely that oil-in-water microemulsion enhances slurries’ stability without affecting on the percolating network of carbon black. On the other hand, the oil-in-water analogous and bilayer structure of lamellar phase cause the slurries less conductive as a consequence of losing the network percolation. These findings are encouraging to formulate microemulsion-based electrodes for energy storage system (lithium-ion batteries).Keywords: electrode slurries, microemulsion, microstructure transition, rheo-electrical properties
Procedia PDF Downloads 2662962 The Effect of Artificial Intelligence on Communication and Information Systems
Authors: Sameh Ibrahim Ghali Hanna
Abstract:
Information system (IS) are fairly crucial in the operation of private and public establishments in growing and developed international locations. Growing countries are saddled with many project failures throughout the implementation of records systems. However, successful information systems are greatly wished for in developing nations in an effort to decorate their economies. This paper is extraordinarily critical in view of the high failure fee of data structures in growing nations, which desire to be decreased to minimal proper levels by means of advocated interventions. This paper centers on a review of IS development in developing international locations. The paper gives evidence of the IS successes and screw-ups in developing nations and posits a version to deal with the IS failures. The proposed model can then be utilized by means of growing nations to lessen their IS mission implementation failure fee. A contrast is drawn between IS improvement in growing international locations and evolved international locations. The paper affords valuable records to assist in decreasing IS failure, and growing IS models and theories on IS development for developing countries.Keywords: research information systems (RIS), research information, heterogeneous sources, data quality, data cleansing, science system, standardization artificial intelligence, AI, enterprise information system, EIS, integration developing countries, information systems, IS development, information systems failure, information systems success, information systems success model
Procedia PDF Downloads 212961 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction
Authors: Qais M. Yousef, Yasmeen A. Alshaer
Abstract:
Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.Keywords: artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization
Procedia PDF Downloads 1752960 Web Development in Information Technology with Javascript, Machine Learning and Artificial Intelligence
Authors: Abdul Basit Kiani, Maryam Kiani
Abstract:
Online developers now have the tools necessary to create online apps that are not only reliable but also highly interactive, thanks to the introduction of JavaScript frameworks and APIs. The objective is to give a broad overview of the recent advances in the area. The fusion of machine learning (ML) and artificial intelligence (AI) has expanded the possibilities for web development. Modern websites now include chatbots, clever recommendation systems, and customization algorithms built in. In the rapidly evolving landscape of modern websites, it has become increasingly apparent that user engagement and personalization are key factors for success. To meet these demands, websites now incorporate a range of innovative technologies. One such technology is chatbots, which provide users with instant assistance and support, enhancing their overall browsing experience. These intelligent bots are capable of understanding natural language and can answer frequently asked questions, offer product recommendations, and even help with troubleshooting. Moreover, clever recommendation systems have emerged as a powerful tool on modern websites. By analyzing user behavior, preferences, and historical data, these systems can intelligently suggest relevant products, articles, or services tailored to each user's unique interests. This not only saves users valuable time but also increases the chances of conversions and customer satisfaction. Additionally, customization algorithms have revolutionized the way websites interact with users. By leveraging user preferences, browsing history, and demographic information, these algorithms can dynamically adjust the website's layout, content, and functionalities to suit individual user needs. This level of personalization enhances user engagement, boosts conversion rates, and ultimately leads to a more satisfying online experience. In summary, the integration of chatbots, clever recommendation systems, and customization algorithms into modern websites is transforming the way users interact with online platforms. These advanced technologies not only streamline user experiences but also contribute to increased customer satisfaction, improved conversions, and overall website success.Keywords: Javascript, machine learning, artificial intelligence, web development
Procedia PDF Downloads 802959 Technical and Economic Potential of Partial Electrification of Railway Lines
Authors: Rafael Martins Manzano Silva, Jean-Francois Tremong
Abstract:
Electrification of railway lines allows to increase speed, power, capacity and energetic efficiency of rolling stocks. However, this process of electrification is complex and costly. An electrification project is not just about design of catenary. It also includes installation of structures around electrification, as substation installation, electrical isolation, signalling, telecommunication and civil engineering structures. France has more than 30,000 km of railways, whose only 53% are electrified. The others 47% of railways use diesel locomotive and represent only 10% of the circulation (tons.km). For this reason, a new type of electrification, less expensive than the usual, is requested to enable the modernization of these railways. One solution could be the use of hybrids trains. This technology opens up new opportunities for less expensive infrastructure development such as the partial electrification of railway lines. In a partially electrified railway, the power supply of theses hybrid trains could be made either by the catenary or by the on-board energy storage system (ESS). Thus, the on-board ESS would feed the energetic needs of the train along the non-electrified zones while in electrified zones, the catenary would feed the train and recharge the on-board ESS. This paper’s objective deals with the technical and economic potential identification of partial electrification of railway lines. This study provides different scenarios of electrification by replacing the most expensive places to electrify using on-board ESS. The target is to reduce the cost of new electrification projects, i.e. reduce the cost of electrification infrastructures while not increasing the cost of rolling stocks. In this study, scenarios are constructed in function of the electrification’s cost of each structure. The electrification’s cost varies considerably because of the installation of catenary support in tunnels, bridges and viaducts is much more expensive than in others zones of the railway. These scenarios will be used to describe the power supply system and to choose between the catenary and the on-board energy storage depending on the position of the train on the railway. To identify the influence of each partial electrification scenario in the sizing of the on-board ESS, a model of the railway line and of the rolling stock is developed for a real case. This real case concerns a railway line located in the south of France. The energy consumption and the power demanded at each point of the line for each power supply (catenary or on-board ESS) are provided at the end of the simulation. Finally, the cost of a partial electrification is obtained by adding the civil engineering costs of the zones to be electrified plus the cost of the on-board ESS. The study of the technical and economic potential ends with the identification of the most economically interesting scenario of electrification.Keywords: electrification, hybrid, railway, storage
Procedia PDF Downloads 4312958 Climate Changes Impact on Artificial Wetlands
Authors: Carla Idely Palencia-Aguilar
Abstract:
Artificial wetlands play an important role at Guasca Municipality in Colombia, not only because they are used for the agroindustry, but also because more than 45 species were found, some of which are endemic and migratory birds. Remote sensing was used to determine the changes in the area occupied by water of artificial wetlands by means of Aster and Modis images for different time periods. Evapotranspiration was also determined by three methods: Surface Energy Balance System-Su (SEBS) algorithm, Surface Energy Balance- Bastiaanssen (SEBAL) algorithm, and Potential Evapotranspiration- FAO. Empirical equations were also developed to determine the relationship between Normalized Difference Vegetation Index (NDVI) versus net radiation, ambient temperature and rain with an obtained R2 of 0.83. Groundwater level fluctuations on a daily basis were studied as well. Data from a piezometer placed next to the wetland were fitted with rain changes (with two weather stations located at the proximities of the wetlands) by means of multiple regression and time series analysis, the R2 from the calculated and measured values resulted was higher than 0.98. Information from nearby weather stations provided information for ordinary kriging as well as the results for the Digital Elevation Model (DEM) developed by using PCI software. Standard models (exponential, spherical, circular, gaussian, linear) to describe spatial variation were tested. Ordinary Cokriging between height and rain variables were also tested, to determine if the accuracy of the interpolation would increase. The results showed no significant differences giving the fact that the mean result of the spherical function for the rain samples after ordinary kriging was 58.06 and a standard deviation of 18.06. The cokriging using for the variable rain, a spherical function; for height variable, the power function and for the cross variable (rain and height), the spherical function had a mean of 57.58 and a standard deviation of 18.36. Threatens of eutrophication were also studied, given the unconsciousness of neighbours and government deficiency. Water quality was determined over the years; different parameters were studied to determine the chemical characteristics of water. In addition, 600 pesticides were studied by gas and liquid chromatography. Results showed that coliforms, nitrogen, phosphorous and prochloraz were the most significant contaminants.Keywords: DEM, evapotranspiration, geostatistics, NDVI
Procedia PDF Downloads 1202957 Variations in the Frequency-Magnitude Distribution with Depth in Kalabsha Area, Aswan, South Egypt
Authors: Ezzat Mohamed El-Amin
Abstract:
Mapping the earthquake-size distribution in various tectonic regimes on a local to regional scale reveals statistically significant variations in the range of at least 0.4 to 2.0 for the b-value in the frequency-magnitude distribution. We map the earthquake frequency–magnitude distribution (b value) as a function of depth in the Reservoir Triggered Seismicity (RTS) region in Kalabsha region, in south Egypt. About 1680 well-located events recorded during 1981–2014 in the Kalabsha region are selected for the analysis. The earthquake data sets are separated in 5 km zones from 0 to 25 km depth. The result shows a systematic decrease in b value up to 12 km followed by an increase. The increase in b value is interpreted to be caused by the presence of fluids. We also investigate the spatial distribution of b value with depth. Significant variations in the b value are detected, with b ranging from b 0.7 to 1.19. Low b value areas at 5 km depth indicate localized high stresses which are favorable for future rupture.Keywords: seismicity, frequency-magnitude, b-value, earthquake
Procedia PDF Downloads 5592956 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation
Authors: Abdal-Hafeez Alhussein
Abstract:
Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.Keywords: artificial intelligence, information technology, automation, scalability
Procedia PDF Downloads 172955 Harnessing Artificial Intelligence for Early Detection and Management of Infectious Disease Outbreaks
Authors: Amarachukwu B. Isiaka, Vivian N. Anakwenze, Chinyere C. Ezemba, Chiamaka R. Ilodinso, Chikodili G. Anaukwu, Chukwuebuka M. Ezeokoli, Ugonna H. Uzoka
Abstract:
Infectious diseases continue to pose significant threats to global public health, necessitating advanced and timely detection methods for effective outbreak management. This study explores the integration of artificial intelligence (AI) in the early detection and management of infectious disease outbreaks. Leveraging vast datasets from diverse sources, including electronic health records, social media, and environmental monitoring, AI-driven algorithms are employed to analyze patterns and anomalies indicative of potential outbreaks. Machine learning models, trained on historical data and continuously updated with real-time information, contribute to the identification of emerging threats. The implementation of AI extends beyond detection, encompassing predictive analytics for disease spread and severity assessment. Furthermore, the paper discusses the role of AI in predictive modeling, enabling public health officials to anticipate the spread of infectious diseases and allocate resources proactively. Machine learning algorithms can analyze historical data, climatic conditions, and human mobility patterns to predict potential hotspots and optimize intervention strategies. The study evaluates the current landscape of AI applications in infectious disease surveillance and proposes a comprehensive framework for their integration into existing public health infrastructures. The implementation of an AI-driven early detection system requires collaboration between public health agencies, healthcare providers, and technology experts. Ethical considerations, privacy protection, and data security are paramount in developing a framework that balances the benefits of AI with the protection of individual rights. The synergistic collaboration between AI technologies and traditional epidemiological methods is emphasized, highlighting the potential to enhance a nation's ability to detect, respond to, and manage infectious disease outbreaks in a proactive and data-driven manner. The findings of this research underscore the transformative impact of harnessing AI for early detection and management, offering a promising avenue for strengthening the resilience of public health systems in the face of evolving infectious disease challenges. This paper advocates for the integration of artificial intelligence into the existing public health infrastructure for early detection and management of infectious disease outbreaks. The proposed AI-driven system has the potential to revolutionize the way we approach infectious disease surveillance, providing a more proactive and effective response to safeguard public health.Keywords: artificial intelligence, early detection, disease surveillance, infectious diseases, outbreak management
Procedia PDF Downloads 662954 Biodiesel Production from Palm Oil Using an Oscillatory Baffled Reactor
Authors: Malee Santikunaporn, Tattep Techopittayakul, Channarong Asavatesanupap
Abstract:
Biofuel production especially that of biodiesel has gained tremendous attention during the last decade due to environmental concerns and shortage in petroleum oil reservoir. This research aims to investigate the influences of operating parameters, such as the alcohol-to-oil molar ratio (4:1, 6:1, and 9:1) and the amount of catalyst (1, 1.5, and 2 wt.%) on the trans esterification of refined palm oil (RPO) in a medium-scale oscillatory baffle reactor. It has been shown that an increase in the methanol-to-oil ratio resulted in an increase in fatty acid methyl esters (FAMEs) content. The amount of catalyst has an insignificant effect on the FAMEs content. Engine testing was performed on B0 (100 v/v% diesel) and blended fuel or B50 (50 v/v% diesel). Combustion of B50 was found to give lower torque compared to pure diesel. Exhaust gas from B50 was found to contain lower concentration of CO and CO2.Keywords: biodiesel, palm oil, transesterification, oscillatory baffled reactor
Procedia PDF Downloads 1772953 Seismic Data Analysis of Intensity, Orientation and Distribution of Fractures in Basement Rocks for Reservoir Characterization
Authors: Mohit Kumar
Abstract:
Natural fractures are classified in two broad categories of joints and faults on the basis of shear movement in the deposited strata. Natural fracture always has high structural relationship with extensional or non-extensional tectonics and sometimes the result is seen in the form of micro cracks. Geological evidences suggest that both large and small-scale fractures help in to analyze the seismic anisotropy which essentially contribute into characterization of petro physical properties behavior associated with directional migration of fluid. We generally question why basement study is much needed as historically it is being treated as non-productive and geoscientist had no interest in exploration of these basement rocks. Basement rock goes under high pressure and temperature, and seems to be highly fractured because of the tectonic stresses that are applied to the formation along with the other geological factors such as depositional trend, internal stress of the rock body, rock rheology, pore fluid and capillary pressure. Sometimes carbonate rocks also plays the role of basement and igneous body e.g basalt deposited over the carbonate rocks and fluid migrate from carbonate to igneous rock due to buoyancy force and adequate permeability generated by fracturing. So in order to analyze the complete petroleum system, FMC (Fluid Migration Characterization) is necessary through fractured media including fracture intensity, orientation and distribution both in basement rock and county rock. Thus good understanding of fractures can lead to project the correct wellbore trajectory or path which passes through potential permeable zone generated through intensified P-T and tectonic stress condition. This paper deals with the analysis of these fracture property such as intensity, orientation and distribution in basement rock as large scale fracture can be interpreted on seismic section, however, small scale fractures show ambiguity in interpretation because fracture in basement rock lies below the seismic wavelength and hence shows erroneous result in identification. Seismic attribute technique also helps us to delineate the seismic fracture and subtle changes in fracture zone and these can be inferred from azimuthal anisotropy in velocity and amplitude and spectral decomposition. Seismic azimuthal anisotropy derives fracture intensity and orientation from compressional wave and converted wave data and based on variation of amplitude or velocity with azimuth. Still detailed analysis of fractured basement required full isotropic and anisotropic analysis of fracture matrix and surrounding rock matrix in order to characterize the spatial variability of basement fracture which support the migration of fluid from basement to overlying rock.Keywords: basement rock, natural fracture, reservoir characterization, seismic attribute
Procedia PDF Downloads 1972952 Analysis of Reflection of Elastic Waves in Three Dimensional Model Comprised with Viscoelastic Anisotropic Medium
Authors: Amares Chattopadhyay, Akanksha Srivastava
Abstract:
A unified approach to study the reflection of a plane wave in three-dimensional model comprised of the triclinic viscoelastic medium. The phase velocities of reflected qP, qSV and qSH wave have been calculated for the concerned medium by using the eigenvalue approach. The generalized method has been implemented to compute the complex form of amplitude ratios. Further, we discussed the nature of reflection coefficients of qP, qSV and qSH wave. The viscoelastic parameter, polar angle and azimuthal angle are found to be strongly influenced by amplitude ratios. The research article is particularly focused to study the effect of viscoelasticity associated with highly anisotropic media which exhibits the notable information about the reflection coefficients of qP, qSV, and qSH wave. The outcomes may further useful to the better exploration of all types of hydrocarbon reservoir and advancement in the field of reflection seismology.Keywords: amplitude ratios, three dimensional, triclinic, viscoelastic
Procedia PDF Downloads 2302951 Off-Policy Q-learning Technique for Intrusion Response in Network Security
Authors: Zheni S. Stefanova, Kandethody M. Ramachandran
Abstract:
With the increasing dependency on our computer devices, we face the necessity of adequate, efficient and effective mechanisms, for protecting our network. There are two main problems that Intrusion Detection Systems (IDS) attempt to solve. 1) To detect the attack, by analyzing the incoming traffic and inspect the network (intrusion detection). 2) To produce a prompt response when the attack occurs (intrusion prevention). It is critical creating an Intrusion detection model that will detect a breach in the system on time and also challenging making it provide an automatic and with an acceptable delay response at every single stage of the monitoring process. We cannot afford to adopt security measures with a high exploiting computational power, and we are not able to accept a mechanism that will react with a delay. In this paper, we will propose an intrusion response mechanism that is based on artificial intelligence, and more precisely, reinforcement learning techniques (RLT). The RLT will help us to create a decision agent, who will control the process of interacting with the undetermined environment. The goal is to find an optimal policy, which will represent the intrusion response, therefore, to solve the Reinforcement learning problem, using a Q-learning approach. Our agent will produce an optimal immediate response, in the process of evaluating the network traffic.This Q-learning approach will establish the balance between exploration and exploitation and provide a unique, self-learning and strategic artificial intelligence response mechanism for IDS.Keywords: cyber security, intrusion prevention, optimal policy, Q-learning
Procedia PDF Downloads 2362950 Quality and Shelf life of UHT Milk Produced in Tripoli, Libya
Authors: Faozia A. S. Abuhtana, Yahia S. Abujnah, Said O. Gnann
Abstract:
Ultra High Temperature (UHT) processed milk is widely distributed and preferred in numerous countries all over the world due its relatively high quality and long shelf life. Because of the notable high consumption rate of UHT in Libya in addition to negligible studies related to such product on the local level, this study was designed to assess the shelf life of locally produced as well as imported reconstituted sterilized whole milk samples marketed in Tripoli, Libya . Four locally produced vs. three imported brands were used in this study. All samples were stored at room temperature (25± 2C ) for 8 month long period, and subjected to physical, chemical, microbiological and sensory tests. These tests included : measurement of pH, specific gravity, percent acidity, and determination of fat, protein and melamine content. Microbiological tests included total aerobic count, total psychotropic bacteria, total spore forming bacteria and total coliform counts. Results indicated no detection of microbial growth of any type during the study period, in addition to no detection of melamine in all samples. On the other hand, a gradual decline in pH accompanied with gradual increase in % acidity of both locally produced and imported samples was observed. Such changes in both pH and % acidity reached their lowest and highest values respectively during the 24th week of storage. For instance pH values were (6.40, 6.55, 6.55, 6.15) and (6.30, 6.50, 6.20) for local and imported brands respectively. On the other hand, % acidity reached (0.185, 0181, 0170, 0183) and (0180, 0.180, 0.171) at the 24th week for local and imported brands respectively. Similar pattern of decline was also observed in specific gravity, fat and protein content in some local and imported samples especially at later stages of the study. In both cases, some of the recorded pH values, % acidity, sp. gravity and fat content were in violation of the accepted limits set by Libyan standard no. 356 for sterilized milk. Such changes in pH, % acidity and other UHT sterilized milk constituents during storage were coincided with a gradual decrease in the degree of acceptance of the stored milk samples of both types as shown by sensory scores recorded by the panelists. In either case degree of acceptance was significantly low at late stages of storage and most milk samples became relatively unacceptable after the 18th and 20th week for both untrained and trained panelists respectively.Keywords: UHT milk, shelf life, quality, gravity, bacteria
Procedia PDF Downloads 3382949 Prediction of Road Accidents in Qatar by 2022
Authors: M. Abou-Amouna, A. Radwan, L. Al-kuwari, A. Hammuda, K. Al-Khalifa
Abstract:
There is growing concern over increasing incidences of road accidents and consequent loss of human life in Qatar. In light to the future planned event in Qatar, World Cup 2022; Qatar should put into consideration the future deaths caused by road accidents, and past trends should be considered to give a reasonable picture of what may happen in the future. Qatar roads should be arranged and paved in a way that accommodate high capacity of the population in that time, since then there will be a huge number of visitors from the world. Qatar should also consider the risk issues of road accidents raised in that period, and plan to maintain high level to safety strategies. According to the increase in the number of road accidents in Qatar from 1995 until 2012, an analysis of elements affecting and causing road accidents will be effectively studied. This paper aims to identify and criticize the factors that have high effect on causing road accidents in the state of Qatar, and predict the total number of road accidents in Qatar 2022. Alternative methods are discussed and the most applicable ones according to the previous researches are selected for further studies. The methods that satisfy the existing case in Qatar were the multiple linear regression model (MLR) and artificial neutral network (ANN). Those methods are analyzed and their findings are compared. We conclude that by using MLR the number of accidents in 2022 will become 355,226 accidents, and by using ANN 216,264 accidents. We conclude that MLR gave better results than ANN because the artificial neutral network doesn’t fit data with large range varieties.Keywords: road safety, prediction, accident, model, Qatar
Procedia PDF Downloads 2582948 AIR SAFE: an Internet of Things System for Air Quality Management Leveraging Artificial Intelligence Algorithms
Authors: Mariangela Viviani, Daniele Germano, Simone Colace, Agostino Forestiero, Giuseppe Papuzzo, Sara Laurita
Abstract:
Nowadays, people spend most of their time in closed environments, in offices, or at home. Therefore, secure and highly livable environmental conditions are needed to reduce the probability of aerial viruses spreading. Also, to lower the human impact on the planet, it is important to reduce energy consumption. Heating, Ventilation, and Air Conditioning (HVAC) systems account for the major part of energy consumption in buildings [1]. Devising systems to control and regulate the airflow is, therefore, essential for energy efficiency. Moreover, an optimal setting for thermal comfort and air quality is essential for people’s well-being, at home or in offices, and increases productivity. Thanks to the features of Artificial Intelligence (AI) tools and techniques, it is possible to design innovative systems with: (i) Improved monitoring and prediction accuracy; (ii) Enhanced decision-making and mitigation strategies; (iii) Real-time air quality information; (iv) Increased efficiency in data analysis and processing; (v) Advanced early warning systems for air pollution events; (vi) Automated and cost-effective m onitoring network; and (vii) A better understanding of air quality patterns and trends. We propose AIR SAFE, an IoT-based infrastructure designed to optimize air quality and thermal comfort in indoor environments leveraging AI tools. AIR SAFE employs a network of smart sensors collecting indoor and outdoor data to be analyzed in order to take any corrective measures to ensure the occupants’ wellness. The data are analyzed through AI algorithms able to predict the future levels of temperature, relative humidity, and CO₂ concentration [2]. Based on these predictions, AIR SAFE takes actions, such as opening/closing the window or the air conditioner, to guarantee a high level of thermal comfort and air quality in the environment. In this contribution, we present the results from the AI algorithm we have implemented on the first s et o f d ata c ollected i n a real environment. The results were compared with other models from the literature to validate our approach.Keywords: air quality, internet of things, artificial intelligence, smart home
Procedia PDF Downloads 932947 A Combined Approach Based on Artificial Intelligence and Computer Vision for Qualitative Grading of Rice Grains
Authors: Hemad Zareiforoush, Saeed Minaei, Ahmad Banakar, Mohammad Reza Alizadeh
Abstract:
The quality inspection of rice (Oryza sativa L.) during its various processing stages is very important. In this research, an artificial intelligence-based model coupled with computer vision techniques was developed as a decision support system for qualitative grading of rice grains. For conducting the experiments, first, 25 samples of rice grains with different levels of percentage of broken kernels (PBK) and degree of milling (DOM) were prepared and their qualitative grade was assessed by experienced experts. Then, the quality parameters of the same samples examined by experts were determined using a machine vision system. A grading model was developed based on fuzzy logic theory in MATLAB software for making a relationship between the qualitative characteristics of the product and its quality. Totally, 25 rules were used for qualitative grading based on AND operator and Mamdani inference system. The fuzzy inference system was consisted of two input linguistic variables namely, DOM and PBK, which were obtained by the machine vision system, and one output variable (quality of the product). The model output was finally defuzzified using Center of Maximum (COM) method. In order to evaluate the developed model, the output of the fuzzy system was compared with experts’ assessments. It was revealed that the developed model can estimate the qualitative grade of the product with an accuracy of 95.74%.Keywords: machine vision, fuzzy logic, rice, quality
Procedia PDF Downloads 4192946 Comparative Evaluation of Different Extenders and Sperm Protectors to Keep the Spermatozoa Viable for More than 24 Hours
Authors: A. M. Raseona, D. M. Barry, T. L. Nedambale
Abstract:
Preservation of semen is an important process to ensure that semen quality is sufficient for assisted reproductive technology. This study evaluated the effectiveness of different extenders to preserve Nguni bull semen stored at controlled room temperature 24 °C for three days, as an alternative to frozen-thawed semen straws used for artificial insemination. Semen samples were collected from two Nguni bulls using an electro-ejaculator and transported to the laboratory for evaluation. Pooled semen was aliquot into three extenders Triladyl, Ham’s F10 and M199 at a dilution ratio of 1:4 then stored at controlled room temperature 24 °C. Sperm motility was analysed after 0, 24, 48 and 72 hours. Morphology and viability were analysed after 72 hours. The study was replicated four times and data was analysed by analysis of variance (ANOVA). Triladyl showed higher viability percentage and consistent total motility for three days. Ham’s F10 showed higher progressive motility compared to the other extenders. There was no significant difference in viability between Ham’s F10 and M199. No significant difference was also observed in total abnormality between the two Nguni bulls. In conclusion, Nguni semen can be preserved in Triladyl or Ham’s F10 and M199 culture media stored at 24 °C and stay alive for three days. Triladyl proved to be the best extender showing high viability and consistency in total motility as compared to Ham’s F10 and M199.Keywords: bull semen, artificial insemination, Triladyl, Ham’s F10, M199, viability
Procedia PDF Downloads 5002945 Neuro-Fuzzy Approach to Improve Reliability in Auxiliary Power Supply System for Nuclear Power Plant
Authors: John K. Avor, Choong-Koo Chang
Abstract:
The transfer of electrical loads at power generation stations from Standby Auxiliary Transformer (SAT) to Unit Auxiliary Transformer (UAT) and vice versa is through a fast bus transfer scheme. Fast bus transfer is a time-critical application where the transfer process depends on various parameters, thus transfer schemes apply advance algorithms to ensure power supply reliability and continuity. In a nuclear power generation station, supply continuity is essential, especially for critical class 1E electrical loads. Bus transfers must, therefore, be executed accurately within 4 to 10 cycles in order to achieve safety system requirements. However, the main problem is that there are instances where transfer schemes scrambled due to inaccurate interpretation of key parameters; and consequently, have failed to transfer several critical loads from UAT to the SAT during main generator trip event. Although several techniques have been adopted to develop robust transfer schemes, a combination of Artificial Neural Network and Fuzzy Systems (Neuro-Fuzzy) has not been extensively used. In this paper, we apply the concept of Neuro-Fuzzy to determine plant operating mode and dynamic prediction of the appropriate bus transfer algorithm to be selected based on the first cycle of voltage information. The performance of Sequential Fast Transfer and Residual Bus Transfer schemes was evaluated through simulation and integration of the Neuro-Fuzzy system. The objective for adopting Neuro-Fuzzy approach in the bus transfer scheme is to utilize the signal validation capabilities of artificial neural network, specifically the back-propagation algorithm which is very accurate in learning completely new systems. This research presents a combined effect of artificial neural network and fuzzy systems to accurately interpret key bus transfer parameters such as magnitude of the residual voltage, decay time, and the associated phase angle of the residual voltage in order to determine the possibility of high speed bus transfer for a particular bus and the corresponding transfer algorithm. This demonstrates potential for general applicability to improve reliability of the auxiliary power distribution system. The performance of the scheme is implemented on APR1400 nuclear power plant auxiliary system.Keywords: auxiliary power system, bus transfer scheme, fuzzy logic, neural networks, reliability
Procedia PDF Downloads 1712944 Stability Optimization of NABH₄ via PH and H₂O:NABH₄ Ratios for Large Scale Hydrogen Production
Authors: Parth Mehta, Vedasri Bai Khavala, Prabhu Rajagopal, Tiju Thomas
Abstract:
There is an increasing need for alternative clean fuels, and hydrogen (H₂) has long been considered a promising solution with a high calorific value (142MJ/kg). However, the storage of H₂ and expensive processes for its generation have hindered its usage. Sodium borohydride (NaBH₄) can potentially be used as an economically viable means of H₂ storage. Thus far, there have been attempts to optimize the life of NaBH₄ (half-life) in aqueous media by stabilizing it with sodium hydroxide (NaOH) for various pH values. Other reports have shown that H₂ yield and reaction kinetics remained constant for all ratios of H₂O to NaBH₄ > 30:1, without any acidic catalysts. Here we highlight the importance of pH and H₂O: NaBH₄ ratio (80:1, 40:1, 20:1 and 10:1 by weight), for NaBH₄ stabilization (half-life reaction time at room temperature) and corrosion minimization of H₂ reactor components. It is interesting to observe that at any particular pH>10 (e.g., pH = 10, 11 and 12), the H₂O: NaBH₄ ratio does not have the expected linear dependence with stability. On the contrary, high stability was observed at the ratio of 10:1 H₂O: NaBH₄ across all pH>10. When the H₂O: NaBH₄ ratio is increased from 10:1 to 20:1 and beyond (till 80:1), constant stability (% degradation) is observed with respect to time. For practical usage (consumption within 6 hours of making NaBH₄ solution), 15% degradation at pH 11 and NaBH₄: H₂O ratio of 10:1 is recommended. Increasing this ratio demands higher NaOH concentration at the same pH, thus requiring a higher concentration or volume of acid (e.g., HCl) for H₂ generation. The reactions are done with tap water to render the results useful from an industrial standpoint. The observed stability regimes are rationalized based on complexes associated with NaBH₄ when solvated in water, which depend sensitively on both pH and NaBH₄: H₂O ratio.Keywords: hydrogen, sodium borohydride, stability optimization, H₂O:NaBH₄ ratio
Procedia PDF Downloads 1202943 Graph-Oriented Summary for Optimized Resource Description Framework Graphs Streams Processing
Authors: Amadou Fall Dia, Maurras Ulbricht Togbe, Aliou Boly, Zakia Kazi Aoul, Elisabeth Metais
Abstract:
Existing RDF (Resource Description Framework) Stream Processing (RSP) systems allow continuous processing of RDF data issued from different application domains such as weather station measuring phenomena, geolocation, IoT applications, drinking water distribution management, and so on. However, processing window phase often expires before finishing the entire session and RSP systems immediately delete data streams after each processed window. Such mechanism does not allow optimized exploitation of the RDF data streams as the most relevant and pertinent information of the data is often not used in a due time and almost impossible to be exploited for further analyzes. It should be better to keep the most informative part of data within streams while minimizing the memory storage space. In this work, we propose an RDF graph summarization system based on an explicit and implicit expressed needs through three main approaches: (1) an approach for user queries (SPARQL) in order to extract their needs and group them into a more global query, (2) an extension of the closeness centrality measure issued from Social Network Analysis (SNA) to determine the most informative parts of the graph and (3) an RDF graph summarization technique combining extracted user query needs and the extended centrality measure. Experiments and evaluations show efficient results in terms of memory space storage and the most expected approximate query results on summarized graphs compared to the source ones.Keywords: centrality measures, RDF graphs summary, RDF graphs stream, SPARQL query
Procedia PDF Downloads 2032942 Comparison between Two Software Packages GSTARS4 and HEC-6 about Prediction of the Sedimentation Amount in Dam Reservoirs and to Estimate Its Efficient Life Time in the South of Iran
Authors: Fatemeh Faramarzi, Hosein Mahjoob
Abstract:
Building dams on rivers for utilization of water resources causes problems in hydrodynamic equilibrium and results in leaving all or part of the sediments carried by water in dam reservoir. This phenomenon has also significant impacts on water and sediment flow regime and in the long term can cause morphological changes in the environment surrounding the river, reducing the useful life of the reservoir which threatens sustainable development through inefficient management of water resources. In the past, empirical methods were used to predict the sedimentation amount in dam reservoirs and to estimate its efficient lifetime. But recently the mathematical and computational models are widely used in sedimentation studies in dam reservoirs as a suitable tool. These models usually solve the equations using finite element method. This study compares the results from tow software packages, GSTARS4 & HEC-6, in the prediction of the sedimentation amount in Dez dam, southern Iran. The model provides a one-dimensional, steady-state simulation of sediment deposition and erosion by solving the equations of momentum, flow and sediment continuity and sediment transport. GSTARS4 (Generalized Sediment Transport Model for Alluvial River Simulation) which is based on a one-dimensional mathematical model that simulates bed changes in both longitudinal and transverse directions by using flow tubes in a quasi-two-dimensional scheme to calibrate a period of 47 years and forecast the next 47 years of sedimentation in Dez Dam, Southern Iran. This dam is among the highest dams all over the world (with its 203 m height), and irrigates more than 125000 square hectares of downstream lands and plays a major role in flood control in the region. The input data including geometry, hydraulic and sedimentary data, starts from 1955 to 2003 on a daily basis. To predict future river discharge, in this research, the time series data were assumed to be repeated after 47 years. Finally, the obtained result was very satisfactory in the delta region so that the output from GSTARS4 was almost identical to the hydrographic profile in 2003. In the Dez dam due to the long (65 km) and a large tank, the vertical currents are dominant causing the calculations by the above-mentioned method to be inaccurate. To solve this problem, we used the empirical reduction method to calculate the sedimentation in the downstream area which led to very good answers. Thus, we demonstrated that by combining these two methods a very suitable model for sedimentation in Dez dam for the study period can be obtained. The present study demonstrated successfully that the outputs of both methods are the same.Keywords: Dez Dam, prediction, sedimentation, water resources, computational models, finite element method, GSTARS4, HEC-6
Procedia PDF Downloads 3132941 A Non-Destructive Estimation Method for Internal Time in Perilla Leaf Using Hyperspectral Data
Authors: Shogo Nagano, Yusuke Tanigaki, Hirokazu Fukuda
Abstract:
Vegetables harvested early in the morning or late in the afternoon are valued in plant production, and so the time of harvest is important. The biological functions known as circadian clocks have a significant effect on this harvest timing. The purpose of this study was to non-destructively estimate the circadian clock and so construct a method for determining a suitable harvest time. We took eight samples of green busil (Perilla frutescens var. crispa) every 4 hours, six times for 1 day and analyzed all samples at the same time. A hyperspectral camera was used to collect spectrum intensities at 141 different wavelengths (350–1050 nm). Calculation of correlations between spectrum intensity of each wavelength and harvest time suggested the suitability of the hyperspectral camera for non-destructive estimation. However, even the highest correlated wavelength had a weak correlation, so we used machine learning to raise the accuracy of estimation and constructed a machine learning model to estimate the internal time of the circadian clock. Artificial neural networks (ANN) were used for machine learning because this is an effective analysis method for large amounts of data. Using the estimation model resulted in an error between estimated and real times of 3 min. The estimations were made in less than 2 hours. Thus, we successfully demonstrated this method of non-destructively estimating internal time.Keywords: artificial neural network (ANN), circadian clock, green busil, hyperspectral camera, non-destructive evaluation
Procedia PDF Downloads 2992940 Use of Polymeric Materials in the Architectural Preservation
Authors: F. Z. Benabid, F. Zouai, A. Douibi, D. Benachour
Abstract:
These Fluorinated polymers and polyacrylics have known a wide use in the field of historical monuments. PVDF provides a great easiness to processing, a good UV resistance and good chemical inertia. Although the quality of physical characteristics of the PMMA and its low price with a respect to PVDF, its deterioration against UV radiations limits its use as protector agent for the stones. On the other hand, PVDF/PMMA blend is a compromise of a great development in the field of architectural restoration, since it is the best method in term of quality and price to make new polymeric materials having enhanced properties. Films of different compositions based on the two polymers within an adequate solvent (DMF) were obtained to perform an exposition to artificial ageing and to the salted fog, a spectroscopic analysis (FTIR and UV) and optical analysis (refractive index). Based on its great interest in the field of building, a variety of standard tests has been elaborated for the first time at the central laboratory of ENAP (Souk-Ahras) in order to evaluate our blend performance. The obtained results have allowed observing the behavior of the different compositions of the blend under various tests. The addition of PVDF to PMMA enhances the properties of this last to know the exhibition to the natural and artificial ageing and to the saline fog. On the other hand, PMMA enhances the optical properties of the blend. Finally, 70/30 composition of the blend is in concordance with results of previous works and it is the adequate proportion for an eventual application.Keywords: blend, PVDF, PMMA, preservation, historic monuments
Procedia PDF Downloads 3092939 Ethical Considerations of Disagreements Between Clinicians and Artificial Intelligence Recommendations: A Scoping Review
Authors: Adiba Matin, Daniel Cabrera, Javiera Bellolio, Jasmine Stewart, Dana Gerberi (librarian), Nathan Cummins, Fernanda Bellolio
Abstract:
OBJECTIVES: Artificial intelligence (AI) tools are becoming more prevalent in healthcare settings, particularly for diagnostic and therapeutic recommendations, with an expected surge in the incoming years. The bedside use of this technology for clinicians opens the possibility of disagreements between the recommendations from AI algorithms and clinicians’ judgment. There is a paucity in the literature analyzing nature and possible outcomes of these potential conflicts, particularly related to ethical considerations. The goal of this scoping review is to identify, analyze and classify current themes and potential strategies addressing ethical conflicts originating from the conflict between AI and human recommendations. METHODS: A protocol was written prior to the initiation of the study. Relevant literature was searched by a medical librarian for the terms of artificial intelligence, healthcare and liability, ethics, or conflict. Search was run in 2021 in Ovid Cochrane Central Register of Controlled Trials, Embase, Medline, IEEE Xplore, Scopus, and Web of Science Core Collection. Articles describing the role of AI in healthcare that mentioned conflict between humans and AI were included in the primary search. Two investigators working independently and in duplicate screened titles and abstracts and reviewed full-text of potentially eligible studies. Data was abstracted into tables and reported by themes. We followed methodological guidelines for Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR). RESULTS: Of 6846 titles and abstracts, 225 full texts were selected, and 48 articles included in this review. 23 articles were included as original research and review papers. 25 were included as editorials and commentaries with similar themes. There was a lack of consensus in the included articles on who would be held liable for mistakes incurred by following AI recommendations. It appears that there is a dichotomy of the perceived ethical consequences depending on if the negative outcome is a result of a human versus AI conflict or secondary to a deviation from standard of care. Themes identified included transparency versus opacity of recommendations, data bias, liability of outcomes, regulatory framework, and the overall scope of artificial intelligence in healthcare. A relevant issue identified was the concern by clinicians of the “black box” nature of these recommendations and the ability to judge appropriateness of AI guidance. CONCLUSION AI clinical tools are being rapidly developed and adopted, and the use of this technology will create conflicts between AI algorithms and healthcare workers with various outcomes. In turn, these conflicts may have legal, and ethical considerations. There is limited consensus about the focus of ethical and liability for outcomes originated from disagreements. This scoping review identified the importance of framing the problem in terms of conflict between standard of care or not, and informed by the themes of transparency/opacity, data bias, legal liability, absent regulatory frameworks and understanding of the technology. Finally, limited recommendations to mitigate ethical conflicts between AI and humans have been identified. Further work is necessary in this field.Keywords: ethics, artificial intelligence, emergency medicine, review
Procedia PDF Downloads 932938 Numerical Evaluation of Lateral Bearing Capacity of Piles in Cement-Treated Soils
Authors: Reza Ziaie Moayed, Saeideh Mohammadi
Abstract:
Soft soil is used in many of civil engineering projects like coastal, marine and road projects. Because of low shear strength and stiffness of soft soils, large settlement and low bearing capacity will occur under superstructure loads. This will make the civil engineering activities more difficult and costlier. In the case of soft soils, improvement is a suitable method to increase the shear strength and stiffness for engineering purposes. In recent years, the artificial cementation of soil by cement and lime has been extensively used for soft soil improvement. Cement stabilization is a well-established technique for improving soft soils. Artificial cementation increases the shear strength and hardness of the natural soils. On the other hand, in soft soils, the use of piles to transfer loads to the depths of ground is usual. By using cement treated soil around the piles, high bearing capacity and low settlement in piles can be achieved. In the present study, lateral bearing capacity of short piles in cemented soils is investigated by numerical approach. For this purpose, three dimensional (3D) finite difference software, FLAC 3D is used. Cement treated soil has a strain hardening-softening behavior, because of breaking of bonds between cement agent and soil particle. To simulate such behavior, strain hardening-softening soil constitutive model is used for cement treated soft soil. Additionally, conventional elastic-plastic Mohr Coulomb constitutive model and linear elastic model are used for stress-strain behavior of natural soils and pile. To determine the parameters of constitutive models and also for verification of numerical model, the results of available triaxial laboratory tests on and insitu loading of piles in cement treated soft soil are used. Different parameters are considered in parametric study to determine the effective parameters on the bearing of the piles on cemented treated soils. In the present paper, the effect of various length and height of the artificial cemented area, different diameter and length of the pile and the properties of the materials are studied. Also, the effect of choosing a constitutive model for cemented treated soils in the bearing capacity of the pile is investigated.Keywords: bearing capacity, cement-treated soils, FLAC 3D, pile
Procedia PDF Downloads 1262937 Generative Pre-Trained Transformers (GPT-3) and Their Impact on Higher Education
Authors: Sheelagh Heugh, Michael Upton, Kriya Kalidas, Stephen Breen
Abstract:
This article aims to create awareness of the opportunities and issues the artificial intelligence (AI) tool GPT-3 (Generative Pre-trained Transformer-3) brings to higher education. Technological disruptors have featured in higher education (HE) since Konrad Klaus developed the first functional programmable automatic digital computer. The flurry of technological advances, such as personal computers, smartphones, the world wide web, search engines, and artificial intelligence (AI), have regularly caused disruption and discourse across the educational landscape around harnessing the change for the good. Accepting AI influences are inevitable; we took mixed methods through participatory action research and evaluation approach. Joining HE communities, reviewing the literature, and conducting our own research around Chat GPT-3, we reviewed our institutional approach to changing our current practices and developing policy linked to assessments and the use of Chat GPT-3. We review the impact of GPT-3, a high-powered natural language processing (NLP) system first seen in 2020 on HE. Historically HE has flexed and adapted with each technological advancement, and the latest debates for educationalists are focusing on the issues around this version of AI which creates natural human language text from prompts and other forms that can generate code and images. This paper explores how Chat GPT-3 affects the current educational landscape: we debate current views around plagiarism, research misconduct, and the credibility of assessment and determine the tool's value in developing skills for the workplace and enhancing critical analysis skills. These questions led us to review our institutional policy and explore the effects on our current assessments and the development of new assessments. Conclusions: After exploring the pros and cons of Chat GTP-3, it is evident that this form of AI cannot be un-invented. Technology needs to be harnessed for positive outcomes in higher education. We have observed that materials developed through AI and potential effects on our development of future assessments and teaching methods. Materials developed through Chat GPT-3 can still aid student learning but lead to redeveloping our institutional policy around plagiarism and academic integrity.Keywords: artificial intelligence, Chat GPT-3, intellectual property, plagiarism, research misconduct
Procedia PDF Downloads 892936 Restored CO₂ from Flue Gas and Utilization by Converting to Methanol by 3 Step Processes: Steam Reforming, Reverse Water Gas Shift and Hydrogenation
Authors: Rujira Jitrwung, Kuntima Krekkeitsakul, Weerawat Patthaveekongka, Chiraphat Kumpidet, Jarukit Tepkeaw, Krissana Jaikengdee, Anantachai Wannajampa
Abstract:
Flue gas discharging from coal fired or gas combustion power plant contains around 12% Carbon dioxide (CO₂), 6% Oxygen (O₂), and 82% Nitrogen (N₂).CO₂ is a greenhouse gas which has been concerned to the global warming. Carbon Capture, Utilization, and Storage (CCUS) is a topic which is a tool to deal with this CO₂ realization. Flue gas is drawn down from the chimney and filtered, then it is compressed to build up the pressure until 8 bar. This compressed flue gas is sent to three stages Pressure Swing Adsorption (PSA), which is filled with activated carbon. Experiments were showed the optimum adsorption pressure at 7bar, which CO₂ can be adsorbed step by step in 1st, 2nd, and 3rd stage, obtaining CO₂ concentration 29.8, 66.4, and 96.7 %, respectively. The mixed gas concentration from the last step is composed of 96.7% CO₂,2.7% N₂, and 0.6%O₂. This mixed CO₂product gas obtained from 3 stages PSA contained high concentration CO₂, which is ready to use for methanol synthesis. The mixed CO₂ was experimented in 5 Liter/Day of methanol synthesis reactor skid by 3 step processes as followed steam reforming, reverse water gas shift, and then hydrogenation. The result showed that proportional of mixed CO₂ and CH₄ 70/30, 50/50, 30/70 % (v/v), and 10/90 yielded methanol 2.4, 4.3, 5.6, and 6.0 Liter/day and save CO₂ 40, 30, 20, and 5 % respectively. The optimum condition resulted both methanol yield and CO₂ consumption using CO₂/CH₄ ratio 43/57 % (v/v), which yielded 4.8 Liter/day methanol and save CO₂ 27% comparing with traditional methanol production from methane steam reforming (5 Liter/day)and absent CO₂ consumption.Keywords: carbon capture utilization and storage, pressure swing adsorption, reforming, reverse water gas shift, methanol
Procedia PDF Downloads 1872935 The Role of Artificial Intelligence in Creating Personalized Health Content for Elderly People: A Systematic Review Study
Authors: Mahnaz Khalafehnilsaz, Rozina Rahnama
Abstract:
Introduction: The elderly population is growing rapidly, and with this growth comes an increased demand for healthcare services. Artificial intelligence (AI) has the potential to revolutionize the delivery of healthcare services to the elderly population. In this study, the various ways in which AI is used to create health content for elderly people and its transformative impact on the healthcare industry will be explored. Method: A systematic review of the literature was conducted to identify studies that have investigated the role of AI in creating health content specifically for elderly people. Several databases, including PubMed, Scopus, and Web of Science, were searched for relevant articles published between 2000 and 2022. The search strategy employed a combination of keywords related to AI, personalized health content, and the elderly. Studies that utilized AI to create health content for elderly individuals were included, while those that did not meet the inclusion criteria were excluded. A total of 20 articles that met the inclusion criteria were identified. Finding: The findings of this review highlight the diverse applications of AI in creating health content for elderly people. One significant application is the use of natural language processing (NLP), which involves the creation of chatbots and virtual assistants capable of providing personalized health information and advice to elderly patients. AI is also utilized in the field of medical imaging, where algorithms analyze medical images such as X-rays, CT scans, and MRIs to detect diseases and abnormalities. Additionally, AI enables the development of personalized health content for elderly patients by analyzing large amounts of patient data to identify patterns and trends that can inform healthcare providers in developing tailored treatment plans. Conclusion: AI is transforming the healthcare industry by providing a wide range of applications that can improve patient outcomes and reduce healthcare costs. From creating chatbots and virtual assistants to analyzing medical images and developing personalized treatment plans, AI is revolutionizing the way healthcare is delivered to elderly patients. Continued investment in this field is essential to ensure that elderly patients receive the best possible care.Keywords: artificial intelligence, health content, older adult, healthcare
Procedia PDF Downloads 692934 The Estimation Method of Inter-Story Drift for Buildings Based on Evolutionary Learning
Authors: Kyu Jin Kim, Byung Kwan Oh, Hyo Seon Park
Abstract:
The seismic responses-based structural health monitoring system has been performed to reduce seismic damage. The inter-story drift ratio which is the major index of the seismic capacity assessment is employed for estimating the seismic damage of buildings. Meanwhile, seismic response analysis to estimate the structural responses of building demands significantly high computational cost due to increasing number of high-rise and large buildings. To estimate the inter-story drift ratio of buildings from the earthquake efficiently, this paper suggests the estimation method of inter-story drift for buildings using an artificial neural network (ANN). In the method, the radial basis function neural network (RBFNN) is integrated with optimization algorithm to optimize the variable through evolutionary learning that refers to evolutionary radial basis function neural network (ERBFNN). The estimation method estimates the inter-story drift without seismic response analysis when the new earthquakes are subjected to buildings. The effectiveness of the estimation method is verified through a simulation using multi-degree of freedom system.Keywords: structural health monitoring, inter-story drift ratio, artificial neural network, radial basis function neural network, genetic algorithm
Procedia PDF Downloads 327