Search results for: artificial roughness
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2523

Search results for: artificial roughness

1413 AI-Based Information System for Hygiene and Safety Management of Shared Kitchens

Authors: Jongtae Rhee, Sangkwon Han, Seungbin Ji, Junhyeong Park, Byeonghun Kim, Taekyung Kim, Byeonghyeon Jeon, Jiwoo Yang

Abstract:

The shared kitchen is a concept that transfers the value of the sharing economy to the kitchen. It is a type of kitchen equipped with cooking facilities that allows multiple companies or chefs to share time and space and use it jointly. These shared kitchens provide economic benefits and convenience, such as reduced investment costs and rent, but also increase the risk of safety management, such as cross-contamination of food ingredients. Therefore, to manage the safety of food ingredients and finished products in a shared kitchen where several entities jointly use the kitchen and handle various types of food ingredients, it is critical to manage followings: the freshness of food ingredients, user hygiene and safety and cross-contamination of cooking equipment and facilities. In this study, it propose a machine learning-based system for hygiene safety and cross-contamination management, which are highly difficult to manage. User clothing management and user access management, which are most relevant to the hygiene and safety of shared kitchens, are solved through machine learning-based methodology, and cutting board usage management, which is most relevant to cross-contamination management, is implemented as an integrated safety management system based on artificial intelligence. First, to prevent cross-contamination of food ingredients, we use images collected through a real-time camera to determine whether the food ingredients match a given cutting board based on a real-time object detection model, YOLO v7. To manage the hygiene of user clothing, we use a camera-based facial recognition model to recognize the user, and real-time object detection model to determine whether a sanitary hat and mask are worn. In addition, to manage access for users qualified to enter the shared kitchen, we utilize machine learning based signature recognition module. By comparing the pairwise distance between the contract signature and the signature at the time of entrance to the shared kitchen, access permission is determined through a pre-trained signature verification model. These machine learning-based safety management tasks are integrated into a single information system, and each result is managed in an integrated database. Through this, users are warned of safety dangers through the tablet PC installed in the shared kitchen, and managers can track the cause of the sanitary and safety accidents. As a result of system integration analysis, real-time safety management services can be continuously provided by artificial intelligence, and machine learning-based methodologies are used for integrated safety management of shared kitchens that allows dynamic contracts among various users. By solving this problem, we were able to secure the feasibility and safety of the shared kitchen business.

Keywords: artificial intelligence, food safety, information system, safety management, shared kitchen

Procedia PDF Downloads 62
1412 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 118
1411 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach

Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar

Abstract:

Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.

Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI

Procedia PDF Downloads 150
1410 Neural Networks-based Acoustic Annoyance Model for Laptop Hard Disk Drive

Authors: Yichao Ma, Chengsiong Chin, Wailok Woo

Abstract:

Since the last decade, there has been a rapid growth in digital multimedia, such as high-resolution media files and three-dimentional movies. Hence, there is a need for large digital storage such as Hard Disk Drive (HDD). As such, users expect to have a quieter HDD in their laptop. In this paper, a jury test has been conducted on a group of 34 people where 17 of them are students who is the potential consumer, and the remaining are engineers who know the HDD. A total 13 HDD sound samples have been selected from over hundred HDD noise recordings. These samples are selected based on an agreed subjective feeling. The samples are played to the participants using head acoustic playback system which enabled them to experience as similar as possible the same environment as have been recorded. Analysis has been conducted and the obtained results have indicated different group has different perception over the noises. Two neural network-based acoustic annoyance models are established based on back propagation neural network. Four psychoacoustic metrics, loudness, sharpness, roughness and fluctuation strength, are used as the input of the model, and the subjective evaluation results are taken as the output. The developed models are reasonably accurate in simulating both training and test samples.

Keywords: hdd noise, jury test, neural network model, psychoacoustic annoyance

Procedia PDF Downloads 431
1409 Development of a Multi-Locus DNA Metabarcoding Method for Endangered Animal Species Identification

Authors: Meimei Shi

Abstract:

Objectives: The identification of endangered species, especially simultaneous detection of multiple species in complex samples, plays a critical role in alleged wildlife crime incidents and prevents illegal trade. This study was to develop a multi-locus DNA metabarcoding method for endangered animal species identification. Methods: Several pairs of universal primers were designed according to the mitochondria conserved gene regions. Experimental mixtures were artificially prepared by mixing well-defined species, including endangered species, e.g., forest musk, bear, tiger, pangolin, and sika deer. The artificial samples were prepared with 1-16 well-characterized species at 1% to 100% DNA concentrations. After multiplex-PCR amplification and parameter modification, the amplified products were analyzed by capillary electrophoresis and used for NGS library preparation. The DNA metabarcoding was carried out based on Illumina MiSeq amplicon sequencing. The data was processed with quality trimming, reads filtering, and OTU clustering; representative sequences were blasted using BLASTn. Results: According to the parameter modification and multiplex-PCR amplification results, five primer sets targeting COI, Cytb, 12S, and 16S, respectively, were selected as the NGS library amplification primer panel. High-throughput sequencing data analysis showed that the established multi-locus DNA metabarcoding method was sensitive and could accurately identify all species in artificial mixtures, including endangered animal species Moschus berezovskii, Ursus thibetanus, Panthera tigris, Manis pentadactyla, Cervus nippon at 1% (DNA concentration). In conclusion, the established species identification method provides technical support for customs and forensic scientists to prevent the illegal trade of endangered animals and their products.

Keywords: DNA metabarcoding, endangered animal species, mitochondria nucleic acid, multi-locus

Procedia PDF Downloads 133
1408 Forecast Financial Bubbles: Multidimensional Phenomenon

Authors: Zouari Ezzeddine, Ghraieb Ikram

Abstract:

From the results of the academic literature which evokes the limitations of previous studies, this article shows the reasons for multidimensionality Prediction of financial bubbles. A new framework for modeling study predicting financial bubbles by linking a set of variable presented on several dimensions dictating its multidimensional character. It takes into account the preferences of financial actors. A multicriteria anticipation of the appearance of bubbles in international financial markets helps to fight against a possible crisis.

Keywords: classical measures, predictions, financial bubbles, multidimensional, artificial neural networks

Procedia PDF Downloads 571
1407 Innovation Management in E-Health Care: The Implementation of New Technologies for Health Care in Europe and the USA

Authors: Dariusz M. Trzmielak, William Bradley Zehner, Elin Oftedal, Ilona Lipka-Matusiak

Abstract:

The use of new technologies should create new value for all stakeholders in the healthcare system. The article focuses on demonstrating that technologies or products typically enable new functionality, a higher standard of service, or a higher level of knowledge and competence for clinicians. It also highlights the key benefits that can be achieved through the use of artificial intelligence, such as relieving clinicians of many tasks and enabling the expansion and greater specialisation of healthcare services. The comparative analysis allowed the authors to create a classification of new technologies in e-health according to health needs and benefits for patients, doctors, and healthcare systems, i.e., the main stakeholders in the implementation of new technologies and products in healthcare. The added value of the development of new technologies in healthcare is diagnosed. The work is both theoretical and practical in nature. The primary research methods are bibliographic analysis and analysis of research data and market potential of new solutions for healthcare organisations. The bibliographic analysis is complemented by the author's case studies of implemented technologies, mostly based on artificial intelligence or telemedicine. In the past, patients were often passive recipients, the end point of the service delivery system, rather than stakeholders in the system. One of the dangers of powerful new technologies is that patients may become even more marginalised. Healthcare will be provided and delivered in an increasingly administrative, programmed way. The doctor may also become a robot, carrying out programmed activities - using 'non-human services'. An alternative approach is to put the patient at the centre, using technologies, products, and services that allow them to design and control technologies based on their own needs. An important contribution to the discussion is to open up the different dimensions of the user (carer and patient) and to make them aware of healthcare units implementing new technologies. The authors of this article outline the importance of three types of patients in the successful implementation of new medical solutions. The impact of implemented technologies is analysed based on: 1) "Informed users", who are able to use the technology based on a better understanding of it; 2) "Engaged users" who play an active role in the broader healthcare system as a result of the technology; 3) "Innovative users" who bring their own ideas to the table based on a deeper understanding of healthcare issues. The authors' research hypothesis is that the distinction between informed, engaged, and innovative users has an impact on the perceived and actual quality of healthcare services. The analysis is based on case studies of new solutions implemented in different medical centres. In addition, based on the observations of the Polish author, who is a manager at the largest medical research institute in Poland, with analytical input from American and Norwegian partners, the added value of the implementations for patients, clinicians, and the healthcare system will be demonstrated.

Keywords: innovation, management, medicine, e-health, artificial intelligence

Procedia PDF Downloads 12
1406 Leadership in the Era of AI: Growing Organizational Intelligence

Authors: Mark Salisbury

Abstract:

The arrival of artificially intelligent avatars and the automation they bring is worrying many of us, not only for our livelihood but for the jobs that may be lost to our kids. We worry about what our place will be as human beings in this new economy where much of it will be conducted online in the metaverse – in a network of 3D virtual worlds – working with intelligent machines. The Future of Leadership was written to address these fears and show what our place will be – the right place – in this new economy of AI avatars, automation, and 3D virtual worlds. But to be successful in this new economy, our job will be to bring wisdom to our workplace and the marketplace. And we will use AI avatars and 3D virtual worlds to do it. However, this book is about more than AI and the avatars that we will work with in the metaverse. It’s about building Organizational intelligence (OI) -- the capability of an organization to comprehend and create knowledge relevant to its purpose; in other words, it is the intellectual capacity of the entire organization. To increase organizational intelligence requires a new kind of knowledge worker, a wisdom worker, that requires a new kind of leadership. This book begins your story for how to become a leader of wisdom workers and be successful in the emerging wisdom economy. After this presentation, conference participants will be able to do the following: Recognize the characteristics of the new generation of wisdom workers and how they differ from their predecessors. Recognize that new leadership methods and techniques are needed to lead this new generation of wisdom workers. Apply personal and professional values – personal integrity, belief in something larger than yourself, and keeping the best interest of others in mind – to improve your work performance and lead others. Exhibit an attitude of confidence, courage, and reciprocity of sharing knowledge to increase your productivity and influence others. Leverage artificial intelligence to accelerate your ability to learn, augment your decision-making, and influence others.Utilize new technologies to communicate with human colleagues and intelligent machines to develop better solutions more quickly.

Keywords: metaverse, generative artificial intelligence, automation, leadership, organizational intelligence, wisdom worker

Procedia PDF Downloads 35
1405 Comparison of Surface Hardness of Filling Material Glass Ionomer Cement Which Soaked in Alcohol Containing Mouthwash and Alcohol-Free Mouthwash

Authors: Farid Yuristiawan, Aulina R. Rahmi, Detty Iryani, Gunawan

Abstract:

Glass ionomer cement is one of the filling material that often used in the field of dentistry because it is relatively less expensive and mostly available. Surface hardness is one of the most important properties of restoration material; it is the ability of material to stand against indentation, which is directly connected to the material compressive strength and its ability to withstand abrasion. The higher surface hardness of a material means it is better to withstand abrasion. The existence of glass ionomer cement in the mouth makes it susceptible to any substance that comes into mouth, one of them is mouthwash which is a solution that used for many purposes such as antiseptic, astringent, to prevent caries, and bad breath. The presence of alcohol in mouthwash could affect the properties of glass ionomer cement, surface hardness. Objective: To determine the comparison of surface hardness of glass ionomer cement which soaked in alcohol containing mouthwash and alcohol-free mouthwash. Methods: This research is a laboratory experimental type study. There were 30 samples made from GC FUJI IX GP EXTRA and then soaked in artificial saliva for the first 24 hours inside incubator which temperature and humidity were controlled. Samples then divided into three groups. The first group will be soaked in alcohol-containing mouthwash; second group will be soaked alcohol-free mouthwash and control group will be soaked in artificial saliva for 6 hours inside incubator. Listerine is the mouthwash that was used on this research and surface hardness was examined using Vickers Hardness Tester. The result of this research shows mean value for surface hardness of the first group is 16.36 VHN, 24.04 VHN for second group, and 43.60 VHN for control group. The result one way ANOVA with post hoc Bonferroni comparing test show significant results p = 0.00. Conclusions: The data showed there were statistically significant differences of surface hardness between each group, which surface hardness of the first group is lower than the second group, and both surface hardness of the first (alcohol mouthwash) and second group (alcohol-free mouthwash) are lowered than control group (p = 0.00).

Keywords: glass ionomer cement, mouthwash, surface hardness, Vickers hardness tester

Procedia PDF Downloads 219
1404 Safeguarding the Construction Industry: Interrogating and Mitigating Emerging Risks from AI in Construction

Authors: Abdelrhman Elagez, Rolla Monib

Abstract:

This empirical study investigates the observed risks associated with adopting Artificial Intelligence (AI) technologies in the construction industry and proposes potential mitigation strategies. While AI has transformed several industries, the construction industry is slowly adopting advanced technologies like AI, introducing new risks that lack critical analysis in the current literature. A comprehensive literature review identified a research gap, highlighting the lack of critical analysis of risks and the need for a framework to measure and mitigate the risks of AI implementation in the construction industry. Consequently, an online survey was conducted with 24 project managers and construction professionals, possessing experience ranging from 1 to 30 years (with an average of 6.38 years), to gather industry perspectives and concerns relating to AI integration. The survey results yielded several significant findings. Firstly, respondents exhibited a moderate level of familiarity (66.67%) with AI technologies, while the industry's readiness for AI deployment and current usage rates remained low at 2.72 out of 5. Secondly, the top-ranked barriers to AI adoption were identified as lack of awareness, insufficient knowledge and skills, data quality concerns, high implementation costs, absence of prior case studies, and the uncertainty of outcomes. Thirdly, the most significant risks associated with AI use in construction were perceived to be a lack of human control (decision-making), accountability, algorithm bias, data security/privacy, and lack of legislation and regulations. Additionally, the participants acknowledged the value of factors such as education, training, organizational support, and communication in facilitating AI integration within the industry. These findings emphasize the necessity for tailored risk assessment frameworks, guidelines, and governance principles to address the identified risks and promote the responsible adoption of AI technologies in the construction sector.

Keywords: risk management, construction, artificial intelligence, technology

Procedia PDF Downloads 76
1403 Development of Wear Resistant Ceramic Coating on Steel Using High Velocity Oxygen Flame Thermal Spray

Authors: Abhijit Pattnayak, Abhijith N.V, Deepak Kumar, Jayant Jain, Vijay Chaudhry

Abstract:

Hard and dense ceramic coatings deposited on the surface provide the ideal solution to the poor tribological properties exhibited by some popular stainless steels like EN-36, 17-4PH, etc. These steels are widely used in nuclear, fertilizer, food processing, and marine industries under extreme environmental conditions. The present study focuses on the development of Al₂O₃-CeO₂-rGO-based coatings on the surface of 17-4PH steel using High-Velocity Oxygen Flame (HVOF) thermal spray process. The coating is developed using an oxyacetylene flame. Further, we report the physical (Density, Surface roughness, Surface energetics), Metallurgical (Scanning electron microscopy, X-ray diffraction, Raman), Mechanical (Hardness(Vickers and Nano Hard-ness)), Tribological (Wear, Scratch hardness) and Chemical (corrosion) characterization of both As-sprayed coating and the Substrate (17-4 PH steel). The comparison of the properties will help us to understand the microstructure-property relationship of the coating and reveal the necessity and challenges of such coatings.

Keywords: thermal spray process, HVOF, ceramic coating, hardness, wear, corrosion

Procedia PDF Downloads 88
1402 Interfacial Investigation and Chemical Bonding in Graphene Reinforced Alumina Ceramic Nanocomposites

Authors: Iftikhar Ahmad, Mohammad Islam

Abstract:

Thermally exfoliated graphene nanomaterial was reinforced into Al2O3 ceramic and the nanocomposites were consolidated using rapid high-frequency induction heat sintering route. The resulting nanocomposites demonstrated higher mechanical properties due to efficient GNS incorporation and chemical interaction with the Al2O3 matrix grains. The enhancement in mechanical properties is attributed to (i) uniformly-dispersed GNS in the consolidated structure (ii) ability of GNS to decorate Al2O3 nanoparticles and (iii) strong GNS/Al2O3 chemical interaction during colloidal mixing and pullout/crack bridging toughening mechanisms during mechanical testing. The GNS/Al2O3 interaction during different processing stages was thoroughly examined by thermal and structural investigation of the interfacial area. The formation of an intermediate aluminum oxycarbide phase (Al2OC) via a confined carbothermal reduction reaction at the GNS/Al2O3 interface was observed using advanced electron microscopes. The GNS surface roughness improves GNS/Al2O3 mechanical locking and chemical compatibility. The sturdy interface phase facilitates efficient load transfer and delayed failure through impediment of crack propagation. The resulting nanocomposites, therefore, offer superior toughness.

Keywords: ceramics, nanocomposites, interfaces, nanostructures, electron microscopy, Al2O3

Procedia PDF Downloads 356
1401 Bridging Minds and Nature: Revolutionizing Elementary Environmental Education Through Artificial Intelligence

Authors: Hoora Beheshti Haradasht, Abooali Golzary

Abstract:

Environmental education plays a pivotal role in shaping the future stewards of our planet. Leveraging the power of artificial intelligence (AI) in this endeavor presents an innovative approach to captivate and educate elementary school children about environmental sustainability. This paper explores the application of AI technologies in designing interactive and personalized learning experiences that foster curiosity, critical thinking, and a deep connection to nature. By harnessing AI-driven tools, virtual simulations, and personalized content delivery, educators can create engaging platforms that empower children to comprehend complex environmental concepts while nurturing a lifelong commitment to protecting the Earth. With the pressing challenges of climate change and biodiversity loss, cultivating an environmentally conscious generation is imperative. Integrating AI in environmental education revolutionizes traditional teaching methods by tailoring content, adapting to individual learning styles, and immersing students in interactive scenarios. This paper delves into the potential of AI technologies to enhance engagement, comprehension, and pro-environmental behaviors among elementary school children. Modern AI technologies, including natural language processing, machine learning, and virtual reality, offer unique tools to craft immersive learning experiences. Adaptive platforms can analyze individual learning patterns and preferences, enabling real-time adjustments in content delivery. Virtual simulations, powered by AI, transport students into dynamic ecosystems, fostering experiential learning that goes beyond textbooks. AI-driven educational platforms provide tailored content, ensuring that environmental lessons resonate with each child's interests and cognitive level. By recognizing patterns in students' interactions, AI algorithms curate customized learning pathways, enhancing comprehension and knowledge retention. Utilizing AI, educators can develop virtual field trips and interactive nature explorations. Children can navigate virtual ecosystems, analyze real-time data, and make informed decisions, cultivating an understanding of the delicate balance between human actions and the environment. While AI offers promising educational opportunities, ethical concerns must be addressed. Safeguarding children's data privacy, ensuring content accuracy, and avoiding biases in AI algorithms are paramount to building a trustworthy learning environment. By merging AI with environmental education, educators can empower children not only with knowledge but also with the tools to become advocates for sustainable practices. As children engage in AI-enhanced learning, they develop a sense of agency and responsibility to address environmental challenges. The application of artificial intelligence in elementary environmental education presents a groundbreaking avenue to cultivate environmentally conscious citizens. By embracing AI-driven tools, educators can create transformative learning experiences that empower children to grasp intricate ecological concepts, forge an intimate connection with nature, and develop a strong commitment to safeguarding our planet for generations to come.

Keywords: artificial intelligence, environmental education, elementary children, personalized learning, sustainability

Procedia PDF Downloads 75
1400 Role of Process Parameters on Pocket Milling with Abrasive Water Jet Machining Technique

Authors: T. V. K. Gupta, J. Ramkumar, Puneet Tandon, N. S. Vyas

Abstract:

Abrasive Water Jet Machining (AWJM) is an unconventional machining process well known for machining hard to cut materials. The primary research focus on the process was for through cutting and a very limited literature is available on pocket milling using AWJM. The present work is an attempt to use this process for milling applications considering a set of various process parameters. Four different input parameters, which were considered by researchers for part separation, are selected for the above application i.e. abrasive size, flow rate, standoff distance, and traverse speed. Pockets of definite size are machined to investigate surface roughness, material removal rate, and pocket depth. Based on the data available through experiments on SS304 material, it is observed that higher traverse speeds gives a better finish because of reduction in the particle energy density and lower depth is also observed. Increase in the standoff distance and abrasive flow rate reduces the rate of material removal as the jet loses its focus and occurrence of collisions within the particles. ANOVA for individual output parameter has been studied to know the significant process parameters.

Keywords: abrasive flow rate, surface finish, abrasive size, standoff distance, traverse speed

Procedia PDF Downloads 299
1399 A Potential Spin-orbit Torque Device Using the Tri-layer Structure

Authors: Chih-Wei Cheng, Wei-Jen Chan, Yu-Han Huang, Yi-Tsung Lin, Yen-Wei Huang, Min-Cheng Chen, Shou-Zen Chang, G. Chern, Yuan-Chieh Tseng

Abstract:

How to develop spin-orbit-torque (SOT) devices with the virtues of field-free, perpendicular magnetic anisotropy (PMA), and low switching current is one of the many challenges in spintronics today. We propose a CoFeB/Ta/CoFeB tri-layer antiferromagnetic SOT device that could meet the above requirements. The device’s PMA was developed by adopting CoFeB–MgO interface. The key to the success of this structure is to ensure that (i)changes of the inter-layer coupling(IEC) and CoFeB anisotropy can occur simultaneously; (ii) one of the CoFeB needs to have a slightly tilted moment in the beginning. When sufficient current is given, the SHEreverses the already-tiltedCoFeB, and the other CoFeB can be reversed simultaneously by the IEC with the field-free nature. Adjusting the thickness of Ta can modify the coupling state to reduce the switching current while the field-free nature was preserved. Micromagnetic simulation suggests that the Néel orange peel effect (NOPE) is non-negligible due to interface roughness and coupling effect in the presence of perpendicular anisotropy. Fortunately, the Néel field induced by the NOPE appears to favor the field-free reversal.

Keywords: CoFeB, spin-orbit torque, antiferromagnetic, MRAM, trilayer

Procedia PDF Downloads 110
1398 Effect of the Deposition Time of Hydrogenated Nanocrystalline Si Grown on Porous Alumina Film on Glass Substrate by Plasma Processing Chemical Vapor Deposition

Authors: F. Laatar, S. Ktifa, H. Ezzaouia

Abstract:

Plasma Enhanced Chemical Vapor Deposition (PECVD) method is used to deposit hydrogenated nanocrystalline silicon films (nc-Si: H) on Porous Anodic Alumina Films (PAF) on glass substrate at different deposition duration. Influence of the deposition time on the physical properties of nc-Si: H grown on PAF was investigated through an extensive correlation between micro-structural and optical properties of these films. In this paper, we present an extensive study of the morphological, structural and optical properties of these films by Atomic Force Microscopy (AFM), X-Ray Diffraction (XRD) techniques and a UV-Vis-NIR spectrometer. It was found that the changes in DT can modify the films thickness, the surface roughness and eventually improve the optical properties of the composite. Optical properties (optical thicknesses, refractive indexes (n), absorption coefficients (α), extinction coefficients (k), and the values of the optical transitions EG) of this kind of samples were obtained using the data of the transmittance T and reflectance R spectra’s recorded by the UV–Vis–NIR spectrometer. We used Cauchy and Wemple–DiDomenico models for the analysis of the dispersion of the refractive index and the determination of the optical properties of these films.

Keywords: hydragenated nanocrystalline silicon, plasma processing chemical vapor deposition, X-ray diffraction, optical properties

Procedia PDF Downloads 373
1397 AI-Driven Solutions for Optimizing Master Data Management

Authors: Srinivas Vangari

Abstract:

In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.

Keywords: artificial intelligence, master data management, data governance, data quality

Procedia PDF Downloads 9
1396 Low Enrollment in Civil Engineering Departments: Challenges and Opportunities

Authors: Alaa Yehia, Ayatollah Yehia, Sherif Yehia

Abstract:

There is a recurring issue of low enrollments across many civil engineering departments in postsecondary institutions. While there have been moments where enrollments begin to increase, civil engineering departments find themselves facing low enrollments at around 60% over the last five years across the Middle East. There are many reasons that could be attributed to this decline, such as low entry-level salaries, over-saturation of civil engineering graduates in the job market, and a lack of construction projects due to the impending or current recession. However, this recurring problem alludes to an intrinsic issue of the curriculum. The societal shift to the usage of high technology such as machine learning (ML) and artificial intelligence (AI) demands individuals who are proficient at utilizing it. Therefore, existing curriculums must adapt to this change in order to provide an education that is suitable for potential and current students. In this paper, In order to provide potential solutions for this issue, the analysis considers two possible implementations of high technology into the civil engineering curriculum. The first approach is to implement a course that introduces applications of high technology in Civil Engineering contexts. While the other approach is to intertwine applications of high technology throughout the degree. Both approaches, however, should meet requirements of accreditation agencies. In addition to the proposed improvement in civil engineering curriculum, a different pedagogical practice must be adapted as well. The passive learning approach might not be appropriate for Gen Z students; current students, now more than ever, need to be introduced to engineering topics and practice following different learning methods to ensure they will have the necessary skills for the job market. Different learning methods that incorporate high technology applications, like AI, must be integrated throughout the curriculum to make the civil engineering degree more attractive to prospective students. Moreover, the paper provides insight on the importance and approach of adapting the Civil Engineering curriculum to address the current low enrollment crisis that civil engineering departments globally, but specifically in the Middle East, are facing.

Keywords: artificial intelligence (AI), civil engineering curriculum, high technology, low enrollment, pedagogy

Procedia PDF Downloads 158
1395 The Efficacy of Box Lesion+ Procedure in Patients with Atrial Fibrillation: Two-Year Follow-up Results

Authors: Oleg Sapelnikov, Ruslan Latypov, Darina Ardus, Samvel Aivazian, Andrey Shiryaev, Renat Akchurin

Abstract:

OBJECTIVE: MAZE procedure is one of the most effective surgical methods in atrial fibrillation (AF) treatment. Nowadays we are all aware of its modifications. In our study we conducted clinical analysis of “Box lesion+” approach during MAZE procedure in two-year follow-up. METHODS: We studied the results of the open-heart on-pump procedures performed in our hospital from 2017 to 2018 years. Thirty-two (32) patients with atrial fibrillation (AF) were included in this study. Fifteen (15) patients had concomitant coronary bypass grafting and seventeen (17) patients had mitral valve repair. Mean age was 62.3±8.7 years; prevalence of men was admitted (56.1%). Mean duration of AF was 4.75±5.44 and 7.07±8.14 years. In all cases, we performed endocardial Cryo-MAZE procedure with one-time myocardium revascularization or mitral-valve surgery. All patients of this study underwent pulmonary vein (PV) isolation and ablation of mitral isthmus with additional isolation of LA posterior wall (Box-lesion+ procedure). Mean follow-up was 2 years. RESULTS: All cases were performed without any complications. Additional isolation of posterior wall did not prolong the operative time and artificial circulation significantly. Cryo-MAZE procedure directly lasted 20±2.1 min, the whole operation time was 192±24 min and artificial circulation time was 103±12 min. According to design of the study, we performed clinical investigation of the patients in 12 months and in 2 years from the initial procedure. In 12 months, the number of AF free patients 81.8% and 75.8% in two years of follow-up. CONCLUSIONS: Isolation of the left atrial posterior wall and perimitral area may considerably improve the efficacy of surgical treatment, which was demonstrated in significant decrease of AF recurrences during the whole period of follow-up.

Keywords: atrial fibrillation, cryoablation, left atrium isolation, open heart procedure

Procedia PDF Downloads 123
1394 Comparison of GIS-Based Soil Erosion Susceptibility Models Using Support Vector Machine, Binary Logistic Regression and Artificial Neural Network in the Southwest Amazon Region

Authors: Elaine Lima Da Fonseca, Eliomar Pereira Da Silva Filho

Abstract:

The modeling of areas susceptible to soil loss by hydro erosive processes consists of a simplified instrument of reality with the purpose of predicting future behaviors from the observation and interaction of a set of geoenvironmental factors. The models of potential areas for soil loss will be obtained through binary logistic regression, artificial neural networks, and support vector machines. The choice of the municipality of Colorado do Oeste in the south of the western Amazon is due to soil degradation due to anthropogenic activities, such as agriculture, road construction, overgrazing, deforestation, and environmental and socioeconomic configurations. Initially, a soil erosion inventory map constructed through various field investigations will be designed, including the use of remotely piloted aircraft, orbital imagery, and the PLANAFLORO/RO database. 100 sampling units with the presence of erosion will be selected based on the assumptions indicated in the literature, and, to complement the dichotomous analysis, 100 units with no erosion will be randomly designated. The next step will be the selection of the predictive parameters that exert, jointly, directly, or indirectly, some influence on the mechanism of occurrence of soil erosion events. The chosen predictors are altitude, declivity, aspect or orientation of the slope, curvature of the slope, composite topographic index, flow power index, lineament density, normalized difference vegetation index, drainage density, lithology, soil type, erosivity, and ground surface temperature. After evaluating the relative contribution of each predictor variable, the erosion susceptibility model will be applied to the municipality of Colorado do Oeste - Rondônia through the SPSS Statistic 26 software. Evaluation of the model will occur through the determination of the values of the R² of Cox & Snell and the R² of Nagelkerke, Hosmer and Lemeshow Test, Log Likelihood Value, and Wald Test, in addition to analysis of the Confounding Matrix, ROC Curve and Accumulated Gain according to the model specification. The validation of the synthesis map resulting from both models of the potential risk of soil erosion will occur by means of Kappa indices, accuracy, and sensitivity, as well as by field verification of the classes of susceptibility to erosion using drone photogrammetry. Thus, it is expected to obtain the mapping of the following classes of susceptibility to erosion very low, low, moderate, very high, and high, which may constitute a screening tool to identify areas where more detailed investigations need to be carried out, applying more efficient social resources.

Keywords: modeling, susceptibility to erosion, artificial intelligence, Amazon

Procedia PDF Downloads 64
1393 Impact of Water Storage Structures on Groundwater Recharge in Jeloula Basin, Central Tunisia

Authors: I. Farid, K. Zouari

Abstract:

An attempt has been made to examine the effect of water storage structures on groundwater recharge in a semi-arid agroclimatic setting in Jeloula Basin (Central Tunisia). In this area, surface water in rivers is seasonal, and therefore groundwater is the perennial source of water supply for domestic and agricultural purposes. Three pumped storage water power plants (PSWPP) have been built to increase the overall water availability in the basin and support agricultural livelihoods of rural smallholders. The scale and geographical dispersion of these multiple lakes restrict the understanding of these coupled human-water systems and the identification of adequate strategies to support riparian farmers. In the present review, hydrochemistry and isotopic tools were combined to get an insight into the processes controlling mineralization and recharge conditions in the investigated aquifer system. This study showed a slight increase in the groundwater level, especially after the artificial recharge operations and a decline when the water volume moves down during drought periods. Chemical data indicate that the main sources of salinity in the waters are related to water-rock interactions. Data inferred from stable isotopes in groundwater samples indicated recharge with modern rainfall. The investigated surface water samples collected from the PSWPP are affected by a significant evaporation and reveal large seasonal variations, which could be controlled by the water volume changes in the open surface reservoirs and the meteorological conditions during evaporation, condensation, and precipitation. The geochemical information is comparable to the isotopic results and illustrates that the chemical and isotopic signatures of reservoir waters differ clearly from those of groundwaters. These data confirm that the contribution of the artificial recharge operations from the PSWPP is very limited.

Keywords: Jeloula basin, recharge, hydrochemistry, isotopes

Procedia PDF Downloads 146
1392 Multi-Template Molecularly Imprinted Polymer: Synthesis, Characterization and Removal of Selected Acidic Pharmaceuticals from Wastewater

Authors: Lawrence Mzukisi Madikizela, Luke Chimuka

Abstract:

Removal of organics from wastewater offers a better water quality, therefore, the purpose of this work was to investigate the use of molecularly imprinted polymer (MIP) for the elimination of selected organics from water. A multi-template MIP for the adsorption of naproxen, ibuprofen and diclofenac was synthesized using a bulk polymerization method. A MIP was synthesized at 70°C by employing 2-vinylpyridine, ethylene glycol dimethacrylate, toluene and 1,1’-azobis-(cyclohexanecarbonitrile) as functional monomer, cross-linker, porogen and initiator, respectively. Thermogravimetric characterization indicated that the polymer backbone collapses at 250°C and scanning electron microscopy revealed the porous and roughness nature of the MIP after elution of templates. The performance of the MIP in aqueous solutions was evaluated by optimizing several adsorption parameters. The optimized adsorption conditions were 50 mg of MIP, extraction time of 10 min, a sample pH of 4.6 and the initial concentration of 30 mg/L. The imprinting factors obtained for naproxen, ibuprofen and diclofenac were 1.25, 1.42, and 2.01, respectively. The order of selectivity for the MIP was; diclofenac > ibuprofen > naproxen. MIP showed great swelling in water with an initial swelling rate of 2.62 g/(g min). The synthesized MIP proved to be able to adsorb naproxen, ibuprofen and diclofenac from contaminated deionized water, wastewater influent and effluent.

Keywords: adsorption, molecularly imprinted polymer, multi template, pharmaceuticals

Procedia PDF Downloads 300
1391 Enhancing Efficiency of Building through Translucent Concrete

Authors: Humaira Athar, Brajeshwar Singh

Abstract:

Generally, the brightness of the indoor environment of buildings is entirely maintained by the artificial lighting which has consumed a large amount of resources. It is reported that lighting consumes about 19% of the total generated electricity which accounts for about 30-40% of total energy consumption. One possible way is to reduce the lighting energy by exploiting sunlight either through the use of suitable devices or energy efficient materials like translucent concrete. Translucent concrete is one such architectural concrete which allows the passage of natural light as well as artificial light through it. Several attempts have been made on different aspects of translucent concrete such as light guiding materials (glass fibers, plastic fibers, cylinder etc.), concrete mix design and manufacturing methods for use as building elements. Concerns are, however, raised on various related issues such as poor compatibility between the optical fibers and cement paste, unaesthetic appearance due to disturbance occurred in the arrangement of fibers during vibration and high shrinkage in flowable concrete due to its high water/cement ratio. Need is felt to develop translucent concrete to meet the requirement of structural safety as OPC concrete with the maximized saving in energy towards the power of illumination and thermal load in buildings. Translucent concrete was produced using pre-treated plastic optical fibers (POF, 2mm dia.) and high slump white concrete. The concrete mix was proportioned in the ratio of 1:1.9:2.1 with a w/c ratio of 0.40. The POF was varied from 0.8-9 vol.%. The mechanical properties and light transmission of this concrete were determined. Thermal conductivity of samples was measured by a transient plate source technique. Daylight illumination was measured by a lux grid method as per BIS:SP-41. It was found that the compressive strength of translucent concrete increased with decreasing optical fiber content. An increase of ~28% in the compressive strength of concrete was noticed when fiber was pre-treated. FE-SEM images showed little-debonded zone between the fibers and cement paste which was well supported with pull-out bond strength test results (~187% improvement over untreated). The light transmission of concrete was in the range of 3-7% depending on fiber spacing (5-20 mm). The average daylight illuminance (~75 lux) was nearly equivalent to the criteria specified for illumination for circulation (80 lux). The thermal conductivity of translucent concrete was reduced by 28-40% with respect to plain concrete. The thermal load calculated by heat conduction equation was ~16% more than the plain concrete. Based on Design-Builder software, the total annual illumination energy load of a room using one side translucent concrete was 162.36 kW compared with the energy load of 249.75 kW for a room without concrete. The calculated energy saving on an account of the power of illumination was ~25%. A marginal improvement towards thermal comfort was also noticed. It is concluded that the translucent concrete has the advantages of the existing concrete (load bearing) with translucency and insulation characteristics. It saves a significant amount of energy by providing natural daylight instead of artificial power consumption of illumination.

Keywords: energy saving, light transmission, microstructure, plastic optical fibers, translucent concrete

Procedia PDF Downloads 124
1390 Characterization of Nanostructured and Conventional TiAlN and AlCrN Coated ASTM-SA213-T-11 Boiler Steel

Authors: Vikas Chawla, Buta Singh Sidhu, Amita Rani, Amit Handa

Abstract:

The main objective of the present work is microstructural and mechanical characterization of the conventional and nanostructured TiAlN and AlCrN coatings deposited on T-11 boiler steel. In case of conventional coatings, Al-Cr and Ti-Al metallic powders were deposited using plasma spray process followed by gas nitriding of the surface which was done in the lab with optimized parameters after conducting several trials on plasma-sprayed coated specimens. The physical vapor deposition process (PAPVD) was employed for depositing nanostructured TiAlN and AlCrN coatings. The field emission scanning electron microscopy (FE-SEM) with energy dispersive X-ray analysis (EDAX) attachment, X-ray diffraction (XRD) analysis, atomic force microscopy (AFM) analysis and the X-Ray mapping analysis techniques have been used to study surface and cross-sectional morphology of the coatings. The surface roughness and micro-hardness were also measured. A good adhesion of the conventional thick TiAlN and AlCrN coatings was found. The coatings under study are recommended for the applications to super-heater and re-heater tubes of the boilers based upon the outcomes of the research work.

Keywords: nanostructure, physical vapour deposition, oxides, thin films, electron microscopy

Procedia PDF Downloads 135
1389 Development of Transparent Nano-Structured Super-Hydrophobic Coating on Glass and Evaluation of Anti-Dust Properties

Authors: Abhilasha Mishra, Neha Bhatt

Abstract:

Super-hydrophobicity is an effect in which a surface roughness and chemical composition are combined to produce unusual water and dust repellent surface. The super-hydrophobic surface is widely used in many applications such as windshields of the automobile, aircraft, lens, solar cells, roofing, boat hull, paints, etc. Four coating solutions were prepared by varying compositions of 1,1,1,3,3,3 hexametyldisilazane (HDMS) and tetraethylorthosilicate (TEOS) sol. These solutions were coated on glass slides by a spin coating method and etched at a high temperature ranging 250 -350 oC. All the coatings were studied for its different properties like water repellent, anti-dust, and transparency and contact angle measurements. Stability of coatings was also studied with respect to temperature, external environment, and pH. It was found that all coatings impart a significant super-hydrophobicity on a glass surface with contact angle ranging from 156o to 162o and have good stability in the external environment. The results of the different coatings were observed and compared with each other. On increasing layers of coatings the super-hydrophobicity and anti-dust properties increases but after 3 coatings the transparency of coating starts decreasing.

Keywords: super-hydrophobic, contact angle, coating, anti-dust

Procedia PDF Downloads 253
1388 Intelligent Process and Model Applied for E-Learning Systems

Authors: Mafawez Alharbi, Mahdi Jemmali

Abstract:

E-learning is a developing area especially in education. E-learning can provide several benefits to learners. An intelligent system to collect all components satisfying user preferences is so important. This research presents an approach that it capable to personalize e-information and give the user their needs following their preferences. This proposal can make some knowledge after more evaluations made by the user. In addition, it can learn from the habit from the user. Finally, we show a walk-through to prove how intelligent process work.

Keywords: artificial intelligence, architecture, e-learning, software engineering, processing

Procedia PDF Downloads 188
1387 Strategies to Synthesize Ambient Stable Ultrathin Ag Film Supported on Oxide Substrate

Authors: Allamula Ashok, Peela Lasya, Daljin Jacob, P. Muhammed Razi, Satyesh Kumar Yadav

Abstract:

We report zinc (Zn) as a seed layer material and a need for a specific disposition sequence to grow ultrathin silver (Ag) films on quartz (SiO₂). Ag films of thickness 4, 6, 8 and 10 nm were deposited by DC magnetron sputtering without and with Zn seed layer thickness of 1, 2 and 4 nm. The effect of Zn seed layer thickness and its annealing on the surface morphology, sheet resistance, and stability of ultrathin Ag films is investigated. We show that by increasing Zn seed layer thickness from 1 to 2 nm, there is a 5-order reduction in sheet resistance of 6 nm Ag films. We find that annealing of the seed layer is crucial to achieving stability of ultrathin Ag films. 6 nm Ag film with 2 nm Zn is unstable to 100 oC annealing, while the 6 nm Ag film with annealed 2 nm Zn seed layer is stable. 2 nm Zn seeded 8 nm Ag film maintained a constant sheet resistance of 7 Ω/□ for all 6 months of exposure to ambient conditions. Among the ultrathin film grown, 8nm Ag film with 2nm Zn seed layer had the best figure of merit with sheet resistance of 7 Ω/□, mean absolute surface roughness (Ra) ~1 nm, and optical transparency of 61 %. Such stable exposed ultrathin Ag films can find applications as catalysts, sensors, and transparent and conductive electrodes for solar cells, LEDs and plasmonic devices.

Keywords: ultrathin Ag films, magnetron sputtering, thermal stability, seed layer, exposed silver, zinc.

Procedia PDF Downloads 35
1386 Floodplain Modeling of River Jhelum Using HEC-RAS: A Case Study

Authors: Kashif Hassan, M.A. Ahanger

Abstract:

Floods have become more frequent and severe due to effects of global climate change and human alterations of the natural environment. Flood prediction/ forecasting and control is one of the greatest challenges facing the world today. The forecast of floods is achieved by the use of hydraulic models such as HEC-RAS, which are designed to simulate flow processes of the surface water. Extreme flood events in river Jhelum , lasting from a day to few are a major disaster in the State of Jammu and Kashmir, India. In the present study HEC-RAS model was applied to two different reaches of river Jhelum in order to estimate the flood levels corresponding to 25, 50 and 100 year return period flood events at important locations and to deduce flood vulnerability of important areas and structures. The flow rates for the two reaches were derived from flood-frequency analysis of 50 years of historic peak flow data. Manning's roughness coefficient n was selected using detailed analysis. Rating Curves were also generated to serve as base for determining the boundary conditions. Calibration and Validation procedures were applied in order to ensure the reliability of the model. Sensitivity analysis was also performed in order to ensure the accuracy of Manning's n in generating water surface profiles.

Keywords: flood plain, HEC-RAS, Jhelum, return period

Procedia PDF Downloads 421
1385 African Personhood and the Regulation of Brain-Computer Interface (BCI) Technologies: A South African view

Authors: Meshandren Naidoo, Amy Gooden

Abstract:

Implantable brain-computer interface (BCI) technologies have developed to the point where brain-computer communication is possible. This has great potential in the medical field, as it allows persons who have lost capacities. However, ethicists and regulators call for a strict approach to these technologies due to the impact on personhood. This research demonstrates that the personhood debate is more nuanced and that where an African approach to personhood is used, it may produce results more favorable to the development and use of this technology.

Keywords: artificial intelligence, law, neuroscience, ethics

Procedia PDF Downloads 124
1384 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 113