Search results for: Intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1562

Search results for: Intelligence

362 The Effect of Artificial Intelligence on Food and Beverages

Authors: Remon Karam Zakry Kelada

Abstract:

This survey research ambitions to examine the usual of carrier quality of meals and beverage provider staffs in lodge business by way of studying the carrier fashionable of 3 pattern inns, Siam Kempinski lodge Bangkok, four Seasons lodge Chiang Mai, and Banyan Tree Phuket. as a way to locate the international provider general of food and beverage provider, triangular research, i.e. quantitative, qualitative, and survey were hired. on this research, questionnaires and in-depth interview have been used for getting the statistics on the sequences and method of services. There had been three components of modified questionnaires to degree carrier pleasant and visitor’s satisfaction inclusive of carrier facilities, attentiveness, obligation, reliability, and circumspection. This observe used pattern random sampling to derive topics with the go back fee of the questionnaires changed into 70% or 280. information have been analyzed via SPSS to find mathematics mean, SD, percent, and comparison by using t-take a look at and One-manner ANOVA. The outcomes revealed that the service first-rate of the three lodges have been in the worldwide stage that could create excessive pride to the international clients. hints for studies implementations have been to hold the area of precise carrier satisfactory, and to enhance some dimensions of service fine together with reliability. training in service fashionable, product expertise, and new generation for employees must be provided. furthermore, for you to develop the provider pleasant of the enterprise, training collaboration among inn corporation and academic institutions in food and beverage carrier should be considered.

Keywords: food and beverage staff, food poisoning, food production, hygiene knowledge BPA, health, regulations, toxicity service standard, food and beverage department, sequence of service, service method

Procedia PDF Downloads 34
361 How Envisioning Process Is Constructed: An Exploratory Research Comparing Three International Public Televisions

Authors: Alexandre Bedard, Johane Brunet, Wendellyn Reid

Abstract:

Public Television is constantly trying to maintain and develop its audience. And to achieve those goals, it needs a strong and clear vision. Vision or envision is a multidimensional process; it is simultaneously a conduit that orients and fixes the future, an idea that comes before the strategy and a mean by which action is accomplished, from a business perspective. Also, vision is often studied from a prescriptive and instrumental manner. Based on our understanding of the literature, we were able to explain how envisioning, as a process, is a creative one; it takes place in the mind and uses wisdom and intelligence through a process of evaluation, analysis and creation. Through an aggregation of the literature, we build a model of the envisioning process, based on past experiences, perceptions and knowledge and influenced by the context, being the individual, the organization and the environment. With exploratory research in which vision was deciphered through the discourse, through a qualitative and abductive approach and a grounded theory perspective, we explored three extreme cases, with eighteen interviews with experts, leaders, politicians, actors of the industry, etc. and more than twenty hours of interviews in three different countries. We compared the strategy, the business model, and the political and legal forces. We also looked at the history of each industry from an inertial point of view. Our analysis of the data revealed that a legitimacy effect due to the audience, the innovation and the creativity of the institutions was at the cornerstone of what would influence the envisioning process. This allowed us to identify how different the process was for Canadian, French and UK public broadcasters, although we concluded that the three of them had a socially constructed vision for their future, based on stakeholder management and an emerging role for the managers: ideas brokers.

Keywords: envisioning process, international comparison, television, vision

Procedia PDF Downloads 132
360 Anyword: A Digital Marketing Tool to Increase Productivity in Newly Launching Businesses

Authors: Jana Atteah, Wid Jan, Yara AlHibshi, Rahaf AlRougi

Abstract:

Anyword is an AI copywriting tool that helps marketers create effective campaigns for specific audiences. It offers a wide range of templates for various platforms, brand voice guidelines, and valuable analytics insights. Anyword is used by top global companies and has been recognized as one of the "Fastest Growing Products" in the 2023 software awards. A recent study examined the utilization and impact of AI-powered writing tools, specifically focusing on the adoption of AI in writing pursuits and the use of the Anyword platform. The results indicate that a majority of respondents (52.17%) had not previously used Anyword, but those who had were generally satisfied with the platform. Notable productivity improvements were observed among 13% of the participants, while an additional 34.8% reported a slight increase in productivity. A majority (47.8%) maintained a neutral stance, suggesting that their productivity remained unaffected. Only a minimal percentage (4.3%) claimed that their productivity did not improve with the usage of Anyword AI. In terms of the quality of written content generated, the participants responded positively. Approximately 91% of participants gave Anyword AI a score of 5 or higher, with roughly 17% giving it a perfect score. A small percentage (approximately 9%) gave a low score between 0-2. The mode result was a score of 7, indicating a generally positive perception of the quality of content generated using Anyword AI. These findings suggest that AI can contribute to increased productivity and positively influence the quality of written content. Further research and exploration of AI tools in writing pursuits are warranted to fully understand their potential and limitations.

Keywords: artificial intelligence, marketing platforms, productivity, user interface

Procedia PDF Downloads 63
359 An IoT-Enabled Crop Recommendation System Utilizing Message Queuing Telemetry Transport (MQTT) for Efficient Data Transmission to AI/ML Models

Authors: Prashansa Singh, Rohit Bajaj, Manjot Kaur

Abstract:

In the modern agricultural landscape, precision farming has emerged as a pivotal strategy for enhancing crop yield and optimizing resource utilization. This paper introduces an innovative Crop Recommendation System (CRS) that leverages the Internet of Things (IoT) technology and the Message Queuing Telemetry Transport (MQTT) protocol to collect critical environmental and soil data via sensors deployed across agricultural fields. The system is designed to address the challenges of real-time data acquisition, efficient data transmission, and dynamic crop recommendation through the application of advanced Artificial Intelligence (AI) and Machine Learning (ML) models. The CRS architecture encompasses a network of sensors that continuously monitor environmental parameters such as temperature, humidity, soil moisture, and nutrient levels. This sensor data is then transmitted to a central MQTT server, ensuring reliable and low-latency communication even in bandwidth-constrained scenarios typical of rural agricultural settings. Upon reaching the server, the data is processed and analyzed by AI/ML models trained to correlate specific environmental conditions with optimal crop choices and cultivation practices. These models consider historical crop performance data, current agricultural research, and real-time field conditions to generate tailored crop recommendations. This implementation gets 99% accuracy.

Keywords: Iot, MQTT protocol, machine learning, sensor, publish, subscriber, agriculture, humidity

Procedia PDF Downloads 68
358 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images

Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn

Abstract:

The detection and segmentation of mitochondria from fluorescence microscopy are crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. In the literature, a number of open-source software tools and artificial intelligence (AI) methods have been described for analyzing mitochondrial images, achieving remarkable classification and quantitation results. However, the availability of combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compatibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source python and openCV library, the algorithms are implemented in three stages: pre-processing, image binarization, and coarse-to-fine segmentation. The proposed model is validated using the mitochondrial fluorescence dataset. Ground truth labels generated using a Lab kit were also used to evaluate the performance of our detection and segmentation model. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks conclude the paper.

Keywords: 2D, binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation

Procedia PDF Downloads 357
357 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning

Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah

Abstract:

Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.

Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning

Procedia PDF Downloads 32
356 Evaluating the Satisfaction of Chinese Consumers toward Influencers at TikTok

Authors: Noriyuki Suyama

Abstract:

The progress and spread of digitalization have led to the provision of a variety of new services. The recent progress in digitization can be attributed to rapid developments in science and technology. First, the research and diffusion of artificial intelligence (AI) has made dramatic progress. Around 2000, the third wave of AI research, which had been underway for about 50 years, arrived. Specifically, machine learning and deep learning were made possible in AI, and the ability of AI to acquire knowledge, define the knowledge, and update its own knowledge in a quantitative manner made the use of big data practical even for commercial PCs. On the other hand, with the spread of social media, information exchange has become more common in our daily lives, and the lending and borrowing of goods and services, in other words, the sharing economy, has become widespread. The scope of this trend is not limited to any industry, and its momentum is growing as the SDGs take root. In addition, the Social Network Service (SNS), a part of social media, has brought about the evolution of the retail business. In the past few years, social network services (SNS) involving users or companies have especially flourished. The People's Republic of China (hereinafter referred to as "China") is a country that is stimulating enormous consumption through its own unique SNS, which is different from the SNS used in developed countries around the world. This paper focuses on the effectiveness and challenges of influencer marketing by focusing on the influence of influencers on users' behavior and satisfaction with Chinese SNSs. Specifically, Conducted was the quantitative survey of Tik Tok users living in China, with the aim of gaining new insights from the analysis and discussions. As a result, we found several important findings and knowledge.

Keywords: customer satisfaction, social networking services, influencer marketing, Chinese consumers’ behavior

Procedia PDF Downloads 89
355 Integrating Knowledge Distillation of Multiple Strategies

Authors: Min Jindong, Wang Mingxia

Abstract:

With the widespread use of artificial intelligence in life, computer vision, especially deep convolutional neural network models, has developed rapidly. With the increase of the complexity of the real visual target detection task and the improvement of the recognition accuracy, the target detection network model is also very large. The huge deep neural network model is not conducive to deployment on edge devices with limited resources, and the timeliness of network model inference is poor. In this paper, knowledge distillation is used to compress the huge and complex deep neural network model, and the knowledge contained in the complex network model is comprehensively transferred to another lightweight network model. Different from traditional knowledge distillation methods, we propose a novel knowledge distillation that incorporates multi-faceted features, called M-KD. In this paper, when training and optimizing the deep neural network model for target detection, the knowledge of the soft target output of the teacher network in knowledge distillation, the relationship between the layers of the teacher network and the feature attention map of the hidden layer of the teacher network are transferred to the student network as all knowledge. in the model. At the same time, we also introduce an intermediate transition layer, that is, an intermediate guidance layer, between the teacher network and the student network to make up for the huge difference between the teacher network and the student network. Finally, this paper adds an exploration module to the traditional knowledge distillation teacher-student network model. The student network model not only inherits the knowledge of the teacher network but also explores some new knowledge and characteristics. Comprehensive experiments in this paper using different distillation parameter configurations across multiple datasets and convolutional neural network models demonstrate that our proposed new network model achieves substantial improvements in speed and accuracy performance.

Keywords: object detection, knowledge distillation, convolutional network, model compression

Procedia PDF Downloads 278
354 Efficient Chess Board Representation: A Space-Efficient Protocol

Authors: Raghava Dhanya, Shashank S.

Abstract:

This paper delves into the intersection of chess and computer science, specifically focusing on the efficient representation of chess game states. We propose two methods: the Static Method and the Dynamic Method, each offering unique advantages in terms of space efficiency and computational complexity. The Static Method aims to represent the game state using a fixedlength encoding, allocating 192 bits to capture the positions of all pieces on the board. This method introduces a protocol for ordering and encoding piece positions, ensuring efficient storage and retrieval. However, it faces challenges in representing pieces no longer in play. In contrast, the Dynamic Method adapts to the evolving game state by dynamically adjusting the encoding length based on the number of pieces in play. By incorporating Alive Bits for each piece kind, this method achieves greater flexibility and space efficiency. Additionally, it includes provisions for encoding additional game state information such as castling rights and en passant squares. Our findings demonstrate that the Dynamic Method offers superior space efficiency compared to traditional Forsyth-Edwards Notation (FEN), particularly as the game progresses and pieces are captured. However, it comes with increased complexity in encoding and decoding processes. In conclusion, this study provides insights into optimizing the representation of chess game states, offering potential applications in chess engines, game databases, and artificial intelligence research. The proposed methods offer a balance between space efficiency and computational overhead, paving the way for further advancements in the field.

Keywords: chess, optimisation, encoding, bit manipulation

Procedia PDF Downloads 50
353 Automated Detection of Targets and Retrieve the Corresponding Analytics Using Augmented Reality

Authors: Suvarna Kumar Gogula, Sandhya Devi Gogula, P. Chanakya

Abstract:

Augmented reality is defined as the collection of the digital (or) computer generated information like images, audio, video, 3d models, etc. and overlay them over the real time environment. Augmented reality can be thought as a blend between completely synthetic and completely real. Augmented reality provides scope in a wide range of industries like manufacturing, retail, gaming, advertisement, tourism, etc. and brings out new dimensions in the modern digital world. As it overlays the content, it makes the users enhance the knowledge by providing the content blended with real world. In this application, we integrated augmented reality with data analytics and integrated with cloud so the virtual content will be generated on the basis of the data present in the database and we used marker based augmented reality where every marker will be stored in the database with corresponding unique ID. This application can be used in wide range of industries for different business processes, but in this paper, we mainly focus on the marketing industry which helps the customer in gaining the knowledge about the products in the market which mainly focus on their prices, customer feedback, quality, and other benefits. This application also focuses on providing better market strategy information for marketing managers who obtain the data about the stocks, sales, customer response about the product, etc. In this paper, we also included the reports from the feedback got from different people after the demonstration, and finally, we presented the future scope of Augmented Reality in different business processes by integrating with new technologies like cloud, big data, artificial intelligence, etc.

Keywords: augmented reality, data analytics, catch room, marketing and sales

Procedia PDF Downloads 237
352 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 244
351 An Efficient Machine Learning Model to Detect Metastatic Cancer in Pathology Scans Using Principal Component Analysis Algorithm, Genetic Algorithm, and Classification Algorithms

Authors: Bliss Singhal

Abstract:

Machine learning (ML) is a branch of Artificial Intelligence (AI) where computers analyze data and find patterns in the data. The study focuses on the detection of metastatic cancer using ML. Metastatic cancer is the stage where cancer has spread to other parts of the body and is the cause of approximately 90% of cancer-related deaths. Normally, pathologists spend hours each day to manually classifying whether tumors are benign or malignant. This tedious task contributes to mislabeling metastasis being over 60% of the time and emphasizes the importance of being aware of human error and other inefficiencies. ML is a good candidate to improve the correct identification of metastatic cancer, saving thousands of lives and can also improve the speed and efficiency of the process, thereby taking fewer resources and time. So far, the deep learning methodology of AI has been used in research to detect cancer. This study is a novel approach to determining the potential of using preprocessing algorithms combined with classification algorithms in detecting metastatic cancer. The study used two preprocessing algorithms: principal component analysis (PCA) and the genetic algorithm, to reduce the dimensionality of the dataset and then used three classification algorithms: logistic regression, decision tree classifier, and k-nearest neighbors to detect metastatic cancer in the pathology scans. The highest accuracy of 71.14% was produced by the ML pipeline comprising of PCA, the genetic algorithm, and the k-nearest neighbor algorithm, suggesting that preprocessing and classification algorithms have great potential for detecting metastatic cancer.

Keywords: breast cancer, principal component analysis, genetic algorithm, k-nearest neighbors, decision tree classifier, logistic regression

Procedia PDF Downloads 81
350 Mammographic Multi-View Cancer Identification Using Siamese Neural Networks

Authors: Alisher Ibragimov, Sofya Senotrusova, Aleksandra Beliaeva, Egor Ushakov, Yuri Markin

Abstract:

Mammography plays a critical role in screening for breast cancer in women, and artificial intelligence has enabled the automatic detection of diseases in medical images. Many of the current techniques used for mammogram analysis focus on a single view (mediolateral or craniocaudal view), while in clinical practice, radiologists consider multiple views of mammograms from both breasts to make a correct decision. Consequently, computer-aided diagnosis (CAD) systems could benefit from incorporating information gathered from multiple views. In this study, the introduce a method based on a Siamese neural network (SNN) model that simultaneously analyzes mammographic images from tri-view: bilateral and ipsilateral. In this way, when a decision is made on a single image of one breast, attention is also paid to two other images – a view of the same breast in a different projection and an image of the other breast as well. Consequently, the algorithm closely mimics the radiologist's practice of paying attention to the entire examination of a patient rather than to a single image. Additionally, to the best of our knowledge, this research represents the first experiments conducted using the recently released Vietnamese dataset of digital mammography (VinDr-Mammo). On an independent test set of images from this dataset, the best model achieved an AUC of 0.87 per image. Therefore, this suggests that there is a valuable automated second opinion in the interpretation of mammograms and breast cancer diagnosis, which in the future may help to alleviate the burden on radiologists and serve as an additional layer of verification.

Keywords: breast cancer, computer-aided diagnosis, deep learning, multi-view mammogram, siamese neural network

Procedia PDF Downloads 138
349 Challenges to Reaching Higher Education in Developing Countries

Authors: Suhail Shersad

Abstract:

Introduction In developing countries, the access to higher education for the lower socioeconomic strata is very poor at less than 0.05%. The challenges faced by prospective students in these circumstances to pursue higher education have been explored through direct interaction with them and their families in urban slums of New Delhi. This study included evaluation of the demographics, social indices, expectations and perceptions of selected communities. Results The results show that the poor life expectancy, low exposure to technology, lack of social infrastructure and poor sanitary conditions have reduced their drive for academic achievements. This is despite a good level of intelligence and critical thinking skills among these students. The perception of the community including parents shows that despite their desire to excel, there are too may roadblocks to achieving a fruitful professional life for the next generation. Discussion The prerequisites of higher education may have to be revisited to be more inclusive of socially handicapped students. The knowledge, skills and attributes required for higher education system should form the baseline for creating a roadmap for higher secondary education suited for local needs. Conventional parameters like marks and grading have to be re-looked so that life skills and vocational training form part of the core curriculum. Essential skills should be incorporated at an earlier age, providing an alternative pathway for such students to join higher education. Conclusion: There is a need to bridge the disconnect that exists between higher education planning, the needs of the concerned cohorts and the existing higher secondary education. The variables that contribute to making such a decision have to be examined further. Keywords: prerequisites of higher education, social mobility, society expectations, access to higher education

Keywords: access to higher education, prerequisites of higher education, society expectations, social mobility

Procedia PDF Downloads 386
348 Clustering for Detection of the Population at Risk of Anticholinergic Medication

Authors: A. Shirazibeheshti, T. Radwan, A. Ettefaghian, G. Wilson, C. Luca, Farbod Khanizadeh

Abstract:

Anticholinergic medication has been associated with events such as falls, delirium, and cognitive impairment in older patients. To further assess this, anticholinergic burden scores have been developed to quantify risk. A risk model based on clustering was deployed in a healthcare management system to cluster patients into multiple risk groups according to anticholinergic burden scores of multiple medicines prescribed to patients to facilitate clinical decision-making. To do so, anticholinergic burden scores of drugs were extracted from the literature, which categorizes the risk on a scale of 1 to 3. Given the patients’ prescription data on the healthcare database, a weighted anticholinergic risk score was derived per patient based on the prescription of multiple anticholinergic drugs. This study was conducted on over 300,000 records of patients currently registered with a major regional UK-based healthcare provider. The weighted risk scores were used as inputs to an unsupervised learning algorithm (mean-shift clustering) that groups patients into clusters that represent different levels of anticholinergic risk. To further evaluate the performance of the model, any association between the average risk score within each group and other factors such as socioeconomic status (i.e., Index of Multiple Deprivation) and an index of health and disability were investigated. The clustering identifies a group of 15 patients at the highest risk from multiple anticholinergic medication. Our findings also show that this group of patients is located within more deprived areas of London compared to the population of other risk groups. Furthermore, the prescription of anticholinergic medicines is more skewed to female than male patients, indicating that females are more at risk from this kind of multiple medications. The risk may be monitored and controlled in well artificial intelligence-equipped healthcare management systems.

Keywords: anticholinergic medicines, clustering, deprivation, socioeconomic status

Procedia PDF Downloads 211
347 Decision-Making Strategies on Smart Dairy Farms: A Review

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh

Abstract:

Farm management and operations will drastically change due to access to real-time data, real-time forecasting, and tracking of physical items in combination with Internet of Things developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm-based management and decision-making does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyse on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue, and environmental impact. Evolutionary computing can be very effective in finding the optimal combination of sets of some objects and, finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and evolutionary computing in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management, and its uptake has become a continuing trend.

Keywords: big data, evolutionary computing, cloud, precision technologies

Procedia PDF Downloads 189
346 Optimizing the Readability of Orthopaedic Trauma Patient Education Materials Using ChatGPT-4

Authors: Oscar Covarrubias, Diane Ghanem, Christopher Murdock, Babar Shafiq

Abstract:

Introduction: ChatGPT is an advanced language AI tool designed to understand and generate human-like text. The aim of this study is to assess the ability of ChatGPT-4 to re-write orthopaedic trauma patient education materials at the recommended 6th-grade level. Methods: Two independent reviewers accessed ChatGPT-4 (chat.openai.com) and gave identical instructions to simplify the readability of provided text to a 6th-grade level. All trauma-related articles by the Orthopaedic Trauma Association (OTA) and American Academy of Orthopaedic Surgeons (AAOS) were sequentially provided. The academic grade level was determined using the Flesh-Kincaid Grade Level (FKGL) and Flesch Reading Ease (FRE). Paired t-tests and Wilcox-rank sum tests were used to compare the FKGL and FRE between the ChatGPT-4 revised and original text. Inter-rater correlation coefficient (ICC) was used to assess variability in ChatGPT-4 generated text between the two reviewers. Results: ChatGPT-4 significantly reduced FKGL and increased FRE scores in the OTA (FKGL: 5.7±0.5 compared to the original 8.2±1.1, FRE: 76.4±5.7 compared to the original 65.5±6.6, p < 0.001) and AAOS articles (FKGL: 5.8±0.8 compared to the original 8.9±0.8, FRE: 76±5.5 compared to the original 56.7±5.9, p < 0.001). On average, 14.6% of OTA and 28.6% of AAOS articles required at least two revisions by ChatGPT-4 to achieve a 6th-grade reading level. ICC demonstrated poor reliability for FKGL (OTA 0.24, AAOS 0.45) and moderate reliability for FRE (OTA 0.61, AAOS 0.73). Conclusion: This study provides a novel, simple and efficient method using language AI to optimize the readability of patient education content which may only require the surgeon’s final proofreading. This method would likely be as effective for other medical specialties.

Keywords: artificial intelligence, AI, chatGPT, patient education, readability, trauma education

Procedia PDF Downloads 72
345 The Intersection of Art and Technology: Innovations in Visual Communication Design

Authors: Sareh Enjavi

Abstract:

In recent years, the field of visual communication design has seen a significant shift in the way that art is created and consumed, with the advent of new technologies like virtual reality, augmented reality, and artificial intelligence. This paper explores the ways in which technology is changing the landscape of visual communication design, and how designers are incorporating new technological tools into their artistic practices. The primary objective of this research paper is to investigate the ways in which technology is influencing the creative process of designers and artists in the field of visual communication design. The paper also aims to examine the challenges and limitations that arise from the intersection of art and technology in visual communication design, and to identify strategies for overcoming these challenges. Drawing on examples from a range of fields, including advertising, fine art, and digital media, this paper highlights the exciting innovations that are emerging as artists and designers use technology to push the boundaries of traditional artistic expression. The paper argues that embracing technological innovation is essential for the continued evolution of visual communication design. By exploring the intersection of art and technology, designers can create new and exciting visual experiences that engage and inspire audiences in new ways. The research also contributes to the theoretical and methodological understanding of the intersection of art and technology, a topic that has gained significant attention in recent years. Ultimately, this paper emphasizes the importance of embracing innovation and experimentation in the field of visual communication design, and highlights the exciting innovations that are emerging as a result of the intersection of art and technology, and emphasizes the importance of embracing innovation and experimentation in the field of visual communication design.

Keywords: visual communication design, art and technology, virtual reality, interactive art, creative process

Procedia PDF Downloads 118
344 A Virtual Reality Simulation Tool for Reducing the Risk of Building Content during Earthquakes

Authors: Ali Asgary, Haopeng Zhou, Ghassem Tofighi

Abstract:

Use of virtual (VR), augmented reality (AR), and extended reality technologies for training and education has increased in recent years as more hardware and software tools have become available and accessible to larger groups of users. Similarly, the applications of these technologies in earthquake related training and education are on the rise. Several studies have reported promising results for the use of VR and AR for evacuation behaviour and training under earthquake situations. They simulate the impacts that earthquake has on buildings, buildings’ contents, and how building occupants and users can find safe spots or open paths to outside. Considering that considerable number of earthquake injuries and fatalities are linked to the behaviour, our goal is to use these technologies to reduce the impacts of building contents on people. Building on our artificial intelligence (AI) based indoor earthquake risk assessment application that enables users to use their mobile device to assess the risks associated with building contents during earthquakes, we develop a virtual reality application to demonstrate the behavior of different building contents during earthquakes, their associate moving, spreading, falling, and collapsing risks, and their risk mitigation methods. We integrate realistic seismic models, building contents behavior with and without risk mitigation measures in virtual reality environment. The application can be used for training of architects, interior design experts, and building users to enhance indoor safety of the buildings that can sustain earthquakes. This paper describes and demonstrates the application development background, structure, components, and usage.

Keywords: virtual reality, earthquake damage, building content, indoor risks, earthquake risk mitigation, interior design, unity game engine, oculus

Procedia PDF Downloads 105
343 The Effect of Technology and Artifical Intelligence on Legal Securities and Privacy Issues

Authors: Kerolis Samoul Zaghloul Noaman

Abstract:

area law is the brand new access in the basket of worldwide law in the latter half of the 20 th Century. inside the last hundred and fifty years, courts and pupils advanced a consensus that, the custom is an vital supply of global law. Article 38(1) (b) of the statute of the international court of Justice identified global custom as a supply of global law. country practices and usages have a more role to play in formulating commonplace international regulation. This paper examines those country practices which may be certified to emerge as global standard law. due to the fact that, 1979 (after Moon Treaty) no hard law had been developed within the vicinity of space exploration. It attempts to link among country practices and custom in area exploration and development of standard global regulation in area activities. The paper makes use of doctrinal approach of felony research for inspecting the current questions of worldwide regulation. The paper explores exceptional worldwide prison files which include general meeting Resolutions, Treaty standards, working papers of UN, cases relating to commonplace global law and writing of jurists regarding area law and standard international law. it's far argued that, ideas such as common background of mankind, non-navy region, sovereign equality, nuclear weapon unfastened area and protection of outer area environment, etc. evolved nation practices a number of the worldwide community which can be certified to turn out to be international customary regulation.

Keywords: social networks privacy issues, social networks security issues, social networks privacy precautions measures, social networks security precautions measures

Procedia PDF Downloads 20
342 Analysis of Truck Drivers’ Distraction on Crash Risk

Authors: Samuel Nderitu Muchiri, Tracy Wangechi Maina

Abstract:

Truck drivers face a myriad of challenges in their profession. Enhancements in logistics effectiveness can be pivotal in propelling economic developments. The specific objective of the study was to assess the influence of driver distraction on crash risk. The study is significant as it elucidates best practices that truck drivers can embrace in an effort to enhance road safety. These include amalgamating behaviors that enable drivers to fruitfully execute multifaceted functions such as finding and following routes, evading collisions, monitoring speed, adhering to road regulations, and evaluating vehicle systems’ conditions. The analysis involved an empirical review of ten previous studies related to the research topic. The articles revealed that driver distraction plays a substantial role in road accidents and other crucial road security incidents across the globe. Africa depends immensely on the freight transport sector to facilitate supply chain operations. Several studies indicate that drivers who operate primarily on rural roads, such as those found in Sub-Saharan Africa, have an increased propensity to engage in distracted activities such as cell phone usage while driving. The findings also identified the need for digitalization in truck driving operations, including carrier management techniques such as fatigue management, artificial intelligence, and automating functions like cell phone usage controls. The recommendations can aid policymakers and commercial truck carriers in deepening their understanding of driver distraction and enforcing mitigations to foster road safety.

Keywords: truck drivers, distraction, digitalization, crash risk, road safety

Procedia PDF Downloads 49
341 Design and Characterization of Ecological Materials Based on Demolition and Concrete Waste, Casablanca (Morocco)

Authors: Mourad Morsli, Mohamed Tahiri, Azzedine Samdi

Abstract:

The Cities are the urbanized territories most favorable to the consumption of resources (materials, energy). In Morocco, the economic capital Casablanca is one of them, with its 4M inhabitants and its 60% share in the economic and industrial activity of the kingdom. In the absence of legal status in force, urban development has favored the generation of millions of tons of demolition and construction waste scattered in open spaces causing a significant nuisance to the environment and citizens. Hence the main objective of our work is to valorize concrete waste. The representative wastes are mainly concrete, concrete, and fired clay bricks, ceramic tiles, marble panels, gypsum, and scrap metal. The work carried out includes: geolocation with a combination of artificial intelligence, GIS, and Google Earth, which allowed the estimation of the quantity of these wastes per site; then the sorting, crushing, grinding, and physicochemical characterization of the collected samples allowed the definition of the exploitation ways for each extracted fraction for integrated management of the said wastes. In the present work, we proceeded to the exploitation of the fractions obtained after sieving the representative samples to incorporate them in the manufacture of new ecological materials for construction. These formulations prepared studies have been tested and characterized: physical criteria (specific surface, resistance to flexion and compression) and appearance (cracks, deformation). We will present in detail the main results of our research work and also describe the specific properties of each material developed.

Keywords: demolition and construction waste, GIS combination software, inert waste recovery, ecological materials, Casablanca, Morocco

Procedia PDF Downloads 134
340 Social-Cognitive Aspects of Interpretation: Didactic Approaches in Language Processing and English as a Second Language Difficulties in Dyslexia

Authors: Schnell Zsuzsanna

Abstract:

Background: The interpretation of written texts, language processing in the visual domain, in other words, atypical reading abilities, also known as dyslexia, is an ever-growing phenomenon in today’s societies and educational communities. The much-researched problem affects cognitive abilities and, coupled with normal intelligence normally manifests difficulties in the differentiation of sounds and orthography and in the holistic processing of written words. The factors of susceptibility are varied: social, cognitive psychological, and linguistic factors interact with each other. Methods: The research will explain the psycholinguistics of dyslexia on the basis of several empirical experiments and demonstrate how domain-general abilities of inhibition, retrieval from the mental lexicon, priming, phonological processing, and visual modality transfer affect successful language processing and interpretation. Interpretation of visual stimuli is hindered, and the problem seems to be embedded in a sociocultural, psycholinguistic, and cognitive background. This makes the picture even more complex, suggesting that the understanding and resolving of the issues of dyslexia has to be interdisciplinary, aided by several disciplines in the field of humanities and social sciences, and should be researched from an empirical approach, where the practical, educational corollaries can be analyzed on an applied basis. Aim and applicability: The lecture sheds light on the applied, cognitive aspects of interpretation, social cognitive traits of language processing, the mental underpinnings of cognitive interpretation strategies in different languages (namely, Hungarian and English), offering solutions with a few applied techniques for success in foreign language learning that can be useful advice for the developers of testing methodologies and measures across ESL teaching and testing platforms.

Keywords: dyslexia, social cognition, transparency, modalities

Procedia PDF Downloads 84
339 Winning the Future of Education in Africa through Project Base Learning: How the Implementation of PBL Pedagogy Can Transform Africa’s Educational System from Theory Base to Practical Base in School Curriculum

Authors: Bismark Agbemble

Abstract:

This paper talks about how project-based learning (PBL) is being infused or implemented in the educational sphere of Africa. The paper navigates through the liminal aspects of PBL as a pedagogical approach to bridge the divide between theoretical knowledge and its application within school curriculums. Given that contextualized learning can be embodied, the abstract vehemently discusses that PBL creates an opportunity for students to work on projects that are of academic relevance in their local settings. It presents PBL’s growth of critical thinking, problem-solving, cooperation, and communications, which is vital in getting young citizens to prepare for the 21st-century revolution. In addition, the abstract stresses the possibility that PBL could become a stimulus to creativity and innovation wherein learning becomes motivated from within by intrinsic motivations. The paper advocates for a holistic approach that is based on teacher’s professional development with the provision of adequate infrastructural facilities and resource allocation, thus ensuring the success and sustainability of PBLs in African education systems. In the end, the paper positions this as a transformative educational methodology that has great potential in helping to shape an African generation that is prepared for a great future.

Keywords: student centered pedagogy, constructivist learning theory, self-directed learning, active exploration, real world challenges, STEM, 21st century skills, curriculum design, classroom management, project base learning curriculum, global intelligence, social and communication skills, transferable skills, critical thinking, investigatable learning, life skills

Procedia PDF Downloads 55
338 An Intelligent Prediction Method for Annular Pressure Driven by Mechanism and Data

Authors: Zhaopeng Zhu, Xianzhi Song, Gensheng Li, Shuo Zhu, Shiming Duan, Xuezhe Yao

Abstract:

Accurate calculation of wellbore pressure is of great significance to prevent wellbore risk during drilling. The traditional mechanism model needs a lot of iterative solving procedures in the calculation process, which reduces the calculation efficiency and is difficult to meet the demand of dynamic control of wellbore pressure. In recent years, many scholars have introduced artificial intelligence algorithms into wellbore pressure calculation, which significantly improves the calculation efficiency and accuracy of wellbore pressure. However, due to the ‘black box’ property of intelligent algorithm, the existing intelligent calculation model of wellbore pressure is difficult to play a role outside the scope of training data and overreacts to data noise, often resulting in abnormal calculation results. In this study, the multi-phase flow mechanism is embedded into the objective function of the neural network model as a constraint condition, and an intelligent prediction model of wellbore pressure under the constraint condition is established based on more than 400,000 sets of pressure measurement while drilling (MPD) data. The constraint of the multi-phase flow mechanism makes the prediction results of the neural network model more consistent with the distribution law of wellbore pressure, which overcomes the black-box attribute of the neural network model to some extent. The main performance is that the accuracy of the independent test data set is further improved, and the abnormal calculation values basically disappear. This method is a prediction method driven by MPD data and multi-phase flow mechanism, and it is the main way to predict wellbore pressure accurately and efficiently in the future.

Keywords: multiphase flow mechanism, pressure while drilling data, wellbore pressure, mechanism constraints, combined drive

Procedia PDF Downloads 174
337 Determination of Optimum Parameters for Thermal Stress Distribution in Composite Plate Containing a Triangular Cutout by Optimization Method

Authors: Mohammad Hossein Bayati Chaleshtari, Hadi Khoramishad

Abstract:

Minimizing the stress concentration around triangular cutout in infinite perforated plates subjected to a uniform heat flux induces thermal stresses is an important consideration in engineering design. Furthermore, understanding the effective parameters on stress concentration and proper selection of these parameters enables the designer to achieve a reliable design. In the analysis of thermal stress, the effective parameters on stress distribution around cutout include fiber angle, flux angle, bluntness and rotation angle of the cutout for orthotropic materials. This paper was tried to examine effect of these parameters on thermal stress analysis of infinite perforated plates with central triangular cutout. In order to achieve the least amount of thermal stress around a triangular cutout using a novel swarm intelligence optimization technique called dragonfly optimizer that inspired by the life method and hunting behavior of dragonfly in nature. In this study, using the two-dimensional thermoelastic theory and based on the Likhnitskiiʼ complex variable technique, the stress analysis of orthotropic infinite plate with a circular cutout under a uniform heat flux was developed to the plate containing a quasi-triangular cutout in thermal steady state condition. To achieve this goal, a conformal mapping function was used to map an infinite plate containing a quasi- triangular cutout into the outside of a unit circle. The plate is under uniform heat flux at infinity and Neumann boundary conditions and thermal-insulated condition at the edge of the cutout were considered.

Keywords: infinite perforated plate, complex variable method, thermal stress, optimization method

Procedia PDF Downloads 147
336 Signs, Signals and Syndromes: Algorithmic Surveillance and Global Health Security in the 21st Century

Authors: Stephen L. Roberts

Abstract:

This article offers a critical analysis of the rise of syndromic surveillance systems for the advanced detection of pandemic threats within contemporary global health security frameworks. The article traces the iterative evolution and ascendancy of three such novel syndromic surveillance systems for the strengthening of health security initiatives over the past two decades: 1) The Program for Monitoring Emerging Diseases (ProMED-mail); 2) The Global Public Health Intelligence Network (GPHIN); and 3) HealthMap. This article demonstrates how each newly introduced syndromic surveillance system has become increasingly oriented towards the integration of digital algorithms into core surveillance capacities to continually harness and forecast upon infinitely generating sets of digital, open-source data, potentially indicative of forthcoming pandemic threats. This article argues that the increased centrality of the algorithm within these next-generation syndromic surveillance systems produces a new and distinct form of infectious disease surveillance for the governing of emergent pathogenic contingencies. Conceptually, the article also shows how the rise of this algorithmic mode of infectious disease surveillance produces divergences in the governmental rationalities of global health security, leading to the rise of an algorithmic governmentality within contemporary contexts of Big Data and these surveillance systems. Empirically, this article demonstrates how this new form of algorithmic infectious disease surveillance has been rapidly integrated into diplomatic, legal, and political frameworks to strengthen the practice of global health security – producing subtle, yet distinct shifts in the outbreak notification and reporting transparency of states, increasingly scrutinized by the algorithmic gaze of syndromic surveillance.

Keywords: algorithms, global health, pandemic, surveillance

Procedia PDF Downloads 184
335 Potassium-Phosphorus-Nitrogen Detection and Spectral Segmentation Analysis Using Polarized Hyperspectral Imagery and Machine Learning

Authors: Nicholas V. Scott, Jack McCarthy

Abstract:

Military, law enforcement, and counter terrorism organizations are often tasked with target detection and image characterization of scenes containing explosive materials in various types of environments where light scattering intensity is high. Mitigation of this photonic noise using classical digital filtration and signal processing can be difficult. This is partially due to the lack of robust image processing methods for photonic noise removal, which strongly influence high resolution target detection and machine learning-based pattern recognition. Such analysis is crucial to the delivery of reliable intelligence. Polarization filters are a possible method for ambient glare reduction by allowing only certain modes of the electromagnetic field to be captured, providing strong scene contrast. An experiment was carried out utilizing a polarization lens attached to a hyperspectral imagery camera for the purpose of exploring the degree to which an imaged polarized scene of potassium, phosphorus, and nitrogen mixture allows for improved target detection and image segmentation. Preliminary imagery results based on the application of machine learning algorithms, including competitive leaky learning and distance metric analysis, to polarized hyperspectral imagery, suggest that polarization filters provide a slight advantage in image segmentation. The results of this work have implications for understanding the presence of explosive material in dry, desert areas where reflective glare is a significant impediment to scene characterization.

Keywords: explosive material, hyperspectral imagery, image segmentation, machine learning, polarization

Procedia PDF Downloads 141
334 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 70
333 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector

Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau

Abstract:

Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.

Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement

Procedia PDF Downloads 198