Search results for: virtual Machine
971 Simulation of Particle Damping in Boring Tool Using Combined Particles
Authors: S. Chockalingam, U. Natarajan, D. M. Santhoshsarang
Abstract:
Particle damping is a promising vibration attenuating technique in boring tool than other type of damping with minimal effect on the strength, rigidity and stiffness ratio of the machine tool structure. Due to the cantilever nature of boring tool holder in operations, it suffers chatter when the slenderness ratio of the tool gets increased. In this study, Copper-Stainless steel (SS) particles were packed inside the boring tool which acts as a damper. Damper suppresses chatter generated during machining and also improves the machining efficiency of the tool with better slenderness ratio. In the first approach of particle damping, combined Cu-SS particles were packed inside the vibrating tool, whereas Copper and Stainless steel particles were selected separately and packed inside another tool and their effectiveness was analysed in this simulation. This study reveals that the efficiency of finite element simulation of the boring tools when equipped with particles such as copper, stainless steel and a combination of both. In this study, the newly modified boring tool holder with particle damping was simulated using ANSYS12.0 with and without particles. The aim of this study is to enhance the structural rigidity through particle damping thus avoiding the occurrence of resonance in the boring tool during machining.Keywords: boring bar, copper-stainless steel, chatter, particle damping
Procedia PDF Downloads 461970 Exploring Multimodal Communication: Intersections of Language, Gesture, and Technology
Authors: Rasha Ali Dheyab
Abstract:
In today's increasingly interconnected and technologically-driven world, communication has evolved beyond traditional verbal exchanges. This paper delves into the fascinating realm of multimodal communication, a dynamic field at the intersection of linguistics, gesture studies, and technology. The study of how humans convey meaning through a combination of spoken language, gestures, facial expressions, and digital platforms has gained prominence as our modes of interaction continue to diversify. This exploration begins by examining the foundational theories in linguistics and gesture studies, tracing their historical development and mutual influences. It further investigates the role of nonverbal cues, such as gestures and facial expressions, in augmenting and sometimes even altering the meanings conveyed by spoken language. Additionally, the paper delves into the modern technological landscape, where emojis, GIFs, and other digital symbols have emerged as new linguistic tools, reshaping the ways in which we communicate and express emotions. The interaction between traditional and digital modes of communication is a central focus of this study. The paper investigates how technology has not only introduced new modes of expression but has also influenced the adaptation of existing linguistic and gestural patterns in online discourse. The emergence of virtual reality and augmented reality environments introduces yet another layer of complexity to multimodal communication, offering new avenues for studying how humans navigate and negotiate meaning in immersive digital spaces. Through a combination of literature review, case studies, and theoretical analysis, this paper seeks to shed light on the intricate interplay between language, gesture, and technology in the realm of multimodal communication. By understanding how these diverse modes of expression intersect and interact, we gain valuable insights into the ever-evolving nature of human communication and its implications for fields ranging from linguistics and psychology to human-computer interaction and digital anthropology.Keywords: multimodal communication, linguistics ., gesture studies., emojis., verbal communication., digital
Procedia PDF Downloads 81969 A Preliminary Kinematic Comparison of Vive and Vicon Systems for the Accurate Tracking of Lumbar Motion
Authors: Yaghoubi N., Moore Z., Van Der Veen S. M., Pidcoe P. E., Thomas J. S., Dexheimer B.
Abstract:
Optoelectronic 3D motion capture systems, such as the Vicon kinematic system, are widely utilized in biomedical research to track joint motion. These systems are considered powerful and accurate measurement tools with <2 mm average error. However, these systems are costly and may be difficult to implement and utilize in a clinical setting. 3D virtual reality (VR) is gaining popularity as an affordable and accessible tool to investigate motor control and perception in a controlled, immersive environment. The HTC Vive VR system includes puck-style trackers that seamlessly integrate into its VR environments. These affordable, wireless, lightweight trackers may be more feasible for clinical kinematic data collection. However, the accuracy of HTC Vive Trackers (3.0), when compared to optoelectronic 3D motion capture systems, remains unclear. In this preliminary study, we compared the HTC Vive Tracker system to a Vicon kinematic system in a simulated lumbar flexion task. A 6-DOF robot arm (SCORBOT ER VII, Eshed Robotec/RoboGroup, Rosh Ha’Ayin, Israel) completed various reaching movements to mimic increasing levels of hip flexion (15°, 30°, 45°). Light reflective markers, along with one HTC Vive Tracker (3.0), were placed on the rigid segment separating the elbow and shoulder of the robot. We compared position measures simultaneously collected from both systems. Our preliminary analysis shows no significant differences between the Vicon motion capture system and the HTC Vive tracker in the Z axis, regardless of hip flexion. In the X axis, we found no significant differences between the two systems at 15 degrees of hip flexion but minimal differences at 30 and 45 degrees, ranging from .047 cm ± .02 SE (p = .03) at 30 degrees hip flexion to .194 cm ± .024 SE (p < .0001) at 45 degrees of hip flexion. In the Y axis, we found a minimal difference for 15 degrees of hip flexion only (.743 cm ± .275 SE; p = .007). This preliminary analysis shows that the HTC Vive Tracker may be an appropriate, affordable option for gross motor motion capture when the Vicon system is not available, such as in clinical settings. Further research is needed to compare these two motion capture systems in different body poses and for different body segments.Keywords: lumbar, vivetracker, viconsystem, 3dmotion, ROM
Procedia PDF Downloads 101968 Supervisor Controller-Based Colored Petri Nets for Deadlock Control and Machine Failures in Automated Manufacturing Systems
Authors: Husam Kaid, Abdulrahman Al-Ahmari, Zhiwu Li
Abstract:
This paper develops a robust deadlock control technique for shared and unreliable resources in automated manufacturing systems (AMSs) based on structural analysis and colored Petri nets, which consists of three steps. The first step involves using strict minimal siphon control to create a live (deadlock-free) system that does not consider resource failure. The second step uses an approach based on colored Petri net, in which all monitors designed in the first step are merged into a single monitor. The third step addresses the deadlock control problems caused by resource failures. For all resource failures in the Petri net model a common recovery subnet based on colored petri net is proposed. The common recovery subnet is added to the obtained system at the second step to make the system reliable. The proposed approach is evaluated using an AMS from the literature. The results show that the proposed approach can be applied to an unreliable complex Petri net model, has a simpler structure and less computational complexity, and can obtain one common recovery subnet to model all resource failures.Keywords: automated manufacturing system, colored Petri net, deadlocks, siphon
Procedia PDF Downloads 129967 Exploratory Study of the Influencing Factors for Hotels' Competitors
Authors: Asma Ameur, Dhafer Malouche
Abstract:
Hotel competitiveness research is an essential phase of the marketing strategy for any hotel. Certainly, knowing the hotels' competitors helps the hotelier to grasp its position in the market and the citizen to make the right choice in picking a hotel. Thus, competitiveness is an important indicator that can be influenced by various factors. In fact, the issue of competitiveness, this ability to cope with competition, remains a difficult and complex concept to define and to exploit. Therefore, the purpose of this article is to make an exploratory study to calculate a competitiveness indicator for hotels. Further on, this paper makes it possible to determine the criteria of direct or indirect effect on the image and the perception of a hotel. The actual research is used to look into the right model for hotel ‘competitiveness. For this reason, we exploit different theoretical contributions in the field of machine learning. Thus, we use some statistical techniques such as the Principal Component Analysis (PCA) to reduce the dimensions, as well as other techniques of statistical modeling. This paper presents a survey covering of the techniques and methods in hotel competitiveness research. Furthermore, this study allows us to deduct the significant variables that influence the determination of hotel’s competitors. Lastly, the discussed experiences in this article found that the hotel competitors are influenced by several factors with different rates.Keywords: competitiveness, e-reputation, hotels' competitors, online hotel’ review, principal component analysis, statistical modeling
Procedia PDF Downloads 119966 IoT and Advanced Analytics Integration in Biogas Modelling
Authors: Rakesh Choudhary, Ajay Kumar, Deepak Sharma
Abstract:
The main goal of this paper is to investigate the challenges and benefits of IoT integration in biogas production. This overview explains how the inclusion of IoT can enhance biogas production efficiency. Therefore, such collected data can be explored by advanced analytics, including Artificial intelligence (AI) and Machine Learning (ML) algorithms, consequently improving bio-energy processes. To boost biogas generation efficiency, this report examines the use of IoT devices for real-time data collection on key parameters, e.g., pH, temperature, gas composition, and microbial growth. Real-time monitoring through big data has made it possible to detect diverse, complex trends in the process of producing biogas. The Informed by advanced analytics can also help in improving bio-energy production as well as optimizing operational conditions. Moreover, IoT allows remote observation, control and management, which decreases manual intervention needed whilst increasing process effectiveness. Such a paradigm shift in the incorporation of IoT technologies into biogas production systems helps to achieve higher productivity levels as well as more practical biomass quality biomethane through real-time monitoring-based proactive decision-making, thus driving continuous performance improvement.Keywords: internet of things, biogas, renewable energy, sustainability, anaerobic digestion, real-time monitoring, optimization
Procedia PDF Downloads 20965 Violence Detection and Tracking on Moving Surveillance Video Using Machine Learning Approach
Authors: Abe Degale D., Cheng Jian
Abstract:
When creating automated video surveillance systems, violent action recognition is crucial. In recent years, hand-crafted feature detectors have been the primary method for achieving violence detection, such as the recognition of fighting activity. Researchers have also looked into learning-based representational models. On benchmark datasets created especially for the detection of violent sequences in sports and movies, these methods produced good accuracy results. The Hockey dataset's videos with surveillance camera motion present challenges for these algorithms for learning discriminating features. Image recognition and human activity detection challenges have shown success with deep representation-based methods. For the purpose of detecting violent images and identifying aggressive human behaviours, this research suggested a deep representation-based model using the transfer learning idea. The results show that the suggested approach outperforms state-of-the-art accuracy levels by learning the most discriminating features, attaining 99.34% and 99.98% accuracy levels on the Hockey and Movies datasets, respectively.Keywords: violence detection, faster RCNN, transfer learning and, surveillance video
Procedia PDF Downloads 107964 Assessing the Self-Directed Learning Skills of the Undergraduate Nursing Students in a Medical University in Bahrain: A Quantitative Study
Authors: Catherine Mary Abou-Zaid
Abstract:
This quantitative study discusses the concerns with the self-directed learning (SDL) skills of the undergraduate nursing students in a medical university in Bahrain. The nursing undergraduate student SDL study was conducted taking all 4 years and compiling data collected from the students themselves by survey questionnaire. The aim of the study is to understand and change the attitudes of self-directed learning among the undergraduate students. The SDL of the undergraduate student nurses has been noticed to be lacking and motivation to actually perform without supervision while out-with classrooms are very low. Their use of the resources available on the virtual learning environment and also within the university is not as good as it should be for a university student at this level. They do not use them to their own advantage. They are not prepared for the transition from high school to an academic environment such as a university or college. For some students it is the first time in their academic lives that they have faced sharing a classroom with the opposite sex. For some this is a major issue and we as academics need to be aware of all issues that they come to higher education with. Design Methodology: The design methodology that was chosen was a quantitative design using convenience sampling of the students who would be asked to complete survey questionnaire. This sampling method was chosen because of the time constraint. This was completed by the undergraduate students themselves while in class. The questionnaire was analyzed by the statistical package for social sciences (SPSS), the results interpreted by the researcher and the findings published in the paper. The analyzed data will also be reported on and from this information we as educators will be able to see the student’s weaknesses regarding self-directed learning. The aims and objectives of the research will be used as recommendations for the improvement of resources for the students to improve their SDL skills. Conclusion: The results will be able to give the educators an insight to how we can change the self-directed learning techniques of the students and enable them to embrace the skills and to focus more on being self-directed in their studies rather than having to be put on to a SDL pathway from the educators themselves. This evidence will come from the analysis of the statistical data. It may even change the way in which the students are selected for the nursing programme. These recommendations will be reported to the head of school and also to the nursing faculty.Keywords: self-directed learning, undergraduate students, transition, statistical package for social sciences (SPSS), higher education
Procedia PDF Downloads 315963 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date
Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian
Abstract:
To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven
Procedia PDF Downloads 174962 Control Flow around NACA 4415 Airfoil Using Slot and Injection
Authors: Imine Zakaria, Meftah Sidi Mohamed El Amine
Abstract:
One of the most vital aerodynamic organs of a flying machine is the wing, which allows it to fly in the air efficiently. The flow around the wing is very sensitive to changes in the angle of attack. Beyond a value, there is a phenomenon of the boundary layer separation on the upper surface, which causes instability and total degradation of aerodynamic performance called a stall. However, controlling flow around an airfoil has become a researcher concern in the aeronautics field. There are two techniques for controlling flow around a wing to improve its aerodynamic performance: passive and active controls. Blowing and suction are among the active techniques that control the boundary layer separation around an airfoil. Their objective is to give energy to the air particles in the boundary layer separation zones and to create vortex structures that will homogenize the velocity near the wall and allow control. Blowing and suction have long been used as flow control actuators around obstacles. In 1904 Prandtl applied a permanent blowing to a cylinder to delay the boundary layer separation. In the present study, several numerical investigations have been developed to predict a turbulent flow around an aerodynamic profile. CFD code was used for several angles of attack in order to validate the present work with that of the literature in the case of a clean profile. The variation of the lift coefficient CL with the momentum coefficientKeywords: CFD, control flow, lift, slot
Procedia PDF Downloads 197961 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic
Authors: Fei Gao, Rodolfo C. Raga Jr.
Abstract:
This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle
Procedia PDF Downloads 75960 Advances in Artificial intelligence Using Speech Recognition
Authors: Khaled M. Alhawiti
Abstract:
This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance
Procedia PDF Downloads 478959 A Parallel Computation Based on GPU Programming for a 3D Compressible Fluid Flow Simulation
Authors: Sugeng Rianto, P.W. Arinto Yudi, Soemarno Muhammad Nurhuda
Abstract:
A computation of a 3D compressible fluid flow for virtual environment with haptic interaction can be a non-trivial issue. This is especially how to reach good performances and balancing between visualization, tactile feedback interaction, and computations. In this paper, we describe our approach of computation methods based on parallel programming on a GPU. The 3D fluid flow solvers have been developed for smoke dispersion simulation by using combinations of the cubic interpolated propagation (CIP) based fluid flow solvers and the advantages of the parallelism and programmability of the GPU. The fluid flow solver is generated in the GPU-CPU message passing scheme to get rapid development of haptic feedback modes for fluid dynamic data. A rapid solution in fluid flow solvers is developed by applying cubic interpolated propagation (CIP) fluid flow solvers. From this scheme, multiphase fluid flow equations can be solved simultaneously. To get more acceleration in the computation, the Navier-Stoke Equations (NSEs) is packed into channels of texel, where computation models are performed on pixels that can be considered to be a grid of cells. Therefore, despite of the complexity of the obstacle geometry, processing on multiple vertices and pixels can be done simultaneously in parallel. The data are also shared in global memory for CPU to control the haptic in providing kinaesthetic interaction and felling. The results show that GPU based parallel computation approaches provide effective simulation of compressible fluid flow model for real-time interaction in 3D computer graphic for PC platform. This report has shown the feasibility of a new approach of solving the compressible fluid flow equations on the GPU. The experimental tests proved that the compressible fluid flowing on various obstacles with haptic interactions on the few model obstacles can be effectively and efficiently simulated on the reasonable frame rate with a realistic visualization. These results confirm that good performances and balancing between visualization, tactile feedback interaction, and computations can be applied successfully.Keywords: CIP, compressible fluid, GPU programming, parallel computation, real-time visualisation
Procedia PDF Downloads 432958 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction
Authors: Priyadarsini Samal, Rajesh Singla
Abstract:
Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.Keywords: brain computer interface, electroencephalogram, regression model, stress, word search
Procedia PDF Downloads 187957 Electrocardiogram-Based Heartbeat Classification Using Convolutional Neural Networks
Authors: Jacqueline Rose T. Alipo-on, Francesca Isabelle F. Escobar, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar Al Dahoul
Abstract:
Electrocardiogram (ECG) signal analysis and processing are crucial in the diagnosis of cardiovascular diseases, which are considered one of the leading causes of mortality worldwide. However, the traditional rule-based analysis of large volumes of ECG data is time-consuming, labor-intensive, and prone to human errors. With the advancement of the programming paradigm, algorithms such as machine learning have been increasingly used to perform an analysis of ECG signals. In this paper, various deep learning algorithms were adapted to classify five classes of heartbeat types. The dataset used in this work is the synthetic MIT-BIH Arrhythmia dataset produced from generative adversarial networks (GANs). Various deep learning models such as ResNet-50 convolutional neural network (CNN), 1-D CNN, and long short-term memory (LSTM) were evaluated and compared. ResNet-50 was found to outperform other models in terms of recall and F1 score using a five-fold average score of 98.88% and 98.87%, respectively. 1-D CNN, on the other hand, was found to have the highest average precision of 98.93%.Keywords: heartbeat classification, convolutional neural network, electrocardiogram signals, generative adversarial networks, long short-term memory, ResNet-50
Procedia PDF Downloads 128956 Lightweight Hybrid Convolutional and Recurrent Neural Networks for Wearable Sensor Based Human Activity Recognition
Authors: Sonia Perez-Gamboa, Qingquan Sun, Yan Zhang
Abstract:
Non-intrusive sensor-based human activity recognition (HAR) is utilized in a spectrum of applications, including fitness tracking devices, gaming, health care monitoring, and smartphone applications. Deep learning models such as convolutional neural networks (CNNs) and long short term memory (LSTM) recurrent neural networks (RNNs) provide a way to achieve HAR accurately and effectively. In this paper, we design a multi-layer hybrid architecture with CNN and LSTM and explore a variety of multi-layer combinations. Based on the exploration, we present a lightweight, hybrid, and multi-layer model, which can improve the recognition performance by integrating local features and scale-invariant with dependencies of activities. The experimental results demonstrate the efficacy of the proposed model, which can achieve a 94.7% activity recognition rate on a benchmark human activity dataset. This model outperforms traditional machine learning and other deep learning methods. Additionally, our implementation achieves a balance between recognition rate and training time consumption.Keywords: deep learning, LSTM, CNN, human activity recognition, inertial sensor
Procedia PDF Downloads 150955 The Proposal for a Framework to Face Opacity and Discrimination ‘Sins’ Caused by Consumer Creditworthiness Machines in the EU
Authors: Diogo José Morgado Rebelo, Francisco António Carneiro Pacheco de Andrade, Paulo Jorge Freitas de Oliveira Novais
Abstract:
Not everything in AI-power consumer credit scoring turns out to be a wonder. When using AI in Creditworthiness Assessment (CWA), opacity and unfairness ‘sins’ must be considered to the task be deemed Responsible. AI software is not always 100% accurate, which can lead to misclassification. Discrimination of some groups can be exponentiated. A hetero personalized identity can be imposed on the individual(s) affected. Also, autonomous CWA sometimes lacks transparency when using black box models. However, for this intended purpose, human analysts ‘on-the-loop’ might not be the best remedy consumers are looking for in credit. This study seeks to explore the legality of implementing a Multi-Agent System (MAS) framework in consumer CWA to ensure compliance with the regulation outlined in Article 14(4) of the Proposal for an Artificial Intelligence Act (AIA), dated 21 April 2021 (as per the last corrigendum by the European Parliament on 19 April 2024), Especially with the adoption of Art. 18(8)(9) of the EU Directive 2023/2225, of 18 October, which will go into effect on 20 November 2026, there should be more emphasis on the need for hybrid oversight in AI-driven scoring to ensure fairness and transparency. In fact, the range of EU regulations on AI-based consumer credit will soon impact the AI lending industry locally and globally, as shown by the broad territorial scope of AIA’s Art. 2. Consequently, engineering the law of consumer’s CWA is imperative. Generally, the proposed MAS framework consists of several layers arranged in a specific sequence, as follows: firstly, the Data Layer gathers legitimate predictor sets from traditional sources; then, the Decision Support System Layer, whose Neural Network model is trained using k-fold Cross Validation, provides recommendations based on the feeder data; the eXplainability (XAI) multi-structure comprises Three-Step-Agents; and, lastly, the Oversight Layer has a 'Bottom Stop' for analysts to intervene in a timely manner. From the analysis, one can assure a vital component of this software is the XAY layer. It appears as a transparent curtain covering the AI’s decision-making process, enabling comprehension, reflection, and further feasible oversight. Local Interpretable Model-agnostic Explanations (LIME) might act as a pillar by offering counterfactual insights. SHapley Additive exPlanation (SHAP), another agent in the XAI layer, could address potential discrimination issues, identifying the contribution of each feature to the prediction. Alternatively, for thin or no file consumers, the Suggestion Agent can promote financial inclusion. It uses lawful alternative sources such as the share of wallet, among others, to search for more advantageous solutions to incomplete evaluation appraisals based on genetic programming. Overall, this research aspires to bring the concept of Machine-Centered Anthropocentrism to the table of EU policymaking. It acknowledges that, when put into service, credit analysts no longer exert full control over the data-driven entities programmers have given ‘birth’ to. With similar explanatory agents under supervision, AI itself can become self-accountable, prioritizing human concerns and values. AI decisions should not be vilified inherently. The issue lies in how they are integrated into decision-making and whether they align with non-discrimination principles and transparency rules.Keywords: creditworthiness assessment, hybrid oversight, machine-centered anthropocentrism, EU policymaking
Procedia PDF Downloads 34954 Difference between 'HDR Ir-192 and Co-60 Sources' for High Dose Rate Brachytherapy Machine
Authors: Md Serajul Islam
Abstract:
High Dose Rate (HDR) Brachytherapy is used for cancer patients. In our country’s prospect, we are using only cervices and breast cancer treatment by using HDR. The air kerma rate in air at a reference distance of less than a meter from the source is the recommended quantity for the specification of gamma ray source Ir-192 in brachytherapy. The absorbed dose for the patients is directly proportional to the air kerma rate. Therefore the air kerma rate should be determined before the first use of the source on patients by qualified medical physicist who is independent from the source manufacturer. The air kerma rate will then be applied in the calculation of the dose delivered to patients in their planning systems. In practice, high dose rate (HDR) Ir-192 afterloader machines are mostly used in brachytherapy treatment. Currently, HDR-Co-60 increasingly comes into operation too. The essential advantage of the use of Co-60 sources is its longer half-life compared to Ir-192. The use of HDRCo-60 afterloading machines is also quite interesting for developing countries. This work describes the dosimetry at HDR afterloading machines according to the protocols IAEA-TECDOC-1274 (2002) with the nuclides Ir-192 and Co-60. We have used 3 different measurement methods (with a ring chamber, with a solid phantom and in free air and with a well chamber) in dependence of each of the protocols. We have shown that the standard deviations of the measured air kerma rate for the Co-60 source are generally larger than those of the Ir-192 source. The measurements with the well chamber had the lowest deviation from the certificate value. In all protocols and methods, the deviations stood for both nuclides by a maximum of about 1% for Ir-192 and 2.5% for Co-60-Sources respectively.Keywords: Ir-192 source, cancer, patients, cheap treatment cost
Procedia PDF Downloads 236953 Bhumastra “Unmanned Ground Vehicle”
Authors: Vivek Krishna, Nikhil Jain, A. Mary Posonia A., Albert Mayan J
Abstract:
Terrorism and insurgency are significant global issues that require constant attention and effort from governments and scientists worldwide. To combat these threats, nations invest billions of dollars in developing new defensive technologies to protect civilians. Breakthroughs in vehicle automation have led to the use of sophisticated machines for many dangerous and critical anti-terrorist activities. Our concept of an "Unmanned Ground Vehicle" can carry out tasks such as border security, surveillance, mine detection, and active combat independently or in tandem with human control. The robot's movement can be wirelessly controlled by a person in a distant location or can travel to a pre-programmed destination autonomously in situations where personal control is not feasible. Our defence system comprises two units: the control unit that regulates mobility and the motion tracking unit. The remote operator robot uses the camera's live visual feed to manually operate both units, and the rover can automatically detect movement. The rover is operated by manpower who controls it using a joystick or mouse, and a wireless modem enables a soldier in a combat zone to control the rover via an additional controller feature.Keywords: robotics, computer vision, Machine learning, Artificial intelligence, future of AI
Procedia PDF Downloads 124952 A Sociolinguistic Approach to the Translation of Children’s Literature: Exploring Identity Issues in the American English Translation of Manolito Gafotas
Authors: Owen Harrington-Fernandez, Pilar Alderete-Diez
Abstract:
Up until recently, translation studies treated children’s literature as something of a marginal preoccupation, but the recent attention that this text type has attracted suggests that it may be fertile ground for research. This paper contributes to this new research avenue by applying a sociolinguistic theoretical framework to explore issues around the intersubjective co-construction of identity in the American English translation of the Spanish children’s story, Manolito Gafotas. The application of Bucholtz and Hall’s framework achieves two objectives: (1) it identifies shifts in the translation of the main character’s behaviour as culturally and morally motivated manipulations, and (2) it demonstrates how the context of translation becomes the very censorship machine that delegitimises the identity of the main character, and, concomitantly, the identity of the implied reader(s). If we take identity to be an intersubjective phenomenon, then it logicall follows that expurgating the identity of the main character necessarily shifts the identity of the implied reader(s) also. It is a double censorship of identity carried out under the auspices of an intellectual colonisation of a Spanish text. After reporting on the results of the analysis, the paper ends by raising the question of censorship in translation, and, more specifically, in children’s literature, in order to promote debate around this topic.Keywords: censorship, identity, sociolinguistics, translation
Procedia PDF Downloads 261951 A Hybrid Feature Selection and Deep Learning Algorithm for Cancer Disease Classification
Authors: Niousha Bagheri Khulenjani, Mohammad Saniee Abadeh
Abstract:
Learning from very big datasets is a significant problem for most present data mining and machine learning algorithms. MicroRNA (miRNA) is one of the important big genomic and non-coding datasets presenting the genome sequences. In this paper, a hybrid method for the classification of the miRNA data is proposed. Due to the variety of cancers and high number of genes, analyzing the miRNA dataset has been a challenging problem for researchers. The number of features corresponding to the number of samples is high and the data suffer from being imbalanced. The feature selection method has been used to select features having more ability to distinguish classes and eliminating obscures features. Afterward, a Convolutional Neural Network (CNN) classifier for classification of cancer types is utilized, which employs a Genetic Algorithm to highlight optimized hyper-parameters of CNN. In order to make the process of classification by CNN faster, Graphics Processing Unit (GPU) is recommended for calculating the mathematic equation in a parallel way. The proposed method is tested on a real-world dataset with 8,129 patients, 29 different types of tumors, and 1,046 miRNA biomarkers, taken from The Cancer Genome Atlas (TCGA) database.Keywords: cancer classification, feature selection, deep learning, genetic algorithm
Procedia PDF Downloads 111950 Dematerialized Beings in Katherine Dunn's Geek Love: A Corporeal and Ethical Study under Posthumanities
Authors: Anum Javed
Abstract:
This study identifies the dynamical image of human body that continues its metamorphosis in the virtual field of reality. It calls attention to the ways where humans start co-evolving with other life forms; technology in particular and are striving to establish a realm outside the physical framework of matter. The problem exceeds the area of technological ethics by explicably and explanatorily entering the space of literary texts and criticism. Textual analysis of Geek Love (1989) by Katherine Dunn is adjoined with posthumanist perspectives of Pramod K. Nayar to beget psycho-somatic changes in man’s nature of being. It uncovers the meaning people give to their experiences in this budding social and cultural phenomena of material representation tied up with personal practices and technological innovations. It also observes an ethical, physical and psychological reassessment of man within the context of technological evolutions. The study indicates the elements that have rendered morphological freedom and new materialism in man’s consciousness. Moreover this work is inquisitive of what it means to be a human in this time of accelerating change where surgeries, implants, extensions, cloning and robotics have shaped a new sense of being. It attempts to go beyond individual’s body image and explores how objectifying media and culture have influenced people’s judgement of others on new material grounds. It further argues a decentring of the glorified image of man as an independent entity because of his energetic partnership with intelligent machines and external agents. The history of the future progress of technology is also mentioned. The methodology adopted is posthumanist techno-ethical textual analysis. This work necessitates a negotiating relationship between man and technology in order to achieve harmonic and balanced interconnected existence. The study concludes by recommending a call for an ethical set of codes to be cultivated for the techno-human habituation. Posthumanism ushers a strong need of adopting new ethics within the terminology of neo-materialist humanism.Keywords: corporeality, dematerialism, human ethos, posthumanism
Procedia PDF Downloads 147949 Using ePortfolios to Mapping Social Work Graduate Competencies
Authors: Cindy Davis
Abstract:
Higher education is changing globally and there is increasing pressure from professional social work accreditation bodies for academic programs to demonstrate how students have successfully met mandatory graduate competencies. As professional accreditation organizations increase their demand for evidence of graduate competencies, strategies to document and recording learning outcomes becomes increasingly challenging for academics and students. Studies in higher education have found support for the pedagogical value of ePortfolios, a flexible personal learning space that is owned by the student and include opportunity for assessment, feedback and reflection as well as a virtual space to store evidence of demonstration of professional competencies and graduate attributes. Examples of institutional uses of ePortfolios include e-administration of a diverse student population, assessment of student learning, and the demonstration of graduate attributes attained and future student career preparation. The current paper presents a case study on the introduction of ePortfolios for social work graduates in Australia as part of an institutional approach to technology-enhanced learning and e-learning. Social work graduates were required to submit an ePortfolio hosted on PebblePad. The PebblePad platform was selected because it places the student at the center of their learning whilst providing powerful tools for staff to structure, guide and assess that learning. The ePortofolio included documentation and evidence of how the student met each graduate competency as set out by the social work accreditation body in Australia (AASW). This digital resource played a key role in the process of external professional accreditation by clearly documenting and evidencing how students met required graduate competencies. In addition, student feedback revealed a positive outcome on how this resource provided them with a consolidation of their learning experiences and assisted them in obtaining employment post-graduation. There were also significant institutional factors that were key to successful implementation such as investment in the digital technology, capacity building amongst academics, and technical support for staff and students.Keywords: accreditation, social work, teaching, technology
Procedia PDF Downloads 139948 A Simulation-Optimization Approach to Control Production, Subcontracting and Maintenance Decisions for a Deteriorating Production System
Authors: Héctor Rivera-Gómez, Eva Selene Hernández-Gress, Oscar Montaño-Arango, Jose Ramon Corona-Armenta
Abstract:
This research studies the joint production, maintenance and subcontracting control policy for an unreliable deteriorating manufacturing system. Production activities are controlled by a derivation of the Hedging Point Policy, and given that the system is subject to deterioration, it reduces progressively its capacity to satisfy product demand. Multiple deterioration effects are considered, reflected mainly in the quality of the parts produced and the reliability of the machine. Subcontracting is available as support to satisfy product demand; also overhaul maintenance can be conducted to reduce the effects of deterioration. The main objective of the research is to determine simultaneously the production, maintenance and subcontracting rate which minimize the total incurred cost. A stochastic dynamic programming model is developed and solved through a simulation-based approach composed of statistical analysis and optimization with the response surface methodology. The obtained results highlight the strong interactions between production, deterioration and quality which justify the development of an integrated model. A numerical example and a sensitivity analysis are presented to validate our results.Keywords: subcontracting, optimal control, deterioration, simulation, production planning
Procedia PDF Downloads 579947 AI and the Future of Misinformation: Opportunities and Challenges
Authors: Noor Azwa Azreen Binti Abd. Aziz, Muhamad Zaim Bin Mohd Rozi
Abstract:
Moving towards the 4th Industrial Revolution, artificial intelligence (AI) is now more popular than ever. This subject is gaining significance every day and is continually expanding, often merging with other fields. Instead of merely being passive observers, there are benefits to understanding modern technology by delving into its inner workings. However, in a world teeming with digital information, the impact of AI on the spread of disinformation has garnered significant attention. The dissemination of inaccurate or misleading information is referred to as misinformation, posing a serious threat to democratic society, public debate, and individual decision-making. This article delves deep into the connection between AI and the dissemination of false information, exploring its potential, risks, and ethical issues as AI technology advances. The rise of AI has ushered in a new era in the dissemination of misinformation as AI-driven technologies are increasingly responsible for curating, recommending, and amplifying information on online platforms. While AI holds the potential to enhance the detection and mitigation of misinformation through natural language processing and machine learning, it also raises concerns about the amplification and propagation of false information. AI-powered deepfake technology, for instance, can generate hyper-realistic videos and audio recordings, making it increasingly challenging to discern fact from fiction.Keywords: artificial intelligence, digital information, disinformation, ethical issues, misinformation
Procedia PDF Downloads 92946 Correlation Analysis to Quantify Learning Outcomes for Different Teaching Pedagogies
Authors: Kanika Sood, Sijie Shang
Abstract:
A fundamental goal of education includes preparing students to become a part of the global workforce by making beneficial contributions to society. In this paper, we analyze student performance for multiple courses that involve different teaching pedagogies: a cooperative learning technique and an inquiry-based learning strategy. Student performance includes student engagement, grades, and attendance records. We perform this study in the Computer Science department for online and in-person courses for 450 students. We will perform correlation analysis to study the relationship between student scores and other parameters such as gender, mode of learning. We use natural language processing and machine learning to analyze student feedback data and performance data. We assess the learning outcomes of two teaching pedagogies for undergraduate and graduate courses to showcase the impact of pedagogical adoption and learning outcome as determinants of academic achievement. Early findings suggest that when using the specified pedagogies, students become experts on their topics and illustrate enhanced engagement with peers.Keywords: bag-of-words, cooperative learning, education, inquiry-based learning, in-person learning, natural language processing, online learning, sentiment analysis, teaching pedagogy
Procedia PDF Downloads 77945 Performance of On-site Earthquake Early Warning Systems for Different Sensor Locations
Authors: Ting-Yu Hsu, Shyu-Yu Wu, Shieh-Kung Huang, Hung-Wei Chiang, Kung-Chun Lu, Pei-Yang Lin, Kuo-Liang Wen
Abstract:
Regional earthquake early warning (EEW) systems are not suitable for Taiwan, as most destructive seismic hazards arise due to in-land earthquakes. These likely cause the lead-time provided by regional EEW systems before a destructive earthquake wave arrives to become null. On the other hand, an on-site EEW system can provide more lead-time at a region closer to an epicenter, since only seismic information of the target site is required. Instead of leveraging the information of several stations, the on-site system extracts some P-wave features from the first few seconds of vertical ground acceleration of a single station and performs a prediction of the oncoming earthquake intensity at the same station according to these features. Since seismometers could be triggered by non-earthquake events such as a passing of a truck or other human activities, to reduce the likelihood of false alarms, a seismometer was installed at three different locations on the same site and the performance of the EEW system for these three sensor locations were discussed. The results show that the location on the ground of the first floor of a school building maybe a good choice, since the false alarms could be reduced and the cost for installation and maintenance is the lowest.Keywords: earthquake early warning, on-site, seismometer location, support vector machine
Procedia PDF Downloads 244944 Application of Latent Class Analysis and Self-Organizing Maps for the Prediction of Treatment Outcomes for Chronic Fatigue Syndrome
Authors: Ben Clapperton, Daniel Stahl, Kimberley Goldsmith, Trudie Chalder
Abstract:
Chronic fatigue syndrome (CFS) is a condition characterised by chronic disabling fatigue and other symptoms that currently can't be explained by any underlying medical condition. Although clinical trials support the effectiveness of cognitive behaviour therapy (CBT), the success rate for individual patients is modest. Patients vary in their response and little is known which factors predict or moderate treatment outcomes. The aim of the project is to develop a prediction model from baseline characteristics of patients, such as demographics, clinical and psychological variables, which may predict likely treatment outcome and provide guidance for clinical decision making and help clinicians to recommend the best treatment. The project is aimed at identifying subgroups of patients with similar baseline characteristics that are predictive of treatment effects using modern cluster analyses and data mining machine learning algorithms. The characteristics of these groups will then be used to inform the types of individuals who benefit from a specific treatment. In addition, results will provide a better understanding of for whom the treatment works. The suitability of different clustering methods to identify subgroups and their response to different treatments of CFS patients is compared.Keywords: chronic fatigue syndrome, latent class analysis, prediction modelling, self-organizing maps
Procedia PDF Downloads 226943 Correlation of Material Mechanical Characteristics Obtained by Means of Standardized and Miniature Test Specimens
Authors: Vaclav Mentl, P. Zlabek, J. Volak
Abstract:
New methods of mechanical testing were developed recently that are based on making use of miniature test specimens (e.g. Small Punch Test). The most important advantage of these method is the nearly non-destructive withdrawal of test material and small size of test specimen what is interesting in cases of remaining lifetime assessment when a sufficient volume of the representative material cannot be withdrawn of the component in question. In opposite, the most important disadvantage of such methods stems from the necessity to correlate test results with the results of standardised test procedures and to build up a database of material data in service. The correlations among the miniature test specimen data and the results of standardised tests are necessary. The paper describes the results of fatigue tests performed on miniature tests specimens in comparison with traditional fatigue tests for several steels applied in power producing industry. Special miniature test specimens fixtures were designed and manufactured for the purposes of fatigue testing at the Zwick/Roell 10HPF5100 testing machine. The miniature test specimens were produced of the traditional test specimens. Seven different steels were fatigue loaded (R = 0.1) at room temperature.Keywords: mechanical properties, miniature test specimens, correlations, small punch test, micro-tensile test, mini-charpy impact test
Procedia PDF Downloads 538942 An Online Questionnaire Investigating UK Mothers' Experiences of Bottle Refusal by Their Breastfed Baby
Authors: Clare Maxwell, Lorna Porcellato, Valerie Fleming, Kate Fleming
Abstract:
A review of global online forums and social media reveals large numbers of mothers experiencing bottle refusal by their breastfed baby. It is difficult to determine precise numbers due to a lack of data, however, established virtual communities illustrate thousands of posts in relation to the issue. Mothers report various negative consequences of bottle refusal including delaying their return to work, time and financial outlay spent on methods to overcome it and experiencing stress, anxiety, and resentment of breastfeeding. A search of the literature revealed no studies being identified, and due to a lack of epidemiological data, a study investigating mother’s experiences of bottle refusal by their breastfed baby was undertaken. The aim of the study was to investigate UK mothers’ experiences of bottle refusal by their breastfed baby. Data were collected using an online questionnaire collecting quantitative and qualitative data. 841 UK mothers who had experienced or were experiencing bottle refusal by their breastfed baby completed the questionnaire. Data were analyzed using descriptive statistics and non-parametric testing. The results showed 61% (516/840) of mothers reported their breastfed baby was still refusing/had never accepted a bottle, with 39% (324/840) reporting their baby had eventually accepted. The most frequently reported reason to introduce a bottle was so partner/family could feed the baby 59% (499/839). 75% (634/841) of mothers intended their baby to feed on a bottle ‘occasionally’. Babies who accepted a bottle were more likely to be older at 1st attempt to introduce one than those babies who refused (Mdn = 12 weeks v 8 weeks, n = 286) (p = <0.001). Length of time taken to acceptance was 9 weeks (Mdn = 9, IQR = 18, R = 103.9, n = 306) with the older the baby was at 1st attempt to introduce a bottle being associated with a shorter length of time to acceptance (p = < 0.002). 60% (500/841) of mothers stated that none of the methods they used had worked. 26% (222/841) of mothers reported bottle refusal had had a negative impact upon their overall breastfeeding experience. 47% (303/604) reported they would have tried to introduce a bottle earlier to prevent refusal. This study provides a unique insight into the scenario of bottle refusal by breastfed babies. It highlights that bottle refusal by breastfed babies is a significant issue, which requires recognition from those communicating breastfeeding information to mothers.Keywords: bottle feeding, bottle refusal, breastfeeding, infant feeding
Procedia PDF Downloads 164