Search results for: S/R machine
852 Advancement of Computer Science Research in Nigeria: A Bibliometric Analysis of the Past Three Decades
Authors: Temidayo O. Omotehinwa, David O. Oyewola, Friday J. Agbo
Abstract:
This study aims to gather a proper perspective of the development landscape of Computer Science research in Nigeria. Therefore, a bibliometric analysis of 4,333 bibliographic records of Computer Science research in Nigeria in the last 31 years (1991-2021) was carried out. The bibliographic data were extracted from the Scopus database and analyzed using VOSviewer and the bibliometrix R package through the biblioshiny web interface. The findings of this study revealed that Computer Science research in Nigeria has a growth rate of 24.19%. The most developed and well-studied research areas in the Computer Science field in Nigeria are machine learning, data mining, and deep learning. The social structure analysis result revealed that there is a need for improved international collaborations. Sparsely established collaborations are largely influenced by geographic proximity. The funding analysis result showed that Computer Science research in Nigeria is under-funded. The findings of this study will be useful for researchers conducting Computer Science related research. Experts can gain insights into how to develop a strategic framework that will advance the field in a more impactful manner. Government agencies and policymakers can also utilize the outcome of this research to develop strategies for improved funding for Computer Science research.Keywords: bibliometric analysis, biblioshiny, computer science, Nigeria, science mapping
Procedia PDF Downloads 112851 Building Safety Through Real-time Design Fire Protection Systems
Authors: Mohsin Ali Shaikh, Song Weiguo, Muhammad Kashan Surahio, Usman Shahid, Rehmat Karim
Abstract:
When the area of a structure that is threatened by a disaster affects personal safety, the effectiveness of disaster prevention, evacuation, and rescue operations can be summarized by three assessment indicators: personal safety, property preservation, and attribution of responsibility. These indicators are applicable regardless of the disaster that affects the building. People need to get out of the hazardous area and to a safe place as soon as possible because there's no other way to respond. The results of the tragedy are thus closely related to how quickly people are advised to evacuate and how quickly they are rescued. This study considers present fire prevention systems to address catastrophes and improve building safety. It proposes the methods of Prevention Level for Deployment in Advance and Spatial Transformation by Human-Machine Collaboration. We present and prototype a real-time fire protection system architecture for building disaster prevention, evacuation, and rescue operations. The design encourages the use of simulations to check the efficacy of evacuation, rescue, and disaster prevention procedures throughout the planning and design phase of the structure.Keywords: prevention level, building information modeling, quality management system, simulated reality
Procedia PDF Downloads 70850 Eco-Drive Predictive Analytics
Authors: Sharif Muddsair, Eisels Martin, Giesbrecht Eugenie
Abstract:
With development of society increase the demand for the movement of people also increases gradually. The various modes of the transport in different extent which expat impacts, which depends on mainly technical-operating conditions. The up-to-date telematics systems provide the transport industry a revolutionary. Appropriate use of these systems can help to substantially improve the efficiency. Vehicle monitoring and fleet tracking are among services used for improving efficiency and effectiveness of utility vehicle. There are many telematics systems which may contribute to eco-driving. Generally, they can be grouped according to their role in driving cycle. • Before driving - eco-route selection, • While driving – Advanced driver assistance, • After driving – remote analysis. Our point of interest is regulated in third point [after driving – remote analysis]. TS [Telematics-system] make it possible to record driving patterns in real time and analysis the data later on, So that driver- classification-specific hints [fast driver, slow driver, aggressive driver…)] are given to imitate eco-friendly driving style. Together with growing number of vehicle and development of information technology, telematics become an ‘active’ research subject in IT and the car industry. Telematics has gone a long way from providing navigation solution/assisting the driver to become an integral part of the vehicle. Today’s telematics ensure safety, comfort and become convenience of the driver.Keywords: internet of things, iot, connected vehicle, cv, ts, telematics services, ml, machine learning
Procedia PDF Downloads 307849 Image Inpainting Model with Small-Sample Size Based on Generative Adversary Network and Genetic Algorithm
Authors: Jiawen Wang, Qijun Chen
Abstract:
The performance of most machine-learning methods for image inpainting depends on the quantity and quality of the training samples. However, it is very expensive or even impossible to obtain a great number of training samples in many scenarios. In this paper, an image inpainting model based on a generative adversary network (GAN) is constructed for the cases when the number of training samples is small. Firstly, a feature extraction network (F-net) is incorporated into the GAN network to utilize the available information of the inpainting image. The weighted sum of the extracted feature and the random noise acts as the input to the generative network (G-net). The proposed network can be trained well even when the sample size is very small. Secondly, in the phase of the completion for each damaged image, a genetic algorithm is designed to search an optimized noise input for G-net; based on this optimized input, the parameters of the G-net and F-net are further learned (Once the completion for a certain damaged image ends, the parameters restore to its original values obtained in the training phase) to generate an image patch that not only can fill the missing part of the damaged image smoothly but also has visual semantics.Keywords: image inpainting, generative adversary nets, genetic algorithm, small-sample size
Procedia PDF Downloads 130848 Simulation of Particle Damping in Boring Tool Using Combined Particles
Authors: S. Chockalingam, U. Natarajan, D. M. Santhoshsarang
Abstract:
Particle damping is a promising vibration attenuating technique in boring tool than other type of damping with minimal effect on the strength, rigidity and stiffness ratio of the machine tool structure. Due to the cantilever nature of boring tool holder in operations, it suffers chatter when the slenderness ratio of the tool gets increased. In this study, Copper-Stainless steel (SS) particles were packed inside the boring tool which acts as a damper. Damper suppresses chatter generated during machining and also improves the machining efficiency of the tool with better slenderness ratio. In the first approach of particle damping, combined Cu-SS particles were packed inside the vibrating tool, whereas Copper and Stainless steel particles were selected separately and packed inside another tool and their effectiveness was analysed in this simulation. This study reveals that the efficiency of finite element simulation of the boring tools when equipped with particles such as copper, stainless steel and a combination of both. In this study, the newly modified boring tool holder with particle damping was simulated using ANSYS12.0 with and without particles. The aim of this study is to enhance the structural rigidity through particle damping thus avoiding the occurrence of resonance in the boring tool during machining.Keywords: boring bar, copper-stainless steel, chatter, particle damping
Procedia PDF Downloads 461847 Supervisor Controller-Based Colored Petri Nets for Deadlock Control and Machine Failures in Automated Manufacturing Systems
Authors: Husam Kaid, Abdulrahman Al-Ahmari, Zhiwu Li
Abstract:
This paper develops a robust deadlock control technique for shared and unreliable resources in automated manufacturing systems (AMSs) based on structural analysis and colored Petri nets, which consists of three steps. The first step involves using strict minimal siphon control to create a live (deadlock-free) system that does not consider resource failure. The second step uses an approach based on colored Petri net, in which all monitors designed in the first step are merged into a single monitor. The third step addresses the deadlock control problems caused by resource failures. For all resource failures in the Petri net model a common recovery subnet based on colored petri net is proposed. The common recovery subnet is added to the obtained system at the second step to make the system reliable. The proposed approach is evaluated using an AMS from the literature. The results show that the proposed approach can be applied to an unreliable complex Petri net model, has a simpler structure and less computational complexity, and can obtain one common recovery subnet to model all resource failures.Keywords: automated manufacturing system, colored Petri net, deadlocks, siphon
Procedia PDF Downloads 129846 Exploratory Study of the Influencing Factors for Hotels' Competitors
Authors: Asma Ameur, Dhafer Malouche
Abstract:
Hotel competitiveness research is an essential phase of the marketing strategy for any hotel. Certainly, knowing the hotels' competitors helps the hotelier to grasp its position in the market and the citizen to make the right choice in picking a hotel. Thus, competitiveness is an important indicator that can be influenced by various factors. In fact, the issue of competitiveness, this ability to cope with competition, remains a difficult and complex concept to define and to exploit. Therefore, the purpose of this article is to make an exploratory study to calculate a competitiveness indicator for hotels. Further on, this paper makes it possible to determine the criteria of direct or indirect effect on the image and the perception of a hotel. The actual research is used to look into the right model for hotel ‘competitiveness. For this reason, we exploit different theoretical contributions in the field of machine learning. Thus, we use some statistical techniques such as the Principal Component Analysis (PCA) to reduce the dimensions, as well as other techniques of statistical modeling. This paper presents a survey covering of the techniques and methods in hotel competitiveness research. Furthermore, this study allows us to deduct the significant variables that influence the determination of hotel’s competitors. Lastly, the discussed experiences in this article found that the hotel competitors are influenced by several factors with different rates.Keywords: competitiveness, e-reputation, hotels' competitors, online hotel’ review, principal component analysis, statistical modeling
Procedia PDF Downloads 119845 IoT and Advanced Analytics Integration in Biogas Modelling
Authors: Rakesh Choudhary, Ajay Kumar, Deepak Sharma
Abstract:
The main goal of this paper is to investigate the challenges and benefits of IoT integration in biogas production. This overview explains how the inclusion of IoT can enhance biogas production efficiency. Therefore, such collected data can be explored by advanced analytics, including Artificial intelligence (AI) and Machine Learning (ML) algorithms, consequently improving bio-energy processes. To boost biogas generation efficiency, this report examines the use of IoT devices for real-time data collection on key parameters, e.g., pH, temperature, gas composition, and microbial growth. Real-time monitoring through big data has made it possible to detect diverse, complex trends in the process of producing biogas. The Informed by advanced analytics can also help in improving bio-energy production as well as optimizing operational conditions. Moreover, IoT allows remote observation, control and management, which decreases manual intervention needed whilst increasing process effectiveness. Such a paradigm shift in the incorporation of IoT technologies into biogas production systems helps to achieve higher productivity levels as well as more practical biomass quality biomethane through real-time monitoring-based proactive decision-making, thus driving continuous performance improvement.Keywords: internet of things, biogas, renewable energy, sustainability, anaerobic digestion, real-time monitoring, optimization
Procedia PDF Downloads 21844 Violence Detection and Tracking on Moving Surveillance Video Using Machine Learning Approach
Authors: Abe Degale D., Cheng Jian
Abstract:
When creating automated video surveillance systems, violent action recognition is crucial. In recent years, hand-crafted feature detectors have been the primary method for achieving violence detection, such as the recognition of fighting activity. Researchers have also looked into learning-based representational models. On benchmark datasets created especially for the detection of violent sequences in sports and movies, these methods produced good accuracy results. The Hockey dataset's videos with surveillance camera motion present challenges for these algorithms for learning discriminating features. Image recognition and human activity detection challenges have shown success with deep representation-based methods. For the purpose of detecting violent images and identifying aggressive human behaviours, this research suggested a deep representation-based model using the transfer learning idea. The results show that the suggested approach outperforms state-of-the-art accuracy levels by learning the most discriminating features, attaining 99.34% and 99.98% accuracy levels on the Hockey and Movies datasets, respectively.Keywords: violence detection, faster RCNN, transfer learning and, surveillance video
Procedia PDF Downloads 110843 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date
Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian
Abstract:
To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven
Procedia PDF Downloads 174842 Control Flow around NACA 4415 Airfoil Using Slot and Injection
Authors: Imine Zakaria, Meftah Sidi Mohamed El Amine
Abstract:
One of the most vital aerodynamic organs of a flying machine is the wing, which allows it to fly in the air efficiently. The flow around the wing is very sensitive to changes in the angle of attack. Beyond a value, there is a phenomenon of the boundary layer separation on the upper surface, which causes instability and total degradation of aerodynamic performance called a stall. However, controlling flow around an airfoil has become a researcher concern in the aeronautics field. There are two techniques for controlling flow around a wing to improve its aerodynamic performance: passive and active controls. Blowing and suction are among the active techniques that control the boundary layer separation around an airfoil. Their objective is to give energy to the air particles in the boundary layer separation zones and to create vortex structures that will homogenize the velocity near the wall and allow control. Blowing and suction have long been used as flow control actuators around obstacles. In 1904 Prandtl applied a permanent blowing to a cylinder to delay the boundary layer separation. In the present study, several numerical investigations have been developed to predict a turbulent flow around an aerodynamic profile. CFD code was used for several angles of attack in order to validate the present work with that of the literature in the case of a clean profile. The variation of the lift coefficient CL with the momentum coefficientKeywords: CFD, control flow, lift, slot
Procedia PDF Downloads 200841 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic
Authors: Fei Gao, Rodolfo C. Raga Jr.
Abstract:
This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle
Procedia PDF Downloads 77840 Advances in Artificial intelligence Using Speech Recognition
Authors: Khaled M. Alhawiti
Abstract:
This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance
Procedia PDF Downloads 478839 The Reflection Framework to Enhance the User Experience for Cultural Heritage Spaces’ Websites in Post-Pandemic Times
Authors: Duyen Lam, Thuong Hoang, Atul Sajjanhar, Feifei Chen
Abstract:
With the emerging interactive technology applications helping users connect progressively with cultural artefacts in new approaches, the cultural heritage sector gains significantly. The interactive apps’ issues can be tested via several techniques, including usability surveys and usability evaluations. The severe usability problems for museums’ interactive technologies commonly involve interactions, control, and navigation processes. This study confirms the low quality of being immersive for audio guides in navigating the exhibition and involving experience in the virtual environment, which are the most vital features of new interactive technologies such as AR and VR. In addition, our usability surveys and heuristic evaluations disclosed many usability issues of these interactive technologies relating to interaction functions. Additionally, we use the Wayback Machine to examine what interactive apps/technologies were deployed on these websites during the physical visits limited due to the COVID-19 pandemic lockdown. Based on those inputs, we propose the reflection framework to enhance the UX in the cultural heritage domain with detailed guidelines.Keywords: framework, user experience, cultural heritage, interactive technology, museum, COVID-19 pandemic, usability survey, heuristic evaluation, guidelines
Procedia PDF Downloads 69838 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction
Authors: Priyadarsini Samal, Rajesh Singla
Abstract:
Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.Keywords: brain computer interface, electroencephalogram, regression model, stress, word search
Procedia PDF Downloads 188837 Electrocardiogram-Based Heartbeat Classification Using Convolutional Neural Networks
Authors: Jacqueline Rose T. Alipo-on, Francesca Isabelle F. Escobar, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar Al Dahoul
Abstract:
Electrocardiogram (ECG) signal analysis and processing are crucial in the diagnosis of cardiovascular diseases, which are considered one of the leading causes of mortality worldwide. However, the traditional rule-based analysis of large volumes of ECG data is time-consuming, labor-intensive, and prone to human errors. With the advancement of the programming paradigm, algorithms such as machine learning have been increasingly used to perform an analysis of ECG signals. In this paper, various deep learning algorithms were adapted to classify five classes of heartbeat types. The dataset used in this work is the synthetic MIT-BIH Arrhythmia dataset produced from generative adversarial networks (GANs). Various deep learning models such as ResNet-50 convolutional neural network (CNN), 1-D CNN, and long short-term memory (LSTM) were evaluated and compared. ResNet-50 was found to outperform other models in terms of recall and F1 score using a five-fold average score of 98.88% and 98.87%, respectively. 1-D CNN, on the other hand, was found to have the highest average precision of 98.93%.Keywords: heartbeat classification, convolutional neural network, electrocardiogram signals, generative adversarial networks, long short-term memory, ResNet-50
Procedia PDF Downloads 130836 Lightweight Hybrid Convolutional and Recurrent Neural Networks for Wearable Sensor Based Human Activity Recognition
Authors: Sonia Perez-Gamboa, Qingquan Sun, Yan Zhang
Abstract:
Non-intrusive sensor-based human activity recognition (HAR) is utilized in a spectrum of applications, including fitness tracking devices, gaming, health care monitoring, and smartphone applications. Deep learning models such as convolutional neural networks (CNNs) and long short term memory (LSTM) recurrent neural networks (RNNs) provide a way to achieve HAR accurately and effectively. In this paper, we design a multi-layer hybrid architecture with CNN and LSTM and explore a variety of multi-layer combinations. Based on the exploration, we present a lightweight, hybrid, and multi-layer model, which can improve the recognition performance by integrating local features and scale-invariant with dependencies of activities. The experimental results demonstrate the efficacy of the proposed model, which can achieve a 94.7% activity recognition rate on a benchmark human activity dataset. This model outperforms traditional machine learning and other deep learning methods. Additionally, our implementation achieves a balance between recognition rate and training time consumption.Keywords: deep learning, LSTM, CNN, human activity recognition, inertial sensor
Procedia PDF Downloads 152835 The Proposal for a Framework to Face Opacity and Discrimination ‘Sins’ Caused by Consumer Creditworthiness Machines in the EU
Authors: Diogo José Morgado Rebelo, Francisco António Carneiro Pacheco de Andrade, Paulo Jorge Freitas de Oliveira Novais
Abstract:
Not everything in AI-power consumer credit scoring turns out to be a wonder. When using AI in Creditworthiness Assessment (CWA), opacity and unfairness ‘sins’ must be considered to the task be deemed Responsible. AI software is not always 100% accurate, which can lead to misclassification. Discrimination of some groups can be exponentiated. A hetero personalized identity can be imposed on the individual(s) affected. Also, autonomous CWA sometimes lacks transparency when using black box models. However, for this intended purpose, human analysts ‘on-the-loop’ might not be the best remedy consumers are looking for in credit. This study seeks to explore the legality of implementing a Multi-Agent System (MAS) framework in consumer CWA to ensure compliance with the regulation outlined in Article 14(4) of the Proposal for an Artificial Intelligence Act (AIA), dated 21 April 2021 (as per the last corrigendum by the European Parliament on 19 April 2024), Especially with the adoption of Art. 18(8)(9) of the EU Directive 2023/2225, of 18 October, which will go into effect on 20 November 2026, there should be more emphasis on the need for hybrid oversight in AI-driven scoring to ensure fairness and transparency. In fact, the range of EU regulations on AI-based consumer credit will soon impact the AI lending industry locally and globally, as shown by the broad territorial scope of AIA’s Art. 2. Consequently, engineering the law of consumer’s CWA is imperative. Generally, the proposed MAS framework consists of several layers arranged in a specific sequence, as follows: firstly, the Data Layer gathers legitimate predictor sets from traditional sources; then, the Decision Support System Layer, whose Neural Network model is trained using k-fold Cross Validation, provides recommendations based on the feeder data; the eXplainability (XAI) multi-structure comprises Three-Step-Agents; and, lastly, the Oversight Layer has a 'Bottom Stop' for analysts to intervene in a timely manner. From the analysis, one can assure a vital component of this software is the XAY layer. It appears as a transparent curtain covering the AI’s decision-making process, enabling comprehension, reflection, and further feasible oversight. Local Interpretable Model-agnostic Explanations (LIME) might act as a pillar by offering counterfactual insights. SHapley Additive exPlanation (SHAP), another agent in the XAI layer, could address potential discrimination issues, identifying the contribution of each feature to the prediction. Alternatively, for thin or no file consumers, the Suggestion Agent can promote financial inclusion. It uses lawful alternative sources such as the share of wallet, among others, to search for more advantageous solutions to incomplete evaluation appraisals based on genetic programming. Overall, this research aspires to bring the concept of Machine-Centered Anthropocentrism to the table of EU policymaking. It acknowledges that, when put into service, credit analysts no longer exert full control over the data-driven entities programmers have given ‘birth’ to. With similar explanatory agents under supervision, AI itself can become self-accountable, prioritizing human concerns and values. AI decisions should not be vilified inherently. The issue lies in how they are integrated into decision-making and whether they align with non-discrimination principles and transparency rules.Keywords: creditworthiness assessment, hybrid oversight, machine-centered anthropocentrism, EU policymaking
Procedia PDF Downloads 36834 Difference between 'HDR Ir-192 and Co-60 Sources' for High Dose Rate Brachytherapy Machine
Authors: Md Serajul Islam
Abstract:
High Dose Rate (HDR) Brachytherapy is used for cancer patients. In our country’s prospect, we are using only cervices and breast cancer treatment by using HDR. The air kerma rate in air at a reference distance of less than a meter from the source is the recommended quantity for the specification of gamma ray source Ir-192 in brachytherapy. The absorbed dose for the patients is directly proportional to the air kerma rate. Therefore the air kerma rate should be determined before the first use of the source on patients by qualified medical physicist who is independent from the source manufacturer. The air kerma rate will then be applied in the calculation of the dose delivered to patients in their planning systems. In practice, high dose rate (HDR) Ir-192 afterloader machines are mostly used in brachytherapy treatment. Currently, HDR-Co-60 increasingly comes into operation too. The essential advantage of the use of Co-60 sources is its longer half-life compared to Ir-192. The use of HDRCo-60 afterloading machines is also quite interesting for developing countries. This work describes the dosimetry at HDR afterloading machines according to the protocols IAEA-TECDOC-1274 (2002) with the nuclides Ir-192 and Co-60. We have used 3 different measurement methods (with a ring chamber, with a solid phantom and in free air and with a well chamber) in dependence of each of the protocols. We have shown that the standard deviations of the measured air kerma rate for the Co-60 source are generally larger than those of the Ir-192 source. The measurements with the well chamber had the lowest deviation from the certificate value. In all protocols and methods, the deviations stood for both nuclides by a maximum of about 1% for Ir-192 and 2.5% for Co-60-Sources respectively.Keywords: Ir-192 source, cancer, patients, cheap treatment cost
Procedia PDF Downloads 239833 Bhumastra “Unmanned Ground Vehicle”
Authors: Vivek Krishna, Nikhil Jain, A. Mary Posonia A., Albert Mayan J
Abstract:
Terrorism and insurgency are significant global issues that require constant attention and effort from governments and scientists worldwide. To combat these threats, nations invest billions of dollars in developing new defensive technologies to protect civilians. Breakthroughs in vehicle automation have led to the use of sophisticated machines for many dangerous and critical anti-terrorist activities. Our concept of an "Unmanned Ground Vehicle" can carry out tasks such as border security, surveillance, mine detection, and active combat independently or in tandem with human control. The robot's movement can be wirelessly controlled by a person in a distant location or can travel to a pre-programmed destination autonomously in situations where personal control is not feasible. Our defence system comprises two units: the control unit that regulates mobility and the motion tracking unit. The remote operator robot uses the camera's live visual feed to manually operate both units, and the rover can automatically detect movement. The rover is operated by manpower who controls it using a joystick or mouse, and a wireless modem enables a soldier in a combat zone to control the rover via an additional controller feature.Keywords: robotics, computer vision, Machine learning, Artificial intelligence, future of AI
Procedia PDF Downloads 126832 A Sociolinguistic Approach to the Translation of Children’s Literature: Exploring Identity Issues in the American English Translation of Manolito Gafotas
Authors: Owen Harrington-Fernandez, Pilar Alderete-Diez
Abstract:
Up until recently, translation studies treated children’s literature as something of a marginal preoccupation, but the recent attention that this text type has attracted suggests that it may be fertile ground for research. This paper contributes to this new research avenue by applying a sociolinguistic theoretical framework to explore issues around the intersubjective co-construction of identity in the American English translation of the Spanish children’s story, Manolito Gafotas. The application of Bucholtz and Hall’s framework achieves two objectives: (1) it identifies shifts in the translation of the main character’s behaviour as culturally and morally motivated manipulations, and (2) it demonstrates how the context of translation becomes the very censorship machine that delegitimises the identity of the main character, and, concomitantly, the identity of the implied reader(s). If we take identity to be an intersubjective phenomenon, then it logicall follows that expurgating the identity of the main character necessarily shifts the identity of the implied reader(s) also. It is a double censorship of identity carried out under the auspices of an intellectual colonisation of a Spanish text. After reporting on the results of the analysis, the paper ends by raising the question of censorship in translation, and, more specifically, in children’s literature, in order to promote debate around this topic.Keywords: censorship, identity, sociolinguistics, translation
Procedia PDF Downloads 261831 A Hybrid Feature Selection and Deep Learning Algorithm for Cancer Disease Classification
Authors: Niousha Bagheri Khulenjani, Mohammad Saniee Abadeh
Abstract:
Learning from very big datasets is a significant problem for most present data mining and machine learning algorithms. MicroRNA (miRNA) is one of the important big genomic and non-coding datasets presenting the genome sequences. In this paper, a hybrid method for the classification of the miRNA data is proposed. Due to the variety of cancers and high number of genes, analyzing the miRNA dataset has been a challenging problem for researchers. The number of features corresponding to the number of samples is high and the data suffer from being imbalanced. The feature selection method has been used to select features having more ability to distinguish classes and eliminating obscures features. Afterward, a Convolutional Neural Network (CNN) classifier for classification of cancer types is utilized, which employs a Genetic Algorithm to highlight optimized hyper-parameters of CNN. In order to make the process of classification by CNN faster, Graphics Processing Unit (GPU) is recommended for calculating the mathematic equation in a parallel way. The proposed method is tested on a real-world dataset with 8,129 patients, 29 different types of tumors, and 1,046 miRNA biomarkers, taken from The Cancer Genome Atlas (TCGA) database.Keywords: cancer classification, feature selection, deep learning, genetic algorithm
Procedia PDF Downloads 112830 A Simulation-Optimization Approach to Control Production, Subcontracting and Maintenance Decisions for a Deteriorating Production System
Authors: Héctor Rivera-Gómez, Eva Selene Hernández-Gress, Oscar Montaño-Arango, Jose Ramon Corona-Armenta
Abstract:
This research studies the joint production, maintenance and subcontracting control policy for an unreliable deteriorating manufacturing system. Production activities are controlled by a derivation of the Hedging Point Policy, and given that the system is subject to deterioration, it reduces progressively its capacity to satisfy product demand. Multiple deterioration effects are considered, reflected mainly in the quality of the parts produced and the reliability of the machine. Subcontracting is available as support to satisfy product demand; also overhaul maintenance can be conducted to reduce the effects of deterioration. The main objective of the research is to determine simultaneously the production, maintenance and subcontracting rate which minimize the total incurred cost. A stochastic dynamic programming model is developed and solved through a simulation-based approach composed of statistical analysis and optimization with the response surface methodology. The obtained results highlight the strong interactions between production, deterioration and quality which justify the development of an integrated model. A numerical example and a sensitivity analysis are presented to validate our results.Keywords: subcontracting, optimal control, deterioration, simulation, production planning
Procedia PDF Downloads 580829 AI and the Future of Misinformation: Opportunities and Challenges
Authors: Noor Azwa Azreen Binti Abd. Aziz, Muhamad Zaim Bin Mohd Rozi
Abstract:
Moving towards the 4th Industrial Revolution, artificial intelligence (AI) is now more popular than ever. This subject is gaining significance every day and is continually expanding, often merging with other fields. Instead of merely being passive observers, there are benefits to understanding modern technology by delving into its inner workings. However, in a world teeming with digital information, the impact of AI on the spread of disinformation has garnered significant attention. The dissemination of inaccurate or misleading information is referred to as misinformation, posing a serious threat to democratic society, public debate, and individual decision-making. This article delves deep into the connection between AI and the dissemination of false information, exploring its potential, risks, and ethical issues as AI technology advances. The rise of AI has ushered in a new era in the dissemination of misinformation as AI-driven technologies are increasingly responsible for curating, recommending, and amplifying information on online platforms. While AI holds the potential to enhance the detection and mitigation of misinformation through natural language processing and machine learning, it also raises concerns about the amplification and propagation of false information. AI-powered deepfake technology, for instance, can generate hyper-realistic videos and audio recordings, making it increasingly challenging to discern fact from fiction.Keywords: artificial intelligence, digital information, disinformation, ethical issues, misinformation
Procedia PDF Downloads 95828 Correlation Analysis to Quantify Learning Outcomes for Different Teaching Pedagogies
Authors: Kanika Sood, Sijie Shang
Abstract:
A fundamental goal of education includes preparing students to become a part of the global workforce by making beneficial contributions to society. In this paper, we analyze student performance for multiple courses that involve different teaching pedagogies: a cooperative learning technique and an inquiry-based learning strategy. Student performance includes student engagement, grades, and attendance records. We perform this study in the Computer Science department for online and in-person courses for 450 students. We will perform correlation analysis to study the relationship between student scores and other parameters such as gender, mode of learning. We use natural language processing and machine learning to analyze student feedback data and performance data. We assess the learning outcomes of two teaching pedagogies for undergraduate and graduate courses to showcase the impact of pedagogical adoption and learning outcome as determinants of academic achievement. Early findings suggest that when using the specified pedagogies, students become experts on their topics and illustrate enhanced engagement with peers.Keywords: bag-of-words, cooperative learning, education, inquiry-based learning, in-person learning, natural language processing, online learning, sentiment analysis, teaching pedagogy
Procedia PDF Downloads 77827 Performance of On-site Earthquake Early Warning Systems for Different Sensor Locations
Authors: Ting-Yu Hsu, Shyu-Yu Wu, Shieh-Kung Huang, Hung-Wei Chiang, Kung-Chun Lu, Pei-Yang Lin, Kuo-Liang Wen
Abstract:
Regional earthquake early warning (EEW) systems are not suitable for Taiwan, as most destructive seismic hazards arise due to in-land earthquakes. These likely cause the lead-time provided by regional EEW systems before a destructive earthquake wave arrives to become null. On the other hand, an on-site EEW system can provide more lead-time at a region closer to an epicenter, since only seismic information of the target site is required. Instead of leveraging the information of several stations, the on-site system extracts some P-wave features from the first few seconds of vertical ground acceleration of a single station and performs a prediction of the oncoming earthquake intensity at the same station according to these features. Since seismometers could be triggered by non-earthquake events such as a passing of a truck or other human activities, to reduce the likelihood of false alarms, a seismometer was installed at three different locations on the same site and the performance of the EEW system for these three sensor locations were discussed. The results show that the location on the ground of the first floor of a school building maybe a good choice, since the false alarms could be reduced and the cost for installation and maintenance is the lowest.Keywords: earthquake early warning, on-site, seismometer location, support vector machine
Procedia PDF Downloads 244826 Application of Latent Class Analysis and Self-Organizing Maps for the Prediction of Treatment Outcomes for Chronic Fatigue Syndrome
Authors: Ben Clapperton, Daniel Stahl, Kimberley Goldsmith, Trudie Chalder
Abstract:
Chronic fatigue syndrome (CFS) is a condition characterised by chronic disabling fatigue and other symptoms that currently can't be explained by any underlying medical condition. Although clinical trials support the effectiveness of cognitive behaviour therapy (CBT), the success rate for individual patients is modest. Patients vary in their response and little is known which factors predict or moderate treatment outcomes. The aim of the project is to develop a prediction model from baseline characteristics of patients, such as demographics, clinical and psychological variables, which may predict likely treatment outcome and provide guidance for clinical decision making and help clinicians to recommend the best treatment. The project is aimed at identifying subgroups of patients with similar baseline characteristics that are predictive of treatment effects using modern cluster analyses and data mining machine learning algorithms. The characteristics of these groups will then be used to inform the types of individuals who benefit from a specific treatment. In addition, results will provide a better understanding of for whom the treatment works. The suitability of different clustering methods to identify subgroups and their response to different treatments of CFS patients is compared.Keywords: chronic fatigue syndrome, latent class analysis, prediction modelling, self-organizing maps
Procedia PDF Downloads 226825 Correlation of Material Mechanical Characteristics Obtained by Means of Standardized and Miniature Test Specimens
Authors: Vaclav Mentl, P. Zlabek, J. Volak
Abstract:
New methods of mechanical testing were developed recently that are based on making use of miniature test specimens (e.g. Small Punch Test). The most important advantage of these method is the nearly non-destructive withdrawal of test material and small size of test specimen what is interesting in cases of remaining lifetime assessment when a sufficient volume of the representative material cannot be withdrawn of the component in question. In opposite, the most important disadvantage of such methods stems from the necessity to correlate test results with the results of standardised test procedures and to build up a database of material data in service. The correlations among the miniature test specimen data and the results of standardised tests are necessary. The paper describes the results of fatigue tests performed on miniature tests specimens in comparison with traditional fatigue tests for several steels applied in power producing industry. Special miniature test specimens fixtures were designed and manufactured for the purposes of fatigue testing at the Zwick/Roell 10HPF5100 testing machine. The miniature test specimens were produced of the traditional test specimens. Seven different steels were fatigue loaded (R = 0.1) at room temperature.Keywords: mechanical properties, miniature test specimens, correlations, small punch test, micro-tensile test, mini-charpy impact test
Procedia PDF Downloads 540824 Managing the Magnetic Protection of Workers in Magnetic Resonance Imaging
Authors: Safoin Aktaou, Aya Al Masri, Kamel Guerchouche, Malorie Martin, Fouad Maaloul
Abstract:
Introduction: In the ‘Magnetic Resonance Imaging (MRI)’ department, all workers involved in preparing the patient, setting it up, tunnel cleaning, etc. are likely to be exposed to ‘ElectroMagnetic fields (EMF)’ emitted by the MRI device. Exposure to EMF can cause adverse radio-biological effects to workers. The purpose of this study is to propose an organizational process to manage and control EMF risks. Materials and methods: The study was conducted at seven MRI departments using machines with 1.5 and 3 Tesla magnetic fields. We assessed the exposure of each one by measuring the two electromagnetic fields (static and dynamic) at different distances from the MRI machine both inside and around the examination room. Measurement values were compared with British and American references (those of the UK's ‘Medicines and Healthcare Regulatory Agency (MHRA)’ and the ‘American Radiology Society (ACR)’). Results: Following the results of EMF measurements and their comparison with the recommendations of learned societies, a zoning system that adapts to needs of different MRI services across the country has been proposed. In effect, three risk areas have been identified within the MRI services. This has led to the development of a good practice guide related to the magnetic protection of MRI workers. Conclusion: The guide established by our study is a standard that allows MRI workers to protect themselves against the risk of electromagnetic fields.Keywords: comparison with international references, measurement of electromagnetic fields, magnetic protection of workers, magnetic resonance imaging
Procedia PDF Downloads 165823 Fuzzy Neuro Approach for Integrated Water Management System
Authors: Stuti Modi, Aditi Kambli
Abstract:
This paper addresses the need for intelligent water management and distribution system in smart cities to ensure optimal consumption and distribution of water for drinking and sanitation purposes. Water being a limited resource in cities require an effective system for collection, storage and distribution. In this paper, applications of two mostly widely used particular types of data-driven models, namely artificial neural networks (ANN) and fuzzy logic-based models, to modelling in the water resources management field are considered. The objective of this paper is to review the principles of various types and architectures of neural network and fuzzy adaptive systems and their applications to integrated water resources management. Final goal of the review is to expose and formulate progressive direction of their applicability and further research of the AI-related and data-driven techniques application and to demonstrate applicability of the neural networks, fuzzy systems and other machine learning techniques in the practical issues of the regional water management. Apart from this the paper will deal with water storage, using ANN to find optimum reservoir level and predicting peak daily demands.Keywords: artificial neural networks, fuzzy systems, peak daily demand prediction, water management and distribution
Procedia PDF Downloads 187