Search results for: machine readable format
679 Characterization and Nanostructure Formation of Banana Peels Nanosorbent with Its Application
Authors: Opeyemi Atiba-Oyewo, Maurice S. Onyango, Christian Wolkersdorfer
Abstract:
Characterization and nanostructure formation of banana peels as sorbent material are described in this paper. The transformation of this agricultural waste via mechanical milling to enhance its properties such as changed in microstructure and surface area for water pollution control and other applications were studied. Mechanical milling was employed using planetary continuous milling machine with ethanol as a milling solvent and the samples were taken at time intervals between 10 h to 30 h to examine the structural changes. The samples were characterised by X-ray diffraction (XRD), scanning electron microscopy (SEM), Fourier transform infra-red (FTIR), Transmission electron microscopy (TEM) and Brunauer Emmett and teller (BET). Results revealed three typical structures with different deformation mechanisms and the grain-sizes within the range of (71-12 nm), nanostructure of the particles and fibres. The particle size decreased from 65µm to 15 nm as the milling progressed for a period of 30 h. The morphological properties of the materials indicated that the particle shapes becomes regular and uniform as the milling progresses. Furthermore, particles fracturing resulted in surface area increment from 1.0694-4.5547 m2/g. The functional groups responsible for the banana peels capacity to coordinate and remove metal ions, such as the carboxylic and amine groups were identified at absorption bands of 1730 and 889 cm-1, respectively. However, the choice of this sorbent material for the sorption or any application will depend on the composition of the pollutant to be eradicated.Keywords: characterization, nanostructure, nanosorbent, eco-friendly, banana peels, mechanical milling, water quality
Procedia PDF Downloads 286678 Mining User-Generated Contents to Detect Service Failures with Topic Model
Authors: Kyung Bae Park, Sung Ho Ha
Abstract:
Online user-generated contents (UGC) significantly change the way customers behave (e.g., shop, travel), and a pressing need to handle the overwhelmingly plethora amount of various UGC is one of the paramount issues for management. However, a current approach (e.g., sentiment analysis) is often ineffective for leveraging textual information to detect the problems or issues that a certain management suffers from. In this paper, we employ text mining of Latent Dirichlet Allocation (LDA) on a popular online review site dedicated to complaint from users. We find that the employed LDA efficiently detects customer complaints, and a further inspection with the visualization technique is effective to categorize the problems or issues. As such, management can identify the issues at stake and prioritize them accordingly in a timely manner given the limited amount of resources. The findings provide managerial insights into how analytics on social media can help maintain and improve their reputation management. Our interdisciplinary approach also highlights several insights by applying machine learning techniques in marketing research domain. On a broader technical note, this paper illustrates the details of how to implement LDA in R program from a beginning (data collection in R) to an end (LDA analysis in R) since the instruction is still largely undocumented. In this regard, it will help lower the boundary for interdisciplinary researcher to conduct related research.Keywords: latent dirichlet allocation, R program, text mining, topic model, user generated contents, visualization
Procedia PDF Downloads 187677 Thermo-Mechanical Properties of PBI Fiber Reinforced HDPE Composites: Effect of Fiber Length and Composition
Authors: Shan Faiz, Arfat Anis, Saeed M. Al-Zarani
Abstract:
High density polyethylene (HDPE) and poly benzimidazole fiber (PBI) composites were prepared by melt blending in a twin screw extruder (TSE). The thermo-mechanical properties of PBI fiber reinforced HDPE composite samples (1%, 4% and 8% fiber content) of fiber lengths 3 mm and 6 mm were investigated using differential scanning calorimeter (DSC), universal testing machine (UTM), rheometer and scanning electron microscopy (SEM). The effect of fiber content and fiber lengths on the thermo-mechanical properties of the HDPE-PBI composites was studied. The DSC analysis showed decrease in crystallinity of HDPE-PBI composites with the increase of fiber loading. Maximum decrease observed was 12% at 8% fiber length. The thermal stability was found to increase with the addition of fiber. T50% was notably increased to 40oC for both grades of HDPE using 8% of fiber content. The mechanical properties were not much affected by the increase in fiber content. The optimum value of tensile strength was achieved using 4% fiber content and slight increase of 9% in tensile strength was observed. No noticeable change was observed in flexural strength. In rheology study, the complex viscosities of HDPE-PBI composites were higher than the HDPE matrix and substantially increased with even minimum increase of PBI fiber loading i.e. 1%. We found that the addition of the PBI fiber resulted in a modest improvement in the thermal stability and mechanical properties of the prepared composites.Keywords: PBI fiber, high density polyethylene, composites, melt blending
Procedia PDF Downloads 368676 Computational Fluid Dynamics Simulation of Reservoir for Dwell Time Prediction
Authors: Nitin Dewangan, Nitin Kattula, Megha Anawat
Abstract:
Hydraulic reservoir is the key component in the mobile construction vehicles; most of the off-road earth moving construction machinery requires bigger side hydraulic reservoirs. Their reservoir construction is very much non-uniform and designers used such design to utilize the space available under the vehicle. There is no way to find out the space utilization of the reservoir by oil and validity of design except virtual simulation. Computational fluid dynamics (CFD) helps to predict the reservoir space utilization by vortex mapping, path line plots and dwell time prediction to make sure the design is valid and efficient for the vehicle. The dwell time acceptance criteria for effective reservoir design is 15 seconds. The paper will describe the hydraulic reservoir simulation which is carried out using CFD tool acuSolve using automated mesh strategy. The free surface flow and moving reference mesh is used to define the oil flow level inside the reservoir. The first baseline design is not able to meet the acceptance criteria, i.e., dwell time below 15 seconds because the oil entry and exit ports were very close. CFD is used to redefine the port locations for the reservoir so that oil dwell time increases in the reservoir. CFD also proposed baffle design the effective space utilization. The final design proposed through CFD analysis is used for physical validation on the machine.Keywords: reservoir, turbulence model, transient model, level set, free-surface flow, moving frame of reference
Procedia PDF Downloads 153675 XAI Implemented Prognostic Framework: Condition Monitoring and Alert System Based on RUL and Sensory Data
Authors: Faruk Ozdemir, Roy Kalawsky, Peter Hubbard
Abstract:
Accurate estimation of RUL provides a basis for effective predictive maintenance, reducing unexpected downtime for industrial equipment. However, while models such as the Random Forest have effective predictive capabilities, they are the so-called ‘black box’ models, where interpretability is at a threshold to make critical diagnostic decisions involved in industries related to aviation. The purpose of this work is to present a prognostic framework that embeds Explainable Artificial Intelligence (XAI) techniques in order to provide essential transparency in Machine Learning methods' decision-making mechanisms based on sensor data, with the objective of procuring actionable insights for the aviation industry. Sensor readings have been gathered from critical equipment such as turbofan jet engine and landing gear, and the prediction of the RUL is done by a Random Forest model. It involves steps such as data gathering, feature engineering, model training, and evaluation. These critical components’ datasets are independently trained and evaluated by the models. While suitable predictions are served, their performance metrics are reasonably good; such complex models, however obscure reasoning for the predictions made by them and may even undermine the confidence of the decision-maker or the maintenance teams. This is followed by global explanations using SHAP and local explanations using LIME in the second phase to bridge the gap in reliability within industrial contexts. These tools analyze model decisions, highlighting feature importance and explaining how each input variable affects the output. This dual approach offers a general comprehension of the overall model behavior and detailed insight into specific predictions. The proposed framework, in its third component, incorporates the techniques of causal analysis in the form of Granger causality tests in order to move beyond correlation toward causation. This will not only allow the model to predict failures but also present reasons, from the key sensor features linked to possible failure mechanisms to relevant personnel. The causality between sensor behaviors and equipment failures creates much value for maintenance teams due to better root cause identification and effective preventive measures. This step contributes to the system being more explainable. Surrogate Several simple models, including Decision Trees and Linear Models, can be used in yet another stage to approximately represent the complex Random Forest model. These simpler models act as backups, replicating important jobs of the original model's behavior. If the feature explanations obtained from the surrogate model are cross-validated with the primary model, the insights derived would be more reliable and provide an intuitive sense of how the input variables affect the predictions. We then create an iterative explainable feedback loop, where the knowledge learned from the explainability methods feeds back into the training of the models. This feeds into a cycle of continuous improvement both in model accuracy and interpretability over time. By systematically integrating new findings, the model is expected to adapt to changed conditions and further develop its prognosis capability. These components are then presented to the decision-makers through the development of a fully transparent condition monitoring and alert system. The system provides a holistic tool for maintenance operations by leveraging RUL predictions, feature importance scores, persistent sensor threshold values, and autonomous alert mechanisms. Since the system will provide explanations for the predictions given, along with active alerts, the maintenance personnel can make informed decisions on their end regarding correct interventions to extend the life of the critical machinery.Keywords: predictive maintenance, explainable artificial intelligence, prognostic, RUL, machine learning, turbofan engines, C-MAPSS dataset
Procedia PDF Downloads 9674 A Real-Time Snore Detector Using Neural Networks and Selected Sound Features
Authors: Stelios A. Mitilineos, Nicolas-Alexander Tatlas, Georgia Korompili, Lampros Kokkalas, Stelios M. Potirakis
Abstract:
Obstructive Sleep Apnea Hypopnea Syndrome (OSAHS) is a widespread chronic disease that mostly remains undetected, mainly due to the fact that it is diagnosed via polysomnography which is a time and resource-intensive procedure. Screening the disease’s symptoms at home could be used as an alternative approach in order to alert individuals that potentially suffer from OSAHS without compromising their everyday routine. Since snoring is usually linked to OSAHS, developing a snore detector is appealing as an enabling technology for screening OSAHS at home using ubiquitous equipment like commodity microphones (included in, e.g., smartphones). In this context, this study developed a snore detection tool and herein present the approach and selection of specific sound features that discriminate snoring vs. environmental sounds, as well as the performance of the proposed tool. Furthermore, a Real-Time Snore Detector (RTSD) is built upon the snore detection tool and employed in whole-night sleep sound recordings resulting to a large dataset of snoring sound excerpts that are made freely available to the public. The RTSD may be used either as a stand-alone tool that offers insight to an individual’s sleep quality or as an independent component of OSAHS screening applications in future developments.Keywords: obstructive sleep apnea hypopnea syndrome, apnea screening, snoring detection, machine learning, neural networks
Procedia PDF Downloads 208673 A Hybrid Genetic Algorithm and Neural Network for Wind Profile Estimation
Authors: M. Saiful Islam, M. Mohandes, S. Rehman, S. Badran
Abstract:
Increasing necessity of wind power is directing us to have precise knowledge on wind resources. Methodical investigation of potential locations is required for wind power deployment. High penetration of wind energy to the grid is leading multi megawatt installations with huge investment cost. This fact appeals to determine appropriate places for wind farm operation. For accurate assessment, detailed examination of wind speed profile, relative humidity, temperature and other geological or atmospheric parameters are required. Among all of these uncertainty factors influencing wind power estimation, vertical extrapolation of wind speed is perhaps the most difficult and critical one. Different approaches have been used for the extrapolation of wind speed to hub height which are mainly based on Log law, Power law and various modifications of the two. This paper proposes a Artificial Neural Network (ANN) and Genetic Algorithm (GA) based hybrid model, namely GA-NN for vertical extrapolation of wind speed. This model is very simple in a sense that it does not require any parametric estimations like wind shear coefficient, roughness length or atmospheric stability and also reliable compared to other methods. This model uses available measured wind speeds at 10m, 20m and 30m heights to estimate wind speeds up to 100m. A good comparison is found between measured and estimated wind speeds at 30m and 40m with approximately 3% mean absolute percentage error. Comparisons with ANN and power law, further prove the feasibility of the proposed method.Keywords: wind profile, vertical extrapolation of wind, genetic algorithm, artificial neural network, hybrid machine learning
Procedia PDF Downloads 490672 MIM and Experimental Studies of the Thermal Drift in an Ultra-High Precision Instrument for Dimensional Metrology
Authors: Kamélia Bouderbala, Hichem Nouira, Etienne Videcoq, Manuel Girault, Daniel Petit
Abstract:
Thermal drifts caused by the power dissipated by the mechanical guiding systems constitute the main limit to enhance the accuracy of an ultra-high precision cylindricity measuring machine. For this reason, a high precision compact prototype has been designed to simulate the behaviour of the instrument. It ensures in situ calibration of four capacitive displacement probes by comparison with four laser interferometers. The set-up includes three heating wires for simulating the powers dissipated by the mechanical guiding systems, four additional heating wires located between each laser interferometer head and its respective holder, 19 Platinum resistance thermometers (Pt100) to observe the temperature evolution inside the set-up and four Pt100 sensors to monitor the ambient temperature. Both a Reduced Model (RM), based on the Modal Identification Method (MIM) was developed and optimized by comparison with the experimental results. Thereafter, time dependent tests were performed under several conditions to measure the temperature variation at 19 fixed positions in the system and compared to the calculated RM results. The RM results show good agreement with experiment and reproduce as well the temperature variations, revealing the importance of the RM proposed for the evaluation of the thermal behaviour of the system.Keywords: modal identification method (MIM), thermal behavior and drift, dimensional metrology, measurement
Procedia PDF Downloads 396671 Fostering Non-Traditional Student Success in an Online Music Appreciation Course
Authors: Linda Fellag, Arlene Caney
Abstract:
E-learning has earned an essential place in academia because it promotes learner autonomy, student engagement, and technological aptitude, and allows for flexible learning. However, despite advantages, educators have been slower to embrace e-learning for ESL and other non-traditional students for fear that such students will not succeed without the direct faculty contact and academic support of face-to-face classrooms. This study aims to determine if a non-traditional student-friendly online course can produce student retention and performance rates that compare favorably with those of students in standard online sections of the same course aimed at traditional college-level students. One Music faculty member is currently collaborating with an English instructor to redesign an online college-level Music Appreciation course for non-traditional college students. At Community College of Philadelphia, Introduction to Music Appreciation was recently designated as one of the few college-level courses that advanced ESL, and developmental English students can take while completing their language studies. Beginning in Fall 2017, the course will be critical for international students who must maintain full-time student status under visa requirements. In its current online format, however, Music Appreciation is designed for traditional college students, and faculty who teach these sections have been reluctant to revise the course to address the needs of non-traditional students. Interestingly, presenters maintain that the online platform is the ideal place to develop language and college readiness skills in at-risk students while maintaining the course's curricular integrity. The two faculty presenters describe how curriculum rather than technology drives the redesign of the digitized music course, and self-study materials, guided assignments, and periodic assessments promote independent learning and comprehension of material. The 'scaffolded' modules allow ESL and developmental English students to build on prior knowledge, preview key vocabulary, discuss content, and complete graded tasks that demonstrate comprehension. Activities and assignments, in turn, enhance college success by allowing students to practice academic reading strategies, writing, speaking, and student-faculty and peer-peer communication and collaboration. The course components facilitate a comparison of student performance and retention in sections of the redesigned and existing online sections of Music Appreciation as well as in previous sections with at-risk students. Indirect, qualitative measures include student attitudinal surveys and evaluations. Direct, quantitative measures include withdrawal rates, tests of disciplinary knowledge, and final grades. The study will compare the outcomes of three cohorts in the two versions of the online course: ESL students, at-risk developmental students, and college-level students. These data will also be compared with retention and student outcomes data of the three cohorts in f2f Music Appreciation, which permitted non-traditional student enrollment from 1998-2005. During this eight-year period, the presenter addressed the problems of at-risk students by adding language and college success support, which resulted in strong retention and outcomes. The presenters contend that the redesigned course will produce favorable outcomes among all three cohorts because it contains components which proved successful with at-risk learners in f2f sections of the course. Results of their study will be published in 2019 after the redesigned online course has met for two semesters.Keywords: college readiness, e-learning, music appreciation, online courses
Procedia PDF Downloads 177670 Artificial Intelligence Assisted Sentiment Analysis of Hotel Reviews Using Topic Modeling
Authors: Sushma Ghogale
Abstract:
With a surge in user-generated content or feedback or reviews on the internet, it has become possible and important to know consumers' opinions about products and services. This data is important for both potential customers and businesses providing the services. Data from social media is attracting significant attention and has become the most prominent channel of expressing an unregulated opinion. Prospective customers look for reviews from experienced customers before deciding to buy a product or service. Several websites provide a platform for users to post their feedback for the provider and potential customers. However, the biggest challenge in analyzing such data is in extracting latent features and providing term-level analysis of the data. This paper proposes an approach to use topic modeling to classify the reviews into topics and conduct sentiment analysis to mine the opinions. This approach can analyse and classify latent topics mentioned by reviewers on business sites or review sites, or social media using topic modeling to identify the importance of each topic. It is followed by sentiment analysis to assess the satisfaction level of each topic. This approach provides a classification of hotel reviews using multiple machine learning techniques and comparing different classifiers to mine the opinions of user reviews through sentiment analysis. This experiment concludes that Multinomial Naïve Bayes classifier produces higher accuracy than other classifiers.Keywords: latent Dirichlet allocation, topic modeling, text classification, sentiment analysis
Procedia PDF Downloads 97669 Bridge Health Monitoring: A Review
Authors: Mohammad Bakhshandeh
Abstract:
Structural Health Monitoring (SHM) is a crucial and necessary practice that plays a vital role in ensuring the safety and integrity of critical structures, and in particular, bridges. The continuous monitoring of bridges for signs of damage or degradation through Bridge Health Monitoring (BHM) enables early detection of potential problems, allowing for prompt corrective action to be taken before significant damage occurs. Although all monitoring techniques aim to provide accurate and decisive information regarding the remaining useful life, safety, integrity, and serviceability of bridges, understanding the development and propagation of damage is vital for maintaining uninterrupted bridge operation. Over the years, extensive research has been conducted on BHM methods, and experts in the field have increasingly adopted new methodologies. In this article, we provide a comprehensive exploration of the various BHM approaches, including sensor-based, non-destructive testing (NDT), model-based, and artificial intelligence (AI)-based methods. We also discuss the challenges associated with BHM, including sensor placement and data acquisition, data analysis and interpretation, cost and complexity, and environmental effects, through an extensive review of relevant literature and research studies. Additionally, we examine potential solutions to these challenges and propose future research ideas to address critical gaps in BHM.Keywords: structural health monitoring (SHM), bridge health monitoring (BHM), sensor-based methods, machine-learning algorithms, and model-based techniques, sensor placement, data acquisition, data analysis
Procedia PDF Downloads 90668 Measuring Greenhouse Gas Exchange from Paddy Field Using Eddy Covariance Method in Mekong Delta, Vietnam
Authors: Vu H. N. Khue, Marian Pavelka, Georg Jocher, Jiří Dušek, Le T. Son, Bui T. An, Ho Q. Bang, Pham Q. Huong
Abstract:
Agriculture is an important economic sector of Vietnam, the most popular of which is wet rice cultivation. These activities are also known as the main contributor to the national greenhouse gas. In order to understand more about greenhouse gas exchange in these activities and to investigate the factors influencing carbon cycling and sequestration in these types of ecosystems, since 2019, the first eddy covariance station has been installed in a paddy field in Long An province, Mekong Delta. The station was equipped with state-of-the-art equipment for CO₂ and CH₄ gas exchange and micrometeorology measurements. In this study, data from the station was processed following the ICOS recommendations (Integrated Carbon Observation System) standards for CO₂, while CH₄ was manually processed and gap-filled using a random forest model from methane-gapfill-ml, a machine learning package, as there is no standard method for CH₄ flux gap-filling yet. Finally, the carbon equivalent (Ce) balance based on CO₂ and CH₄ fluxes was estimated. The results show that in 2020, even though a new water management practice - alternate wetting and drying - was applied to reduce methane emissions, the paddy field released 928 g Cₑ.m⁻².yr⁻¹, and in 2021, it was reduced to 707 g Cₑ.m⁻².yr⁻¹. On a provincial level, rice cultivation activities in Long An, with a total area of 498,293 ha, released 4.6 million tons of Cₑ in 2020 and 3.5 million tons of Cₑ in 2021.Keywords: eddy covariance, greenhouse gas, methane, rice cultivation, Mekong Delta
Procedia PDF Downloads 142667 The Effect of Alkaline Treatment on Tensile Strength and Morphological Properties of Kenaf Fibres for Yarn Production
Authors: A. Khalina, K. Shaharuddin, M. S. Wahab, M. P. Saiman, H. A. Aisyah
Abstract:
This paper investigates the effect of alkali treatment and mechanical properties of kenaf (Hibiscus cannabinus) fibre for the development of yarn. Two different fibre sources are used for the yarn production. Kenaf fibres were treated with sodium hydroxide (NaOH) in the concentration of 3, 6, 9, and 12% prior to fibre opening process and tested for their tensile strength and Young’s modulus. Then, the selected fibres were introduced to fibre opener at three different opening processing parameters; namely, speed of roller feeder, small drum, and big drum. The diameter size, surface morphology, and fibre durability towards machine of the fibres were characterized. The results show that concentrations of NaOH used have greater effects on fibre mechanical properties. From this study, the tensile and modulus properties of the treated fibres for both types have improved significantly as compared to untreated fibres, especially at the optimum level of 6% NaOH. It is also interesting to highlight that 6% NaOH is the optimum concentration for the alkaline treatment. The untreated and treated fibres at 6% NaOH were then introduced to fibre opener, and it was found that the treated fibre produced higher fibre diameter with better surface morphology compared to the untreated fibre. Higher speed parameter during opening was found to produce higher yield of opened-kenaf fibres.Keywords: alkaline treatment, kenaf fibre, tensile strength, yarn production
Procedia PDF Downloads 249666 Rank-Based Chain-Mode Ensemble for Binary Classification
Authors: Chongya Song, Kang Yen, Alexander Pons, Jin Liu
Abstract:
In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.Keywords: consensus, curse of correlation, imbalance classification, rank-based chain-mode ensemble
Procedia PDF Downloads 138665 Client Hacked Server
Authors: Bagul Abhijeet
Abstract:
Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring
Procedia PDF Downloads 252664 Temperature Evolution, Microstructure and Mechanical Properties of Heat-Treatable Aluminum Alloy Welded by Friction Stir Welding: Comparison with Tungsten Inert Gas
Authors: Saliha Gachi, Mouloud Aissani, Fouad Boubenider
Abstract:
Friction Stir Welding (FSW) is a solid-state welding technique that can join material without melting the plates to be welded. In this work, we are interested to demonstrate the potentiality of FSW for joining the heat-treatable aluminum alloy 2024-T3 which is reputed as difficult to be welded by fusion techniques. Thereafter, the FSW joint is compared with another one obtained from a conventional fusion process Tungsten Inert Gas (TIG). FSW welds are made up using an FSW tool mounted on a milling machine. Single pass welding was applied to fabricated TIG joint. The comparison between the two processes has been made on the temperature evolution, mechanical and microstructure behavior. The microstructural examination revealed that FSW weld is composed of four zones: Base metal (BM), Heat affected zone (HAZ), Thermo-mechanical affected zone (THAZ) and the nugget zone (NZ). The NZ exhibits a recrystallized equiaxed refined grains that induce better mechanical properties and good ductility compared to TIG joint where the grains have a larger size in the welded region compared with the BM due to the elevated heat input. The microhardness results show that, in FSW weld, the THAZ contains the lowest microhardness values and increase in the NZ; however, in TIG process, the lowest values are localized on the NZ.Keywords: friction stir welding, tungsten inert gaz, aluminum, microstructure
Procedia PDF Downloads 277663 Decision Making System for Clinical Datasets
Authors: P. Bharathiraja
Abstract:
Computer Aided decision making system is used to enhance diagnosis and prognosis of diseases and also to assist clinicians and junior doctors in clinical decision making. Medical Data used for decision making should be definite and consistent. Data Mining and soft computing techniques are used for cleaning the data and for incorporating human reasoning in decision making systems. Fuzzy rule based inference technique can be used for classification in order to incorporate human reasoning in the decision making process. In this work, missing values are imputed using the mean or mode of the attribute. The data are normalized using min-ma normalization to improve the design and efficiency of the fuzzy inference system. The fuzzy inference system is used to handle the uncertainties that exist in the medical data. Equal-width-partitioning is used to partition the attribute values into appropriate fuzzy intervals. Fuzzy rules are generated using Class Based Associative rule mining algorithm. The system is trained and tested using heart disease data set from the University of California at Irvine (UCI) Machine Learning Repository. The data was split using a hold out approach into training and testing data. From the experimental results it can be inferred that classification using fuzzy inference system performs better than trivial IF-THEN rule based classification approaches. Furthermore it is observed that the use of fuzzy logic and fuzzy inference mechanism handles uncertainty and also resembles human decision making. The system can be used in the absence of a clinical expert to assist junior doctors and clinicians in clinical decision making.Keywords: decision making, data mining, normalization, fuzzy rule, classification
Procedia PDF Downloads 518662 Water Body Detection and Estimation from Landsat Satellite Images Using Deep Learning
Authors: M. Devaki, K. B. Jayanthi
Abstract:
The identification of water bodies from satellite images has recently received a great deal of attention. Different methods have been developed to distinguish water bodies from various satellite images that vary in terms of time and space. Urban water identification issues body manifests in numerous applications with a great deal of certainty. There has been a sharp rise in the usage of satellite images to map natural resources, including urban water bodies and forests, during the past several years. This is because water and forest resources depend on each other so heavily that ongoing monitoring of both is essential to their sustainable management. The relevant elements from satellite pictures have been chosen using a variety of techniques, including machine learning. Then, a convolution neural network (CNN) architecture is created that can identify a superpixel as either one of two classes, one that includes water or doesn't from input data in a complex metropolitan scene. The deep learning technique, CNN, has advanced tremendously in a variety of visual-related tasks. CNN can improve classification performance by reducing the spectral-spatial regularities of the input data and extracting deep features hierarchically from raw pictures. Calculate the water body using the satellite image's resolution. Experimental results demonstrate that the suggested method outperformed conventional approaches in terms of water extraction accuracy from remote-sensing images, with an average overall accuracy of 97%.Keywords: water body, Deep learning, satellite images, convolution neural network
Procedia PDF Downloads 90661 Single Atom Manipulation with 4 Scanning Tunneling Microscope Technique
Authors: Jianshu Yang, Delphine Sordes, Marek Kolmer, Christian Joachim
Abstract:
Nanoelectronics, for example the calculating circuits integrating at molecule scale logic gates, atomic scale circuits, has been constructed and investigated recently. A major challenge is their functional properties characterization because of the connecting problem from atomic scale to micrometer scale. New experimental instruments and new processes have been proposed therefore. To satisfy a precisely measurement at atomic scale and then connecting micrometer scale electrical integration controller, the technique improvement is kept on going. Our new machine, a low temperature high vacuum four scanning tunneling microscope, as a customer required instrument constructed by Omicron GmbH, is expected to be scaling down to atomic scale characterization. Here, we will present our first testified results about the performance of this new instrument. The sample we selected is Au(111) surface. The measurements have been taken at 4.2 K. The atomic resolution surface structure was observed with each of four scanners with noise level better than 3 pm. With a tip-sample distance calibration by I-z spectra, the sample conductance has been derived from its atomic locally I-V spectra. Furthermore, the surface conductance measurement has been performed using two methods, (1) by landing two STM tips on the surface with sample floating; and (2) by sample floating and one of the landed tips turned to be grounding. In addition, single atom manipulation has been achieved with a modified tip design, which is comparable to a conventional LT-STM.Keywords: low temperature ultra-high vacuum four scanning tunneling microscope, nanoelectronics, point contact, single atom manipulation, tunneling resistance
Procedia PDF Downloads 280660 Mixed Mode Fracture Analyses Using Finite Element Method of Edge Cracked Heavy Spinning Annulus Pulley
Authors: Bijit Kalita, K. V. N. Surendra
Abstract:
Rotating disk is one of the most indispensable parts of a rotating machine. Rotating disk has found many applications in the diverging field of science and technology. In this paper, we have taken into consideration the problem of a heavy spinning disk mounted on a rotor system acted upon by boundary traction. Finite element modelling is used at various loading condition to determine the mixed mode stress intensity factors. The effect of combined shear and normal traction on the boundary is incorporated in the analysis under the action of gravity. The variation near the crack tip is characterized in terms of the stress intensity factor (SIF) with an aim to find the SIF for a wide range of parameters. The results of the finite element analyses carried out on the compressed disk of a belt pulley arrangement using fracture mechanics concepts are shown. A total of hundred cases of the problem are solved for each of the variations in loading arc parameter and crack orientation using finite element models of the disc under compression. All models were prepared and analyzed for the uncracked disk, disk with a single crack at different orientation emanating from shaft hole as well as for a disc with pair of cracks emerging from the same center hole. Curves are plotted for various loading conditions. Finally, crack propagation paths are determined using kink angle concepts.Keywords: crack-tip deformations, static loading, stress concentration, stress intensity factor
Procedia PDF Downloads 144659 Finite Element Modeling of Two-Phase Microstructure during Metal Cutting
Authors: Junior Nomani
Abstract:
This paper presents a novel approach to modelling the metal cutting of duplex stainless steels, a two-phase alloy regarded as a difficult-to-machine material. Calculation and control of shear strain and stresses during cutting are essential to achievement of ideal cutting conditions. Too low or too high leads to higher required cutting force or excessive heat generation causing premature tool wear failure. A 2D finite element cutting model was created based on electron backscatter diffraction (EBSD) data imagery of duplex microstructure. A mesh was generated using ‘object-oriented’ software OOF2 version V2.1.11, converting microstructural images to quadrilateral elements. A virtual workpiece was created on ABAQUS modelling software where a rigid body toolpiece advanced towards workpiece simulating chip formation, generating serrated edge chip formation cutting. Model results found calculated stress strain contour plots correlated well with similar finite element models tied with austenite stainless steel alloys. Virtual chip form profile is also similar compared experimental frozen machining chip samples. The output model data provides new insight description of strain behavior of two phase material on how it transitions from workpiece into the chip.Keywords: Duplex stainless steel, ABAQUS, OOF2, Chip formation
Procedia PDF Downloads 100658 The Role of Community Activism in Promoting Social Justice around Housing Issues: A Case Study of the Western Cape
Authors: Mapule Maema
Abstract:
The paper aims to highlight the role that community activism has played in promoting social justice around housing issues in the Western Cape. The Western Cape is one of the largest spatially segregated provinces in South Africa which continues to exhibit grave inequalities between cities, townships and farms. These inequalities cut across intersectional issues such as, race, class, gender, and politics. The main challenges facing marginalized communities in the Western Cape include access to housing, land and basic services. This is not peculiar to only the Western Cape, the entire country is facing similar challenges however the Western Cape is seen as a fasted urbanizing province in the country due to tourism. Various social movements have been formed across the country to counter these challenges, however, this paper focuses on the resilience communities have fostered despite the myriad housing and spatial crisis they are faced with. The paper focuses on the Legal Resource’s Centre’s clients from an informal settlement called Imizamo Yethu based in Hout Bay Valley area. The 18 hectare settlement houses approximately 33600 people. On the 21st July 2017, Hout Bay experienced violent protests following an eviction order passed by the City of Cape Town. The protest was characterized by tensions within the community regarding the super-blocking initiative which aims to establish roads in informal settlements to ensure basic services. Residents against the process argued that there were no proper consultations done to educate them on what this process entailed. Public participation is one of the objectives the municipalities aim to promote however it remains a great challenge. In order to highlight the experiences of the LRC clients in relation to what motivated their involvement in the movement, how it felt their participation, and aspirations, the paper will employ qualitative research methods. Qualitative research methods enable the researcher to get a deeper and nuanced understanding of the social world in the eyes of those who experienced it. It is a flexible methodology that enables one to also understand social processes and the significance they generate. Data will be collected through the use of the World Cafe as a focus group method. The World Café is a simple, effective and flexible format for hosting group dialogue. The steps taken when setting up a World Café includes the following: setting the context (why you are bringing people together and what you want to achieve), create hospitality space (make participants feel at home and free to discuss issues), explore questions that matter, connect diverse perspectives (the opportunity to actively contribute your thinking), listen together for patterns and insights, share collective discoveries and learnings. Secondary data will be used to augment the data collected. Stories of impact will be drawn from the exercises. This paper will contribute to the discourse of sustainable housing and urban development and the research outputs will be disseminated to the public for learning.Keywords: community activism, influence, social justice, development
Procedia PDF Downloads 138657 Automatic Multi-Label Image Annotation System Guided by Firefly Algorithm and Bayesian Method
Authors: Saad M. Darwish, Mohamed A. El-Iskandarani, Guitar M. Shawkat
Abstract:
Nowadays, the amount of available multimedia data is continuously on the rise. The need to find a required image for an ordinary user is a challenging task. Content based image retrieval (CBIR) computes relevance based on the visual similarity of low-level image features such as color, textures, etc. However, there is a gap between low-level visual features and semantic meanings required by applications. The typical method of bridging the semantic gap is through the automatic image annotation (AIA) that extracts semantic features using machine learning techniques. In this paper, a multi-label image annotation system guided by Firefly and Bayesian method is proposed. Firstly, images are segmented using the maximum variance intra cluster and Firefly algorithm, which is a swarm-based approach with high convergence speed, less computation rate and search for the optimal multiple threshold. Feature extraction techniques based on color features and region properties are applied to obtain the representative features. After that, the images are annotated using translation model based on the Net Bayes system, which is efficient for multi-label learning with high precision and less complexity. Experiments are performed using Corel Database. The results show that the proposed system is better than traditional ones for automatic image annotation and retrieval.Keywords: feature extraction, feature selection, image annotation, classification
Procedia PDF Downloads 586656 Investigating the Determinants and Growth of Financial Technology Depth of Penetration among the Heterogeneous Africa Economies
Authors: Tochukwu Timothy Okoli, Devi Datt Tewari
Abstract:
The high rate of Fintech adoption has not transmitted to greater financial inclusion and development in Africa. This problem is attributed to poor Fintech diversification and usefulness in the continent. This concept is referred to as the Fintech depth of penetration in this study. The study, therefore, assessed its determinants and growth process in a panel of three emergings, twenty-four frontiers and five fragile African economies disaggregated with dummies over the period 2004-2018 to allow for heterogeneity between groups. The System Generalized Method of Moments (GMM) technique reveals that the average depth of Mobile banking and automated teller machine (ATM) is a dynamic heterogeneity process. Moreover, users' previous experiences/compatibility, trial-ability/income, and financial development were the major factors that raise its usefulness, whereas perceived risk, financial openness, and inflation rate significantly limit its usefulness. The growth rate of Mobile banking, ATM, and Internet banking in 2018 is, on average 41.82, 0.4, and 20.8 per cent respectively greater than its average rates in 2004. These greater averages after the 2009 financial crisis suggest that countries resort to Fintech as a risk-mitigating tool. This study, therefore, recommends greater Fintech diversification through improved literacy, institutional development, financial liberalization, and continuous innovation.Keywords: depth of fintech, emerging Africa, financial technology, internet banking, mobile banking
Procedia PDF Downloads 131655 A Regulator's Assessment of Consumer Risk When Evaluating a User Test for an Umbrella Brand Name in an over the Counter Medicine
Authors: A. Bhatt, C. Bassi, H. Farragher, J. Musk
Abstract:
Background: All medicines placed on the EU market are legally required to be accompanied by labeling and package leaflet, which provide comprehensive information, enabling its safe and appropriate use. Mock-ups with results of assessments using a target patient group must be submitted for a marketing authorisation application. Consumers need confidence in non-prescription, OTC medicines in order to manage their minor ailments and umbrella brands assist purchasing decisions by assisting easy identification within a particular therapeutic area. A number of regulatory agencies have risk management tools and guidelines to assist in developing umbrella brands for OTC medicines, however assessment and decision making is subjective and inconsistent. This study presents an evaluation in the UK following the US FDA warning concerning methaemoglobinaemia following 21 reported cases (11 children under 2 years) caused by OTC oral analgesics containing benzocaine. METHODS: A standard face to face, 25 structured task based user interview testing methodology using a standard questionnaire and rating scale in consumers aged 15-91 years, was conducted independently between June and October 2015 in their homes. Whether individuals could discriminate between the labelling, safety information and warnings on cartons and PILs between 3 different OTC medicines packs with the same umbrella name was evaluated. Each pack was presented with differing information hierarchy using, different coloured cartons, containing the 3 different active ingredients, benzocaine (oromucosal spray) and two lozenges containing 2, 4, dichlorobenzyl alcohol, amylmetacresol and hexylresorcinol respectively (for the symptomatic relief of sore throat pain). The test was designed to determine whether warnings on the carton and leaflet were prominent, accessible to alert users that one product contained benzocaine, risk of methaemoglobinaemia, and refer to the leaflet for the signs of the condition and what to do should this occur. Results: Two consumers did not locate the warnings on the side of the pack, eventually found them on the back and two suggestions to further improve accessibility of the methaemoglobinaemia warning. Using a gold pack design for the oromucosal spray, all consumers could differentiate between the 3 drugs, minimum age particulars, pharmaceutical form and the risk factor methaemoglobinaemia. The warnings for benzocaine were deemed to be clear or very clear; appearance of the 3 packs were either very well differentiated or quite well differentiated. The PIL test passed on all criteria. All consumers could use the product correctly, identify risk factors ensuring the critical information necessary for the safe use was legible and easily accessible so that confusion and errors were minimised. Conclusion: Patients with known methaemoglobinaemia are likely to be vigilant in checking for benzocaine containing products, despite similar umbrella brand names across a range of active ingredients. Despite these findings, the package design and spray format were not deemed to be sufficient to mitigate potential safety risks associated with differences in target populations and contraindications when submitted to the Regulatory Agency. Although risk management tools are increasingly being used by agencies to assist in providing objective assurance of package safety, further transparency, reduction in subjectivity and proportionate risk should be demonstrated.Keywords: labelling, OTC, risk, user testing
Procedia PDF Downloads 309654 Stress Concentration Trend for Combined Loading Conditions
Authors: Aderet M. Pantierer, Shmuel Pantierer, Raphael Cordina, Yougashwar Budhoo
Abstract:
Stress concentration occurs when there is an abrupt change in geometry, a mechanical part under loading. These changes in geometry can include holes, notches, or cracks within the component. The modifications create larger stress within the part. This maximum stress is difficult to determine, as it is directly at the point of the minimum area. Strain gauges have yet to be developed to analyze stresses at such minute areas. Therefore, a stress concentration factor must be utilized. The stress concentration factor is a dimensionless parameter calculated solely on the geometry of a part. The factor is multiplied by the nominal, or average, stress of the component, which can be found analytically or experimentally. Stress concentration graphs exist for common loading conditions and geometrical configurations to aid in the determination of the maximum stress a part can withstand. These graphs were developed from historical data yielded from experimentation. This project seeks to verify a stress concentration graph for combined loading conditions. The aforementioned graph was developed using CATIA Finite Element Analysis software. The results of this analysis will be validated through further testing. The 3D modeled parts will be subjected to further finite element analysis using Patran-Nastran software. The finite element models will then be verified by testing physical specimen using a tensile testing machine. Once the data is validated, the unique stress concentration graph will be submitted for publication so it can aid engineers in future projects.Keywords: stress concentration, finite element analysis, finite element models, combined loading
Procedia PDF Downloads 444653 Celebrity Culture and Social Role of Celebrities in Türkiye during the 1990s: The Case of Türkiye, Newspaper, Radio, Televison (TGRT) Channel
Authors: Yelda Yenel, Orkut Acele
Abstract:
In a media-saturated world, celebrities have become ubiquitous figures, encountered both in public spaces and within the privacy of our homes, seamlessly integrating into daily life. From Alexander the Great to contemporary media personalities, the image of celebrity has persisted throughout history, manifesting in various forms and contexts. Over time, as the relationship between society and the market evolved, so too did the roles and behaviors of celebrities. These transformations offer insights into the cultural climate, revealing shifts in habits and worldviews. In Türkiye, the emergence of private television channels brought an influx of celebrities into everyday life, making them a pervasive part of daily routines. To understand modern celebrity culture, it is essential to examine the ideological functions of media within political, economic, and social contexts. Within this framework, celebrities serve as both reflections and creators of cultural values and, at times, act as intermediaries, offering insights into the society of their era. Starting its broadcasting life in 1992 with religious films and religious conversation, Türkiye Newspaper, Radio, Television channel (TGRT) later changed its appearance, slogan, and the celebrities it featured in response to the political atmosphere. Celebrities played a critical role in transforming from the existing slogan 'Peace has come to the screen' to 'Watch and see what will happen”. Celebrities hold significant roles in society, and their images are produced and circulated by various actors, including media organizations and public relations teams. Understanding these dynamics is crucial for analyzing their influence and impact. This study aims to explore Turkish society in the 1990s, focusing on TGRT and its visual and discursive characteristics regarding celebrity figures such as Seda Sayan. The first section examines the historical development of celebrity culture and its transformations, guided by the conceptual framework of celebrity studies. The complex and interconnected image of celebrity, as introduced by post-structuralist approaches, plays a fundamental role in making sense of existing relationships. This section traces the existence and functions of celebrities from antiquity to the present day. The second section explores the economic, social, and cultural contexts of 1990s Türkiye, focusing on the media landscape and visibility that became prominent in the neoliberal era following the 1980s. This section also discusses the political factors underlying TGRT's transformation, such as the 1997 military memorandum. The third section analyzes TGRT as a case study, focusing on its significance as an Islamic television channel and the shifts in its public image, categorized into two distinct periods. The channel’s programming, which aligned with Islamic teachings, and the celebrities who featured prominently during these periods became the public face of both TGRT and the broader society. In particular, the transition to a more 'secular' format during TGRT's second phase is analyzed, focusing on changes in celebrity attire and program formats. This study reveals that celebrities are used as indicators of ideology, benefiting from this instrumentalization by enhancing their own fame and reflecting the prevailing cultural hegemony in society.Keywords: celebrity culture, media, neoliberalism, TGRT
Procedia PDF Downloads 33652 Prediction of Distillation Curve and Reid Vapor Pressure of Dual-Alcohol Gasoline Blends Using Artificial Neural Network for the Determination of Fuel Performance
Authors: Leonard D. Agana, Wendell Ace Dela Cruz, Arjan C. Lingaya, Bonifacio T. Doma Jr.
Abstract:
The purpose of this paper is to study the predict the fuel performance parameters, which include drivability index (DI), vapor lock index (VLI), and vapor lock potential using distillation curve and Reid vapor pressure (RVP) of dual alcohol-gasoline fuel blends. Distillation curve and Reid vapor pressure were predicted using artificial neural networks (ANN) with macroscopic properties such as boiling points, RVP, and molecular weights as the input layers. The ANN consists of 5 hidden layers and was trained using Bayesian regularization. The training mean square error (MSE) and R-value for the ANN of RVP are 91.4113 and 0.9151, respectively, while the training MSE and R-value for the distillation curve are 33.4867 and 0.9927. Fuel performance analysis of the dual alcohol–gasoline blends indicated that highly volatile gasoline blended with dual alcohols results in non-compliant fuel blends with D4814 standard. Mixtures of low-volatile gasoline and 10% methanol or 10% ethanol can still be blended with up to 10% C3 and C4 alcohols. Intermediate volatile gasoline containing 10% methanol or 10% ethanol can still be blended with C3 and C4 alcohols that have low RVPs, such as 1-propanol, 1-butanol, 2-butanol, and i-butanol. Biography: Graduate School of Chemical, Biological, and Materials Engineering and Sciences, Mapua University, Muralla St., Intramuros, Manila, 1002, PhilippinesKeywords: dual alcohol-gasoline blends, distillation curve, machine learning, reid vapor pressure
Procedia PDF Downloads 103651 Procedural Protocol for Dual Energy Computed Tomography (DECT) Inversion
Authors: Rezvan Ravanfar Haghighi, S. Chatterjee, Pratik Kumar, V. C. Vani, Priya Jagia, Sanjiv Sharma, Susama Rani Mandal, R. Lakshmy
Abstract:
The dual energy computed tomography (DECT) aims at noting the HU(V) values for the sample at two different voltages V=V1, V2 and thus obtain the electron densities (ρe) and effective atomic number (Zeff) of the substance. In the present paper, we aim to obtain a numerical algorithm by which (ρe, Zeff) can be obtained from the HU(100) and HU(140) data, where V=100, 140 kVp. The idea is to use this inversion method to characterize and distinguish between the lipid and fibrous coronary artery plaques.With the idea to develop the inversion algorithm for low Zeff materials, as is the case with non calcified coronary artery plaque, we prepare aqueous samples whose calculated values of (ρe, Zeff) lie in the range (2.65×1023≤ ρe≤ 3.64×1023 per cc ) and (6.80≤ Zeff ≤ 8.90). We fill the phantom with these known samples and experimentally determine HU(100) and HU(140) for the same pixels. Knowing that the HU(V) values are related to the attenuation coefficient of the system, we present an algorithm by which the (ρe, Zeff) is calibrated with respect to (HU(100), HU(140)). The calibration is done with a known set of 20 samples; its accuracy is checked with a different set of 23 known samples. We find that the calibration gives the ρe with an accuracy of ± 4% while Zeff is found within ±1% of the actual value, the confidence being 95%.In this inversion method (ρe, Zeff) of the scanned sample can be found by eliminating the effects of the CT machine and also by ensuring that the determination of the two unknowns (ρe, Zeff) does not interfere with each other. It is found that this algorithm can be used for prediction of chemical characteristic (ρe, Zeff) of unknown scanned materials with 95% confidence level, by inversion of the DECT data.Keywords: chemical composition, dual-energy computed tomography, inversion algorithm
Procedia PDF Downloads 438650 Hand Gesture Interface for PC Control and SMS Notification Using MEMS Sensors
Authors: Keerthana E., Lohithya S., Harshavardhini K. S., Saranya G., Suganthi S.
Abstract:
In an epoch of expanding human-machine interaction, the development of innovative interfaces that bridge the gap between physical gestures and digital control has gained significant momentum. This study introduces a distinct solution that leverages a combination of MEMS (Micro-Electro-Mechanical Systems) sensors, an Arduino Mega microcontroller, and a PC to create a hand gesture interface for PC control and SMS notification. The core of the system is an ADXL335 MEMS accelerometer sensor integrated with an Arduino Mega, which communicates with a PC via a USB cable. The ADXL335 provides real-time acceleration data, which is processed by the Arduino to detect specific hand gestures. These gestures, such as left, right, up, down, or custom patterns, are interpreted by the Arduino, and corresponding actions are triggered. In the context of SMS notifications, when a gesture indicative of a new SMS is recognized, the Arduino relays this information to the PC through the serial connection. The PC application, designed to monitor the Arduino's serial port, displays these SMS notifications in the serial monitor. This study offers an engaging and interactive means of interfacing with a PC by translating hand gestures into meaningful actions, opening up opportunities for intuitive computer control. Furthermore, the integration of SMS notifications adds a practical dimension to the system, notifying users of incoming messages as they interact with their computers. The use of MEMS sensors, Arduino, and serial communication serves as a promising foundation for expanding the capabilities of gesture-based control systems.Keywords: hand gestures, multiple cables, serial communication, sms notification
Procedia PDF Downloads 71