Search results for: smelting techniques
5755 Helicopter Exhaust Gases Cooler in Terms of Computational Fluid Dynamics (CFD) Analysis
Authors: Mateusz Paszko, Ksenia Siadkowska
Abstract:
Due to the low-altitude and relatively low-speed flight, helicopters are easy targets for actual combat assets e.g. infrared-guided missiles. Current techniques aim to increase the combat effectiveness of the military helicopters. Protection of the helicopter in flight from early detection, tracking and finally destruction can be realized in many ways. One of them is cooling hot exhaust gasses, emitting from the engines to the atmosphere in special heat exchangers. Nowadays, this process is realized in ejective coolers, where strong heat and momentum exchange between hot exhaust gases and cold air ejected from atmosphere takes place. Flow effects of air, exhaust gases; mixture of those two and the heat transfer between cold air and hot exhaust gases are given by differential equations of: Mass transportation–flow continuity, ejection of cold air through expanding exhaust gasses, conservation of momentum, energy and physical relationship equations. Calculation of those processes in ejective cooler by means of classic mathematical analysis is extremely hard or even impossible. Because of this, it is necessary to apply the numeric approach with modern, numeric computer programs. The paper discussed the general usability of the Computational Fluid Dynamics (CFD) in a process of projecting the ejective exhaust gases cooler cooperating with helicopter turbine engine. In this work, the CFD calculations have been performed for ejective-based cooler cooperating with the PA W3 helicopter’s engines.Keywords: aviation, CFD analysis, ejective-cooler, helicopter techniques
Procedia PDF Downloads 3325754 A Convolutional Neural Network-Based Model for Lassa fever Virus Prediction Using Patient Blood Smear Image
Authors: A. M. John-Otumu, M. M. Rahman, M. C. Onuoha, E. P. Ojonugwa
Abstract:
A Convolutional Neural Network (CNN) model for predicting Lassa fever was built using Python 3.8.0 programming language, alongside Keras 2.2.4 and TensorFlow 2.6.1 libraries as the development environment in order to reduce the current high risk of Lassa fever in West Africa, particularly in Nigeria. The study was prompted by some major flaws in existing conventional laboratory equipment for diagnosing Lassa fever (RT-PCR), as well as flaws in AI-based techniques that have been used for probing and prognosis of Lassa fever based on literature. There were 15,679 blood smear microscopic image datasets collected in total. The proposed model was trained on 70% of the dataset and tested on 30% of the microscopic images in avoid overfitting. A 3x3x3 convolution filter was also used in the proposed system to extract features from microscopic images. The proposed CNN-based model had a recall value of 96%, a precision value of 93%, an F1 score of 95%, and an accuracy of 94% in predicting and accurately classifying the images into clean or infected samples. Based on empirical evidence from the results of the literature consulted, the proposed model outperformed other existing AI-based techniques evaluated. If properly deployed, the model will assist physicians, medical laboratory scientists, and patients in making accurate diagnoses for Lassa fever cases, allowing the mortality rate due to the Lassa fever virus to be reduced through sound decision-making.Keywords: artificial intelligence, ANN, blood smear, CNN, deep learning, Lassa fever
Procedia PDF Downloads 1205753 Effect of Highway Construction on Soil Properties and Soil Organic Carbon (Soc) Along Lagos-Badagry Expressway, Lagos, Nigeria
Authors: Fatai Olakunle Ogundele
Abstract:
Road construction is increasingly common in today's world as human development expands and people increasingly rely on cars for transportation on a daily basis. The construction of a large network of roads has dramatically altered the landscape and impacted well-being in a number of deleterious ways. In addition, the road can also shift population demographics and be a source of pollution into the environment. Road construction activities normally result in changes in alteration of the soil's physical properties through soil compaction on the road itself and on adjacent areas and chemical and biological properties, among other effects. Understanding roadside soil properties that are influenced by road construction activities can serve as a basis for formulating conservation-based management strategies. Therefore, this study examined the effects of road construction on soil properties and soil organic carbon along Lagos Badagry Expressway, Lagos, Nigeria. The study adopted purposive sampling techniques and 40 soil samples were collected at a depth of 0 – 30cm from each of the identified road intersections and infrastructures using a soil auger. The soil samples collected were taken to the laboratory for soil properties and carbon stock analysis using standard methods. Both descriptive and inferential statistical techniques were applied to analyze the data obtained. The results revealed that soil compaction inhibits ecological succession on roadsides in that increased compaction suppresses plant growth as well as causes changes in soil quality.Keywords: highway, soil properties, organic carbon, road construction, land degradation
Procedia PDF Downloads 805752 A Flute Tracking System for Monitoring the Wear of Cutting Tools in Milling Operations
Authors: Hatim Laalej, Salvador Sumohano-Verdeja, Thomas McLeay
Abstract:
Monitoring of tool wear in milling operations is essential for achieving the desired dimensional accuracy and surface finish of a machined workpiece. Although there are numerous statistical models and artificial intelligence techniques available for monitoring the wear of cutting tools, these techniques cannot pin point which cutting edge of the tool, or which insert in the case of indexable tooling, is worn or broken. Currently, the task of monitoring the wear on the tool cutting edges is carried out by the operator who performs a manual inspection, causing undesirable stoppages of machine tools and consequently resulting in costs incurred from lost productivity. The present study is concerned with the development of a flute tracking system to segment signals related to each physical flute of a cutter with three flutes used in an end milling operation. The purpose of the system is to monitor the cutting condition for individual flutes separately in order to determine their progressive wear rates and to predict imminent tool failure. The results of this study clearly show that signals associated with each flute can be effectively segmented using the proposed flute tracking system. Furthermore, the results illustrate that by segmenting the sensor signal by flutes it is possible to investigate the wear in each physical cutting edge of the cutting tool. These findings are significant in that they facilitate the online condition monitoring of a cutting tool for each specific flute without the need for operators/engineers to perform manual inspections of the tool.Keywords: machining, milling operation, tool condition monitoring, tool wear prediction
Procedia PDF Downloads 3035751 A Machine Learning Based Framework for Education Levelling in Multicultural Countries: UAE as a Case Study
Authors: Shatha Ghareeb, Rawaa Al-Jumeily, Thar Baker
Abstract:
In Abu Dhabi, there are many different education curriculums where sector of private schools and quality assurance is supervising many private schools in Abu Dhabi for many nationalities. As there are many different education curriculums in Abu Dhabi to meet expats’ needs, there are different requirements for registration and success. In addition, there are different age groups for starting education in each curriculum. In fact, each curriculum has a different number of years, assessment techniques, reassessment rules, and exam boards. Currently, students that transfer curriculums are not being placed in the right year group due to different start and end dates of each academic year and their date of birth for each year group is different for each curriculum and as a result, we find students that are either younger or older for that year group which therefore creates gaps in their learning and performance. In addition, there is not a way of storing student data throughout their academic journey so that schools can track the student learning process. In this paper, we propose to develop a computational framework applicable in multicultural countries such as UAE in which multi-education systems are implemented. The ultimate goal is to use cloud and fog computing technology integrated with Artificial Intelligence techniques of Machine Learning to aid in a smooth transition when assigning students to their year groups, and provide leveling and differentiation information of students who relocate from a particular education curriculum to another, whilst also having the ability to store and access student data from anywhere throughout their academic journey.Keywords: admissions, algorithms, cloud computing, differentiation, fog computing, levelling, machine learning
Procedia PDF Downloads 1425750 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows
Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid
Abstract:
Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil
Procedia PDF Downloads 1305749 Surface Characterization of Zincblende and Wurtzite Semiconductors Using Nonlinear Optics
Authors: Hendradi Hardhienata, Tony Sumaryada, Sri Setyaningsih
Abstract:
Current progress in the field of nonlinear optics has enabled precise surface characterization in semiconductor materials. Nonlinear optical techniques are favorable due to their nondestructive measurement and ability to work in nonvacuum and ambient conditions. The advance of the bond hyperpolarizability models opens a wide range of nanoscale surface investigation including the possibility to detect molecular orientation at the surface of silicon and zincblende semiconductors, investigation of electric field induced second harmonic fields at the semiconductor interface, detection of surface impurities, and very recently, study surface defects such as twin boundary in wurtzite semiconductors. In this work, we show using nonlinear optical techniques, e.g. nonlinear bond models how arbitrary polarization of the incoming electric field in Rotational Anisotropy Spectroscopy experiments can provide more information regarding the origin of the nonlinear sources in zincblende and wurtzite semiconductor structure. In addition, using hyperpolarizability consideration, we describe how the nonlinear susceptibility tensor describing SHG can be well modelled using only few parameter because of the symmetry of the bonds. We also show how the third harmonic intensity feature shows considerable changes when the incoming field polarization angle is changed from s-polarized to p-polarized. We also propose a method how to investigate surface reconstruction and defects in wurtzite and zincblende structure at the nanoscale level.Keywords: surface characterization, bond model, rotational anisotropy spectroscopy, effective hyperpolarizability
Procedia PDF Downloads 1585748 Comparative Study on Sensory Profiles of Liquor from Different Dried Cocoa Beans
Authors: Khairul Bariah Sulaiman, Tajul Aris Yang
Abstract:
Malaysian dried cocoa beans have been reported to have low quality flavour and are often sold at discounted prices. Various efforts have been made to improve the Malaysian beans quality. Among these efforts is introduction of the shallow box fermentation technique and pulp preconditioned through pods storage. However, after nearly four decades of the effort was done, Malaysian cocoa farmers still received lower prices for their beans. So, this study was carried out in order to assess the flavour quality of dried cocoa beans produced by shallow box fermentation techniques, combination of shallow box fermentation with pods storage and compared to dried cocoa beans obtained from Ghana. A total of eight samples of dried cocoa was used in this study, which one of the samples was Ghanaian beans (coded with no.8), while the rest were Malaysian cocoa beans with different post-harvest processing (coded with no. 1, 2, 3, 4, 5, 6 and 7). Cocoa liquor was prepared from all samples in the prescribed techniques and sensory evaluation was carried out using Quantitative Descriptive Analysis (QDA) Method with 0-10 scale by Malaysian Cocoa Board trained panelist. Sensory evaluation showed that cocoa attributes for all cocoa liquors ranging from 3.5 to 5.3, whereas bitterness was ranging from 3.4 to 4.6 and astringent attribute ranging from 3.9 to 5.5, respectively. Meanwhile, all cocoa liquors were having acid or sourness attribute ranging from 1.6 to 3.6, respectively. In general cocoa liquor prepared from sample coded no 4 has almost similar flavour profile and no significantly different at p < 0.05 with Ghana, in term of most flavour attributes as compared to the other six samples.Keywords: cocoa beans, flavour, fermentation, shallow box, pods storage
Procedia PDF Downloads 3945747 Exploring the Synergistic Effects of Aerobic Exercise and Cinnamon Extract on Metabolic Markers in Insulin-Resistant Rats through Advanced Machine Learning and Deep Learning Techniques
Authors: Masoomeh Alsadat Mirshafaei
Abstract:
The present study aims to explore the effect of an 8-week aerobic training regimen combined with cinnamon extract on serum irisin and leptin levels in insulin-resistant rats. Additionally, this research leverages various machine learning (ML) and deep learning (DL) algorithms to model the complex interdependencies between exercise, nutrition, and metabolic markers, offering a groundbreaking approach to obesity and diabetes research. Forty-eight Wistar rats were selected and randomly divided into four groups: control, training, cinnamon, and training cinnamon. The training protocol was conducted over 8 weeks, with sessions 5 days a week at 75-80% VO2 max. The cinnamon and training-cinnamon groups were injected with 200 ml/kg/day of cinnamon extract. Data analysis included serum data, dietary intake, exercise intensity, and metabolic response variables, with blood samples collected 72 hours after the final training session. The dataset was analyzed using one-way ANOVA (P<0.05) and fed into various ML and DL models, including Support Vector Machines (SVM), Random Forest (RF), and Convolutional Neural Networks (CNN). Traditional statistical methods indicated that aerobic training, with and without cinnamon extract, significantly increased serum irisin and decreased leptin levels. Among the algorithms, the CNN model provided superior performance in identifying specific interactions between cinnamon extract concentration and exercise intensity, optimizing the increase in irisin and the decrease in leptin. The CNN model achieved an accuracy of 92%, outperforming the SVM (85%) and RF (88%) models in predicting the optimal conditions for metabolic marker improvements. The study demonstrated that advanced ML and DL techniques could uncover nuanced relationships and potential cellular responses to exercise and dietary supplements, which is not evident through traditional methods. These findings advocate for the integration of advanced analytical techniques in nutritional science and exercise physiology, paving the way for personalized health interventions in managing obesity and diabetes.Keywords: aerobic training, cinnamon extract, insulin resistance, irisin, leptin, convolutional neural networks, exercise physiology, support vector machines, random forest
Procedia PDF Downloads 375746 Efficient Microspore Isolation Methods for High Yield Embryoids and Regeneration in Rice (Oryza sativa L.)
Authors: S. M. Shahinul Islam, Israt Ara, Narendra Tuteja, Sreeramanan Subramaniam
Abstract:
Through anther and microspore culture methods, complete homozygous plants can be produced within a year as compared to the long inbreeding method. Isolated microspore culture is one of the most important techniques for rapid development of haploid plants. The efficiency of this method is influenced by several factors such as cultural conditions, growth regulators, plant media, pretreatments, physical and growth conditions of the donor plants, pollen isolation procedure, etc. The main purpose of this study was to improve the isolated microspore culture protocol in order to increase the efficiency of embryoids, its regeneration and reducing albinisms. Under this study we have tested mainly three different microspore isolation procedures by glass rod, homozeniger and by blending and found the efficiency on gametic embryogenesis. There are three types of media viz. washing, pre-culture and induction was used. The induction medium as AMC (modified MS) supplemented by 2, 4-D (2.5 mg/l), kinetin (0.5 mg/l) and higher amount of D-Manitol (90 g/l) instead of sucrose and two types of amino acids (L-glutamine and L-serine) were used. Out of three main microspore isolation procedure by homogenizer isolation (P4) showed best performance on ELS induction (177%) and green plantlets (104%) compared with other techniques. For all cases albinisims occurred but microspore isolation from excised anthers by glass rod and homogenizer showed lesser numbers of albino plants that was also one of the important findings in this study.Keywords: androgenesis, pretreatment, microspore culture, regeneration, albino plants, Oryza sativa
Procedia PDF Downloads 3625745 Ant Lion Optimization in a Fuzzy System for Benchmark Control Problem
Authors: Leticia Cervantes, Edith Garcia, Oscar Castillo
Abstract:
At today, there are several control problems where the main objective is to obtain the best control in the study to decrease the error in the application. Many techniques can use to control these problems such as Neural Networks, PID control, Fuzzy Logic, Optimization techniques and many more. In this case, fuzzy logic with fuzzy system and an optimization technique are used to control the case of study. In this case, Ant Lion Optimization is used to optimize a fuzzy system to control the velocity of a simple treadmill. The main objective is to achieve the control of the velocity in the control problem using the ALO optimization. First, a simple fuzzy system was used to control the velocity of the treadmill it has two inputs (error and error change) and one output (desired speed), then results were obtained but to decrease the error the ALO optimization was developed to optimize the fuzzy system of the treadmill. Having the optimization, the simulation was performed, and results can prove that using the ALO optimization the control of the velocity was better than a conventional fuzzy system. This paper describes some basic concepts to help to understand the idea in this work, the methodology of the investigation (control problem, fuzzy system design, optimization), the results are presented and the optimization is used for the fuzzy system. A comparison between the simple fuzzy system and the optimized fuzzy systems are presented where it can be proving the optimization improved the control with good results the major findings of the study is that ALO optimization is a good alternative to improve the control because it helped to decrease the error in control applications even using any control technique to optimized, As a final statement is important to mentioned that the selected methodology was good because the control of the treadmill was improve using the optimization technique.Keywords: ant lion optimization, control problem, fuzzy control, fuzzy system
Procedia PDF Downloads 3995744 Evaluating the Radiation Dose Involved in Interventional Radiology Procedures
Authors: Kholood Baron
Abstract:
Radiologic interventional studies use fluoroscopy imaging guidance to perform both diagnostic and therapeutic procedures. These could result in high radiation doses being delivered to the patients and also to the radiology team. This is due to the prolonged fluoroscopy time and the large number of images taken, even when dose-minimizing techniques and modern fluoroscopic tools are applied. Hence, these procedures are part of the everyday routine of interventional radiology doctors, assistant nurses, and radiographers. Thus, it is important to estimate the radiation exposure dose they received in order to give objective advice and reduce both patient and radiology team radiation exposure dose. The aim of this study was to find out the total radiation dose reaching the radiologist and the patient during an interventional procedure and to determine the impact of certain parameters on the patient dose. Method: The radiation dose was measured by TLD devices (thermoluminescent dosimeter; radiation dosimeter device). Physicians, patients, nurses, and radiographers wore TLDs during 12 interventional radiology procedures performed in two hospitals, Mubarak and Chest Hospital. This study highlights the need for interventional radiologists to be mindful of the radiation doses received by both patients and medical staff during interventional radiology procedures. The findings emphasize the impact of factors such as fluoroscopy duration and the number of images taken on the patient dose. By raising awareness and providing insights into optimizing techniques and protective measures, this research contributes to the overall goal of reducing radiation doses and ensuring the safety of patients and medical staff.Keywords: dosimetry, radiation dose, interventional radiology procedures, patient radiation dose
Procedia PDF Downloads 1115743 Cut-Out Animation as an Technic and Development inside History Process
Authors: Armagan Gokcearslan
Abstract:
The art of animation has developed very rapidly from the aspects of script, sound and music, motion, character design, techniques being used and technological tools being developed since the first years until today. Technical variety attracts a particular attention in the art of animation. Being perceived as a kind of illusion in the beginning; animations commonly used the Flash Sketch technique. Animations artists using the Flash Sketch technique created scenes by drawing them on a blackboard with chalk. The Flash Sketch technique was used by primary animation artists like Emile Cohl, Winsor McCay ande Blackton. And then tools like Magical Lantern, Thaumatrope, Phenakisticope, and Zeotrap were developed and started to be used intensely in the first years of the art of animation. Today, on the other hand, the art of animation is affected by developments in the computer technology. It is possible to create three-dimensional and two-dimensional animations with the help of various computer software. Cut-out technique is among the important techniques being used in the art of animation. Cut-out animation technique is based on the art of paper cutting. Examining cut-out animations; it is observed that they technically resemble the art of paper cutting. The art of paper cutting has a rooted history. It is possible to see the oldest samples of paper cutting in the People’s Republic of China in the period after the 2. century B.C. when the Chinese invented paper. The most popular artist using the cut-out animation technique is the German artist Lotte Reiniger. This study titled “Cut-out Animation as a Technic and Development Inside History Process” will embrace the art of paper cutting, the relationship between the art of paper cutting and cut-out animation, its development within the historical process, animation artists producing artworks in this field, important cut-out animations, and their technical properties.Keywords: cut-out, paper art, animation, technic
Procedia PDF Downloads 2755742 Markowitz and Implementation of a Multi-Objective Evolutionary Technique Applied to the Colombia Stock Exchange (2009-2015)
Authors: Feijoo E. Colomine Duran, Carlos E. Peñaloza Corredor
Abstract:
There modeling component selection financial investment (Portfolio) a variety of problems that can be addressed with optimization techniques under evolutionary schemes. For his feature, the problem of selection of investment components of a dichotomous relationship between two elements that are opposed: The Portfolio Performance and Risk presented by choosing it. This relationship was modeled by Markowitz through a media problem (Performance) - variance (risk), ie must Maximize Performance and Minimize Risk. This research included the study and implementation of multi-objective evolutionary techniques to solve these problems, taking as experimental framework financial market equities Colombia Stock Exchange between 2009-2015. Comparisons three multiobjective evolutionary algorithms, namely the Nondominated Sorting Genetic Algorithm II (NSGA-II), the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and Indicator-Based Selection in Multiobjective Search (IBEA) were performed using two measures well known performance: The Hypervolume indicator and R_2 indicator, also it became a nonparametric statistical analysis and the Wilcoxon rank-sum test. The comparative analysis also includes an evaluation of the financial efficiency of the investment portfolio chosen by the implementation of various algorithms through the Sharpe ratio. It is shown that the portfolio provided by the implementation of the algorithms mentioned above is very well located between the different stock indices provided by the Colombia Stock Exchange.Keywords: finance, optimization, portfolio, Markowitz, evolutionary algorithms
Procedia PDF Downloads 3025741 Management of Urban Watering: A Study of Appliance of Technologies and Legislation in Goiania, Brazil
Authors: Vinicius Marzall, Jussanã Milograna
Abstract:
The urban drainwatering remains a major challenge for most of the Brazilian cities. Not so different of the most part, Goiania, a state capital located in Midwest of the country has few legislations about the subject matter and only one registered solution of compensative techniques for drainwater. This paper clam to show some solutions which are adopted in other Brazilian cities with consolidated legislation, suggesting technics about detention tanks in a building sit. This study analyzed and compared the legislation of Curitiba, Porto Alegre e Sao Paulo, with the actual legislation and politics of Goiania. After this, were created models with adopted data for dimensioning the size of detention tanks using the envelope curve method considering synthetic series for intense precipitations and building sits between 250 m² and 600 m², with an impermeabilization tax of 50%. The results showed great differences between the legislation of Goiania and the documentation of the others cities analyzed, like the number of techniques for drainwatering applied to the reality of the cities, educational actions to awareness the population about care the water courses and political management by having a specified funds for drainwater subjects, for example. Besides, the use of detention tank showed itself practicable, have seen that the occupation of the tank is minor than 3% of the building sit, whatever the size of the terrain, granting the exit flow to pre-occupational taxes in extreme rainfall events. Also, was developed a linear equation to measure the detention tank based in the size of the building sit in Goiania, making simpler the calculation and implementation for non-specialized people.Keywords: clean technology, legislation, rainwater management, urban drainwater
Procedia PDF Downloads 1595740 Visual Inspection of Road Conditions Using Deep Convolutional Neural Networks
Authors: Christos Theoharatos, Dimitris Tsourounis, Spiros Oikonomou, Andreas Makedonas
Abstract:
This paper focuses on the problem of visually inspecting and recognizing the road conditions in front of moving vehicles, targeting automotive scenarios. The goal of road inspection is to identify whether the road is slippery or not, as well as to detect possible anomalies on the road surface like potholes or body bumps/humps. Our work is based on an artificial intelligence methodology for real-time monitoring of road conditions in autonomous driving scenarios, using state-of-the-art deep convolutional neural network (CNN) techniques. Initially, the road and ego lane are segmented within the field of view of the camera that is integrated into the front part of the vehicle. A novel classification CNN is utilized to identify among plain and slippery road textures (e.g., wet, snow, etc.). Simultaneously, a robust detection CNN identifies severe surface anomalies within the ego lane, such as potholes and speed bumps/humps, within a distance of 5 to 25 meters. The overall methodology is illustrated under the scope of an integrated application (or system), which can be integrated into complete Advanced Driver-Assistance Systems (ADAS) systems that provide a full range of functionalities. The outcome of the proposed techniques present state-of-the-art detection and classification results and real-time performance running on AI accelerator devices like Intel’s Myriad 2/X Vision Processing Unit (VPU).Keywords: deep learning, convolutional neural networks, road condition classification, embedded systems
Procedia PDF Downloads 1345739 Pricing Strategy in Marketing: Balancing Value and Profitability
Authors: Mohsen Akhlaghi, Tahereh Ebrahimi
Abstract:
Pricing strategy is a vital component in achieving the balance between customer value and business profitability. The aim of this study is to provide insights into the factors, techniques, and approaches involved in pricing decisions. The study utilizes a descriptive approach to discuss various aspects of pricing strategy in marketing, drawing on concepts from market research, consumer psychology, competitive analysis, and adaptability. This approach presents a comprehensive view of pricing decisions. The result of this exploration is a framework that highlights key factors influencing pricing decisions. The study examines how factors such as market positioning, product differentiation, and brand image shape pricing strategies. Additionally, it emphasizes the role of consumer psychology in understanding price elasticity, perceived value, and price-quality associations that influence consumer behavior. Various pricing techniques, including charm pricing, prestige pricing, and bundle pricing, are mentioned as methods to enhance sales by influencing consumer perceptions. The study also underscores the importance of adaptability in responding to market dynamics through regular price monitoring, dynamic pricing, and promotional strategies. It recognizes the role of digital platforms in enabling personalized pricing and dynamic pricing models. In conclusion, the study emphasizes that effective pricing strategies strike a balance between customer value and business profitability, ultimately driving sales, enhancing brand perception, and fostering lasting customer relationships.Keywords: business, customer benefits, marketing, pricing
Procedia PDF Downloads 795738 Valence and Arousal-Based Sentiment Analysis: A Comparative Study
Authors: Usama Shahid, Muhammad Zunnurain Hussain
Abstract:
This research paper presents a comprehensive analysis of a sentiment analysis approach that employs valence and arousal as its foundational pillars, in comparison to traditional techniques. Sentiment analysis is an indispensable task in natural language processing that involves the extraction of opinions and emotions from textual data. The valence and arousal dimensions, representing the intensity and positivity/negativity of emotions, respectively, enable the creation of four quadrants, each representing a specific emotional state. The study seeks to determine the impact of utilizing these quadrants to identify distinct emotional states on the accuracy and efficiency of sentiment analysis, in comparison to traditional techniques. The results reveal that the valence and arousal-based approach outperforms other approaches, particularly in identifying nuanced emotions that may be missed by conventional methods. The study's findings are crucial for applications such as social media monitoring and market research, where the accurate classification of emotions and opinions is paramount. Overall, this research highlights the potential of using valence and arousal as a framework for sentiment analysis and offers invaluable insights into the benefits of incorporating specific types of emotions into the analysis. These findings have significant implications for researchers and practitioners in the field of natural language processing, as they provide a basis for the development of more accurate and effective sentiment analysis tools.Keywords: sentiment analysis, valence and arousal, emotional states, natural language processing, machine learning, text analysis, sentiment classification, opinion mining
Procedia PDF Downloads 1015737 Comprehensive Feature Extraction for Optimized Condition Assessment of Fuel Pumps
Authors: Ugochukwu Ejike Akpudo, Jank-Wook Hur
Abstract:
The increasing demand for improved productivity, maintainability, and reliability has prompted rapidly increasing research studies on the emerging condition-based maintenance concept- Prognostics and health management (PHM). Varieties of fuel pumps serve critical functions in several hydraulic systems; hence, their failure can have daunting effects on productivity, safety, etc. The need for condition monitoring and assessment of these pumps cannot be overemphasized, and this has led to the uproar in research studies on standard feature extraction techniques for optimized condition assessment of fuel pumps. By extracting time-based, frequency-based and the more robust time-frequency based features from these vibrational signals, a more comprehensive feature assessment (and selection) can be achieved for a more accurate and reliable condition assessment of these pumps. With the aid of emerging deep classification and regression algorithms like the locally linear embedding (LLE), we propose a method for comprehensive condition assessment of electromagnetic fuel pumps (EMFPs). Results show that the LLE as a comprehensive feature extraction technique yields better feature fusion/dimensionality reduction results for condition assessment of EMFPs against the use of single features. Also, unlike other feature fusion techniques, its capabilities as a fault classification technique were explored, and the results show an acceptable accuracy level using standard performance metrics for evaluation.Keywords: electromagnetic fuel pumps, comprehensive feature extraction, condition assessment, locally linear embedding, feature fusion
Procedia PDF Downloads 1175736 Analysis of Ionospheric Variations over Japan during 23rd Solar Cycle Using Wavelet Techniques
Authors: C. S. Seema, P. R. Prince
Abstract:
The characterization of spatio-temporal inhomogeneities occurring in the ionospheric F₂ layer is remarkable since these variations are direct consequences of electrodynamical coupling between magnetosphere and solar events. The temporal and spatial variations of the F₂ layer, which occur with a period of several days or even years, mainly owe to geomagnetic and meteorological activities. The hourly F₂ layer critical frequency (foF2) over 23rd solar cycle (1996-2008) of three ionosonde stations (Wakkanai, Kokunbunji, and Okinawa) in northern hemisphere, which falls within same longitudinal span, is analyzed using continuous wavelet techniques. Morlet wavelet is used to transform continuous time series data of foF2 to a two dimensional time-frequency space, quantifying the time evolution of the oscillatory modes. The presence of significant time patterns (periodicities) at a particular time period and the time location of each periodicity are detected from the two-dimensional representation of the wavelet power, in the plane of scale and period of the time series. The mean strength of each periodicity over the entire period of analysis is studied using global wavelet spectrum. The quasi biennial, annual, semiannual, 27 day, diurnal and 12 hour variations of foF2 are clearly evident in the wavelet power spectra in all the three stations. Critical frequency oscillations with multi-day periods (2-3 days and 9 days in the low latitude station, 6-7 days in all stations and 15 days in mid-high latitude station) are also superimposed over large time scaled variations.Keywords: continuous wavelet analysis, critical frequency, ionosphere, solar cycle
Procedia PDF Downloads 2205735 Modular 3D Environmental Development for Augmented Reality
Authors: Kevin William Taylor
Abstract:
This work used industry-standard practices and technologies as a foundation to explore current and future advancements in modularity for 3D environmental production. Covering environmental generation, and AI-assisted generation, this study investigated how these areas will shape the industries goal to achieve full immersion within augmented reality environments. This study will explore modular environmental construction techniques utilized in large scale 3D productions. This will include the reasoning behind this approach to production, the principles in the successful development, potential pitfalls, and different methodologies for successful implementation of practice in commercial and proprietary interactive engines. A focus will be on the role of the 3D artists in the future of environmental development, requiring adaptability to new approaches, as the field evolves in response to tandem technological advancements. Industry findings and projections theorize how these factors will impact the widespread utilization of augmented reality in daily life. This will continue to inform the direction of technology towards expansive interactive environments. It will change the tools and techniques utilized in the development of environments for game, film, and VFX. This study concludes that this technology will be the cornerstone for the creation of AI-driven AR that is able to fully theme our world, change how we see and engage with one another. This will impact the concept of a virtual self-identity that will be as prevalent as real-world identity. While this progression scares or even threaten some, it is safe to say that we are seeing the beginnings of a technological revolution that will surpass the impact that the smartphone had on modern society.Keywords: virtual reality, augmented reality, training, 3D environments
Procedia PDF Downloads 1225734 Influence of Surface Preparation Effects on the Electrochemical Behavior of 2098-T351 Al–Cu–Li Alloy
Authors: Rejane Maria P. da Silva, Mariana X. Milagre, João Victor de S. Araujo, Leandro A. de Oliveira, Renato A. Antunes, Isolda Costa
Abstract:
The Al-Cu-Li alloys are advanced materials for aerospace application because of their interesting mechanical properties and low density when compared with conventional Al-alloys. However, Al-Cu-Li alloys are susceptible to localized corrosion. The near-surface deformed layer (NSDL) induced by the rolling process during the production of the alloy and its removal by polishing can influence on the corrosion susceptibility of these alloys. In this work, the influence of surface preparation effects on the electrochemical activity of AA2098-T351 (Al–Cu–Li alloy) was investigated using a correlation between surface chemistry, microstructure, and electrochemical activity. Two conditions were investigated, polished and as-received surfaces of the alloy. The morphology of the two types of surfaces was investigated using confocal laser scanning microscopy (CLSM) and optical microscopy. The surface chemistry was analyzed by X-ray Photoelectron Spectroscopy (XPS) and energy dispersive X-ray spectroscopy (EDS). Global electrochemical techniques (potentiodynamic polarization and EIS technique) and a local electrochemical technique (Localized Electrochemical Impedance Spectroscopy-LEIS) were used to examine the electrochemical activity of the surfaces. The results obtained in this study showed that in the as-received surface, the near-surface deformed layer (NSDL), which is composed of Mg-rich bands, influenced the electrochemical behavior of the alloy. The results showed higher electrochemical activity to the polished surface condition compared to the as-received one.Keywords: Al-Cu-Li alloys, surface preparation effects, electrochemical techniques, localized corrosion
Procedia PDF Downloads 1595733 Participation in IAEA Proficiency Test to Analyse Cobalt, Strontium and Caesium in Seawater Using Direct Counting and Radiochemical Techniques
Authors: S. Visetpotjanakit, C. Khrautongkieo
Abstract:
Radiation monitoring in the environment and foodstuffs is one of the main responsibilities of Office of Atoms for Peace (OAP) as the nuclear regulatory body of Thailand. The main goal of the OAP is to assure the safety of the Thai people and environment from any radiological incidents. Various radioanalytical methods have been developed to monitor radiation and radionuclides in the environmental and foodstuff samples. To validate our analytical performance, several proficiency test exercises from the International Atomic Energy Agency (IAEA) have been performed. Here, the results of a proficiency test exercise referred to as the Proficiency Test for Tritium, Cobalt, Strontium and Caesium Isotopes in Seawater 2017 (IAEA-RML-2017-01) are presented. All radionuclides excepting ³H were analysed using various radioanalytical methods, i.e. direct gamma-ray counting for determining ⁶⁰Co, ¹³⁴Cs and ¹³⁷Cs and developed radiochemical techniques for analysing ¹³⁴Cs, ¹³⁷Cs using AMP pre-concentration technique and 90Sr using di-(2-ethylhexyl) phosphoric acid (HDEHP) liquid extraction technique. The analysis results were submitted to IAEA. All results passed IAEA criteria, i.e. accuracy, precision and trueness and obtained ‘Accepted’ statuses. These confirm the data quality from the OAP environmental radiation laboratory to monitor radiation in the environment.Keywords: international atomic energy agency, proficiency test, radiation monitoring, seawater
Procedia PDF Downloads 1715732 Effects of Auxetic Antibacterial Zwitterion Carboxylate and Sulfate Copolymer Hydrogels for Diabetic Wound Healing Application
Authors: Udayakumar Vee, Franck Quero
Abstract:
Zwitterionic polymers generally have been viewed as a new class of antimicrobial and non-fouling materials. They offer a broad versatility for chemical modification and hence great freedom for accurate molecular design, which bear an equimolar number of homogenously distributed anionic and cationic groups along their polymer chains. This study explores the effectiveness of the auxetic zwitterion carboxylate/sulfonate hydrogel in the diabetic-induced mouse model. A series of silver metal-doped auxetic zwitterion carboxylate/sulfonate/vinylaniline copolymer hydrogels is designed via a 3D printer. Zwitterion monomers have been characterized by FT-IR and NMR techniques. The effect of changing the monomers and different loading ratios of Ag over zwitterion on the final hydrogel materials' antimicrobial properties and biocompatibility will be investigated in detail. The synthesized auxetic hydrogel has been characterized using a wide range of techniques to help establish the relationship between molecular level and macroscopic properties of these materials, including mechanical and antibacterial and biocompatibility and wound healing ability. This work's comparative studies and results provide new insights and guide us in choosing a better auxetic structured material for a broad spectrum of wound healing applications in the animal model. We expect this approach to provide a versatile and robust platform for biomaterial design that could lead to promising treatments for wound healing applications.Keywords: auxetic, zwitterion, carboxylate, sulfonate, polymer, wound healing
Procedia PDF Downloads 1405731 A Grey-Box Text Attack Framework Using Explainable AI
Authors: Esther Chiramal, Kelvin Soh Boon Kai
Abstract:
Explainable AI is a strong strategy implemented to understand complex black-box model predictions in a human-interpretable language. It provides the evidence required to execute the use of trustworthy and reliable AI systems. On the other hand, however, it also opens the door to locating possible vulnerabilities in an AI model. Traditional adversarial text attack uses word substitution, data augmentation techniques, and gradient-based attacks on powerful pre-trained Bidirectional Encoder Representations from Transformers (BERT) variants to generate adversarial sentences. These attacks are generally white-box in nature and not practical as they can be easily detected by humans e.g., Changing the word from “Poor” to “Rich”. We proposed a simple yet effective Grey-box cum Black-box approach that does not require the knowledge of the model while using a set of surrogate Transformer/BERT models to perform the attack using Explainable AI techniques. As Transformers are the current state-of-the-art models for almost all Natural Language Processing (NLP) tasks, an attack generated from BERT1 is transferable to BERT2. This transferability is made possible due to the attention mechanism in the transformer that allows the model to capture long-range dependencies in a sequence. Using the power of BERT generalisation via attention, we attempt to exploit how transformers learn by attacking a few surrogate transformer variants which are all based on a different architecture. We demonstrate that this approach is highly effective to generate semantically good sentences by changing as little as one word that is not detectable by humans while still fooling other BERT models.Keywords: BERT, explainable AI, Grey-box text attack, transformer
Procedia PDF Downloads 1375730 The Clustering of Multiple Sclerosis Subgroups through L2 Norm Multifractal Denoising Technique
Authors: Yeliz Karaca, Rana Karabudak
Abstract:
Multifractal Denoising techniques are used in the identification of significant attributes by removing the noise of the dataset. Magnetic resonance (MR) image technique is the most sensitive method so as to identify chronic disorders of the nervous system such as Multiple Sclerosis. MRI and Expanded Disability Status Scale (EDSS) data belonging to 120 individuals who have one of the subgroups of MS (Relapsing Remitting MS (RRMS), Secondary Progressive MS (SPMS), Primary Progressive MS (PPMS)) as well as 19 healthy individuals in the control group have been used in this study. The study is comprised of the following stages: (i) L2 Norm Multifractal Denoising technique, one of the multifractal technique, has been used with the application on the MS data (MRI and EDSS). In this way, the new dataset has been obtained. (ii) The new MS dataset obtained from the MS dataset and L2 Multifractal Denoising technique has been applied to the K-Means and Fuzzy C Means clustering algorithms which are among the unsupervised methods. Thus, the clustering performances have been compared. (iii) In the identification of significant attributes in the MS dataset through the Multifractal denoising (L2 Norm) technique using K-Means and FCM algorithms on the MS subgroups and control group of healthy individuals, excellent performance outcome has been yielded. According to the clustering results based on the MS subgroups obtained in the study, successful clustering results have been obtained in the K-Means and FCM algorithms by applying the L2 norm of multifractal denoising technique for the MS dataset. Clustering performance has been more successful with the MS Dataset (L2_Norm MS Data Set) K-Means and FCM in which significant attributes are obtained by applying L2 Norm Denoising technique.Keywords: clinical decision support, clustering algorithms, multiple sclerosis, multifractal techniques
Procedia PDF Downloads 1685729 Antibacterial Zwitterion Carboxylate and Sulfonate Copolymer Auxetic Hydrogels for Diabetic Wound Healing Application
Authors: Udayakumar Veerabagu, Franck Quero
Abstract:
Zwitterion carboxylate and sulfonate polymers generally have been viewed as a new class of antimicrobial and non-fouling materials. They offer a broad versatility for chemical modification and hence great freedom for accurate molecular design, which bear an equimolar number of homogenously distributed anionic and cationic groups along their polymer chains. This study explores the effectiveness of the auxetic zwitterion carboxylate/sulfonate hydrogel in the diabetic-induced mouse model. A series of silver metal-doped auxetic zwitterion carboxylate/sulfonate/vinylaniline copolymer hydrogels is designed via a 3D printer. Zwitterion monomers have been characterized by FT-IR and NMR techniques. The effect of changing the monomers and different loading ratios of Ag over zwitterion on the final hydrogel materials' antimicrobial properties and biocompatibility will be investigated in detail. The synthesized auxetic hydrogel has been characterized using a wide range of techniques to help establish the relationship between molecular level and macroscopic properties of these materials, including mechanical and antibacterial and biocompatibility and wound healing ability. This work's comparative studies and results provide new insights and guide us in choosing a better auxetic structured material for a broad spectrum of wound healing applications in the animal model. We expect this approach to provide a versatile and robust platform for biomaterial design that could lead to promising treatments for wound healing applications.Keywords: auxetic, zwitterion, carboxylate, sulfonate, polymer, wound healing
Procedia PDF Downloads 1565728 Porcelain Paste Processing by Robocasting 3D: Parameters Tuning
Authors: A. S. V. Carvalho, J. Luis, L. S. O. Pires, J. M. Oliveira
Abstract:
Additive manufacturing technologies (AM) experienced a remarkable growth in the latest years due to the development and diffusion of a wide range of three-dimensional (3D) printing techniques. Nowadays we can find techniques available for non-industrial users, like fused filament fabrication, but techniques like 3D printing, polyjet, selective laser sintering and stereolithography are mainly spread in the industry. Robocasting (R3D) shows a great potential due to its ability to shape materials with a wide range of viscosity. Industrial porcelain compositions showing different rheological behaviour can be prepared and used as candidate materials to be processed by R3D. The use of this AM technique in industry is very residual. In this work, a specific porcelain composition with suitable rheological properties will be processed by R3D, and a systematic study of the printing parameters tuning will be shown. The porcelain composition was formulated based on an industrial spray dried porcelain powder. The powder particle size and morphology was analysed. The powders were mixed with water and an organic binder on a ball mill at 200 rpm/min for 24 hours. The batch viscosity was adjusted by the addition of an acid solution and mixed again. The paste density, viscosity, zeta potential, particle size distribution and pH were determined. In a R3D system, different speed and pressure settings were studied to access their impact on the fabrication of porcelain models. These models were dried at 80 °C, during 24 hours and sintered in air at 1350 °C for 2 hours. The stability of the models, its walls and surface quality were studied and their physical properties were accessed. The microstructure and layer adhesion were observed by SEM. The studied processing parameters have a high impact on the models quality. Moreover, they have a high impact on the stacking of the filaments. The adequate tuning of the parameters has a huge influence on the final properties of the porcelain models. This work contributes to a better assimilation of AM technologies in ceramic industry. Acknowledgments: The RoboCer3D project – project of additive rapid manufacturing through 3D printing ceramic material (POCI-01-0247-FEDER-003350) financed by Compete 2020, PT 2020, European Regional Development Fund – FEDER through the International and Competitive Operational Program (POCI) under the PT2020 partnership agreement.Keywords: additive manufacturing, porcelain, robocasting, R3D
Procedia PDF Downloads 1625727 Process Monitoring Based on Parameterless Self-Organizing Map
Authors: Young Jae Choung, Seoung Bum Kim
Abstract:
Statistical Process Control (SPC) is a popular technique for process monitoring. A widely used tool in SPC is a control chart, which is used to detect the abnormal status of a process and maintain the controlled status of the process. Traditional control charts, such as Hotelling’s T2 control chart, are effective techniques to detect abnormal observations and monitor processes. However, many complicated manufacturing systems exhibit nonlinearity because of the different demands of the market. In this case, the unregulated use of a traditional linear modeling approach may not be effective. In reality, many industrial processes contain the nonlinear and time-varying properties because of the fluctuation of process raw materials, slowing shift of the set points, aging of the main process components, seasoning effects, and catalyst deactivation. The use of traditional SPC techniques with time-varying data will degrade the performance of the monitoring scheme. To address these issues, in the present study, we propose a parameterless self-organizing map (PLSOM)-based control chart. The PLSOM-based control chart not only can manage a situation where the distribution or parameter of the target observations changes, but also address the nonlinearity of modern manufacturing systems. The control limits of the proposed PLSOM chart are established by estimating the empirical level of significance on the percentile using a bootstrap method. Experimental results with simulated data and actual process data from a thin-film transistor-liquid crystal display process demonstrated the effectiveness and usefulness of the proposed chart.Keywords: control chart, parameter-less self-organizing map, self-organizing map, time-varying property
Procedia PDF Downloads 2755726 Pellegrini-Stieda Syndrome: A Physical Medicine and Rehabilitation Approach
Authors: Pedro Ferraz-Gameiro
Abstract:
Introduction: The Pellegrini-Stieda lesion is the result of post-traumatic calcification and/or ossification on the medial collateral ligament (MCL) of the knee. When this calcification is accompanied by gonalgia and limitation of knee flexion, it is called Pellegrini-Stieda syndrome. The pathogenesis is probably the calcification of a post-traumatic hematoma at least three weeks after the initial trauma or secondary to repetitive microtrauma. On anteroposterior radiographs, a Pellegrini-Stieda lesion is a linear vertical ossification or calcification of the proximal portion of the MCL and usually near the medial femoral condyle. Patients with Pellegrini-Stieda syndrome present knee pain associated with loss of range of motion. The treatment is usually conservative with analgesic and anti-inflammatory drugs, either systemic or intra-articular. Physical medicine and rehabilitation techniques associated with shock wave therapy can be a way of reduction of pain/inflammation. Patients who maintain instability with significant limitation of knee mobility may require surgical excision. Methods: Research was done using PubMed central using the terms Pellegrini-Stieda syndrome. Discussion/conclusion: Medical treatment is the rule, with initial rest, anti-inflammatory, and physiotherapy. If left untreated, this ossification can potentially form a significant bone mass, which can compromise the range of motion of the knee. Physical medicine and rehabilitation techniques associated with shock wave therapy are a way of reduction of pain/inflammation.Keywords: knee, Pellegrini-Stieda syndrome, rehabilitation, shock waves therapy
Procedia PDF Downloads 140