Search results for: hemsball shooting techniques
5785 Surface Characterization of Zincblende and Wurtzite Semiconductors Using Nonlinear Optics
Authors: Hendradi Hardhienata, Tony Sumaryada, Sri Setyaningsih
Abstract:
Current progress in the field of nonlinear optics has enabled precise surface characterization in semiconductor materials. Nonlinear optical techniques are favorable due to their nondestructive measurement and ability to work in nonvacuum and ambient conditions. The advance of the bond hyperpolarizability models opens a wide range of nanoscale surface investigation including the possibility to detect molecular orientation at the surface of silicon and zincblende semiconductors, investigation of electric field induced second harmonic fields at the semiconductor interface, detection of surface impurities, and very recently, study surface defects such as twin boundary in wurtzite semiconductors. In this work, we show using nonlinear optical techniques, e.g. nonlinear bond models how arbitrary polarization of the incoming electric field in Rotational Anisotropy Spectroscopy experiments can provide more information regarding the origin of the nonlinear sources in zincblende and wurtzite semiconductor structure. In addition, using hyperpolarizability consideration, we describe how the nonlinear susceptibility tensor describing SHG can be well modelled using only few parameter because of the symmetry of the bonds. We also show how the third harmonic intensity feature shows considerable changes when the incoming field polarization angle is changed from s-polarized to p-polarized. We also propose a method how to investigate surface reconstruction and defects in wurtzite and zincblende structure at the nanoscale level.Keywords: surface characterization, bond model, rotational anisotropy spectroscopy, effective hyperpolarizability
Procedia PDF Downloads 1595784 Comparative Study on Sensory Profiles of Liquor from Different Dried Cocoa Beans
Authors: Khairul Bariah Sulaiman, Tajul Aris Yang
Abstract:
Malaysian dried cocoa beans have been reported to have low quality flavour and are often sold at discounted prices. Various efforts have been made to improve the Malaysian beans quality. Among these efforts is introduction of the shallow box fermentation technique and pulp preconditioned through pods storage. However, after nearly four decades of the effort was done, Malaysian cocoa farmers still received lower prices for their beans. So, this study was carried out in order to assess the flavour quality of dried cocoa beans produced by shallow box fermentation techniques, combination of shallow box fermentation with pods storage and compared to dried cocoa beans obtained from Ghana. A total of eight samples of dried cocoa was used in this study, which one of the samples was Ghanaian beans (coded with no.8), while the rest were Malaysian cocoa beans with different post-harvest processing (coded with no. 1, 2, 3, 4, 5, 6 and 7). Cocoa liquor was prepared from all samples in the prescribed techniques and sensory evaluation was carried out using Quantitative Descriptive Analysis (QDA) Method with 0-10 scale by Malaysian Cocoa Board trained panelist. Sensory evaluation showed that cocoa attributes for all cocoa liquors ranging from 3.5 to 5.3, whereas bitterness was ranging from 3.4 to 4.6 and astringent attribute ranging from 3.9 to 5.5, respectively. Meanwhile, all cocoa liquors were having acid or sourness attribute ranging from 1.6 to 3.6, respectively. In general cocoa liquor prepared from sample coded no 4 has almost similar flavour profile and no significantly different at p < 0.05 with Ghana, in term of most flavour attributes as compared to the other six samples.Keywords: cocoa beans, flavour, fermentation, shallow box, pods storage
Procedia PDF Downloads 3955783 Exploring the Synergistic Effects of Aerobic Exercise and Cinnamon Extract on Metabolic Markers in Insulin-Resistant Rats through Advanced Machine Learning and Deep Learning Techniques
Authors: Masoomeh Alsadat Mirshafaei
Abstract:
The present study aims to explore the effect of an 8-week aerobic training regimen combined with cinnamon extract on serum irisin and leptin levels in insulin-resistant rats. Additionally, this research leverages various machine learning (ML) and deep learning (DL) algorithms to model the complex interdependencies between exercise, nutrition, and metabolic markers, offering a groundbreaking approach to obesity and diabetes research. Forty-eight Wistar rats were selected and randomly divided into four groups: control, training, cinnamon, and training cinnamon. The training protocol was conducted over 8 weeks, with sessions 5 days a week at 75-80% VO2 max. The cinnamon and training-cinnamon groups were injected with 200 ml/kg/day of cinnamon extract. Data analysis included serum data, dietary intake, exercise intensity, and metabolic response variables, with blood samples collected 72 hours after the final training session. The dataset was analyzed using one-way ANOVA (P<0.05) and fed into various ML and DL models, including Support Vector Machines (SVM), Random Forest (RF), and Convolutional Neural Networks (CNN). Traditional statistical methods indicated that aerobic training, with and without cinnamon extract, significantly increased serum irisin and decreased leptin levels. Among the algorithms, the CNN model provided superior performance in identifying specific interactions between cinnamon extract concentration and exercise intensity, optimizing the increase in irisin and the decrease in leptin. The CNN model achieved an accuracy of 92%, outperforming the SVM (85%) and RF (88%) models in predicting the optimal conditions for metabolic marker improvements. The study demonstrated that advanced ML and DL techniques could uncover nuanced relationships and potential cellular responses to exercise and dietary supplements, which is not evident through traditional methods. These findings advocate for the integration of advanced analytical techniques in nutritional science and exercise physiology, paving the way for personalized health interventions in managing obesity and diabetes.Keywords: aerobic training, cinnamon extract, insulin resistance, irisin, leptin, convolutional neural networks, exercise physiology, support vector machines, random forest
Procedia PDF Downloads 415782 Efficient Microspore Isolation Methods for High Yield Embryoids and Regeneration in Rice (Oryza sativa L.)
Authors: S. M. Shahinul Islam, Israt Ara, Narendra Tuteja, Sreeramanan Subramaniam
Abstract:
Through anther and microspore culture methods, complete homozygous plants can be produced within a year as compared to the long inbreeding method. Isolated microspore culture is one of the most important techniques for rapid development of haploid plants. The efficiency of this method is influenced by several factors such as cultural conditions, growth regulators, plant media, pretreatments, physical and growth conditions of the donor plants, pollen isolation procedure, etc. The main purpose of this study was to improve the isolated microspore culture protocol in order to increase the efficiency of embryoids, its regeneration and reducing albinisms. Under this study we have tested mainly three different microspore isolation procedures by glass rod, homozeniger and by blending and found the efficiency on gametic embryogenesis. There are three types of media viz. washing, pre-culture and induction was used. The induction medium as AMC (modified MS) supplemented by 2, 4-D (2.5 mg/l), kinetin (0.5 mg/l) and higher amount of D-Manitol (90 g/l) instead of sucrose and two types of amino acids (L-glutamine and L-serine) were used. Out of three main microspore isolation procedure by homogenizer isolation (P4) showed best performance on ELS induction (177%) and green plantlets (104%) compared with other techniques. For all cases albinisims occurred but microspore isolation from excised anthers by glass rod and homogenizer showed lesser numbers of albino plants that was also one of the important findings in this study.Keywords: androgenesis, pretreatment, microspore culture, regeneration, albino plants, Oryza sativa
Procedia PDF Downloads 3645781 Ant Lion Optimization in a Fuzzy System for Benchmark Control Problem
Authors: Leticia Cervantes, Edith Garcia, Oscar Castillo
Abstract:
At today, there are several control problems where the main objective is to obtain the best control in the study to decrease the error in the application. Many techniques can use to control these problems such as Neural Networks, PID control, Fuzzy Logic, Optimization techniques and many more. In this case, fuzzy logic with fuzzy system and an optimization technique are used to control the case of study. In this case, Ant Lion Optimization is used to optimize a fuzzy system to control the velocity of a simple treadmill. The main objective is to achieve the control of the velocity in the control problem using the ALO optimization. First, a simple fuzzy system was used to control the velocity of the treadmill it has two inputs (error and error change) and one output (desired speed), then results were obtained but to decrease the error the ALO optimization was developed to optimize the fuzzy system of the treadmill. Having the optimization, the simulation was performed, and results can prove that using the ALO optimization the control of the velocity was better than a conventional fuzzy system. This paper describes some basic concepts to help to understand the idea in this work, the methodology of the investigation (control problem, fuzzy system design, optimization), the results are presented and the optimization is used for the fuzzy system. A comparison between the simple fuzzy system and the optimized fuzzy systems are presented where it can be proving the optimization improved the control with good results the major findings of the study is that ALO optimization is a good alternative to improve the control because it helped to decrease the error in control applications even using any control technique to optimized, As a final statement is important to mentioned that the selected methodology was good because the control of the treadmill was improve using the optimization technique.Keywords: ant lion optimization, control problem, fuzzy control, fuzzy system
Procedia PDF Downloads 4035780 Evaluating the Radiation Dose Involved in Interventional Radiology Procedures
Authors: Kholood Baron
Abstract:
Radiologic interventional studies use fluoroscopy imaging guidance to perform both diagnostic and therapeutic procedures. These could result in high radiation doses being delivered to the patients and also to the radiology team. This is due to the prolonged fluoroscopy time and the large number of images taken, even when dose-minimizing techniques and modern fluoroscopic tools are applied. Hence, these procedures are part of the everyday routine of interventional radiology doctors, assistant nurses, and radiographers. Thus, it is important to estimate the radiation exposure dose they received in order to give objective advice and reduce both patient and radiology team radiation exposure dose. The aim of this study was to find out the total radiation dose reaching the radiologist and the patient during an interventional procedure and to determine the impact of certain parameters on the patient dose. Method: The radiation dose was measured by TLD devices (thermoluminescent dosimeter; radiation dosimeter device). Physicians, patients, nurses, and radiographers wore TLDs during 12 interventional radiology procedures performed in two hospitals, Mubarak and Chest Hospital. This study highlights the need for interventional radiologists to be mindful of the radiation doses received by both patients and medical staff during interventional radiology procedures. The findings emphasize the impact of factors such as fluoroscopy duration and the number of images taken on the patient dose. By raising awareness and providing insights into optimizing techniques and protective measures, this research contributes to the overall goal of reducing radiation doses and ensuring the safety of patients and medical staff.Keywords: dosimetry, radiation dose, interventional radiology procedures, patient radiation dose
Procedia PDF Downloads 1145779 Cut-Out Animation as an Technic and Development inside History Process
Authors: Armagan Gokcearslan
Abstract:
The art of animation has developed very rapidly from the aspects of script, sound and music, motion, character design, techniques being used and technological tools being developed since the first years until today. Technical variety attracts a particular attention in the art of animation. Being perceived as a kind of illusion in the beginning; animations commonly used the Flash Sketch technique. Animations artists using the Flash Sketch technique created scenes by drawing them on a blackboard with chalk. The Flash Sketch technique was used by primary animation artists like Emile Cohl, Winsor McCay ande Blackton. And then tools like Magical Lantern, Thaumatrope, Phenakisticope, and Zeotrap were developed and started to be used intensely in the first years of the art of animation. Today, on the other hand, the art of animation is affected by developments in the computer technology. It is possible to create three-dimensional and two-dimensional animations with the help of various computer software. Cut-out technique is among the important techniques being used in the art of animation. Cut-out animation technique is based on the art of paper cutting. Examining cut-out animations; it is observed that they technically resemble the art of paper cutting. The art of paper cutting has a rooted history. It is possible to see the oldest samples of paper cutting in the People’s Republic of China in the period after the 2. century B.C. when the Chinese invented paper. The most popular artist using the cut-out animation technique is the German artist Lotte Reiniger. This study titled “Cut-out Animation as a Technic and Development Inside History Process” will embrace the art of paper cutting, the relationship between the art of paper cutting and cut-out animation, its development within the historical process, animation artists producing artworks in this field, important cut-out animations, and their technical properties.Keywords: cut-out, paper art, animation, technic
Procedia PDF Downloads 2765778 Markowitz and Implementation of a Multi-Objective Evolutionary Technique Applied to the Colombia Stock Exchange (2009-2015)
Authors: Feijoo E. Colomine Duran, Carlos E. Peñaloza Corredor
Abstract:
There modeling component selection financial investment (Portfolio) a variety of problems that can be addressed with optimization techniques under evolutionary schemes. For his feature, the problem of selection of investment components of a dichotomous relationship between two elements that are opposed: The Portfolio Performance and Risk presented by choosing it. This relationship was modeled by Markowitz through a media problem (Performance) - variance (risk), ie must Maximize Performance and Minimize Risk. This research included the study and implementation of multi-objective evolutionary techniques to solve these problems, taking as experimental framework financial market equities Colombia Stock Exchange between 2009-2015. Comparisons three multiobjective evolutionary algorithms, namely the Nondominated Sorting Genetic Algorithm II (NSGA-II), the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and Indicator-Based Selection in Multiobjective Search (IBEA) were performed using two measures well known performance: The Hypervolume indicator and R_2 indicator, also it became a nonparametric statistical analysis and the Wilcoxon rank-sum test. The comparative analysis also includes an evaluation of the financial efficiency of the investment portfolio chosen by the implementation of various algorithms through the Sharpe ratio. It is shown that the portfolio provided by the implementation of the algorithms mentioned above is very well located between the different stock indices provided by the Colombia Stock Exchange.Keywords: finance, optimization, portfolio, Markowitz, evolutionary algorithms
Procedia PDF Downloads 3025777 Management of Urban Watering: A Study of Appliance of Technologies and Legislation in Goiania, Brazil
Authors: Vinicius Marzall, Jussanã Milograna
Abstract:
The urban drainwatering remains a major challenge for most of the Brazilian cities. Not so different of the most part, Goiania, a state capital located in Midwest of the country has few legislations about the subject matter and only one registered solution of compensative techniques for drainwater. This paper clam to show some solutions which are adopted in other Brazilian cities with consolidated legislation, suggesting technics about detention tanks in a building sit. This study analyzed and compared the legislation of Curitiba, Porto Alegre e Sao Paulo, with the actual legislation and politics of Goiania. After this, were created models with adopted data for dimensioning the size of detention tanks using the envelope curve method considering synthetic series for intense precipitations and building sits between 250 m² and 600 m², with an impermeabilization tax of 50%. The results showed great differences between the legislation of Goiania and the documentation of the others cities analyzed, like the number of techniques for drainwatering applied to the reality of the cities, educational actions to awareness the population about care the water courses and political management by having a specified funds for drainwater subjects, for example. Besides, the use of detention tank showed itself practicable, have seen that the occupation of the tank is minor than 3% of the building sit, whatever the size of the terrain, granting the exit flow to pre-occupational taxes in extreme rainfall events. Also, was developed a linear equation to measure the detention tank based in the size of the building sit in Goiania, making simpler the calculation and implementation for non-specialized people.Keywords: clean technology, legislation, rainwater management, urban drainwater
Procedia PDF Downloads 1595776 Visual Inspection of Road Conditions Using Deep Convolutional Neural Networks
Authors: Christos Theoharatos, Dimitris Tsourounis, Spiros Oikonomou, Andreas Makedonas
Abstract:
This paper focuses on the problem of visually inspecting and recognizing the road conditions in front of moving vehicles, targeting automotive scenarios. The goal of road inspection is to identify whether the road is slippery or not, as well as to detect possible anomalies on the road surface like potholes or body bumps/humps. Our work is based on an artificial intelligence methodology for real-time monitoring of road conditions in autonomous driving scenarios, using state-of-the-art deep convolutional neural network (CNN) techniques. Initially, the road and ego lane are segmented within the field of view of the camera that is integrated into the front part of the vehicle. A novel classification CNN is utilized to identify among plain and slippery road textures (e.g., wet, snow, etc.). Simultaneously, a robust detection CNN identifies severe surface anomalies within the ego lane, such as potholes and speed bumps/humps, within a distance of 5 to 25 meters. The overall methodology is illustrated under the scope of an integrated application (or system), which can be integrated into complete Advanced Driver-Assistance Systems (ADAS) systems that provide a full range of functionalities. The outcome of the proposed techniques present state-of-the-art detection and classification results and real-time performance running on AI accelerator devices like Intel’s Myriad 2/X Vision Processing Unit (VPU).Keywords: deep learning, convolutional neural networks, road condition classification, embedded systems
Procedia PDF Downloads 1355775 Pricing Strategy in Marketing: Balancing Value and Profitability
Authors: Mohsen Akhlaghi, Tahereh Ebrahimi
Abstract:
Pricing strategy is a vital component in achieving the balance between customer value and business profitability. The aim of this study is to provide insights into the factors, techniques, and approaches involved in pricing decisions. The study utilizes a descriptive approach to discuss various aspects of pricing strategy in marketing, drawing on concepts from market research, consumer psychology, competitive analysis, and adaptability. This approach presents a comprehensive view of pricing decisions. The result of this exploration is a framework that highlights key factors influencing pricing decisions. The study examines how factors such as market positioning, product differentiation, and brand image shape pricing strategies. Additionally, it emphasizes the role of consumer psychology in understanding price elasticity, perceived value, and price-quality associations that influence consumer behavior. Various pricing techniques, including charm pricing, prestige pricing, and bundle pricing, are mentioned as methods to enhance sales by influencing consumer perceptions. The study also underscores the importance of adaptability in responding to market dynamics through regular price monitoring, dynamic pricing, and promotional strategies. It recognizes the role of digital platforms in enabling personalized pricing and dynamic pricing models. In conclusion, the study emphasizes that effective pricing strategies strike a balance between customer value and business profitability, ultimately driving sales, enhancing brand perception, and fostering lasting customer relationships.Keywords: business, customer benefits, marketing, pricing
Procedia PDF Downloads 795774 Valence and Arousal-Based Sentiment Analysis: A Comparative Study
Authors: Usama Shahid, Muhammad Zunnurain Hussain
Abstract:
This research paper presents a comprehensive analysis of a sentiment analysis approach that employs valence and arousal as its foundational pillars, in comparison to traditional techniques. Sentiment analysis is an indispensable task in natural language processing that involves the extraction of opinions and emotions from textual data. The valence and arousal dimensions, representing the intensity and positivity/negativity of emotions, respectively, enable the creation of four quadrants, each representing a specific emotional state. The study seeks to determine the impact of utilizing these quadrants to identify distinct emotional states on the accuracy and efficiency of sentiment analysis, in comparison to traditional techniques. The results reveal that the valence and arousal-based approach outperforms other approaches, particularly in identifying nuanced emotions that may be missed by conventional methods. The study's findings are crucial for applications such as social media monitoring and market research, where the accurate classification of emotions and opinions is paramount. Overall, this research highlights the potential of using valence and arousal as a framework for sentiment analysis and offers invaluable insights into the benefits of incorporating specific types of emotions into the analysis. These findings have significant implications for researchers and practitioners in the field of natural language processing, as they provide a basis for the development of more accurate and effective sentiment analysis tools.Keywords: sentiment analysis, valence and arousal, emotional states, natural language processing, machine learning, text analysis, sentiment classification, opinion mining
Procedia PDF Downloads 1025773 Comprehensive Feature Extraction for Optimized Condition Assessment of Fuel Pumps
Authors: Ugochukwu Ejike Akpudo, Jank-Wook Hur
Abstract:
The increasing demand for improved productivity, maintainability, and reliability has prompted rapidly increasing research studies on the emerging condition-based maintenance concept- Prognostics and health management (PHM). Varieties of fuel pumps serve critical functions in several hydraulic systems; hence, their failure can have daunting effects on productivity, safety, etc. The need for condition monitoring and assessment of these pumps cannot be overemphasized, and this has led to the uproar in research studies on standard feature extraction techniques for optimized condition assessment of fuel pumps. By extracting time-based, frequency-based and the more robust time-frequency based features from these vibrational signals, a more comprehensive feature assessment (and selection) can be achieved for a more accurate and reliable condition assessment of these pumps. With the aid of emerging deep classification and regression algorithms like the locally linear embedding (LLE), we propose a method for comprehensive condition assessment of electromagnetic fuel pumps (EMFPs). Results show that the LLE as a comprehensive feature extraction technique yields better feature fusion/dimensionality reduction results for condition assessment of EMFPs against the use of single features. Also, unlike other feature fusion techniques, its capabilities as a fault classification technique were explored, and the results show an acceptable accuracy level using standard performance metrics for evaluation.Keywords: electromagnetic fuel pumps, comprehensive feature extraction, condition assessment, locally linear embedding, feature fusion
Procedia PDF Downloads 1175772 Analysis of Ionospheric Variations over Japan during 23rd Solar Cycle Using Wavelet Techniques
Authors: C. S. Seema, P. R. Prince
Abstract:
The characterization of spatio-temporal inhomogeneities occurring in the ionospheric F₂ layer is remarkable since these variations are direct consequences of electrodynamical coupling between magnetosphere and solar events. The temporal and spatial variations of the F₂ layer, which occur with a period of several days or even years, mainly owe to geomagnetic and meteorological activities. The hourly F₂ layer critical frequency (foF2) over 23rd solar cycle (1996-2008) of three ionosonde stations (Wakkanai, Kokunbunji, and Okinawa) in northern hemisphere, which falls within same longitudinal span, is analyzed using continuous wavelet techniques. Morlet wavelet is used to transform continuous time series data of foF2 to a two dimensional time-frequency space, quantifying the time evolution of the oscillatory modes. The presence of significant time patterns (periodicities) at a particular time period and the time location of each periodicity are detected from the two-dimensional representation of the wavelet power, in the plane of scale and period of the time series. The mean strength of each periodicity over the entire period of analysis is studied using global wavelet spectrum. The quasi biennial, annual, semiannual, 27 day, diurnal and 12 hour variations of foF2 are clearly evident in the wavelet power spectra in all the three stations. Critical frequency oscillations with multi-day periods (2-3 days and 9 days in the low latitude station, 6-7 days in all stations and 15 days in mid-high latitude station) are also superimposed over large time scaled variations.Keywords: continuous wavelet analysis, critical frequency, ionosphere, solar cycle
Procedia PDF Downloads 2235771 Modular 3D Environmental Development for Augmented Reality
Authors: Kevin William Taylor
Abstract:
This work used industry-standard practices and technologies as a foundation to explore current and future advancements in modularity for 3D environmental production. Covering environmental generation, and AI-assisted generation, this study investigated how these areas will shape the industries goal to achieve full immersion within augmented reality environments. This study will explore modular environmental construction techniques utilized in large scale 3D productions. This will include the reasoning behind this approach to production, the principles in the successful development, potential pitfalls, and different methodologies for successful implementation of practice in commercial and proprietary interactive engines. A focus will be on the role of the 3D artists in the future of environmental development, requiring adaptability to new approaches, as the field evolves in response to tandem technological advancements. Industry findings and projections theorize how these factors will impact the widespread utilization of augmented reality in daily life. This will continue to inform the direction of technology towards expansive interactive environments. It will change the tools and techniques utilized in the development of environments for game, film, and VFX. This study concludes that this technology will be the cornerstone for the creation of AI-driven AR that is able to fully theme our world, change how we see and engage with one another. This will impact the concept of a virtual self-identity that will be as prevalent as real-world identity. While this progression scares or even threaten some, it is safe to say that we are seeing the beginnings of a technological revolution that will surpass the impact that the smartphone had on modern society.Keywords: virtual reality, augmented reality, training, 3D environments
Procedia PDF Downloads 1245770 Influence of Surface Preparation Effects on the Electrochemical Behavior of 2098-T351 Al–Cu–Li Alloy
Authors: Rejane Maria P. da Silva, Mariana X. Milagre, João Victor de S. Araujo, Leandro A. de Oliveira, Renato A. Antunes, Isolda Costa
Abstract:
The Al-Cu-Li alloys are advanced materials for aerospace application because of their interesting mechanical properties and low density when compared with conventional Al-alloys. However, Al-Cu-Li alloys are susceptible to localized corrosion. The near-surface deformed layer (NSDL) induced by the rolling process during the production of the alloy and its removal by polishing can influence on the corrosion susceptibility of these alloys. In this work, the influence of surface preparation effects on the electrochemical activity of AA2098-T351 (Al–Cu–Li alloy) was investigated using a correlation between surface chemistry, microstructure, and electrochemical activity. Two conditions were investigated, polished and as-received surfaces of the alloy. The morphology of the two types of surfaces was investigated using confocal laser scanning microscopy (CLSM) and optical microscopy. The surface chemistry was analyzed by X-ray Photoelectron Spectroscopy (XPS) and energy dispersive X-ray spectroscopy (EDS). Global electrochemical techniques (potentiodynamic polarization and EIS technique) and a local electrochemical technique (Localized Electrochemical Impedance Spectroscopy-LEIS) were used to examine the electrochemical activity of the surfaces. The results obtained in this study showed that in the as-received surface, the near-surface deformed layer (NSDL), which is composed of Mg-rich bands, influenced the electrochemical behavior of the alloy. The results showed higher electrochemical activity to the polished surface condition compared to the as-received one.Keywords: Al-Cu-Li alloys, surface preparation effects, electrochemical techniques, localized corrosion
Procedia PDF Downloads 1605769 Participation in IAEA Proficiency Test to Analyse Cobalt, Strontium and Caesium in Seawater Using Direct Counting and Radiochemical Techniques
Authors: S. Visetpotjanakit, C. Khrautongkieo
Abstract:
Radiation monitoring in the environment and foodstuffs is one of the main responsibilities of Office of Atoms for Peace (OAP) as the nuclear regulatory body of Thailand. The main goal of the OAP is to assure the safety of the Thai people and environment from any radiological incidents. Various radioanalytical methods have been developed to monitor radiation and radionuclides in the environmental and foodstuff samples. To validate our analytical performance, several proficiency test exercises from the International Atomic Energy Agency (IAEA) have been performed. Here, the results of a proficiency test exercise referred to as the Proficiency Test for Tritium, Cobalt, Strontium and Caesium Isotopes in Seawater 2017 (IAEA-RML-2017-01) are presented. All radionuclides excepting ³H were analysed using various radioanalytical methods, i.e. direct gamma-ray counting for determining ⁶⁰Co, ¹³⁴Cs and ¹³⁷Cs and developed radiochemical techniques for analysing ¹³⁴Cs, ¹³⁷Cs using AMP pre-concentration technique and 90Sr using di-(2-ethylhexyl) phosphoric acid (HDEHP) liquid extraction technique. The analysis results were submitted to IAEA. All results passed IAEA criteria, i.e. accuracy, precision and trueness and obtained ‘Accepted’ statuses. These confirm the data quality from the OAP environmental radiation laboratory to monitor radiation in the environment.Keywords: international atomic energy agency, proficiency test, radiation monitoring, seawater
Procedia PDF Downloads 1725768 Effects of Auxetic Antibacterial Zwitterion Carboxylate and Sulfate Copolymer Hydrogels for Diabetic Wound Healing Application
Authors: Udayakumar Vee, Franck Quero
Abstract:
Zwitterionic polymers generally have been viewed as a new class of antimicrobial and non-fouling materials. They offer a broad versatility for chemical modification and hence great freedom for accurate molecular design, which bear an equimolar number of homogenously distributed anionic and cationic groups along their polymer chains. This study explores the effectiveness of the auxetic zwitterion carboxylate/sulfonate hydrogel in the diabetic-induced mouse model. A series of silver metal-doped auxetic zwitterion carboxylate/sulfonate/vinylaniline copolymer hydrogels is designed via a 3D printer. Zwitterion monomers have been characterized by FT-IR and NMR techniques. The effect of changing the monomers and different loading ratios of Ag over zwitterion on the final hydrogel materials' antimicrobial properties and biocompatibility will be investigated in detail. The synthesized auxetic hydrogel has been characterized using a wide range of techniques to help establish the relationship between molecular level and macroscopic properties of these materials, including mechanical and antibacterial and biocompatibility and wound healing ability. This work's comparative studies and results provide new insights and guide us in choosing a better auxetic structured material for a broad spectrum of wound healing applications in the animal model. We expect this approach to provide a versatile and robust platform for biomaterial design that could lead to promising treatments for wound healing applications.Keywords: auxetic, zwitterion, carboxylate, sulfonate, polymer, wound healing
Procedia PDF Downloads 1425767 A Grey-Box Text Attack Framework Using Explainable AI
Authors: Esther Chiramal, Kelvin Soh Boon Kai
Abstract:
Explainable AI is a strong strategy implemented to understand complex black-box model predictions in a human-interpretable language. It provides the evidence required to execute the use of trustworthy and reliable AI systems. On the other hand, however, it also opens the door to locating possible vulnerabilities in an AI model. Traditional adversarial text attack uses word substitution, data augmentation techniques, and gradient-based attacks on powerful pre-trained Bidirectional Encoder Representations from Transformers (BERT) variants to generate adversarial sentences. These attacks are generally white-box in nature and not practical as they can be easily detected by humans e.g., Changing the word from “Poor” to “Rich”. We proposed a simple yet effective Grey-box cum Black-box approach that does not require the knowledge of the model while using a set of surrogate Transformer/BERT models to perform the attack using Explainable AI techniques. As Transformers are the current state-of-the-art models for almost all Natural Language Processing (NLP) tasks, an attack generated from BERT1 is transferable to BERT2. This transferability is made possible due to the attention mechanism in the transformer that allows the model to capture long-range dependencies in a sequence. Using the power of BERT generalisation via attention, we attempt to exploit how transformers learn by attacking a few surrogate transformer variants which are all based on a different architecture. We demonstrate that this approach is highly effective to generate semantically good sentences by changing as little as one word that is not detectable by humans while still fooling other BERT models.Keywords: BERT, explainable AI, Grey-box text attack, transformer
Procedia PDF Downloads 1385766 The Clustering of Multiple Sclerosis Subgroups through L2 Norm Multifractal Denoising Technique
Authors: Yeliz Karaca, Rana Karabudak
Abstract:
Multifractal Denoising techniques are used in the identification of significant attributes by removing the noise of the dataset. Magnetic resonance (MR) image technique is the most sensitive method so as to identify chronic disorders of the nervous system such as Multiple Sclerosis. MRI and Expanded Disability Status Scale (EDSS) data belonging to 120 individuals who have one of the subgroups of MS (Relapsing Remitting MS (RRMS), Secondary Progressive MS (SPMS), Primary Progressive MS (PPMS)) as well as 19 healthy individuals in the control group have been used in this study. The study is comprised of the following stages: (i) L2 Norm Multifractal Denoising technique, one of the multifractal technique, has been used with the application on the MS data (MRI and EDSS). In this way, the new dataset has been obtained. (ii) The new MS dataset obtained from the MS dataset and L2 Multifractal Denoising technique has been applied to the K-Means and Fuzzy C Means clustering algorithms which are among the unsupervised methods. Thus, the clustering performances have been compared. (iii) In the identification of significant attributes in the MS dataset through the Multifractal denoising (L2 Norm) technique using K-Means and FCM algorithms on the MS subgroups and control group of healthy individuals, excellent performance outcome has been yielded. According to the clustering results based on the MS subgroups obtained in the study, successful clustering results have been obtained in the K-Means and FCM algorithms by applying the L2 norm of multifractal denoising technique for the MS dataset. Clustering performance has been more successful with the MS Dataset (L2_Norm MS Data Set) K-Means and FCM in which significant attributes are obtained by applying L2 Norm Denoising technique.Keywords: clinical decision support, clustering algorithms, multiple sclerosis, multifractal techniques
Procedia PDF Downloads 1715765 Antibacterial Zwitterion Carboxylate and Sulfonate Copolymer Auxetic Hydrogels for Diabetic Wound Healing Application
Authors: Udayakumar Veerabagu, Franck Quero
Abstract:
Zwitterion carboxylate and sulfonate polymers generally have been viewed as a new class of antimicrobial and non-fouling materials. They offer a broad versatility for chemical modification and hence great freedom for accurate molecular design, which bear an equimolar number of homogenously distributed anionic and cationic groups along their polymer chains. This study explores the effectiveness of the auxetic zwitterion carboxylate/sulfonate hydrogel in the diabetic-induced mouse model. A series of silver metal-doped auxetic zwitterion carboxylate/sulfonate/vinylaniline copolymer hydrogels is designed via a 3D printer. Zwitterion monomers have been characterized by FT-IR and NMR techniques. The effect of changing the monomers and different loading ratios of Ag over zwitterion on the final hydrogel materials' antimicrobial properties and biocompatibility will be investigated in detail. The synthesized auxetic hydrogel has been characterized using a wide range of techniques to help establish the relationship between molecular level and macroscopic properties of these materials, including mechanical and antibacterial and biocompatibility and wound healing ability. This work's comparative studies and results provide new insights and guide us in choosing a better auxetic structured material for a broad spectrum of wound healing applications in the animal model. We expect this approach to provide a versatile and robust platform for biomaterial design that could lead to promising treatments for wound healing applications.Keywords: auxetic, zwitterion, carboxylate, sulfonate, polymer, wound healing
Procedia PDF Downloads 1575764 Porcelain Paste Processing by Robocasting 3D: Parameters Tuning
Authors: A. S. V. Carvalho, J. Luis, L. S. O. Pires, J. M. Oliveira
Abstract:
Additive manufacturing technologies (AM) experienced a remarkable growth in the latest years due to the development and diffusion of a wide range of three-dimensional (3D) printing techniques. Nowadays we can find techniques available for non-industrial users, like fused filament fabrication, but techniques like 3D printing, polyjet, selective laser sintering and stereolithography are mainly spread in the industry. Robocasting (R3D) shows a great potential due to its ability to shape materials with a wide range of viscosity. Industrial porcelain compositions showing different rheological behaviour can be prepared and used as candidate materials to be processed by R3D. The use of this AM technique in industry is very residual. In this work, a specific porcelain composition with suitable rheological properties will be processed by R3D, and a systematic study of the printing parameters tuning will be shown. The porcelain composition was formulated based on an industrial spray dried porcelain powder. The powder particle size and morphology was analysed. The powders were mixed with water and an organic binder on a ball mill at 200 rpm/min for 24 hours. The batch viscosity was adjusted by the addition of an acid solution and mixed again. The paste density, viscosity, zeta potential, particle size distribution and pH were determined. In a R3D system, different speed and pressure settings were studied to access their impact on the fabrication of porcelain models. These models were dried at 80 °C, during 24 hours and sintered in air at 1350 °C for 2 hours. The stability of the models, its walls and surface quality were studied and their physical properties were accessed. The microstructure and layer adhesion were observed by SEM. The studied processing parameters have a high impact on the models quality. Moreover, they have a high impact on the stacking of the filaments. The adequate tuning of the parameters has a huge influence on the final properties of the porcelain models. This work contributes to a better assimilation of AM technologies in ceramic industry. Acknowledgments: The RoboCer3D project – project of additive rapid manufacturing through 3D printing ceramic material (POCI-01-0247-FEDER-003350) financed by Compete 2020, PT 2020, European Regional Development Fund – FEDER through the International and Competitive Operational Program (POCI) under the PT2020 partnership agreement.Keywords: additive manufacturing, porcelain, robocasting, R3D
Procedia PDF Downloads 1635763 Process Monitoring Based on Parameterless Self-Organizing Map
Authors: Young Jae Choung, Seoung Bum Kim
Abstract:
Statistical Process Control (SPC) is a popular technique for process monitoring. A widely used tool in SPC is a control chart, which is used to detect the abnormal status of a process and maintain the controlled status of the process. Traditional control charts, such as Hotelling’s T2 control chart, are effective techniques to detect abnormal observations and monitor processes. However, many complicated manufacturing systems exhibit nonlinearity because of the different demands of the market. In this case, the unregulated use of a traditional linear modeling approach may not be effective. In reality, many industrial processes contain the nonlinear and time-varying properties because of the fluctuation of process raw materials, slowing shift of the set points, aging of the main process components, seasoning effects, and catalyst deactivation. The use of traditional SPC techniques with time-varying data will degrade the performance of the monitoring scheme. To address these issues, in the present study, we propose a parameterless self-organizing map (PLSOM)-based control chart. The PLSOM-based control chart not only can manage a situation where the distribution or parameter of the target observations changes, but also address the nonlinearity of modern manufacturing systems. The control limits of the proposed PLSOM chart are established by estimating the empirical level of significance on the percentile using a bootstrap method. Experimental results with simulated data and actual process data from a thin-film transistor-liquid crystal display process demonstrated the effectiveness and usefulness of the proposed chart.Keywords: control chart, parameter-less self-organizing map, self-organizing map, time-varying property
Procedia PDF Downloads 2765762 Pellegrini-Stieda Syndrome: A Physical Medicine and Rehabilitation Approach
Authors: Pedro Ferraz-Gameiro
Abstract:
Introduction: The Pellegrini-Stieda lesion is the result of post-traumatic calcification and/or ossification on the medial collateral ligament (MCL) of the knee. When this calcification is accompanied by gonalgia and limitation of knee flexion, it is called Pellegrini-Stieda syndrome. The pathogenesis is probably the calcification of a post-traumatic hematoma at least three weeks after the initial trauma or secondary to repetitive microtrauma. On anteroposterior radiographs, a Pellegrini-Stieda lesion is a linear vertical ossification or calcification of the proximal portion of the MCL and usually near the medial femoral condyle. Patients with Pellegrini-Stieda syndrome present knee pain associated with loss of range of motion. The treatment is usually conservative with analgesic and anti-inflammatory drugs, either systemic or intra-articular. Physical medicine and rehabilitation techniques associated with shock wave therapy can be a way of reduction of pain/inflammation. Patients who maintain instability with significant limitation of knee mobility may require surgical excision. Methods: Research was done using PubMed central using the terms Pellegrini-Stieda syndrome. Discussion/conclusion: Medical treatment is the rule, with initial rest, anti-inflammatory, and physiotherapy. If left untreated, this ossification can potentially form a significant bone mass, which can compromise the range of motion of the knee. Physical medicine and rehabilitation techniques associated with shock wave therapy are a way of reduction of pain/inflammation.Keywords: knee, Pellegrini-Stieda syndrome, rehabilitation, shock waves therapy
Procedia PDF Downloads 1425761 The Application and Relevance of Costing Techniques in Service-Oriented Business Organizations a Review of the Activity-Based Costing (ABC) Technique
Authors: Udeh Nneka Evelyn
Abstract:
The shortcoming of traditional costing system in terms of validity, accuracy, consistency, and Relevance increased the need for modern management accounting system. Activity –Based Costing (ABC) can be used as a modern tool for planning, Control and decision making for management. Past studies on ABC system have focused on manufacturing firms thereby making the studies on service firms scanty to some extent. This paper reviewed the application and relevance of activity-based costing technique in service oriented business organizations by employing a qualitative research method which relied heavily on literature review of past and current relevant articles focusing on ABC. Findings suggest that ABC is not only appropriate for use in a manufacturing environment; it is also most appropriate for service organizations such as financial institutions, the healthcare industry and government organization. In fact, some banking and financial institutions have been applying the concept for years under other names. One of them is unit costing, which is used to calculate the cost of banking services by determining the cost and consumption of each unit of output of functions required to deliver the service. ABC in very basic terms may provide very good payback for businesses. Some of the benefits that relate directly to the financial services industry are: identification the most profitable customers: more accurate product and service pricing: increase product profitability: Well organized process costs.Keywords: business, costing, organizations, planning, techniques
Procedia PDF Downloads 2415760 Synthesis, Structural, Spectroscopic and Nonlinear Optical Properties of New Picolinate Complex of Manganese (II) Ion
Authors: Ömer Tamer, Davut Avcı, Yusuf Atalay
Abstract:
Novel picolinate complex of manganese(II) ion, [Mn(pic)2] [pic: picolinate or 2-pyridinecarboxylate], was prepared and fully characterized by single crystal X-ray structure determination. The manganese(II) complex was characterized by FT-IR, FT-Raman and UV–Vis spectroscopic techniques. The C=O, C=N and C=C stretching vibrations were found to be strong and simultaneously active in IR and spectra. In order to support these experimental techniques, density functional theory (DFT) calculations were performed at Gaussian 09W. Although the supramolecular interactions have some influences on the molecular geometry in solid state phase, the calculated data show that the predicted geometries can reproduce the structural parameters. The molecular modeling and calculations of IR, Raman and UV-vis spectra were performed by using DFT levels. Nonlinear optical (NLO) properties of synthesized complex were evaluated by the determining of dipole moment (µ), polarizability (α) and hyperpolarizability (β). Obtained results demonstrated that the manganese(II) complex is a good candidate for NLO material. Stability of the molecule arising from hyperconjugative interactions and charge delocalization was analyzed using natural bond orbital (NBO) analysis. The highest occupied and the lowest unoccupied molecular orbitals (HOMO and LUMO) which is also known the frontier molecular orbitals were simulated, and obtained energy gap confirmed that charge transfer occurs within manganese(II) complex. Molecular electrostatic potential (MEP) for synthesized manganese(II) complex displays the electrophilic and nucleophilic regions. From MEP, the the most negative region is located over carboxyl O atoms while positive region is located over H atoms.Keywords: DFT, picolinate, IR, Raman, nonlinear optic
Procedia PDF Downloads 5005759 Scientific and Technical Basis for the Application of Textile Structures in Glass Using Pate De Verre Technique
Authors: Walaa Hamed Mohamed Hamza
Abstract:
Textile structures are the way in which the threading process of both thread and loom is done together to form the woven. Different methods of attaching the clothing and the flesh produce different textile structures, which differ in their surface appearance from each other, including so-called simple textile structures. Textile compositions are the basis of woven fabric, through which aesthetic values can be achieved in the textile industry by weaving threads of yarn with the weft at varying degrees that may reach the total control of one of the two groups on the other. Hence the idea of how art and design can be used using different textile structures under the modern techniques of pate de verre. In the creation of designs suitable for glass products employed in the interior architecture. The problem of research: The textile structures, in general, have a significant impact on the appearance of the fabrics in terms of form and aesthetic. How can we benefit from the characteristics of different textile compositions in different glass designs with different artistic values. The research achieves its goal by the investment of simple textile structures in innovative artistic designs using the pate de verre technique, as well as the use of designs resulting from the textile structures in the external architecture to add various aesthetic values. The importance of research in the revival of heritage using ancient techniques, as well as synergy between different fields of applied arts such as glass and textile, and also study the different and diverse effects resulting from each fabric composition and the possibility of use in various designs in the interior architecture. The research will be achieved that by investing in simple textile compositions, innovative artistic designs produced using pate de verre technology can be used in interior architecture.Keywords: glass, interior architecture, pate de verre, textile structures
Procedia PDF Downloads 2965758 The Staphylococcus aureus Exotoxin Recognition Using Nanobiosensor Designed by an Antibody-Attached Nanosilica Method
Authors: Hamed Ahari, Behrouz Akbari Adreghani, Vadood Razavilar, Amirali Anvar, Sima Moradi, Hourieh Shalchi
Abstract:
Considering the ever increasing population and industrialization of the developmental trend of humankind's life, we are no longer able to detect the toxins produced in food products using the traditional techniques. This is due to the fact that the isolation time for food products is not cost-effective and even in most of the cases, the precision in the practical techniques like the bacterial cultivation and other techniques suffer from operator errors or the errors of the mixtures used. Hence with the advent of nanotechnology, the design of selective and smart sensors is one of the greatest industrial revelations of the quality control of food products that in few minutes time, and with a very high precision can identify the volume and toxicity of the bacteria. Methods and Materials: In this technique, based on the bacterial antibody connection to nanoparticle, a sensor was used. In this part of the research, as the basis for absorption for the recognition of bacterial toxin, medium sized silica nanoparticles of 10 nanometer in form of solid powder were utilized with Notrino brand. Then the suspension produced from agent-linked nanosilica which was connected to bacterial antibody was positioned near the samples of distilled water, which were contaminated with Staphylococcus aureus bacterial toxin with the density of 10-3, so that in case any toxin exists in the sample, a connection between toxin antigen and antibody would be formed. Finally, the light absorption related to the connection of antigen to the particle attached antibody was measured using spectrophotometry. The gene of 23S rRNA that is conserved in all Staphylococcus spp., also used as control. The accuracy of the test was monitored by using serial dilution (l0-6) of overnight cell culture of Staphylococcus spp., bacteria (OD600: 0.02 = 107 cell). It showed that the sensitivity of PCR is 10 bacteria per ml of cells within few hours. Result: The results indicate that the sensor detects up to 10-4 density. Additionally, the sensitivity of the sensors was examined after 60 days, the sensor by the 56 days had confirmatory results and started to decrease after those time periods. Conclusions: Comparing practical nano biosensory to conventional methods like that culture and biotechnology methods(such as polymerase chain reaction) is accuracy, sensitiveness and being unique. In the other way, they reduce the time from the hours to the 30 minutes.Keywords: exotoxin, nanobiosensor, recognition, Staphylococcus aureus
Procedia PDF Downloads 3875757 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 475756 Effects of Waist-to-Hip Ratio and Visceral Fat Measurements Improvement on Offshore Petrochemical Company Shift Employees' Work Efficiency
Authors: Essam Amerian
Abstract:
The aim of this study was to investigate the effects of improving waist-to-hip ratio (WHR) and visceral fat components on the health of shift workers in an offshore petrochemical company. A total of 100 male shift workers participated in the study, with an average age of 40.5 years and an average BMI of 28.2 kg/m². The study employed a randomized controlled trial design, with participants assigned to either an intervention group or a control group. The intervention group received a 12-week program that included dietary counseling, physical activity recommendations, and stress management techniques. The control group received no intervention. The outcomes measured were changes in WHR, visceral fat components, blood pressure, and lipid profile. The results showed that the intervention group had a statistically significant improvement in WHR (p<0.001) and visceral fat components (p<0.001) compared to the control group. Furthermore, there were statistically significant improvements in systolic blood pressure (p=0.015) and total cholesterol (p=0.034) in the intervention group compared to the control group. These findings suggest that implementing a 12-week program that includes dietary counseling, physical activity recommendations, and stress management techniques can effectively improve WHR, visceral fat components, and cardiovascular health among shift workers in an offshore petrochemical company.Keywords: body composition, waist-hip-ratio, visceral fat, shift worker, work efficiency
Procedia PDF Downloads 80