Search results for: and model-based techniques
4833 Genetic Characterization of Barley Genotypes via Inter-Simple Sequence Repeat
Authors: Mustafa Yorgancılar, Emine Atalay, Necdet Akgün, Ali Topal
Abstract:
In this study, polymerase chain reaction based Inter-simple sequence repeat (ISSR) from DNA fingerprinting techniques were used to investigate the genetic relationships among barley crossbreed genotypes in Turkey. It is important that selection based on the genetic base in breeding programs via ISSR, in terms of breeding time. 14 ISSR primers generated a total of 97 bands, of which 81 (83.35%) were polymorphic. The highest total resolution power (RP) value was obtained from the F2 (0.53) and M16 (0.51) primers. According to the ISSR result, the genetic similarity index changed between 0.64–095; Lane 3 with Line 6 genotypes were the closest, while Line 36 were the most distant ones. The ISSR markers were found to be promising for assessing genetic diversity in barley crossbreed genotypes.Keywords: barley, crossbreed, genetic similarity, ISSR
Procedia PDF Downloads 3474832 Nanoparticles Using in Chiral Analysis with Different Methods of Separation
Authors: Bounoua Nadia, Rebizi Mohamed Nadjib
Abstract:
Chiral molecules in relation to particular biological roles are stereoselective. Enantiomers differ significantly in their biochemical responses in a biological environment. Despite the current advancement in drug discovery and pharmaceutical biotechnology, the chiral separation of some racemic mixtures continues to be one of the greatest challenges because the available techniques are too costly and time-consuming for the assessment of therapeutic drugs in the early stages of development worldwide. Various nanoparticles became one of the most investigated and explored nanotechnology-derived nanostructures, especially in chirality, where several studies are reported to improve the enantiomeric separation of different racemic mixtures. The production of surface-modified nanoparticles has contributed to these limitations in terms of sensitivity, accuracy, and enantioselectivity that can be optimized and therefore makes these surface-modified nanoparticles convenient for enantiomeric identification and separation.Keywords: chirality, enantiomeric recognition, selectors, analysis, surface-modified nanoparticles
Procedia PDF Downloads 944831 Trajectory Planning Algorithms for Autonomous Agricultural Vehicles
Authors: Caner Koc, Dilara Gerdan Koc, Mustafa Vatandas
Abstract:
The fundamental components of autonomous agricultural robot design, such as having a working understanding of coordinates, correctly constructing the desired route, and sensing environmental elements, are the most important. A variety of sensors, hardware, and software are employed by agricultural robots to find these systems.These enable the fully automated driving system of an autonomous vehicle to simulate how a human-driven vehicle would respond to changing environmental conditions. To calculate the vehicle's motion trajectory using data from the sensors, this automation system typically consists of a sophisticated software architecture based on object detection and driving decisions. In this study, the software architecture of an autonomous agricultural vehicle is compared to the trajectory planning techniques.Keywords: agriculture 5.0, computational intelligence, motion planning, trajectory planning
Procedia PDF Downloads 784830 Proactive Pure Handoff Model with SAW-TOPSIS Selection and Time Series Predict
Authors: Harold Vásquez, Cesar Hernández, Ingrid Páez
Abstract:
This paper approach cognitive radio technic and applied pure proactive handoff Model to decrease interference between PU and SU and comparing it with reactive handoff model. Through the study and analysis of multivariate models SAW and TOPSIS join to 3 dynamic prediction techniques AR, MA ,and ARMA. To evaluate the best model is taken four metrics: number failed handoff, number handoff, number predictions, and number interference. The result presented the advantages using this type of pure proactive models to predict changes in the PU according to the selected channel and reduce interference. The model showed better performance was TOPSIS-MA, although TOPSIS-AR had a higher predictive ability this was not reflected in the interference reduction.Keywords: cognitive radio, spectrum handoff, decision making, time series, wireless networks
Procedia PDF Downloads 4874829 Investigating the Demand of Short-Shelf Life Food Products for SME Wholesalers
Authors: Yamini Raju, Parminder S. Kang, Adam Moroz, Ross Clement, Alistair Duffy, Ashley Hopwell
Abstract:
Accurate prediction of fresh produce demand is one the challenges faced by Small Medium Enterprise (SME) wholesalers. Current research in this area focused on limited number of factors specific to a single product or a business type. This paper gives an overview of the current literature on the variability factors used to predict demand and the existing forecasting techniques of short shelf life products. It then extends it by adding new factors and investigating if there is a time lag and possibility of noise in the orders. It also identifies the most important factors using correlation and Principal Component Analysis (PCA).Keywords: demand forecasting, deteriorating products, food wholesalers, principal component analysis, variability factors
Procedia PDF Downloads 5204828 Computational Fluid Dynamics of a Bubbling Fluidized Bed in Wood Pellets
Authors: Opeyemi Fadipe, Seong Lee, Guangming Chen, Steve Efe
Abstract:
In comparison to conventional combustion technologies, fluidized bed combustion has several advantages, such as superior heat transfer characteristics due to homogeneous particle mixing, lower temperature needs, nearly isothermal process conditions, and the ability to operate continuously. Computational fluid dynamics (CFD) can help anticipate the intricate combustion process and the hydrodynamics of a fluidized bed thoroughly by using CFD techniques. Bubbling Fluidized bed was model using the Eulerian-Eulerian model, including the kinetic theory of the flow. The model was validated by comparing it with other simulation of the fluidized bed. The effects of operational gas velocity, volume fraction, and feed rate were also investigated numerically. A higher gas velocity and feed rate cause an increase in fluidization of the bed.Keywords: fluidized bed, operational gas velocity, volume fraction, computational fluid dynamics
Procedia PDF Downloads 834827 Comparison of Chemical Coagulation and Electrocoagulation for Boron Removal from Synthetic Wastewater Using Aluminium
Authors: Kartikaningsih Danis, Yao-Hui Huang
Abstract:
Various techniques including conventional and advanced have been employed for the boron treatment from water and wastewater. The electrocoagulation involves an electrolytic reactor for coagulation/flotation with aluminium as anode and cathode. There is aluminium as coagulant to be used for removal which may induce secondary pollution in chemical coagulation. The purpose of this study is to investigate and compare the performance between electrocoagulation and chemical coagulation on boron removal from synthetic wastewater. The effect of different parameters, such as pH reaction, coagulant dosage, and initial boron concentration were examined. The results show that the boron removal using chemical coagulation was lower. At the optimum condition (e.g. pH 8 and 0.8 mol coagulant dosage), boron removal efficiencies for chemical coagulation and electrocoagulation were 61% and 91%, respectively. In addition, the electrocoagulation needs no chemical reagents and makes the boron treatment easy for application.Keywords: boron removal, chemical coagulation, aluminum, electro-coagulation
Procedia PDF Downloads 4044826 DeClEx-Processing Pipeline for Tumor Classification
Authors: Gaurav Shinde, Sai Charan Gongiguntla, Prajwal Shirur, Ahmed Hambaba
Abstract:
Health issues are significantly increasing, putting a substantial strain on healthcare services. This has accelerated the integration of machine learning in healthcare, particularly following the COVID-19 pandemic. The utilization of machine learning in healthcare has grown significantly. We introduce DeClEx, a pipeline that ensures that data mirrors real-world settings by incorporating Gaussian noise and blur and employing autoencoders to learn intermediate feature representations. Subsequently, our convolutional neural network, paired with spatial attention, provides comparable accuracy to state-of-the-art pre-trained models while achieving a threefold improvement in training speed. Furthermore, we provide interpretable results using explainable AI techniques. We integrate denoising and deblurring, classification, and explainability in a single pipeline called DeClEx.Keywords: machine learning, healthcare, classification, explainability
Procedia PDF Downloads 554825 Association of Social Data as a Tool to Support Government Decision Making
Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias
Abstract:
Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.Keywords: social data, government decision making, association of social data, data mining
Procedia PDF Downloads 3694824 Combining Shallow and Deep Unsupervised Machine Learning Techniques to Detect Bad Actors in Complex Datasets
Authors: Jun Ming Moey, Zhiyaun Chen, David Nicholson
Abstract:
Bad actors are often hard to detect in data that imprints their behaviour patterns because they are comparatively rare events embedded in non-bad actor data. An unsupervised machine learning framework is applied here to detect bad actors in financial crime datasets that record millions of transactions undertaken by hundreds of actors (<0.01% bad). Specifically, the framework combines ‘shallow’ (PCA, Isolation Forest) and ‘deep’ (Autoencoder) methods to detect outlier patterns. Detection performance analysis for both the individual methods and their combination is reported.Keywords: detection, machine learning, deep learning, unsupervised, outlier analysis, data science, fraud, financial crime
Procedia PDF Downloads 944823 Effects of Drying and Extraction Techniques on the Profile of Volatile Compounds in Banana Pseudostem
Authors: Pantea Salehizadeh, Martin P. Bucknall, Robert Driscoll, Jayashree Arcot, George Srzednicki
Abstract:
Banana is one of the most important crops produced in large quantities in tropical and sub-tropical countries. Of the total plant material grown, approximately 40% is considered waste and left in the field to decay. This practice allows fungal diseases such as Sigatoka Leaf Spot to develop, limiting plant growth and spreading spores in the air that can cause respiratory problems in the surrounding population. The pseudostem is considered a waste residue of production (60 to 80 tonnes/ha/year), although it is a good source of dietary fiber and volatile organic compounds (VOC’s). Strategies to process banana pseudostem into palatable, nutritious and marketable food materials could provide significant social and economic benefits. Extraction of VOC’s with desirable odor from dried and fresh pseudostem could improve the smell of products from the confectionary and bakery industries. Incorporation of banana pseudostem flour into bakery products could provide cost savings and improve nutritional value. The aim of this study was to determine the effects of drying methods and different banana species on the profile of volatile aroma compounds in dried banana pseudostem. The banana species analyzed were Musa acuminata and Musa balbisiana. Fresh banana pseudostem samples were processed by either freeze-drying (FD) or heat pump drying (HPD). The extraction of VOC’s was performed at ambient temperature using vacuum distillation and the resulting, mostly aqueous, distillates were analyzed using headspace solid phase microextraction (SPME) gas chromatography – mass spectrometry (GC-MS). Optimal SPME adsorption conditions were 50 °C for 60 min using a Supelco 65 μm PDMS/DVB Stableflex fiber1. Compounds were identified by comparison of their electron impact mass spectra with those from the Wiley 9 / NIST 2011 combined mass spectral library. The results showed that the two species have notably different VOC profiles. Both species contained VOC’s that have been established in literature to have pleasant appetizing aromas. These included l-Menthone, D-Limonene, trans-linlool oxide, 1-Nonanol, CIS 6 Nonen-1ol, 2,6 Nonadien-1-ol, Benzenemethanol, 4-methyl, 1-Butanol, 3-methyl, hexanal, 1-Propanol, 2-methyl- acid، 2-Methyl-2-butanol. Results show banana pseudostem VOC’s are better preserved by FD than by HPD. This study is still in progress and should lead to the optimization of processing techniques that would promote the utilization of banana pseudostem in the food industry.Keywords: heat pump drying, freeze drying, SPME, vacuum distillation, VOC analysis
Procedia PDF Downloads 3354822 Stochastic Default Risk Estimation Evidence from the South African Financial Market
Authors: Mesias Alfeus, Kirsty Fitzhenry, Alessia Lederer
Abstract:
The present paper provides empirical studies to estimate defaultable bonds in the South African financial market. The main goal is to estimate the unobservable factors affecting bond yields for South African major banks. The maximum likelihood approach is adopted for the estimation methodology. Extended Kalman filtering techniques are employed in order to tackle the situation that the factors cannot be observed directly. Multi-dimensional Cox-Ingersoll-Ross (CIR)-type factor models are considered. Results show that default risk increased sharply in the South African financial market during COVID-19 and the CIR model with jumps exhibits a better performance.Keywords: default intensity, unobservable state variables, CIR, α-CIR, extended kalman filtering
Procedia PDF Downloads 1114821 Quantization of Damped Systems Based on the Doubling of Degrees of Freedom
Authors: Khaled I. Nawafleh
Abstract:
In this paper, it provide the canonical approach for studying dissipated oscillators based on the doubling of degrees of freedom. Clearly, expressions for Lagrangians of the elementary modes of the system are given, which ends with the familiar classical equations of motion for the dissipative oscillator. The equation for one variable is the time reversed of the motion of the second variable. it discuss in detail the extended Bateman Lagrangian specifically for a dual extended damped oscillator time-dependent. A Hamilton-Jacobi analysis showing the equivalence with the Lagrangian approach is also obtained. For that purpose, the techniques of separation of variables were applied, and the quantization process was achieved.Keywords: doubling of degrees of freedom, dissipated harmonic oscillator, Hamilton-Jacobi, time-dependent lagrangians, quantization
Procedia PDF Downloads 684820 A General Strategy for Noise Assessment in Open Mining Industries
Authors: Diego Mauricio Murillo Gomez, Enney Leon Gonzalez Ramirez, Hugo Piedrahita, Jairo Yate
Abstract:
This paper proposes a methodology for the management of noise in open mining industries based on an integral concept, which takes into consideration occupational and environmental noise as a whole. The approach relies on the characterization of sources, the combination of several measurements’ techniques and the use of acoustic prediction software. A discussion about the difference between frequently used acoustic indicators such as Leq and LAV is carried out, aiming to establish common ground for homologation. The results show that the correct integration of this data not only allows for a more robust technical analysis but also for a more strategic route of intervention as several departments of the company are working together. Noise control measurements can be designed to provide a healthy acoustic surrounding in which the exposure workers but also the outdoor community is benefited.Keywords: environmental noise, noise control, occupational noise, open mining
Procedia PDF Downloads 2694819 Conductive Clay Nanocomposite Using Smectite and Poly(O-Anisidine)
Authors: M. Şahi̇n, E. Erdem, M. Saçak
Abstract:
In this study, Na-smectite crystals purificated of bentonite were used after being swelling with benzyltributylammonium bromide (BTBAB) as alkyl ammonium salt. Swelling process was carried out using 0.2 g of BTBAB for smectite of 0.8 g with 4 h of mixing time after investigated conditions such as mixing time, the swelling agent amount. Then, the conductive poly(o-anisidine) (POA)/smectite nanocomposite was prepared in the presence of swollen Na-smectite using ammonium persulfate (APS) as oxidant in aqueous acidic medium. The POA content and conductivity of the prepared nanocomposite were systematically investigated as a function of polymerization conditions such as the treatment time of swollen smectite in monomer solution and o-anisidine/APS mol ratio. POA/smectite nanocomposite was characterized by XRD, FTIR and SEM techniques and was compared separately with components of composite.Keywords: clay, composite, conducting polymer, poly(o-anisidine)
Procedia PDF Downloads 3254818 Lexicon-Based Sentiment Analysis for Stock Movement Prediction
Authors: Zane Turner, Kevin Labille, Susan Gauch
Abstract:
Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We present a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.Keywords: computational finance, sentiment analysis, sentiment lexicon, stock movement prediction
Procedia PDF Downloads 1274817 Lexicon-Based Sentiment Analysis for Stock Movement Prediction
Authors: Zane Turner, Kevin Labille, Susan Gauch
Abstract:
Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We introduce a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.Keywords: computational finance, sentiment analysis, sentiment lexicon, stock movement prediction
Procedia PDF Downloads 1704816 Comparison of Two Different Methods for Peptide Synthesis
Authors: Klaudia Chmielewska, Krystyna Dzierzbicka, Iwona Inkielewicz-Stepniak
Abstract:
Carnosine, an endogenous peptide consisting of β-alanine and L-histidine has a variety of functions to mention: antioxidant, antiglycation, and reducing the toxicity of metal ions. It has therefore been proposed to act as a therapeutic agent for many pathological states, although its therapeutic index is limited by quick enzymatic cleavage. To overcome this limitation, there’s an urge to create new derivatives which might become less potent to hydrolysis, while preserving the therapeutic effect. The poster summarizes the efficiency of two peptide synthesis methods, which were: (1) the mixed anhydride with isobutyl chloroformate and N-methylmorpholine (NMM) and (2) carbodiimide - mediated coupling method via appropriate reagent condensing, here – CDI. The methods were used to obtain dipeptides which were the derivatives of carnosine. Obtained dipeptides were made in the form of methyl esters and their structures will be confirmed 1H NMR, 13C NMR, MS and elemental analysis techniques. Later on, they will be analyzed for their antioxidant properties, in comparison to carnosine.Keywords: carnosine, method, peptide, synthesis
Procedia PDF Downloads 1594815 The Pioneering Model in Teaching Arabic as a Mother Tongue through Modern Innovative Strategies
Authors: Rima Abu Jaber Bransi, Rawya Jarjoura Burbara
Abstract:
This study deals with two pioneering approaches in teaching Arabic as a mother tongue: first, computerization of literary and functional texts in the mother tongue; second, the pioneering model in teaching writing skills by computerization. The significance of the study lies in its treatment of a serious problem that is faced in the era of technology, which is the widening gap between the pupils and their mother tongue. The innovation in the study is that it introduces modern methods and tools and a pioneering instructional model that turns the process of mother tongue teaching into an effective, meaningful, interesting and motivating experience. In view of the Arabic language diglossia, standard Arabic and spoken Arabic, which constitutes a serious problem to the pupil in understanding unused words, and in order to bridge the gap between the pupils and their mother tongue, we resorted to computerized techniques; we took texts from the pre-Islamic period (Jahiliyya), starting with the Mu'allaqa of Imru' al-Qais and other selected functional texts and computerized them for teaching in an interesting way that saves time and effort, develops high thinking strategies, expands the literary good taste among the pupils, and gives the text added values that neither the book, the blackboard, the teacher nor the worksheets provide. On the other hand, we have developed a pioneering computerized model that aims to develop the pupil's ability to think, to provide his imagination with the elements of growth, invention and connection, and motivate him to be creative, and raise level of his scores and scholastic achievements. The model consists of four basic stages in teaching according to the following order: 1. The Preparatory stage, 2. The reading comprehension stage, 3. The writing stage, 4. The evaluation stage. Our lecture will introduce a detailed description of the model with illustrations and samples from the units that we built through highlighting some aspects of the uniqueness and innovation that are specific to this model and the different integrated tools and techniques that we developed. One of the most significant conclusions of this research is that teaching languages through the employment of new computerized strategies is very likely to get the Arabic speaking pupils out of the circle of passive reception into active and serious action and interaction. The study also emphasizes the argument that the computerized model of teaching can change the role of the pupil's mind from being a store of knowledge for a short time into a partner in producing knowledge and storing it in a coherent way that prevents its forgetfulness and keeping it in memory for a long period of time. Consequently, the learners also turn into partners in evaluation by expressing their views, giving their notes and observations, and application of the method of peer-teaching and learning.Keywords: classical poetry, computerization, diglossia, writing skill
Procedia PDF Downloads 2254814 Estimating Cyclone Intensity Using INSAT-3D IR Images Based on Convolution Neural Network Model
Authors: Divvela Vishnu Sai Kumar, Deepak Arora, Sheenu Rizvi
Abstract:
Forecasting a cyclone through satellite images consists of the estimation of the intensity of the cyclone and predicting it before a cyclone comes. This research work can help people to take safety measures before the cyclone comes. The prediction of the intensity of a cyclone is very important to save lives and minimize the damage caused by cyclones. These cyclones are very costliest natural disasters that cause a lot of damage globally due to a lot of hazards. Authors have proposed five different CNN (Convolutional Neural Network) models that estimate the intensity of cyclones through INSAT-3D IR images. There are a lot of techniques that are used to estimate the intensity; the best model proposed by authors estimates intensity with a root mean squared error (RMSE) of 10.02 kts.Keywords: estimating cyclone intensity, deep learning, convolution neural network, prediction models
Procedia PDF Downloads 1274813 Research of Applicable Ground Reinforcement Method in Double-Deck Tunnel Junction
Authors: SKhan Park, Seok Jin Lee, Jong Sun Kim, Jun Ho Lee, Bong Chan Kim
Abstract:
Because of the large economic losses caused by traffic congestion in metropolitan areas, various studies on the underground network design and construction techniques has been performed various studies in the developed countries. In Korea, it has performed a study to develop a versatile double-deck of deep tunnel model. This paper is an introduction to develop a ground reinforcement method to enable the safe tunnel construction in the weakened pillar section like as junction of tunnel. Applicable ground reinforcement method in the weakened section is proposed and it is expected to verify the method by the field application tests.Keywords: double-deck tunnel, ground reinforcement, tunnel construction, weakened pillar section
Procedia PDF Downloads 4094812 Throughflow Effects on Thermal Convection in Variable Viscosity Ferromagnetic Liquids
Authors: G. N. Sekhar, P. G. Siddheshwar, G. Jayalatha, R. Prakash
Abstract:
The problem of thermal convection in temperature and magnetic field sensitive Newtonian ferromagnetic liquid is studied in the presence of uniform vertical magnetic field and throughflow. Using a combination of Galerkin and shooting techniques the critical eigenvalues are obtained for stationary mode. The effect of Prandtl number (Pr > 1) on onset is insignificant and nonlinearity of non-buoyancy magnetic parameter M3 is found to have no influence on the onset of ferroconvection. The magnetic buoyancy number, M1 and variable viscosity parameter, V have destabilizing influences on the system. The effect of throughflow Peclet number, Pe is to delay the onset of ferroconvection and this effect is independent of the direction of flow.Keywords: ferroconvection, magnetic field dependent viscosity, temperature dependent viscosity, throughflow
Procedia PDF Downloads 2654811 Computational Homogenization of Thin Walled Structures: On the Influence of the Global vs Local Applied Plane Stress Condition
Authors: M. Beusink, E. W. C. Coenen
Abstract:
The increased application of novel structural materials, such as high grade asphalt, concrete and laminated composites, has sparked the need for a better understanding of the often complex, non-linear mechanical behavior of such materials. The effective macroscopic mechanical response is generally dependent on the applied load path. Moreover, it is also significantly influenced by the microstructure of the material, e.g. embedded fibers, voids and/or grain morphology. At present, multiscale techniques are widely adopted to assess micro-macro interactions in a numerically efficient way. Computational homogenization techniques have been successfully applied over a wide range of engineering cases, e.g. cases involving first order and second order continua, thin shells and cohesive zone models. Most of these homogenization methods rely on Representative Volume Elements (RVE), which model the relevant microstructural details in a confined volume. Imposed through kinematical constraints or boundary conditions, a RVE can be subjected to a microscopic load sequence. This provides the RVE's effective stress-strain response, which can serve as constitutive input for macroscale analyses. Simultaneously, such a study of a RVE gives insight into fine scale phenomena such as microstructural damage and its evolution. It has been reported by several authors that the type of boundary conditions applied to the RVE affect the resulting homogenized stress-strain response. As a consequence, dedicated boundary conditions have been proposed to appropriately deal with this concern. For the specific case of a planar assumption for the analyzed structure, e.g. plane strain, axisymmetric or plane stress, this assumption needs to be addressed consistently in all considered scales. Although in many multiscale studies a planar condition has been employed, the related impact on the multiscale solution has not been explicitly investigated. This work therefore focuses on the influence of the planar assumption for multiscale modeling. In particular the plane stress case is highlighted, by proposing three different implementation strategies which are compatible with a first-order computational homogenization framework. The first method consists of applying classical plane stress theory at the microscale, whereas with the second method a generalized plane stress condition is assumed at the RVE level. For the third method, the plane stress condition is applied at the macroscale by requiring that the resulting macroscopic out-of-plane forces are equal to zero. These strategies are assessed through a numerical study of a thin walled structure and the resulting effective macroscale stress-strain response is compared. It is shown that there is a clear influence of the length scale at which the planar condition is applied.Keywords: first-order computational homogenization, planar analysis, multiscale, microstrucutures
Procedia PDF Downloads 2334810 Coding Considerations for Standalone Molecular Dynamics Simulations of Atomistic Structures
Authors: R. O. Ocaya, J. J. Terblans
Abstract:
The laws of Newtonian mechanics allow ab-initio molecular dynamics to model and simulate particle trajectories in material science by defining a differentiable potential function. This paper discusses some considerations for the coding of ab-initio programs for simulation on a standalone computer and illustrates the approach by C language codes in the context of embedded metallic atoms in the face-centred cubic structure. The algorithms use velocity-time integration to determine particle parameter evolution for up to several thousands of particles in a thermodynamical ensemble. Such functions are reusable and can be placed in a redistributable header library file. While there are both commercial and free packages available, their heuristic nature prevents dissection. In addition, developing own codes has the obvious advantage of teaching techniques applicable to new problems.Keywords: C language, molecular dynamics, simulation, embedded atom method
Procedia PDF Downloads 3054809 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network
Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang
Abstract:
‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.Keywords: deep learning network, smart metering, water end use, water-energy data
Procedia PDF Downloads 3064808 Patterns Obtained by Using Knitting Technique in Textile Crafts
Authors: Özlem Erzurumlu, Nazan Oskay, Ece Melek
Abstract:
Knitting which is one of the textile manufacturing techniques is manufactured by using the system of single yarn. Knitting wares consisting of loops structurally have flexible structures. Knitting can be shaped and given volume easily due to increasing or decreasing the number of loops, being manufactured in circular form and its flexible structure. While the knitting wares are basically being manufactured to meet the requirements, it takes its place in the art field overflowing outside of industrial production later. Textile artist ensures his ideas to convert into artistic product by using textiles and non-textiles with aesthetic concerns and creative impulses. When textile crafts are observed at the present time we see that knitting technique has an extensive area of use such as sculpture, panel, installation art and performing art. It is examined how the knitting technique is used in textile crafts observing patterns obtained by this technique in textile crafts in this study.Keywords: art, textile, knitting art, textile crafts
Procedia PDF Downloads 7074807 Software Defect Analysis- Eclipse Dataset
Authors: Amrane Meriem, Oukid Salyha
Abstract:
The presence of defects or bugs in software can lead to costly setbacks, operational inefficiencies, and compromised user experiences. The integration of Machine Learning(ML) techniques has emerged to predict and preemptively address software defects. ML represents a proactive strategy aimed at identifying potential anomalies, errors, or vulnerabilities within code before they manifest as operational issues. By analyzing historical data, such as code changes, feature im- plementations, and defect occurrences. This en- ables development teams to anticipate and mitigate these issues, thus enhancing software quality, reducing maintenance costs, and ensuring smoother user interactions. In this work, we used a recommendation system to improve the performance of ML models in terms of predicting the code severity and effort estimation.Keywords: software engineering, machine learning, bugs detection, effort estimation
Procedia PDF Downloads 864806 Classification of Coughing and Breathing Activities Using Wearable and a Light-Weight DL Model
Authors: Subham Ghosh, Arnab Nandi
Abstract:
Background: The proliferation of Wireless Body Area Networks (WBAN) and Internet of Things (IoT) applications demonstrates the potential for continuous monitoring of physical changes in the body. These technologies are vital for health monitoring tasks, such as identifying coughing and breathing activities, which are necessary for disease diagnosis and management. Monitoring activities such as coughing and deep breathing can provide valuable insights into a variety of medical issues. Wearable radio-based antenna sensors, which are lightweight and easy to incorporate into clothing or portable goods, provide continuous monitoring. This mobility gives it a substantial advantage over stationary environmental sensors like as cameras and radar, which are constrained to certain places. Furthermore, using compressive techniques provides benefits such as reduced data transmission speeds and memory needs. These wearable sensors offer more advanced and diverse health monitoring capabilities. Methodology: This study analyzes the feasibility of using a semi-flexible antenna operating at 2.4 GHz (ISM band) and positioned around the neck and near the mouth to identify three activities: coughing, deep breathing, and idleness. Vector network analyzer (VNA) is used to collect time-varying complex reflection coefficient data from perturbed antenna nearfield. The reflection coefficient (S11) conveys nuanced information caused by simultaneous variations in the nearfield radiation of three activities across time. The signatures are sparsely represented with gaussian windowed Gabor spectrograms. The Gabor spectrogram is used as a sparse representation approach, which reassigns the ridges of the spectrogram images to improve their resolution and focus on essential components. The antenna is biocompatible in terms of specific absorption rate (SAR). The sparsely represented Gabor spectrogram pictures are fed into a lightweight deep learning (DL) model for feature extraction and classification. Two antenna locations are investigated in order to determine the most effective localization for three different activities. Findings: Cross-validation techniques were used on data from both locations. Due to the complex form of the recorded S11, separate analyzes and assessments were performed on the magnitude, phase, and their combination. The combination of magnitude and phase fared better than the separate analyses. Various sliding window sizes, ranging from 1 to 5 seconds, were tested to find the best window for activity classification. It was discovered that a neck-mounted design was effective at detecting the three unique behaviors.Keywords: activity recognition, antenna, deep-learning, time-frequency
Procedia PDF Downloads 94805 A Dynamic Mechanical Thermal T-Peel Test Approach to Characterize Interfacial Behavior of Polymeric Textile Composites
Authors: J. R. Büttler, T. Pham
Abstract:
Basic understanding of interfacial mechanisms is of importance for the development of polymer composites. For this purpose, we need techniques to analyze the quality of interphases, their chemical and physical interactions and their strength and fracture resistance. In order to investigate the interfacial phenomena in detail, advanced characterization techniques are favorable. Dynamic mechanical thermal analysis (DMTA) using a rheological system is a sensitive tool. T-peel tests were performed with this system, to investigate the temperature-dependent peel behavior of woven textile composites. A model system was made of polyamide (PA) woven fabric laminated with films of polypropylene (PP) or PP modified by grafting with maleic anhydride (PP-g-MAH). Firstly, control measurements were performed with solely PP matrixes. Polymer melt investigations, as well as the extensional stress, extensional viscosity and extensional relaxation modulus at -10°C, 100 °C and 170 °C, demonstrate similar viscoelastic behavior for films made of PP-g-MAH and its non-modified PP-control. Frequency sweeps have shown that PP-g-MAH has a zero phase viscosity of around 1600 Pa·s and PP-control has a similar zero phase viscosity of 1345 Pa·s. Also, the gelation points are similar at 2.42*104 Pa (118 rad/s) and 2.81*104 Pa (161 rad/s) for PP-control and PP-g-MAH, respectively. Secondly, the textile composite was analyzed. The extensional stress of PA66 fabric laminated with either PP-control or PP-g-MAH at -10 °C, 25 °C and 170 °C for strain rates of 0.001 – 1 s-1 was investigated. The laminates containing the modified PP need more stress for T-peeling. However, the strengthening effect due to the modification decreases by increasing temperature and at 170 °C, just above the melting temperature of the matrix, the difference disappears. Independent of the matrix used in the textile composite, there is a decrease of extensional stress by increasing temperature. It appears that the more viscous is the matrix, the weaker the laminar adhesion. Possibly, the measurement is influenced by the fact that the laminate becomes stiffer at lower temperatures. Adhesive lap-shear testing at room temperature supports the findings obtained with the T-peel test. Additional analysis of the textile composite at the microscopic level ensures that the fibers are well embedded in the matrix. Atomic force microscopy (AFM) imaging of a cross section of the composite shows no gaps between the fibers and matrix. Measurements of the water contact angle show that the MAH grafted PP is more polar than the virgin-PP, and that suggests a more favorable chemical interaction of PP-g-MAH with PA, compared to the non-modified PP. In fact, this study indicates that T-peel testing by DMTA is a technique to achieve more insights into polymeric textile composites.Keywords: dynamic mechanical thermal analysis, interphase, polyamide, polypropylene, textile composite
Procedia PDF Downloads 1294804 Systematic Formulation Development and Evaluation of Self-Nanoemulsifying Systems of Rosuvastatin Employing QbD Approach and Chemometric Techniques
Authors: Sarwar Beg, Gajanand Sharma, O. P. Katare, Bhupinder Singh
Abstract:
The current studies entail development of self-nano emulsifying drug delivery systems (SNEDDS) of rosuvastatin, employing rational QbD-based approach for enhancing its oral bioavailability. SNEDDS were prepared using the blend of lipidic and emulsifying excipients, i.e., Peceol, Tween 80, and Transcutol HP. The prepared formulations evaluated for in vitro drug release, ex vivo permeation, in situ perfusion studies and in vivo pharmacokinetic studies in rats, which demonstrated 3-4 fold improvement in biopharmaceutical performance of the developed formulations. Cytotoxicity studies using MTT assay and histopathological studies in intestinal cells revealed the lack of cytotoxicity and thereby safety and efficacy of the developed formulations.Keywords: SNEDDS, bioavailability, solubility, Quality by Design (QbD)
Procedia PDF Downloads 505