Search results for: solution validation
2127 Analysis of Nonlinear Dynamic Systems Excited by Combined Colored and White Noise Excitations
Authors: Siu-Siu Guo, Qingxuan Shi
Abstract:
In this paper, single-degree-of-freedom (SDOF) systems to white noise and colored noise excitations are investigated. By expressing colored noise excitation as a second-order filtered white noise process and introducing colored noise as an additional state variable, the equation of motion for SDOF system under colored noise is then transferred artificially to multi-degree-of-freedom (MDOF) system under white noise excitations. As a consequence, corresponding Fokker-Planck-Kolmogorov (FPK) equation governing the joint probabilistic density function (PDF) of state variables increases to 4-dimension (4-D). Solution procedure and computer programme become much more sophisticated. The exponential-polynomial closure (EPC) method, widely applied for cases of SDOF systems under white noise excitations, is developed and improved for cases of systems under colored noise excitations and for solving the complex 4-D FPK equation. On the other hand, Monte Carlo simulation (MCS) method is performed to test the approximate EPC solutions. Two examples associated with Gaussian and non-Gaussian colored noise excitations are considered. Corresponding band-limited power spectral densities (PSDs) for colored noise excitations are separately given. Numerical studies show that the developed EPC method provides relatively accurate estimates of the stationary probabilistic solutions. Moreover, statistical parameter of mean-up crossing rate (MCR) is taken into account, which is important for reliability and failure analysis.Keywords: filtered noise, narrow-banded noise, nonlinear dynamic, random vibration
Procedia PDF Downloads 2252126 An Efficient Subcarrier Scheduling Algorithm for Downlink OFDMA-Based Wireless Broadband Networks
Authors: Hassen Hamouda, Mohamed Ouwais Kabaou, Med Salim Bouhlel
Abstract:
The growth of wireless technology made opportunistic scheduling a widespread theme in recent research. Providing high system throughput without reducing fairness allocation is becoming a very challenging task. A suitable policy for resource allocation among users is of crucial importance. This study focuses on scheduling multiple streaming flows on the downlink of a WiMAX system based on orthogonal frequency division multiple access (OFDMA). In this paper, we take the first step in formulating and analyzing this problem scrupulously. As a result, we proposed a new scheduling scheme based on Round Robin (RR) Algorithm. Because of its non-opportunistic process, RR does not take in account radio conditions and consequently it affect both system throughput and multi-users diversity. Our contribution called MORRA (Modified Round Robin Opportunistic Algorithm) consists to propose a solution to this issue. MORRA not only exploits the concept of opportunistic scheduler but also takes into account other parameters in the allocation process. The first parameter is called courtesy coefficient (CC) and the second is called Buffer Occupancy (BO). Performance evaluation shows that this well-balanced scheme outperforms both RR and MaxSNR schedulers and demonstrate that choosing between system throughput and fairness is not required.Keywords: OFDMA, opportunistic scheduling, fairness hierarchy, courtesy coefficient, buffer occupancy
Procedia PDF Downloads 3002125 Traffic Prediction with Raw Data Utilization and Context Building
Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao
Abstract:
Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.Keywords: traffic prediction, raw data utilization, context building, data reduction
Procedia PDF Downloads 1272124 3D Label-Free Bioimaging of Native Tissue with Selective Plane Illumination Optical Microscopy
Authors: Jing Zhang, Yvonne Reinwald, Nick Poulson, Alicia El Haj, Chung See, Mike Somekh, Melissa Mather
Abstract:
Biomedical imaging of native tissue using light offers the potential to obtain excellent structural and functional information in a non-invasive manner with good temporal resolution. Image contrast can be derived from intrinsic absorption, fluorescence, or scatter, or through the use of extrinsic contrast. A major challenge in applying optical microscopy to in vivo tissue imaging is the effects of light attenuation which limits light penetration depth and achievable imaging resolution. Recently Selective Plane Illumination Microscopy (SPIM) has been used to map the 3D distribution of fluorophores dispersed in biological structures. In this approach, a focused sheet of light is used to illuminate the sample from the side to excite fluorophores within the sample of interest. Images are formed based on detection of fluorescence emission orthogonal to the illumination axis. By scanning the sample along the detection axis and acquiring a stack of images, 3D volumes can be obtained. The combination of rapid image acquisition speeds with the low photon dose to samples optical sectioning provides SPIM is an attractive approach for imaging biological samples in 3D. To date all implementations of SPIM rely on the use of fluorescence reporters be that endogenous or exogenous. This approach has the disadvantage that in the case of exogenous probes the specimens are altered from their native stage rendering them unsuitable for in vivo studies and in general fluorescence emission is weak and transient. Here we present for the first time to our knowledge a label-free implementation of SPIM that has downstream applications in the clinical setting. The experimental set up used in this work incorporates both label-free and fluorescent illumination arms in addition to a high specification camera that can be partitioned for simultaneous imaging of both fluorescent emission and scattered light from intrinsic sources of optical contrast in the sample being studied. This work first involved calibration of the imaging system and validation of the label-free method with well characterised fluorescent microbeads embedded in agarose gel. 3D constructs of mammalian cells cultured in agarose gel with varying cell concentrations were then imaged. A time course study to track cell proliferation in the 3D construct was also carried out and finally a native tissue sample was imaged. For each sample multiple images were obtained by scanning the sample along the axis of detection and 3D maps reconstructed. The results obtained validated label-free SPIM as a viable approach for imaging cells in a 3D gel construct and native tissue. This technique has the potential use in a near-patient environment that can provide results quickly and be implemented in an easy to use manner to provide more information with improved spatial resolution and depth penetration than current approaches.Keywords: bioimaging, optics, selective plane illumination microscopy, tissue imaging
Procedia PDF Downloads 2482123 Development and Validation of a Quantitative Measure of Engagement in the Analysing Aspect of Dialogical Inquiry
Authors: Marcus Goh Tian Xi, Alicia Chua Si Wen, Eunice Gan Ghee Wu, Helen Bound, Lee Liang Ying, Albert Lee
Abstract:
The Map of Dialogical Inquiry provides a conceptual look at the underlying nature of future-oriented skills. According to the Map, learning is learner-oriented, with conversational time shifted from teachers to learners, who play a strong role in deciding what and how they learn. For example, in courses operating on the principles of Dialogical Inquiry, learners were able to leave the classroom with a deeper understanding of the topic, broader exposure to differing perspectives, and stronger critical thinking capabilities, compared to traditional approaches to teaching. Despite its contributions to learning, the Map is grounded in a qualitative approach both in its development and its application for providing feedback to learners and educators. Studies hinge on openended responses by Map users, which can be time consuming and resource intensive. The present research is motivated by this gap in practicality by aiming to develop and validate a quantitative measure of the Map. In addition, a quantifiable measure may also strengthen applicability by making learning experiences trackable and comparable. The Map outlines eight learning aspects that learners should holistically engage. This research focuses on the Analysing aspect of learning. According to the Map, Analysing has four key components: liking or engaging in logic, using interpretative lenses, seeking patterns, and critiquing and deconstructing. Existing scales of constructs (e.g., critical thinking, rationality) related to these components were identified so that the current scale could adapt items from. Specifically, items were phrased beginning with an “I”, followed by an action phrase, to fulfil the purpose of assessing learners' engagement with Analysing either in general or in classroom contexts. Paralleling standard scale development procedure, the 26-item Analysing scale was administered to 330 participants alongside existing scales with varying levels of association to Analysing, to establish construct validity. Subsequently, the scale was refined and its dimensionality, reliability, and validity were determined. Confirmatory factor analysis (CFA) revealed if scale items loaded onto the four factors corresponding to the components of Analysing. To refine the scale, items were systematically removed via an iterative procedure, according to their factor loadings and results of likelihood ratio tests at each step. Eight items were removed this way. The Analysing scale is better conceptualised as unidimensional, rather than comprising the four components identified by the Map, for three reasons: 1) the covariance matrix of the model specified for the CFA was not positive definite, 2) correlations among the four factors were high, and 3) exploratory factor analyses did not yield an easily interpretable factor structure of Analysing. Regarding validity, since the Analysing scale had higher correlations with conceptually similar scales than conceptually distinct scales, with minor exceptions, construct validity was largely established. Overall, satisfactory reliability and validity of the scale suggest that the current procedure can result in a valid and easy-touse measure for each aspect of the Map.Keywords: analytical thinking, dialogical inquiry, education, lifelong learning, pedagogy, scale development
Procedia PDF Downloads 912122 Enhancing Cellulose Acetate Films: Impact of Glycerol and Ionic Liquid Plasticizers
Authors: Rezzouq Asiya, Bouftou Abderrahim, Belfadil Doha, Taoufyk Azzeddine, El Bouchti Mehdi, Zyade Souad, Cherkaoui Omar, Majid Sanaa
Abstract:
Plastic packaging is widely used, but its pollution is a major environmental problem. Solutions require new sustainable technologies, environmental management, and the use of bio-based polymers as sustainable packaging. Cellulose acetate (CA) is a biobased polymer used in a variety of applications such as the manufacture of plastic films, textiles, and filters. However, it has limitations in terms of thermal stability and rigidity, which necessitates the addition of plasticizers to optimize its use in packaging. Plasticizers are molecules that increase the flexibility of polymers, but their influence on the chemical and physical properties of films (CA) has not been studied in detail. Some studies have focused on mechanical and thermal properties. However, an in-depth analysis is needed to understand the interactions between the additives and the polymer matrix. In this study, the aim is to examine the effect of two types of plasticizers, glycerol (a conventional plasticizer) and an ionic liquid, on the transparency, mechanical, thermal and barrier properties of cellulose acetate (CA) films prepared by the solution-casting method . Various analytical techniques were used to characterize these films, including infrared spectroscopy (FT-IR), X-ray diffraction (XRD), thermogravimetric analysis (TGA), water vapor permeability (WVP), oxygen permeability, scanning electron microscopy (SEM), opacity, transmission analysis and mechanical tests.Keywords: cellulose acetate, plasticizers, biopolymers, ionic liquid, glycerol.
Procedia PDF Downloads 402121 Design of Aesthetic Acoustic Metamaterials Window Panel Based on Sierpiński Fractal Triangle for Sound-Silencing with Free Airflow
Authors: Sanjeet Kumar Singh, Shantanu Bhatacharya
Abstract:
Design of high-efficiency low, frequency (<1000Hz) soundproof window or wall absorber which is transparent to airflow is presented. Due to the massive rise in human population and modernization, environmental noise has significantly risen globally. Prolonged noise exposure can cause severe physiological and psychological symptoms like nausea, headaches, fatigue, and insomnia. There has been continuous growth in building construction and infrastructure like offices, bus stops, and airports due to the urban population. Generally, a ventilated window is used for getting fresh air into the room, but at the same time, unwanted noise comes along. Researchers used traditional approaches like noise barrier mats in front of the window or designed the entire window using sound-absorbing materials. However, this solution is not aesthetically pleasing, and at the same time, it's heavy and not adequate for low-frequency noise shielding. To address this challenge, we design a transparent hexagonal panel based on the Sierpiński fractal triangle, which is aesthetically pleasing and demonstrates a normal incident sound absorption coefficient of more than 0.96 around 700 Hz and transmission loss of around 23 dB while maintaining e air circulation through the triangular cutout. Next, we present a concept of fabrication of large acoustic panels for large-scale applications, which leads to suppressing urban noise pollution.Keywords: acoustic metamaterials, ventilation, urban noise pollution, noise control
Procedia PDF Downloads 1082120 Effect of Temperature on Pervaporation Performance of Ag-Poly Vinyl Alcohol Nanocomposite Membranes
Authors: Asmaa Selim, Peter Mizsey
Abstract:
Bio-ethanol is considered of higher potential as a green renewable energy source owing to its environmental benefits and its high efficiency. In the present study, silver nanoparticles were in-situ generated in a poly (vinyl alcohol) in order to improve its potentials for pervaporation of ethanol-water mixture using solution-casting. Effect of silver content on the pervaporation separation index and the enrichment factor of the membrane at 15 percentage mass water at 40ᵒC was reported. Pervaporation data for nanocomposite membranes showed around 100% increase in the water permeance values while the intrinsic selectivity decreased. The water permeances of origin crosslinked PVA membrane, and the 2.5% silver loaded PVA membrane are 26.65 and 70.45 (g/m².kPa.h) respectively. The values of total flux and water flux are closed to each other, indicating that membranes could be effectively used to break the azeotropic point of ethanol-water. Effect of temperature on the pervaporation performance, permeation parameter and diffusion coefficient of both water and ethanol was discussed. The negative heat of sorption ∆Hs values calculated on the basis of the estimated Arrhenius activation energy values indicating that the sorption process was controlled by Langmuir’s mode. The overall results showed that the membrane containing 0.5 mass percentage of Ag salt exhibited excellent PV performance.Keywords: bio-ethanol, diffusion coefficient, nanocomposite, pervaporation, poly (vinyl alcohol), silver nanoparticles
Procedia PDF Downloads 1702119 Ultrathin Tin-Silicalite 1 Zeolite Membrane in Ester Solvent Recovery
Authors: Kun Liang Ang, Eng Toon Saw, Wei He, Xuecheng Dong, Seeram Ramakrishna
Abstract:
Ester solvents are widely used in pharmaceutical, printing and flavor industry due to their good miscibility, low toxicity, and high volatility. Through pervaporation, these ester solvents can be recovered from industrial wastewater. While metal-doped silicalite 1 zeolite membranes are commonly used in organic solvent recovery in the pervaporation process, these ceramic membranes suffer from low membrane permeation flux, mainly due to the high thickness of the metal-doped zeolite membrane. Herein, a simple method of fabricating an ultrathin tin-silicalite 1 membrane supported on alumina tube is reported. This ultrathin membrane is able to achieve high permeation flux and separation factor for an ester in a diluted aqueous solution. Nanosized tin-Silicalite 1 seeds which are smaller than 500nm has been formed through hydrothermal synthesis. The sn-Silicalite 1 seeds were then seeded onto alumina tube through dip coating, and the tin-Silicalite 1 membrane was then formed by hydrothermal synthesis in an autoclave through secondary growth method. Multiple membrane synthesis factors such as seed size, ceramic substrate surface pore size selection, and secondary growth conditions were studied for their effects on zeolite membrane growth. The microstructure, morphology and the membrane thickness of tin-Silicalite 1 zeolite membrane were examined. The membrane separation performance and stability will also be reported.Keywords: ceramic membrane, pervaporation, solvent recovery, Sn-MFI zeolite
Procedia PDF Downloads 1892118 Comparative Fracture Parameters of Khaya ivorensis and Magnolia obovata: Outlooks for the Development of Sustainable Mobility Materials
Authors: Riccardo Houngbegnon, Loic Chrislin Nguedjio, Valery Doko, José Xavier, Miran Merhar, Rostand Moutou Pitti
Abstract:
Against a backdrop of heightened awareness of environmental impact and the reduction of space debris, the use of sustainable materials for mobility applications is emerging as a promising solution to minimize the environmental footprint of our technologies. Among recent innovative developments in the use of wood, the Japanese species Magnolia obovata attracted particular interest when it was used in the design of the first wooden satellite launched in November 2024. The aim of this project is to explore new species that could replace M. obovata in a mobile context. Khaya ivorensis, a tropical African species, was selected and compared to M. obovata in terms of resistance to cracking, a key criterion in the durability of mobility infrastructures. Prior to the cracking tests, K. ivorensis and M. obovata were characterized to determine their basic mechanical properties. The results presented here relate to this characterization phase, in particular the four-point bending, compression and BING tests, which provided us with strengths and moduli. These results were compared with those found in the literature, which allowed us to observe a number of differences. CHARPY resilience tests were also performed and compare to critical energy release rate in order to estimate the ability of the two species to absorb energy, particularly following impacts and various shocks.Keywords: energy release rate, Khaya ivorensis, magnolia obovata, wood for mobility
Procedia PDF Downloads 62117 Authenticity of Lipid and Soluble Sugar Profiles of Various Oat Cultivars (Avena sativa)
Authors: Marijana M. Ačanski, Kristian A. Pastor, Djura N. Vujić
Abstract:
The identification of lipid and soluble sugar components in flour samples of different cultivars belonging to common oat species (Avena sativa L.) was performed: spring oat, winter oat and hulless oat. Fatty acids were extracted from flour samples with n-hexane, and derivatized into volatile methyl esters, using TMSH (trimethylsulfonium hydroxide in methanol). Soluble sugars were then extracted from defatted and dried samples of oat flour with 96% ethanol, and further derivatized into corresponding TMS-oximes, using hydroxylamine hydrochloride solution and BSTFA (N,O-bis-(trimethylsilyl)-trifluoroacetamide). The hexane and ethanol extracts of each oat cultivar were analyzed using GC-MS system. Lipid and simple sugar compositions are very similar in all samples of investigated cultivars. Chemometric tool was applied to numeric values of automatically integrated surface areas of detected lipid and simple sugar components in their corresponding derivatized forms. Hierarchical cluster analysis shows a very high similarity between the investigated flour samples of oat cultivars, according to the fatty acid content (0.9955). Moderate similarity was observed according to the content of soluble sugars (0.50). These preliminary results support the idea of establishing methods for oat flour authentication, and provide the means for distinguishing oat flour samples, regardless of the variety, from flour samples made of other cereal species, just by lipid and simple sugar profile analysis.Keywords: oat cultivars, lipid composition, soluble sugar composition, GC-MS, chemometrics, authentication
Procedia PDF Downloads 2952116 Elemental and Magnetic Properties of Bed Sediment of Siang River, a Major River of Brahmaputra Basin
Authors: Abhishek Dixit, Sandip S. Sathe, Chandan Mahanta
Abstract:
The Siang river originates in Angsi glacier in southern Tibet (there known as the Yarlung Tsangpo). After traveling through Indus-Tsangpo suture zone and deep gorges near Namcha Barwa peak, it takes a south-ward turn and enters India, where it is known as Siang river and becomes a major tributary of the Brahmaputra in Assam plains. In this study, we have analyzed the bed sediment of the Siang river at two locations (one at extreme upstream near the India-China border and one downstream before Siang Brahmaputra confluence). We have also sampled bed sediment at the remote location of Yammeng river, an eastern tributary of Siang. The magnetic hysteresis properties show the combination of paramagnetic and weak ferromagnetic behavior with a multidomain state. Moreover, curie temperature analysis shows titanomagnetite solid solution series, which is causing the weak ferromagnetic signature. Given that the magnetic mineral was in a multidomain state, the presence of Ti, Fe carrying heave mineral, may be inferred. The Chemical index of alteration shows less weathered sediment. However, the Yammeng river sample being close to source shows fresh grains subjected to physical weathering and least chemically alteration. Enriched Ca and K and depleted Na and Mg with respect to upper continental crust concentration also points toward the less intense chemical weathering along with the dominance of calcite weathering.Keywords: bed sediment, magnetic properties, Siang, weathering
Procedia PDF Downloads 1202115 Study of Corrosion Behavior of Experimental Alloys with Different Levels of Cr and High Levels of Mo Compared to Aisi 444
Authors: Ana P. R. N. Barroso, Maurício N. Kleinberg, Frederico R. Silva, Rodrigo F. Guimarães, Marcelo M. V. Parente, Walney S. Araújo
Abstract:
The fight against accelerated wear of the equipment used in the oil and gas sector is a challenge for minimizing maintenance costs. Corrosion being one of the main agents of equipment deterioration, we seek alternative materials that exhibit improved corrosion resistance at low cost of production. This study aims to evaluate the corrosion behavior of experimental alloys containing 15% and 17% of chromium (Cr) and 5% of molybdenum (Mo) in comparison with an AISI 444 commercial alloy. Microstructural analyzes were performed on samples of the alloys before and after the electrochemical tests. Two samples of each solubilized alloy were also taken for analysis of the corrosion behavior by testing potentiodynamic polarization (PP) and Electrochemical Impedance Spectroscopy (EIS) with immersion time of 24 hours in electrolytic solution with acidic character. The graphics obtained through electrochemical tests of PP and EIS indicated that among the experimental alloys, the alloy with higher chromium content (17%) had a higher corrosion resistance, confirming the beneficial effect of adding chromium. When comparing the experimental alloys with the AISI 444 commercial alloy, it is observed that the AISI 444 commercial alloy showed superior corrosion resistance to that of the experimental alloys for both assays, PP and EIS. The microstructural analyzes performed after the PP and EIS tests confirmed the results previously described. These results suggest that the addition of these levels of molybdenum did not favor the electrochemical behavior of experimental ferritic alloys for the electrolytic medium studied.Keywords: corrosion, molybdenum, electrochemical tests, experimental alloys
Procedia PDF Downloads 5732114 Ontology Expansion via Synthetic Dataset Generation and Transformer-Based Concept Extraction
Authors: Andrey Khalov
Abstract:
The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.Keywords: ontology expansion, synthetic dataset, transformer fine-tuning, concept extraction, DOLCE, BERT, taxonomy, LLM, NER
Procedia PDF Downloads 142113 Nanoparticles in Drug Delivery and Therapy of Alzeheimer's Disease
Authors: Nirupama Dixit, Anyaa Mittal, Neeru Sood
Abstract:
Alzheimer’s disease (AD) is a progressive form of dementia, contributing to up to 70% of cases, mostly observed in elderly but is not restricted to old age. The pathophysiology of the disease is characterized by specific pathological changes in brain. The changes (i.e. accumulation of metal ions in brain, formation of extracellular β-amyloid (Aβ) peptide aggregates and tangle of hyper phosphorylated Tau protein inside neurons) damage the neuronal connections irreversibly. The current issues in improvement of life quality of Alzheimer's patient lies in the fact that the diagnosis is made at a late stage of the disease and the medications do not treat the basic causes of Alzheimer's. The targeted delivery of drug through the blood brain barrier (BBB) poses several limitations via traditional approaches for treatment. To overcome these drug delivery limitation, nanoparticles provide a promising solution. This review focuses on current strategies for efficient targeted drug delivery using nanoparticles and improving the quality of therapy provided to the patient. Nanoparticles can be used to encapsulate drug (which is generally hydrophobic) to ensure its passage to brain; they can be conjugated to metal ion chelators to reduce the metal load in neural tissue thus lowering the harmful effects of oxidative damage; can be conjugated with drug and monoclonal antibodies against BBB endogenous receptors. Finally this review covers how the nanoparticles can play a role in diagnosing the disease.Keywords: Alzheimer's disease, β-amyloid plaques, blood brain barrier, metal chelators, nanoparticles
Procedia PDF Downloads 4902112 AniMoveMineR: Animal Behavior Exploratory Analysis Using Association Rules Mining
Authors: Suelane Garcia Fontes, Silvio Luiz Stanzani, Pedro L. Pizzigatti Corrła Ronaldo G. Morato
Abstract:
Environmental changes and major natural disasters are most prevalent in the world due to the damage that humanity has caused to nature and these damages directly affect the lives of animals. Thus, the study of animal behavior and their interactions with the environment can provide knowledge that guides researchers and public agencies in preservation and conservation actions. Exploratory analysis of animal movement can determine the patterns of animal behavior and with technological advances the ability of animals to be tracked and, consequently, behavioral studies have been expanded. There is a lot of research on animal movement and behavior, but we note that a proposal that combines resources and allows for exploratory analysis of animal movement and provide statistical measures on individual animal behavior and its interaction with the environment is missing. The contribution of this paper is to present the framework AniMoveMineR, a unified solution that aggregates trajectory analysis and data mining techniques to explore animal movement data and provide a first step in responding questions about the animal individual behavior and their interactions with other animals over time and space. We evaluated the framework through the use of monitored jaguar data in the city of Miranda Pantanal, Brazil, in order to verify if the use of AniMoveMineR allows to identify the interaction level between these jaguars. The results were positive and provided indications about the individual behavior of jaguars and about which jaguars have the highest or lowest correlation.Keywords: data mining, data science, trajectory, animal behavior
Procedia PDF Downloads 1442111 Design of Liquid Crystal Based Interface to Study the Interaction of Gram Negative Bacterial Endotoxin with Milk Protein Lactoferrin
Authors: Dibyendu Das, Santanu Kumar Pal
Abstract:
Milk protein lactoferrin (Lf) exhibits potent antibacterial activity due to its interaction with Gram-negative bacterial cell membrane component, lipopolysaccharide (LPS). This paper represents fabrication of new Liquid crystals (LCs) based biosensors to explore the interaction between Lf and LPS. LPS self-assembled at aqueous/LCs interface and orients interfacial nematic 4-cyano-4’- pentylbiphenyl (5CB) LCs in a homeotropic fashion (exhibiting dark optical image under polarized optical microscope). Interestingly, on the exposure of Lf on LPS decorated aqueous/LCs interface, an optical image of LCs changed from dark to bright indicating an ordering alteration of interfacial LCs from homeotropic to tilted/planar state. The ordering transition reflects strong binding between Lf and interfacial LPS that, in turn, perturbs the orientation of LCs. With the help of epifluorescence microscopy, we further affirmed the interfacial LPS-Lf binding event by imaging the presence of FITC tagged Lf at the LPS laden aqueous/LCs interface. Finally, we have investigated the conformational behavior of Lf in solution as well as in the presence of LPS using Circular Dichroism (CD) spectroscopy and further reconfirmed with Vibrational Circular Dichroism (VCD) spectroscopy where we found that Lf undergoes alpha-helix to random coil-like structure in the presence of LPS. As a whole the entire results described in this paper establish a robust approach to envisage the interaction between LPS and Lf through the ordering transitions of LCs at aqueous/LCs interface.Keywords: endotoxin, interface, lactoferrin, lipopolysaccharide
Procedia PDF Downloads 2662110 Fuzzy Approach for the Evaluation of Feasibility Levels of Vehicle Movement on the Disaster-Streaking Zone’s Roads
Authors: Gia Sirbiladze
Abstract:
Route planning problems are among the activities that have the highest impact on logistical planning, transportation, and distribution because of their effects on efficiency in resource management, service levels, and client satisfaction. In extreme conditions, the difficulty of vehicle movement between different customers causes the imprecision of time of movement and the uncertainty of the feasibility of movement. A feasibility level of vehicle movement on the closed route of the disaster-streaking zone is defined for the construction of an objective function. Experts’ evaluations of the uncertain parameters in q-rung ortho-pair fuzzy numbers (q-ROFNs) are presented. A fuzzy bi-objective combinatorial optimization problem of fuzzy vehicle routine problem (FVRP) is constructed based on the technique of possibility theory. The FVRP is reduced to the bi-criteria partitioning problem for the so-called “promising” routes which were selected from the all-admissible closed routes. The convenient selection of the “promising” routes allows us to solve the reduced problem in real-time computing. For the numerical solution of the bi-criteria partitioning problem, the -constraint approach is used. The main results' support software is designed. The constructed model is illustrated with a numerical example.Keywords: q-rung ortho-pair fuzzy sets, facility location selection problem, multi-objective combinatorial optimization problem, partitioning problem
Procedia PDF Downloads 1342109 Outsourcing the Front End of Innovation
Abstract:
The paper presents a new method for efficient innovation process management. Even though the innovation management methods, tools and knowledge are well established and documented in literature, most of the companies still do not manage it efficiently. Especially in SMEs the front end of innovation - problem identification, idea creation and selection - is often not optimally performed. Our eMIPS methodology represents a sort of "umbrella methodology"- a well-defined set of procedures, which can be dynamically adapted to the concrete case in a company. In daily practice, various methods (e.g. for problem identification and idea creation) can be applied, depending on the company's needs. It is based on the proactive involvement of the company's employees supported by the appropriate methodology and external experts. The presented phases are performed via a mixture of face-to-face activities (workshops) and online (eLearning) activities taking place in eLearning Moodle environment and using other e-communication channels. One part of the outcomes is an identified set of opportunities and concrete solutions ready for implementation. The other also very important result is connected to innovation competences for the participating employees related with concrete tools and methods for idea management. In addition, the employees get a strong experience for dynamic, efficient and solution oriented managing of the invention process. The eMIPS also represents a way of establishing or improving the innovation culture in the organization. The first results in a pilot company showed excellent results regarding the motivation of participants and also as to the results achieved.Keywords: creativity, distance learning, front end, innovation, problem
Procedia PDF Downloads 3282108 Selective Solvent Extraction of Co from Ni and Mn through Outer-Sphere Interactions
Authors: Korban Oosthuizen, Robert C. Luckay
Abstract:
Due to the growing popularity of electric vehicles and the importance of cobalt as part of the cathode material for lithium-ion batteries, demand for this metal is on the rise. Recycling of the cathode materials by means of solvent extraction is an attractive means of recovering cobalt and easing the pressure on limited natural resources. In this study, a series of straight chain and macrocyclic diamine ligands were developed for the selective recovery of cobalt from the solution containing nickel and manganese by means of solvent extraction. This combination of metals is the major cathode material used in electric vehicle batteries. The ligands can be protonated and function as ion-pairing ligands targeting the anionic [CoCl₄]²⁻, a species which is not observed for Ni or Mn. Selectivity for Co was found to be good at very high chloride concentrations and low pH. Longer chains or larger macrocycles were found to enhance selectivity, and linear chains on the amide side groups also resulted in greater selectivity over the branched groups. The cation of the chloride salt used for adjusting chloride concentrations seems to play a major role in extraction through salting-out effects. The ligands developed in this study show good selectivity for Co over Ni and Mn but require very high chloride concentrations to function. This research does, however, open the door for further investigations into using diamines as solvent extraction ligands for the recovery of cobalt from spent lithium-ion batteries.Keywords: hydrometallurgy, solvent extraction, cobalt, lithium-ion batteries
Procedia PDF Downloads 782107 Comparative Study of Ni Catalysts Supported by Silica and Modified by Metal Additions Co and Ce for The Steam Reforming of Methane
Authors: Ali Zazi, Ouiza Cherifi
Abstract:
The Catalysts materials Ni-SiO₂, Ni-Co-SiO₂ and Ni-Ce-SiO₂ were synthetized by classical method impregnation and supported by silica. This involves combing the silica with an adequate rate of the solution of nickel nitrates, or nickel nitrate and cobalt nitrate, or nickel nitrate and cerium nitrate, mixed, dried and calcined at 700 ° c. These catalysts have been characterized by different physicochemical analysis techniques. The atomic absorption spectrometry indicates that the real contents of nickel, cerium and cobalt are close to the theoretical contents previously assumed, which let's say that the nitrate solutions have impregnated well the silica support. The BET results show that the surface area of the specific surfaces decreases slightly after impregnation with nickel nitrates or Co and Ce metals and a further slight decrease after the reaction. This is likely due to coke deposition. X-ray diffraction shows the presence of the different SiO₂ and NiO phases for all catalysts—theCoO phase for that promoted by Co and the Ce₂O₂ phase for that promoted by Ce. The methane steam reforming reaction was carried out on a quartz reactor in a fixed bed. Reactants and products of the reaction were analyzed by a gas chromatograph. This study shows that the metal addition of Cerium or Cobalt improves the majority of the catalytic performance of Ni for the steam reforming reaction of methane. And we conclude the classification of our Catalysts in order of decreasing activity and catalytic performances as follows: Ni-Ce / SiO₂ >Ni-Co / SiO₂> Ni / SiO₂ .Keywords: cerium, cobalt, heterogeneous catalysis, hydrogen, methane, steam reforming, synthesis gas
Procedia PDF Downloads 1922106 Improving the Employee Transfer Experience within an Organization
Authors: Drew Fockler
Abstract:
This research examines how to improve an employee’s experience when transferring between departments within an organization. This research includes a historical review of a Canadian retail organization. Based on this historical review, gaps are identified between current and future visions to show where problems with existing training and development practices need to be resolved to reduce front-line employee turnover within an organization. The strategies within this paper support leaders through the LEAD: Listen, Explore, Act and Develop, Change Management Model. The LEAD Change Management Model supports the change process. This research proposes three possible solutions to improve an employee who is transferring between departments. The best solution to resolve the problem of improving an employee moving between departments experience is creating a Training Manager position within the retail store. A Training Manager position could support both employees and leadership with training and development of staff who are moving between departments. Within this research, an implementation plan using the TransX Model was created. The TransX Model is a hybrid of Leader-Member Exchange Theory and Transformational Leadership Theory to facilitate this organizational change within an organization by creating a common vision. Finally, this research provides the next steps as well as future considerations to enhance the training manager role within an organization.Keywords: employee transfers, employee engagement, human resources, employee induction, TransX model, lead change management model
Procedia PDF Downloads 772105 Nigerian Football System: Examining Micro-Level Practices against a Global Model for Integrated Development of Mass and Elite Sport
Authors: Iorwase Derek Kaka’an, Peter Smolianov, Steven Dion, Christopher Schoen, Jaclyn Norberg, Charles Gabriel Iortimah
Abstract:
This study examines the current state of football in Nigeria to identify the country's practices, which could be useful internationally, and to determine areas for improvement. Over 200 sources of literature on sport delivery systems in successful sports nations were analyzed to construct a globally applicable model of elite football integrated with mass participation, comprising of the following three levels: macro (socio-economic, cultural, legislative, and organizational), meso (infrastructures, personnel, and services enabling sports programs) and micro level (operations, processes, and methodologies for the development of individual athletes). The model has received scholarly validation and has shown to be a framework for program analysis that is not culturally bound. It has recently been utilized for further understanding such sports systems as US rugby, tennis, soccer, swimming, and volleyball, as well as Dutch and Russian swimming. A questionnaire was developed using the above-mentioned model. Survey questions were validated by 12 experts including academicians, executives from sports governing bodies, football coaches, and administrators. To identify best practices and determine areas for improvement of football in Nigeria, 116 coaches completed the questionnaire. Useful exemplars and possible improvements were further identified through semi-structured discussions with 10 Nigerian football administrators and experts. Finally, a content analysis of the Nigeria Football Federation's website and organizational documentation was conducted. This paper focuses on the micro level of Nigerian football delivery, particularly talent search and development as well as advanced athlete preparation and support. Results suggested that Nigeria could share such progressive practices as the provision of football programs in all schools and full-time coaches paid by governments based on the level of coach education. Nigerian football administrators and coaches could provide better football services affordable for all, where success in mass and elite sports is guided by science focused on athletes' needs. Better implemented could be international best practices such as lifelong guidelines for health and excellence of everyone and integration of fitness tests into player development and ranking as done in best Dutch, English, French, Russian, Spanish, and other European clubs; integration of educational and competitive events for elite and developing athletes as well as fans as done at the 2018 World Cup Russia; and academies with multi-stage athlete nurturing as done by Ajax in Africa as well as Barcelona FC and other top clubs expanding across the world. The methodical integration of these practices into the balanced development of mass and elite football will help contribute to international sports success as well as national health, education, crime control, and social harmony in Nigeria.Keywords: football, high performance, mass participation, Nigeria, sport development
Procedia PDF Downloads 702104 AutoML: Comprehensive Review and Application to Engineering Datasets
Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili
Abstract:
The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.Keywords: automated machine learning, uncertainty, engineering dataset, regression
Procedia PDF Downloads 612103 Early Diagnosis of Myocardial Ischemia Based on Support Vector Machine and Gaussian Mixture Model by Using Features of ECG Recordings
Authors: Merve Begum Terzi, Orhan Arikan, Adnan Abaci, Mustafa Candemir
Abstract:
Acute myocardial infarction is a major cause of death in the world. Therefore, its fast and reliable diagnosis is a major clinical need. ECG is the most important diagnostic methodology which is used to make decisions about the management of the cardiovascular diseases. In patients with acute myocardial ischemia, temporary chest pains together with changes in ST segment and T wave of ECG occur shortly before the start of myocardial infarction. In this study, a technique which detects changes in ST/T sections of ECG is developed for the early diagnosis of acute myocardial ischemia. For this purpose, a database of real ECG recordings that contains a set of records from 75 patients presenting symptoms of chest pain who underwent elective percutaneous coronary intervention (PCI) is constituted. 12-lead ECG’s of the patients were recorded before and during the PCI procedure. Two ECG epochs, which are the pre-inflation ECG which is acquired before any catheter insertion and the occlusion ECG which is acquired during balloon inflation, are analyzed for each patient. By using pre-inflation and occlusion recordings, ECG features that are critical in the detection of acute myocardial ischemia are identified and the most discriminative features for the detection of acute myocardial ischemia are extracted. A classification technique based on support vector machine (SVM) approach operating with linear and radial basis function (RBF) kernels to detect ischemic events by using ST-T derived joint features from non-ischemic and ischemic states of the patients is developed. The dataset is randomly divided into training and testing sets and the training set is used to optimize SVM hyperparameters by using grid-search method and 10fold cross-validation. SVMs are designed specifically for each patient by tuning the kernel parameters in order to obtain the optimal classification performance results. As a result of implementing the developed classification technique to real ECG recordings, it is shown that the proposed technique provides highly reliable detections of the anomalies in ECG signals. Furthermore, to develop a detection technique that can be used in the absence of ECG recording obtained during healthy stage, the detection of acute myocardial ischemia based on ECG recordings of the patients obtained during ischemia is also investigated. For this purpose, a Gaussian mixture model (GMM) is used to represent the joint pdf of the most discriminating ECG features of myocardial ischemia. Then, a Neyman-Pearson type of approach is developed to provide detection of outliers that would correspond to acute myocardial ischemia. Neyman – Pearson decision strategy is used by computing the average log likelihood values of ECG segments and comparing them with a range of different threshold values. For different discrimination threshold values and number of ECG segments, probability of detection and probability of false alarm values are computed, and the corresponding ROC curves are obtained. The results indicate that increasing number of ECG segments provide higher performance for GMM based classification. Moreover, the comparison between the performances of SVM and GMM based classification showed that SVM provides higher classification performance results over ECG recordings of considerable number of patients.Keywords: ECG classification, Gaussian mixture model, Neyman–Pearson approach, support vector machine
Procedia PDF Downloads 1622102 Copper (II) Complex of New Tetradentate Asymmetrical Schiff Base Ligand: Synthesis, Characterization, and Catecholase-Mimetic Activity
Authors: Cahit Demetgul, Sahin Bayraktar, Neslihan Beyazit
Abstract:
Metalloenzymes are enzyme proteins containing metal ions, which are directly bound to the protein or to enzyme-bound nonprotein components. One of the major metalloenzymes that play a key role in oxidation reactions is catechol oxidase, which shows catecholase activity i.e. oxidation of a broad range of catechols to quinones through the four-electron reduction of molecular oxygen to water. Studies on the model compounds mimicking the catecholase activity are very useful and promising for the development of new, more efficient bioinspired catalysts, for in vitro oxidation reactions. In this study, a new tetradentate asymmetrical Schiff-base and its Cu(II) complex were synthesized by condensation of 4-nitro-1,2-phenylenediamine with 6-formyl-7-hydroxy-5-methoxy-2-methylbenzopyran-4-one and by using an appropriate Cu(II) salt, respectively. The prepared compounds were characterized by elemental analysis, FT-IR, NMR, UV-Vis and magnetic susceptibility. The catecholase-mimicking activity of the new Schiff Base Cu(II) complex was performed for the oxidation of 3,5-di-tert-butylcatechol (3,5-DTBC) in methanol at 25 °C, where the electronic spectra were recorded at different time intervals. The yield of the quinone (3,5-DTBQ) was determined from the measured absorbance at 400 nm of the resulting solution. The compatibility of catalytic reaction with Michaelis-Menten kinetics was also investigated. In conclusion, we have found that our new Schiff Base Cu(II) complex presents a significant capacity to catalyze the oxidation reaction of the catechol to o-quinone.Keywords: catecholase activity, Michaelis-Menten kinetics, Schiff base, transition metals
Procedia PDF Downloads 3102101 Prefabricated Integral Design of Building Services
Authors: Mina Mortazavi
Abstract:
The common approach in the construction industry for restraint requirements in existing structures or new constructions is to have Non-Structural Components (NSCs) assembled and installed on-site by different MEP subcontractors. This leads to a lack of coordination and higher costs, construction time, and complications due to inaccurate building information modelling (BIM) systems. Introducing NSCs to a consistent BIM system from the beginning of the design process and considering their seismic loads in the analysis and design process can improve coordination and reduce costs and time. One solution is to use prefabricated mounts with attached MEPs delivered as an integral module. This eliminates the majority of coordination complications and reduces design and installation costs and time. An advanced approach is to have as many NSCs as possible installed in the same prefabricated module, which gives the structural engineer the opportunity to consider the involved component weights and locations in the analysis and design of the prefabricated support. This efficient approach eliminates coordination and access issues, leading to enhanced quality control. This research will focus on the existing literature on modular sub-assemblies that are integrated with architectural and structural components. Modular MEP systems take advantage of the precision provided by BIM tools to meet exact requirements and achieve a buildable design every time. Modular installations that include MEP systems provide efficient solutions for the installation of MEP services or components.Keywords: building services, modularisation, prefabrication, integral building design
Procedia PDF Downloads 722100 Energy Efficient Plant Design Approaches: Case Study of the Sample Building of the Energy Efficiency Training Facilities
Authors: Idil Kanter Otcu
Abstract:
Nowadays, due to the growing problems of energy supply and the drastic reduction of natural non-renewable resources, the development of new applications in the energy sector and steps towards greater efficiency in energy consumption are required. Since buildings account for a large share of energy consumption, increasing the structural density of buildings causes an increase in energy consumption. This increase in energy consumption means that energy efficiency approaches to building design and the integration of new systems using emerging technologies become necessary in order to curb this consumption. As new systems for productive usage of generated energy are developed, buildings that require less energy to operate, with rational use of resources, need to be developed. One solution for reducing the energy requirements of buildings is through landscape planning, design and application. Requirements such as heating, cooling and lighting can be met with lower energy consumption through planting design, which can help to achieve more efficient and rational use of resources. Within this context, rather than a planting design which considers only the ecological and aesthetic features of plants, these considerations should also extend to spatial organization whereby the relationship between the site and open spaces in the context of climatic elements and planting designs are taken into account. In this way, the planting design can serve an additional purpose. In this study, a landscape design which takes into consideration location, local climate morphology and solar angle will be illustrated on a sample building project.Keywords: energy efficiency, landscape design, plant design, xeriscape landscape
Procedia PDF Downloads 2612099 IOT Based Process Model for Heart Monitoring Process
Authors: Dalyah Y. Al-Jamal, Maryam H. Eshtaiwi, Liyakathunisa Syed
Abstract:
Connecting health services with technology has a huge demand as people health situations are becoming worse day by day. In fact, engaging new technologies such as Internet of Things (IOT) into the medical services can enhance the patient care services. Specifically, patients suffering from chronic diseases such as cardiac patients need a special care and monitoring. In reality, some efforts were previously taken to automate and improve the patient monitoring systems. However, the previous efforts have some limitations and lack the real-time feature needed for chronic kind of diseases. In this paper, an improved process model for patient monitoring system specialized for cardiac patients is presented. A survey was distributed and interviews were conducted to gather the needed requirements to improve the cardiac patient monitoring system. Business Process Model and Notation (BPMN) language was used to model the proposed process. In fact, the proposed system uses the IOT Technology to assist doctors to remotely monitor and follow-up with their heart patients in real-time. In order to validate the effectiveness of the proposed solution, simulation analysis was performed using Bizagi Modeler tool. Analysis results show performance improvements in the heart monitoring process. For the future, authors suggest enhancing the proposed system to cover all the chronic diseases.Keywords: IoT, process model, remote patient monitoring system, smart watch
Procedia PDF Downloads 3322098 Review of Electronic Voting as a Panacea for Election Malpractices in Nigerian Political System: Challenges, Benefits, and Issues
Authors: Muhammad Muhammad Suleiman
Abstract:
The Nigerian political system has witnessed rising occurrences of election malpractice in the last decade. This has been due to election rigging and other forms of electoral fraud. In order to find a sustainable solution to this malpractice, the introduction of electronic voting (e-voting) has been suggested. This paper reviews the challenges, benefits, and issues associated with e-voting as a panacea for election malpractice in Nigeria. The review of existing literature revealed that e-voting can reduce the cost of conducting elections and reduce the opportunity for electoral fraud. The review suggests that the introduction of e-voting in the Nigerian political system would require adequate cybersecurity measures, trust-building initiatives, and proper legal frameworks to ensure its successful implementation. It is recommended that there should be an effective policy that would ensure the security of the system as well as the credibility of the results. Furthermore, a comprehensive awareness campaign needs to be conducted to ensure that voters understand the process and are comfortable using the system. In conclusion, e-voting has the potential to reduce the occurrence of election malpractice in the Nigerian political system. However, the successful implementation of e-voting will require effective policy interventions and trust-building initiatives. Additionally, the costs of acquiring the necessary infrastructure and equipment and implementing proper legal frameworks need to be considered.Keywords: electronic voting, general election, candidate, INEC, cyberattack
Procedia PDF Downloads 104