Search results for: analytical validation
2875 Safety Validation of Black-Box Autonomous Systems: A Multi-Fidelity Reinforcement Learning Approach
Authors: Jared Beard, Ali Baheri
Abstract:
As autonomous systems become more prominent in society, ensuring their safe application becomes increasingly important. This is clearly demonstrated with autonomous cars traveling through a crowded city or robots traversing a warehouse with heavy equipment. Human environments can be complex, having high dimensional state and action spaces. This gives rise to two problems. One being that analytic solutions may not be possible. The other is that in simulation based approaches, searching the entirety of the problem space could be computationally intractable, ruling out formal methods. To overcome this, approximate solutions may seek to find failures or estimate their likelihood of occurrence. One such approach is adaptive stress testing (AST) which uses reinforcement learning to induce failures in the system. The premise of which is that a learned model can be used to help find new failure scenarios, making better use of simulations. In spite of these failures AST fails to find particularly sparse failures and can be inclined to find similar solutions to those found previously. To help overcome this, multi-fidelity learning can be used to alleviate this overuse of information. That is, information in lower fidelity can simulations can be used to build up samples less expensively, and more effectively cover the solution space to find a broader set of failures. Recent work in multi-fidelity learning has passed information bidirectionally using “knows what it knows” (KWIK) reinforcement learners to minimize the number of samples in high fidelity simulators (thereby reducing computation time and load). The contribution of this work, then, is development of the bidirectional multi-fidelity AST framework. Such an algorithm, uses multi-fidelity KWIK learners in an adversarial context to find failure modes. Thus far, a KWIK learner has been used to train an adversary in a grid world to prevent an agent from reaching its goal; thus demonstrating the utility of KWIK learners in an AST framework. The next step is implementation of the bidirectional multi-fidelity AST framework described. Testing will be conducted in a grid world containing an agent attempting to reach a goal position and adversary tasked with intercepting the agent as demonstrated previously. Fidelities will be modified by adjusting the size of a time-step, with higher-fidelity effectively allowing for more responsive closed loop feedback. Results will compare the single KWIK AST learner with the multi-fidelity algorithm with respect to number of samples, distinct failure modes found, and relative effect of learning after a number of trials.Keywords: multi-fidelity reinforcement learning, multi-fidelity simulation, safety validation, falsification
Procedia PDF Downloads 1572874 The Comparison of of Stress Level between Students with Parents and Those without Parents
Authors: Hendeh Majdi, Zahra Arzjani
Abstract:
This research aimed at the comparison of level of stress between students had parents and those without parents by descriptive-analytical study. To do research number of 128 questionnaires (64 students with parents and 64 students without parents) were distributed among high school in Ray city, Tehran province through classified sampling. The results showed that level of stress in stud tent without parents has been effective and the most important proposal is that necessity study should be considered in decreasing level of stress in students without parent.Keywords: stress, students with parents, without parents, Ray city
Procedia PDF Downloads 4992873 Stress Field Induced By an Interfacial Edge Dislocation in a Multi-Layered Medium
Authors: Aditya Khanna, Andrei Kotousov
Abstract:
A novel method is presented for obtaining the stress field induced by an edge dislocation in a multilayered composite. To demonstrate the applications of the obtained solution, we consider the problem of an interfacial crack in a periodically layered bimaterial medium. The crack is modeled as a continuous distribution of edge dislocations and the Distributed Dislocation Technique (DDT) is utilized to obtain numerical results for the energy release rate (ERR). The numerical results correspond well with previously published results and the comparison serves as a validation of the obtained dislocation solution.Keywords: distributed dislocation technique, edge dislocation, elastic field, interfacial crack, multi-layered composite
Procedia PDF Downloads 4452872 Additive Friction Stir Manufacturing Process: Interest in Understanding Thermal Phenomena and Numerical Modeling of the Temperature Rise Phase
Authors: Antoine Lauvray, Fabien Poulhaon, Pierre Michaud, Pierre Joyot, Emmanuel Duc
Abstract:
Additive Friction Stir Manufacturing (AFSM) is a new industrial process that follows the emergence of friction-based processes. The AFSM process is a solid-state additive process using the energy produced by the friction at the interface between a rotating non-consumable tool and a substrate. Friction depends on various parameters like axial force, rotation speed or friction coefficient. The feeder material is a metallic rod that flows through a hole in the tool. Unlike in Friction Stir Welding (FSW) where abundant literature exists and addresses many aspects going from process implementation to characterization and modeling, there are still few research works focusing on AFSM. Therefore, there is still a lack of understanding of the physical phenomena taking place during the process. This research work aims at a better AFSM process understanding and implementation, thanks to numerical simulation and experimental validation performed on a prototype effector. Such an approach is considered a promising way for studying the influence of the process parameters and to finally identify a process window that seems relevant. The deposition of material through the AFSM process takes place in several phases. In chronological order these phases are the docking phase, the dwell time phase, the deposition phase, and the removal phase. The present work focuses on the dwell time phase that enables the temperature rise of the system composed of the tool, the filler material, and the substrate and due to pure friction. Analytic modeling of heat generation based on friction considers as main parameters the rotational speed and the contact pressure. Another parameter considered influential is the friction coefficient assumed to be variable due to the self-lubrication of the system with the rise in temperature or the materials in contact roughness smoothing over time. This study proposes, through numerical modeling followed by experimental validation, to question the influence of the various input parameters on the dwell time phase. Rotation speed, temperature, spindle torque, and axial force are the main monitored parameters during experimentations and serve as reference data for the calibration of the numerical model. This research shows that the geometry of the tool as well as fluctuations of the input parameters like axial force and rotational speed are very influential on the temperature reached and/or the time required to reach the targeted temperature. The main outcome is the prediction of a process window which is a key result for a more efficient process implementation.Keywords: numerical model, additive manufacturing, friction, process
Procedia PDF Downloads 1472871 Flow Visualization in Biological Complex Geometries for Personalized Medicine
Authors: Carlos Escobar-del Pozo, César Ahumada-Monroy, Azael García-Rebolledo, Alberto Brambila-Solórzano, Gregorio Martínez-Sánchez, Luis Ortiz-Rincón
Abstract:
Numerical simulations of flow in complex biological structures have gained considerable attention in the last years. However, the major issue is the validation of the results. The present work shows a Particle Image Velocimetry PIV flow visualization technique in complex biological structures, particularly in intracranial aneurysms. A methodology to reconstruct and generate a transparent model has been developed, as well as visualization and particle tracking techniques. The generated transparent models allow visualizing the flow patterns with a regular camera using the visualization techniques. The final goal is to use visualization as a tool to provide more information on the treatment and surgery decisions in aneurysms.Keywords: aneurysms, PIV, flow visualization, particle tracking
Procedia PDF Downloads 902870 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Models
Authors: Alam Ali, Ashok Kumar Pathak
Abstract:
Path analysis is a statistical technique used to evaluate the direct and indirect effects of variables in path models. One or more structural regression equations are used to estimate a series of parameters in path models to find the better fit of data. However, sometimes the assumptions of classical regression models, such as ordinary least squares (OLS), are violated by the nature of the data, resulting in insignificant direct and indirect effects of exogenous variables. This article aims to explore the effectiveness of a copula-based regression approach as an alternative to classical regression, specifically when variables are linked through an elliptical copula.Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique
Procedia PDF Downloads 432869 Solid State Drive End to End Reliability Prediction, Characterization and Control
Authors: Mohd Azman Abdul Latif, Erwan Basiron
Abstract:
A flaw or drift from expected operational performance in one component (NAND, PMIC, controller, DRAM, etc.) may affect the reliability of the entire Solid State Drive (SSD) system. Therefore, it is important to ensure the required quality of each individual component through qualification testing specified using standards or user requirements. Qualification testing is time-consuming and comes at a substantial cost for product manufacturers. A highly technical team, from all the eminent stakeholders is embarking on reliability prediction from beginning of new product development, identify critical to reliability parameters, perform full-blown characterization to embed margin into product reliability and establish control to ensure the product reliability is sustainable in the mass production. The paper will discuss a comprehensive development framework, comprehending SSD end to end from design to assembly, in-line inspection, in-line testing and will be able to predict and to validate the product reliability at the early stage of new product development. During the design stage, the SSD will go through intense reliability margin investigation with focus on assembly process attributes, process equipment control, in-process metrology and also comprehending forward looking product roadmap. Once these pillars are completed, the next step is to perform process characterization and build up reliability prediction modeling. Next, for the design validation process, the reliability prediction specifically solder joint simulator will be established. The SSD will be stratified into Non-Operating and Operating tests with focus on solder joint reliability and connectivity/component latent failures by prevention through design intervention and containment through Temperature Cycle Test (TCT). Some of the SSDs will be subjected to the physical solder joint analysis called Dye and Pry (DP) and Cross Section analysis. The result will be feedbacked to the simulation team for any corrective actions required to further improve the design. Once the SSD is validated and is proven working, it will be subjected to implementation of the monitor phase whereby Design for Assembly (DFA) rules will be updated. At this stage, the design change, process and equipment parameters are in control. Predictable product reliability at early product development will enable on-time sample qualification delivery to customer and will optimize product development validation, effective development resource and will avoid forced late investment to bandage the end-of-life product failures. Understanding the critical to reliability parameters earlier will allow focus on increasing the product margin that will increase customer confidence to product reliability.Keywords: e2e reliability prediction, SSD, TCT, solder joint reliability, NUDD, connectivity issues, qualifications, characterization and control
Procedia PDF Downloads 1742868 Implementation and Validation of Therapeutic Tourism Products for Families With Children With Autism Spectrum Disorder in Azores Islands: “Azores All in Blue” Project
Authors: Ana Rita Conde, Pilar Mota, Tânia Botelho, Suzana Caldeira, Isabel Rego, Jessica Pacheco, Osvaldo Silva, Áurea Sousa
Abstract:
Tourism promotes well-being and health to children with ASD and their families. Literature indicates the need to provide tourist activities that integrate the therapeutic component, to promote the development and well-being of children with ASD. The study aims to implement tourist offers in Azores that integrate the therapeutic feature, assess their suitability and impact on the well-being and health of the child and caregivers. Using a mixed methodology, the study integrates families that experience and evaluate the impact of tourism products developed specifically for them.Keywords: austism spectrum disorder, children, therapeutic tourism activities, well-being, health, inclusive tourism
Procedia PDF Downloads 1442867 Fault Diagnosis in Induction Motor
Authors: Kirti Gosavi, Anita Bhole
Abstract:
The paper demonstrates simulation and steady-state performance of three phase squirrel cage induction motor and detection of rotor broken bar fault using MATLAB. This simulation model is successfully used in the fault detection of rotor broken bar for the induction machines. A dynamic model using PWM inverter and mathematical modelling of the motor is developed. The dynamic simulation of the small power induction motor is one of the key steps in the validation of the design process of the motor drive system and it is needed for eliminating advertent design errors and the resulting error in the prototype construction and testing. The simulation model will be helpful in detecting the faults in three phase induction motor using Motor current signature analysis.Keywords: squirrel cage induction motor, pulse width modulation (PWM), fault diagnosis, induction motor
Procedia PDF Downloads 6332866 Hydrogen Purity: Developing Low-Level Sulphur Speciation Measurement Capability
Authors: Sam Bartlett, Thomas Bacquart, Arul Murugan, Abigail Morris
Abstract:
Fuel cell electric vehicles provide the potential to decarbonise road transport, create new economic opportunities, diversify national energy supply, and significantly reduce the environmental impacts of road transport. A potential issue, however, is that the catalyst used at the fuel cell cathode is susceptible to degradation by impurities, especially sulphur-containing compounds. A recent European Directive (2014/94/EU) stipulates that, from November 2017, all hydrogen provided to fuel cell vehicles in Europe must comply with the hydrogen purity specifications listed in ISO 14687-2; this includes reactive and toxic chemicals such as ammonia and total sulphur-containing compounds. This requirement poses great analytical challenges due to the instability of some of these compounds in calibration gas standards at relatively low amount fractions and the difficulty associated with undertaking measurements of groups of compounds rather than individual compounds. Without the available reference materials and analytical infrastructure, hydrogen refuelling stations will not be able to demonstrate compliance to the ISO 14687 specifications. The hydrogen purity laboratory at NPL provides world leading, accredited purity measurements to allow hydrogen refuelling stations to evidence compliance to ISO 14687. Utilising state-of-the-art methods that have been developed by NPL’s hydrogen purity laboratory, including a novel method for measuring total sulphur compounds at 4 nmol/mol and a hydrogen impurity enrichment device, we provide the capabilities necessary to achieve these goals. An overview of these capabilities will be given in this paper. As part of the EMPIR Hydrogen co-normative project ‘Metrology for sustainable hydrogen energy applications’, NPL are developing a validated analytical methodology for the measurement of speciated sulphur-containing compounds in hydrogen at low amount fractions pmol/mol to nmol/mol) to allow identification and measurement of individual sulphur-containing impurities in real samples of hydrogen (opposed to a ‘total sulphur’ measurement). This is achieved by producing a suite of stable gravimetrically-prepared primary reference gas standards containing low amount fractions of sulphur-containing compounds (hydrogen sulphide, carbonyl sulphide, carbon disulphide, 2-methyl-2-propanethiol and tetrahydrothiophene have been selected for use in this study) to be used in conjunction with novel dynamic dilution facilities to enable generation of pmol/mol to nmol/mol level gas mixtures (a dynamic method is required as compounds at these levels would be unstable in gas cylinder mixtures). Method development and optimisation are performed using gas chromatographic techniques assisted by cryo-trapping technologies and coupled with sulphur chemiluminescence detection to allow improved qualitative and quantitative analyses of sulphur-containing impurities in hydrogen. The paper will review the state-of-the art gas standard preparation techniques, including the use and testing of dynamic dilution technologies for reactive chemical components in hydrogen. Method development will also be presented highlighting the advances in the measurement of speciated sulphur compounds in hydrogen at low amount fractions.Keywords: gas chromatography, hydrogen purity, ISO 14687, sulphur chemiluminescence detector
Procedia PDF Downloads 2252865 Building an Opinion Dynamics Model from Experimental Data
Authors: Dino Carpentras, Paul J. Maher, Caoimhe O'Reilly, Michael Quayle
Abstract:
Opinion dynamics is a sub-field of agent-based modeling that focuses on people’s opinions and their evolutions over time. Despite the rapid increase in the number of publications in this field, it is still not clear how to apply these models to real-world scenarios. Indeed, there is no agreement on how people update their opinion while interacting. Furthermore, it is not clear if different topics will show the same dynamics (e.g., more polarized topics may behave differently). These problems are mostly due to the lack of experimental validation of the models. Some previous studies started bridging this gap in the literature by directly measuring people’s opinions before and after the interaction. However, these experiments force people to express their opinion as a number instead of using natural language (and then, eventually, encoding it as numbers). This is not the way people normally interact, and it may strongly alter the measured dynamics. Another limitation of these studies is that they usually average all the topics together, without checking if different topics may show different dynamics. In our work, we collected data from 200 participants on 5 unpolarized topics. Participants expressed their opinions in natural language (“agree” or “disagree”). We also measured the certainty of their answer, expressed as a number between 1 and 10. However, this value was not shown to other participants to keep the interaction based on natural language. We then showed the opinion (and not the certainty) of another participant and, after a distraction task, we repeated the measurement. To make the data compatible with opinion dynamics models, we multiplied opinion and certainty to obtain a new parameter (here called “continuous opinion”) ranging from -10 to +10 (using agree=1 and disagree=-1). We firstly checked the 5 topics individually, finding that all of them behaved in a similar way despite having different initial opinions distributions. This suggested that the same model could be applied for different unpolarized topics. We also observed that people tend to maintain similar levels of certainty, even when they changed their opinion. This is a strong violation of what is suggested from common models, where people starting at, for example, +8, will first move towards 0 instead of directly jumping to -8. We also observed social influence, meaning that people exposed with “agree” were more likely to move to higher levels of continuous opinion, while people exposed with “disagree” were more likely to move to lower levels. However, we also observed that the effect of influence was smaller than the effect of random fluctuations. Also, this configuration is different from standard models, where noise, when present, is usually much smaller than the effect of social influence. Starting from this, we built an opinion dynamics model that explains more than 80% of data variance. This model was also able to show the natural conversion of polarization from unpolarized states. This experimental approach offers a new way to build models grounded on experimental data. Furthermore, the model offers new insight into the fundamental terms of opinion dynamics models.Keywords: experimental validation, micro-dynamics rule, opinion dynamics, update rule
Procedia PDF Downloads 1092864 Psychometric Validation of Czech Version of Spiritual Needs Assessment for Patients: The First Part of Research
Authors: Lucie Mrackova, Helena Kisvetrova
Abstract:
Spirituality is an integral part of human life. In a secular environment, spiritual needs are often overlooked, especially in acute nursing care. Spiritual needs assessment for patients (SNAP), which also exists in the Czech version (SNAP-CZ), can be used for objective evaluation. The aim of this study was to measure the psychometric properties of SNAP-CZ and to find correlations between SNAP-CZ and sociodemographic and clinical variables. A cross-sectional study with tools assessing spiritual needs (SNAP-CZ), anxiety (Beck Anxiety Inventory; BAI), depression (Beck Depression Inventory; BDI), pain (Visual Analogue Scale; VAS), self-sufficiency (Barthel Index; BI); cognitive function (Montreal Cognitive Test; MoCa) and selected socio-demographic data was performed. The psychometric properties of SNAP-CZ were tested using factor analysis, reliability and validity tests, and correlations between the questionnaire and sociodemographic data and clinical variables. Internal consistency was established with Cronbach’s alfa for the overall score, respective domains, and individual items. Reliability was assessed by test-retest by Interclass correlation coefficient (ICC). Data for correlation analysis were processed according to Pearson's correlation coefficient. The study included 172 trauma patients (the mean age = 40.6 ± 12.1 years) who experienced polytrauma or severe monotrauma. There were a total of 106 (61.6%) male subjects, 140 (81.4%) respondents identified themselves as non-believers. The full-scale Cronbach's alpha was 0.907. The test-retest showed the reliability of the individual domains in the range of 0.924 to 0.960 ICC. Factor analysis resulted in a three-factor solution (psychosocial needs (alfa = 0.788), spiritual needs (alfa = 0.886) and religious needs (alfa = 0.841)). Correlation analysis using Pearson's correlation coefficient showed that the domain of psychosocial needs significantly correlated only with gender (r = 0.178, p = 0.020). Males had a statistically significant lower average value in this domain (mean = 12.5) compared to females (mean = 13.8). The domain of spiritual needs significantly correlated with gender (r = 0.199, p = 0.009), social status (r = 0.156, p = 0.043), faith (r = -0.250, p = 0.001), anxiety (r = 0.194, p = 0.011) and depression (r = 0.155, p = 0.044). The domain of religious needs significantly correlated with age (r = 0,208, p = 0,007), education (r = -0,161, p = 0,035), faith (r = -0,575, p < 0,0001) and depression (r = 0,179, p = 0,019). Overall, the whole SNAP scale significantly correlated with gender (r = 0.219, p = 0.004), social status (r = 0.175, p = 0.023), faith (r = -0.334, p <0.0001), anxiety (r = 0.177, p = 0.022) and depression (r = 0.173, p = 0.025). The results of this study corroborate the reliability of the SNAP-CZ and support its future use in the nursing care of trauma patients in a secular society. Acknowledgment: The study was supported by grant nr. IGA_FZV_2020_003.Keywords: acute nursing care, assessment of spiritual needs, patient, psychometric validation, spirituality
Procedia PDF Downloads 1042863 Navigating Neural Pathways to Success with Students on the Autism Spectrum
Authors: Panda Krouse
Abstract:
This work is a marriage of the science of Applied Behavioral Analysis and an educator’s look at Neuroscience. The focus is integrating what we know about the anatomy of the brain in autism and evidence-based practices in education. It is a bold attempt to present links between neurological research and the application of evidence-based practices in education. In researching for this work, no discovery of articles making these connections was made. Consideration of the areas of structural differences in the brain are aligned with evidence-based strategies. A brief literary review identifies how identified areas affect overt behavior, which is what, as educators, is what we can see and measure. Giving further justification and validation of our practices in education from a second scientific field is significant for continued improvement in intervention for students on the autism spectrum.Keywords: autism, evidence based practices, neurological differences, education intervention
Procedia PDF Downloads 672862 Author Name Disambiguation for Biomedical Literature
Authors: Parthiban Srinivasan
Abstract:
PubMed provides online access to the National Library of Medicine database (MEDLINE) and other publications, which contain close to 25 million scientific citations from 1865 to the present. There are close to 80 million author name instances in those close to 25 million citations. For any work of literature, a fundamental issue is to identify the individual(s) who wrote it, and conversely, to identify all of the works that belong to a given individual. Due to the lack of universal standards for name information, there are two aspects of name ambiguity: name synonymy (a single author with multiple name representations), and name homonymy (multiple authors sharing the same name representation). In this talk, we present some results from our extensive work in author name disambiguation for PubMed citations. Information will be presented on the effectiveness and shortcomings of different aspects of successful name disambiguation such as parsing, validation, standardization and normalization.Keywords: disambiguation, normalization, parsing, PubMed
Procedia PDF Downloads 3002861 Non-Linear Control Based on State Estimation for the Convoy of Autonomous Vehicles
Authors: M-M. Mohamed Ahmed, Nacer K. M’Sirdi, Aziz Naamane
Abstract:
In this paper, a longitudinal and lateral control approach based on a nonlinear observer is proposed for a convoy of autonomous vehicles to follow a desired trajectory. To authors best knowledge, this topic has not yet been sufficiently addressed in the literature for the control of multi vehicles. The modeling of the convoy of the vehicles is revisited using a robotic method for simulation purposes and control design. With these models, a sliding mode observer is proposed to estimate the states of each vehicle in the convoy from the available sensors, then a sliding mode control based on this observer is used to control the longitudinal and lateral movement. The validation and performance evaluation are done using the well-known driving simulator Scanner-Studio. The results are presented for different maneuvers of 5 vehicles.Keywords: autonomous vehicles, convoy, non-linear control, non-linear observer, sliding mode
Procedia PDF Downloads 1412860 Preliminary Study of Human Reliability of Control in Case of Fire Based on the Decision Processes and Stress Model of Human in a Fire
Authors: Seung-Un Chae, Heung-Yul Kim, Sa-Kil Kim
Abstract:
This paper presents the findings of preliminary study on human control performance in case of fire. The relationship between human control and human decision is studied in decision processes and stress model of human in a fire. Human behavior aspects involved in the decision process during a fire incident. The decision processes appear that six of individual perceptual processes: recognition, validation, definition, evaluation, commitment, and reassessment. Then, human may be stressed in order to get an optimal decision for their activity. This paper explores problems in human control processes and stresses in a catastrophic situation. Thus, the future approach will be concerned to reduce stresses and ambiguous irrelevant information.Keywords: human reliability, decision processes, stress model, fire
Procedia PDF Downloads 9862859 Design Optimization and Thermoacoustic Analysis of Pulse Tube Cryocooler Components
Authors: K. Aravinth, C. T. Vignesh
Abstract:
The usage of pulse tube cryocoolers is significantly increased mainly due to the advantage of the absence of moving parts. The underlying idea of this project is to optimize the design of pulse tube, regenerator, a resonator in cryocooler and analyzing the thermo-acoustic oscillations with respect to the design parameters. Computational Fluid Dynamic (CFD) model with time-dependent validation is done to predict its performance. The continuity, momentum, and energy equations are solved for various porous media regions. The effect of changing the geometries and orientation will be validated and investigated in performance. The pressure, temperature and velocity fields in the regenerator and pulse tube are evaluated. This optimized design performance results will be compared with the existing pulse tube cryocooler design. The sinusoidal behavior of cryocooler in acoustic streaming patterns in pulse tube cryocooler will also be evaluated.Keywords: acoustics, cryogenics, design, optimization
Procedia PDF Downloads 1752858 Numerical and Experimental Investigation of Mixed-Mode Fracture of Cement Paste and Interface Under Three-Point Bending Test
Authors: S. Al Dandachli, F. Perales, Y. Monerie, F. Jamin, M. S. El Youssoufi, C. Pelissou
Abstract:
The goal of this research is to study the fracture process and mechanical behavior of concrete under I–II mixed-mode stress, which is essential for ensuring the safety of concrete structures. For this purpose, two-dimensional simulations of three-point bending tests under variable load and geometry on notched cement paste samples of composite samples (cement paste/siliceous aggregate) are modeled by employing Cohesive Zone Models (CZMs). As a result of experimental validation of these tests, the CZM model demonstrates its capacity to predict fracture propagation at the local scale.Keywords: cement paste, interface, cohesive zone model, fracture, three-point flexural test bending
Procedia PDF Downloads 1502857 Ancient Egyptian Industry Technology of Canopic Jars, Analytical Study and Conservation Processes of Limestone Canopic Jar
Authors: Abd El Rahman Mohamed
Abstract:
Canopic jars made by the ancient Egyptians from different materials were used to preserve the viscera during the mummification process. The canopic jar studied here dates back to the Late Period (712-332 BC). It is found in the Grand Egyptian Museum (GEM), Giza, Egypt. This jar was carved from limestone and covered with a monkey head lid with painted eyes and ears with red pigment and surrounded with black pigment. The jar contains bandages of textile containing mummy viscera with resin and black resin blocks. The canopic jars were made using the sculpting tools that were used by the ancient Egyptians, such as metal chisels (made of copper) and hammers and emptying the mass of the jar from the inside using a tool invented by the ancient Egyptians, which called the emptying drill. This study also aims to use analytical techniques to identify the components of the jar, its contents, pigments, and previous restoration materials and to understand its deterioration aspects. Visual assessment, isolation and identification of fungi, optical microscopy (OM), scanning electron microscopy (SEM), X-ray fluorescence spectroscopy (XRF), X-ray diffraction (XRD), and Fourier transform infrared spectroscopy (FTIR) were used in our study. The jar showed different signs of deterioration, such as dust, dirt, stains, scratches, classifications, missing parts, and breaks; previous conservation materials include using iron wire, completion mortar and an adhesive for assembly. The results revealed that the jar was carved from Dolomite Limestone, red Hematite pigment, Mastic resin, and Linen textile bandages. The previous adhesive was Animal Glue and used Gypsum for the previous completion. The most dominant Microbial infection on the jar was found in the fungi of (Penicillium waksmanii), (Nigrospora sphaerica), (Actinomycetes sp) and (Spore-Forming Gram-Positive Bacilli). Conservation procedures have been applied with high accuracy to conserve the jar, including mechanical and chemical cleaning, re-assembling, completion and consolidation.Keywords: Canopic jar, Consolidation, Mummification, Resin, Viscera.
Procedia PDF Downloads 722856 Computational Models for Accurate Estimation of Joint Forces
Authors: Ibrahim Elnour Abdelrahman Eltayeb
Abstract:
Computational modelling is a method used to investigate joint forces during a movement. It can get high accuracy in the joint forces via subject-specific models. However, the construction of subject-specific models remains time-consuming and expensive. The purpose of this paper was to identify what alterations we can make to generic computational models to get a better estimation of the joint forces. It appraised the impact of these alterations on the accuracy of the estimated joint forces. It found different strategies of alterations: joint model, muscle model, and an optimisation problem. All these alterations affected joint contact force accuracy, so showing the potential for improving the model predictions without involving costly and time-consuming medical images.Keywords: joint force, joint model, optimisation problem, validation
Procedia PDF Downloads 1702855 Issues on Optimizing the Structural Parameters of the Induction Converter
Authors: Marinka K. Baghdasaryan, Siranush M. Muradyan, Avgen A. Gasparyan
Abstract:
Analytical expressions of the current and angular errors, as well as the frequency characteristics of an induction converter describing the relation with its structural parameters, the core and winding characteristics are obtained. Based on estimation of the dependences obtained, a mathematical problem of parametric optimization is formulated which can successfully be used for investigation and diagnosing an induction converter.Keywords: induction converters, magnetic circuit material, current and angular errors, frequency response, mathematical formulation, structural parameters
Procedia PDF Downloads 3452854 Online Electric Current Based Diagnosis of Stator Faults on Squirrel Cage Induction Motors
Authors: Alejandro Paz Parra, Jose Luis Oslinger Gutierrez, Javier Olaya Ochoa
Abstract:
In the present paper, five electric current based methods to analyze electric faults on the stator of induction motors (IM) are used and compared. The analysis tries to extend the application of the multiple reference frames diagnosis technique. An eccentricity indicator is presented to improve the application of the Park’s Vector Approach technique. Most of the fault indicators are validated and some others revised, agree with the technical literatures and published results. A tri-phase 3hp squirrel cage IM, especially modified to establish different fault levels, is used for validation purposes.Keywords: motor fault diagnosis, induction motor, MCSA, ESA, Extended Park´s vector approach, multiparameter analysis
Procedia PDF Downloads 3482853 Residual Evaluation by Thresholding and Neuro-Fuzzy System: Application to Actuator
Authors: Y. Kourd, D. Lefebvre, N. Guersi
Abstract:
The monitoring of industrial processes is required to ensure operating conditions of industrial systems through automatic detection and isolation of faults. In this paper we propose a method of fault diagnosis based on neuro-fuzzy technique and the choice of a threshold. The validation of this method on a test bench "Actuator Electro DAMADICS Benchmark". In the first phase of the method, we construct a model represents the normal state of the system to fault detection. With residuals analysis generated and the choice of thresholds for signatures table. These signatures provide us with groups of non-detectable faults. In the second phase, we build faulty models to see the flaws in the system that are not located in the first phase.Keywords: residuals analysis, threshold, neuro-fuzzy system, residual evaluation
Procedia PDF Downloads 4462852 The Role of Genetic Markers in Prostate Cancer Diagnosis and Treatment
Authors: Farman Ali, Asif Mahmood
Abstract:
The utilization of genetic markers in prostate cancer management represents a significant advance in personalized medicine, offering the potential for more precise diagnosis and tailored treatment strategies. This paper explores the pivotal role of genetic markers in the diagnosis and treatment of prostate cancer, emphasizing their contribution to the identification of individual risk profiles, tumor aggressiveness, and response to therapy. By integrating current research findings, we discuss the application of genetic markers in developing targeted therapies and the implications for patient outcomes. Despite the promising advancements, challenges such as accessibility, cost, and the need for further validation in diverse populations remain. The paper concludes with an outlook on future directions, underscoring the importance of genetic markers in revolutionizing prostate cancer care.Keywords: prostate cancer, genetic markers, personalized medicine, BRCA1 and BRCA2
Procedia PDF Downloads 622851 Predictive Semi-Empirical NOx Model for Diesel Engine
Authors: Saurabh Sharma, Yong Sun, Bruce Vernham
Abstract:
Accurate prediction of NOx emission is a continuous challenge in the field of diesel engine-out emission modeling. Performing experiments for each conditions and scenario cost significant amount of money and man hours, therefore model-based development strategy has been implemented in order to solve that issue. NOx formation is highly dependent on the burn gas temperature and the O2 concentration inside the cylinder. The current empirical models are developed by calibrating the parameters representing the engine operating conditions with respect to the measured NOx. This makes the prediction of purely empirical models limited to the region where it has been calibrated. An alternative solution to that is presented in this paper, which focus on the utilization of in-cylinder combustion parameters to form a predictive semi-empirical NOx model. The result of this work is shown by developing a fast and predictive NOx model by using the physical parameters and empirical correlation. The model is developed based on the steady state data collected at entire operating region of the engine and the predictive combustion model, which is developed in Gamma Technology (GT)-Power by using Direct Injected (DI)-Pulse combustion object. In this approach, temperature in both burned and unburnt zone is considered during the combustion period i.e. from Intake Valve Closing (IVC) to Exhaust Valve Opening (EVO). Also, the oxygen concentration consumed in burnt zone and trapped fuel mass is also considered while developing the reported model. Several statistical methods are used to construct the model, including individual machine learning methods and ensemble machine learning methods. A detailed validation of the model on multiple diesel engines is reported in this work. Substantial numbers of cases are tested for different engine configurations over a large span of speed and load points. Different sweeps of operating conditions such as Exhaust Gas Recirculation (EGR), injection timing and Variable Valve Timing (VVT) are also considered for the validation. Model shows a very good predictability and robustness at both sea level and altitude condition with different ambient conditions. The various advantages such as high accuracy and robustness at different operating conditions, low computational time and lower number of data points requires for the calibration establishes the platform where the model-based approach can be used for the engine calibration and development process. Moreover, the focus of this work is towards establishing a framework for the future model development for other various targets such as soot, Combustion Noise Level (CNL), NO2/NOx ratio etc.Keywords: diesel engine, machine learning, NOₓ emission, semi-empirical
Procedia PDF Downloads 1142850 A Quantitative Structure-Adsorption Study on Novel and Emerging Adsorbent Materials
Authors: Marc Sader, Michiel Stock, Bernard De Baets
Abstract:
Considering a large amount of adsorption data of adsorbate gases on adsorbent materials in literature, it is interesting to predict such adsorption data without experimentation. A quantitative structure-activity relationship (QSAR) is developed to correlate molecular characteristics of gases and existing knowledge of materials with their respective adsorption properties. The application of Random Forest, a machine learning method, on a set of adsorption isotherms at a wide range of partial pressures and concentrations is studied. The predicted adsorption isotherms are fitted to several adsorption equations to estimate the adsorption properties. To impute the adsorption properties of desired gases on desired materials, leave-one-out cross-validation is employed. Extensive experimental results for a range of settings are reported.Keywords: adsorption, predictive modeling, QSAR, random forest
Procedia PDF Downloads 2272849 Asymptotic Expansion of the Korteweg-de Vries-Burgers Equation
Authors: Jian-Jun Shu
Abstract:
It is common knowledge that many physical problems (such as non-linear shallow-water waves and wave motion in plasmas) can be described by the Korteweg-de Vries (KdV) equation, which possesses certain special solutions, known as solitary waves or solitons. As a marriage of the KdV equation and the classical Burgers (KdVB) equation, the Korteweg-de Vries-Burgers (KdVB) equation is a mathematical model of waves on shallow water surfaces in the presence of viscous dissipation. Asymptotic analysis is a method of describing limiting behavior and is a key tool for exploring the differential equations which arise in the mathematical modeling of real-world phenomena. By using variable transformations, the asymptotic expansion of the KdVB equation is presented in this paper. The asymptotic expansion may provide a good gauge on the validation of the corresponding numerical scheme.Keywords: asymptotic expansion, differential equation, Korteweg-de Vries-Burgers (KdVB) equation, soliton
Procedia PDF Downloads 2492848 A Discovery on the Symmetrical Pattern of Mirror Primes in P²: Applications in the Formal Proof of the Goldbach Conjecture
Authors: Yingxu Wang
Abstract:
The base 6 structure and properties of mirror primes are discovered in this work towards the proof of Goldbach Conjecture. This paper reveals a fundamental pattern on pairs of mirror primes adjacent to any even number nₑ > 2 with symmetrical distances on both sides determined by a methodology of Mirror Prime Decomposition (MPD). MPD leads to a formal proof of the Goldbach conjecture, which states that the conjecture holds because any pivot even number, nₑ > 2, is a sum of at least an adjacent pair of primes divided by 2. This work has not only revealed the analytic pattern of base 6 primes but also proven the infinitive validation of the Goldbach conjecture.Keywords: number theory, primes, mirror primes, double recursive patterns, Goldbach conjecture, formal proof, mirror-prime decomposition, applications
Procedia PDF Downloads 502847 Multi-Scale Modelling of the Cerebral Lymphatic System and Its Failure
Authors: Alexandra K. Diem, Giles Richardson, Roxana O. Carare, Neil W. Bressloff
Abstract:
Alzheimer's disease (AD) is the most common form of dementia and although it has been researched for over 100 years, there is still no cure or preventive medication. Its onset and progression is closely related to the accumulation of the neuronal metabolite Aβ. This raises the question of how metabolites and waste products are eliminated from the brain as the brain does not have a traditional lymphatic system. In recent years the rapid uptake of Aβ into cerebral artery walls and its clearance along those arteries towards the lymph nodes in the neck has been suggested and confirmed in mice studies, which has led to the hypothesis that interstitial fluid (ISF), in the basement membranes in the walls of cerebral arteries, provides the pathways for the lymphatic drainage of Aβ. This mechanism, however, requires a net reverse flow of ISF inside the blood vessel wall compared to the blood flow and the driving forces for such a mechanism remain unknown. While possible driving mechanisms have been studied using mathematical models in the past, a mechanism for net reverse flow has not been discovered yet. Here, we aim to address the question of the driving force of this reverse lymphatic drainage of Aβ (also called perivascular drainage) by using multi-scale numerical and analytical modelling. The numerical simulation software COMSOL Multiphysics 4.4 is used to develop a fluid-structure interaction model of a cerebral artery, which models blood flow and displacements in the artery wall due to blood pressure changes. An analytical model of a layer of basement membrane inside the wall governs the flow of ISF and, therefore, solute drainage based on the pressure changes and wall displacements obtained from the cerebral artery model. The findings suggest that an active role in facilitating a reverse flow is played by the components of the basement membrane and that stiffening of the artery wall during age is a major risk factor for the impairment of brain lymphatics. Additionally, our model supports the hypothesis of a close association between cerebrovascular diseases and the failure of perivascular drainage.Keywords: Alzheimer's disease, artery wall mechanics, cerebral blood flow, cerebral lymphatics
Procedia PDF Downloads 5262846 Application of Groundwater Level Data Mining in Aquifer Identification
Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen
Abstract:
Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.Keywords: aquifer identification, decision tree, groundwater, Fourier transform
Procedia PDF Downloads 157