Search results for: hypothesis modelling
2528 Emotional Intelligence Training: Helping Non-Native Pre-Service EFL Teachers to Overcome Speaking Anxiety: The Case of Pre-Service Teachers of English, Algeria
Authors: Khiari Nor El Houda, Hiouani Amira Sarra
Abstract:
Many EFL students with high capacities are hidden because they suffer from speaking anxiety (SA). Most of them find public speaking much demanding. They feel unable to communicate, they fear to make mistakes and they fear negative evaluation or being called on. With the growing number of the learners who suffer from foreign language speaking anxiety (FLSA), it is becoming increasingly difficult to ignore its harmful outcomes on their performance and success, especially during their first contact with the pupils, as they will be teaching in the near future. Different researchers suggested different ways to minimize the negative effects of FLSA. The present study sheds light on emotional intelligence skills training as an effective strategy not only to influence public speaking success but also to help pre-service EFL teachers lessen their speaking anxiety and eventually to prepare them for their professional career. A quasi-experiment was used in order to examine the research hypothesis. We worked with two groups of third-year EFL students at Oum El Bouaghi University. The Foreign Language Classroom Anxiety Scale (FLCAS) and the Emotional Quotient Inventory (EQ-i) were used to collect data about the participants’ FLSA and EI levels. The analysis of the data has yielded that the assumption that there is a negative correlation between EI and FLSA was statistically validated by the Pearson Correlation Test, concluding that, the more emotionally intelligent the individual is the less anxious s/he will be. In addition, the lack of amelioration in the results of the control group and the noteworthy improvement in the experimental group results led us to conclude that EI skills training was an effective strategy in minimizing the FLSA level and therefore, we confirmed our research hypothesis.Keywords: emotional intelligence, emotional intelligence skills training, EQ-I, FLCAS, foreign language speaking anxiety, pre-service EFL teachers
Procedia PDF Downloads 1402527 Thermochemical Modelling for Extraction of Lithium from Spodumene and Prediction of Promising Reagents for the Roasting Process
Authors: Allen Yushark Fosu, Ndue Kanari, James Vaughan, Alexandre Changes
Abstract:
Spodumene is a lithium-bearing mineral of great interest due to increasing demand of lithium in emerging electric and hybrid vehicles. The conventional method of processing the mineral for the metal requires inevitable thermal transformation of α-phase to the β-phase followed by roasting with suitable reagents to produce lithium salts for downstream processes. The selection of appropriate reagent for roasting is key for the success of the process and overall lithium recovery. Several researches have been conducted to identify good reagents for the process efficiency, leading to sulfation, alkaline, chlorination, fluorination, and carbonizing as the methods of lithium recovery from the mineral.HSC Chemistry is a thermochemical software that can be used to model metallurgical process feasibility and predict possible reaction products prior to experimental investigation. The software was employed to investigate and explain the various reagent characteristics as employed in literature during spodumene roasting up to 1200°C. The simulation indicated that all used reagents for sulfation and alkaline were feasible in the direction of lithium salt production. Chlorination was only feasible when Cl2 and CaCl2 were used as chlorination agents but not NaCl nor KCl. Depending on the kind of lithium salt formed during carbonizing and fluorination, the process was either spontaneous or nonspontaneous throughout the temperature range investigated. The HSC software was further used to simulate and predict some promising reagents which may be equally good for roasting the mineral for efficient lithium extraction but have not yet been considered by researchers.Keywords: thermochemical modelling, HSC chemistry software, lithium, spodumene, roasting
Procedia PDF Downloads 1582526 Modelling and Numerical Analysis of Thermal Non-Destructive Testing on Complex Structure
Authors: Y. L. Hor, H. S. Chu, V. P. Bui
Abstract:
Composite material is widely used to replace conventional material, especially in the aerospace industry to reduce the weight of the devices. It is formed by combining reinforced materials together via adhesive bonding to produce a bulk material with alternated macroscopic properties. In bulk composites, degradation may occur in microscopic scale, which is in each individual reinforced fiber layer or especially in its matrix layer such as delamination, inclusion, disbond, void, cracks, and porosity. In this paper, we focus on the detection of defect in matrix layer which the adhesion between the composite plies is in contact but coupled through a weak bond. In fact, the adhesive defects are tested through various nondestructive methods. Among them, pulsed phase thermography (PPT) has shown some advantages providing improved sensitivity, large-area coverage, and high-speed testing. The aim of this work is to develop an efficient numerical model to study the application of PPT to the nondestructive inspection of weak bonding in composite material. The resulting thermal evolution field is comprised of internal reflections between the interfaces of defects and the specimen, and the important key-features of the defects presented in the material can be obtained from the investigation of the thermal evolution of the field distribution. Computational simulation of such inspections has allowed the improvement of the techniques to apply in various inspections, such as materials with high thermal conductivity and more complex structures.Keywords: pulsed phase thermography, weak bond, composite, CFRP, computational modelling, optimization
Procedia PDF Downloads 1742525 Non-Linear Dynamic Analyses of Grouted Pile-Sleeve Connection
Authors: Mogens Saberi
Abstract:
The focus of this article is to present the experience gained from the design of a grouted pile-sleeve connection and to present simple design expressions which can be used in the preliminary design phase of such connections. The grout pile-sleeve connection serves as a connection between an offshore jacket foundation and pre-installed piles located in the seabed. The jacket foundation supports a wind turbine generator resulting in significant dynamic loads on the connection. The connection is designed with shear keys in order to optimize the overall design but little experience is currently available in the use of shear keys in such connections. It is found that the consequence of introducing shear keys in the design is a very complex stress distribution which requires special attention due to significant fatigue loads. An optimal geometrical shape of the shear keys is introduced in order to avoid large stress concentration factors and a relatively easy fabrication. The connection is analysed in ANSYS Mechanical where the grout is modelled by a non-linear material model which allows for cracking of the grout material and captures the elastic-plastic behaviour of the grout material. Special types of finite elements are used in the interface between the pile sleeve and the grout material to model the slip surface between the grout material and the steel. Based on the performed finite element modelling simple design expressions are introduced.Keywords: fatigue design, non-linear finite element modelling, structural dynamics, simple design expressions
Procedia PDF Downloads 3842524 Structured-Ness and Contextual Retrieval Underlie Language Comprehension
Authors: Yao-Ying Lai, Maria Pinango, Ashwini Deo
Abstract:
While grammatical devices are essential to language processing, how comprehension utilizes cognitive mechanisms is less emphasized. This study addresses this issue by probing the complement coercion phenomenon: an entity-denoting complement following verbs like begin and finish receives an eventive interpretation. For example, (1) “The queen began the book” receives an agentive reading like (2) “The queen began [reading/writing/etc.…] the book.” Such sentences engender additional processing cost in real-time comprehension. The traditional account attributes this cost to an operation that coerces the entity-denoting complement to an event, assuming that these verbs require eventive complements. However, in closer examination, examples like “Chapter 1 began the book” undermine this assumption. An alternative, Structured Individual (SI) hypothesis, proposes that the complement following aspectual verbs (AspV; e.g. begin, finish) is conceptualized as a structured individual, construed as an axis along various dimensions (e.g. spatial, eventive, temporal, informational). The composition of an animate subject and an AspV such as (1) engenders an ambiguity between an agentive reading along the eventive dimension like (2), and a constitutive reading along the informational/spatial dimension like (3) “[The story of the queen] began the book,” in which the subject is interpreted as a subpart of the complement denotation. Comprehenders need to resolve the ambiguity by searching contextual information, resulting in additional cost. To evaluate the SI hypothesis, a questionnaire was employed. Method: Target AspV sentences such as “Shakespeare began the volume.” were preceded by one of the following types of context sentence: (A) Agentive-biasing, in which an event was mentioned (…writers often read…), (C) Constitutive-biasing, in which a constitutive meaning was hinted (Larry owns collections of Renaissance literature.), (N) Neutral context, which allowed both interpretations. Thirty-nine native speakers of English were asked to (i) rate each context-target sentence pair from a 1~5 scale (5=fully understandable), and (ii) choose possible interpretations for the target sentence given the context. The SI hypothesis predicts that comprehension is harder for the Neutral condition, as compared to the biasing conditions because no contextual information is provided to resolve an ambiguity. Also, comprehenders should obtain the specific interpretation corresponding to the context type. Results: (A) Agentive-biasing and (C) Constitutive-biasing were rated higher than (N) Neutral conditions (p< .001), while all conditions were within the acceptable range (> 3.5 on the 1~5 scale). This suggests that when lacking relevant contextual information, semantic ambiguity decreases comprehensibility. The interpretation task shows that the participants selected the biased agentive/constitutive reading for condition (A) and (C) respectively. For the Neutral condition, the agentive and constitutive readings were chosen equally often. Conclusion: These findings support the SI hypothesis: the meaning of AspV sentences is conceptualized as a parthood relation involving structured individuals. We argue that semantic representation makes reference to spatial structured-ness (abstracted axis). To obtain an appropriate interpretation, comprehenders utilize contextual information to enrich the conceptual representation of the sentence in question. This study connects semantic structure to human’s conceptual structure, and provides a processing model that incorporates contextual retrieval.Keywords: ambiguity resolution, contextual retrieval, spatial structured-ness, structured individual
Procedia PDF Downloads 3332523 Using Building Information Modelling to Mitigate Risks Associated with Health and Safety in the Construction and Maintenance of Infrastructure Assets
Authors: Mohammed Muzafar, Darshan Ruikar
Abstract:
BIM, an acronym for Building Information Modelling relates to the practice of creating a computer generated model which is capable of displaying the planning, design, construction and operation of a structure. The resulting simulation is a data-rich, object-oriented, intelligent and parametric digital representation of the facility, from which views and data, appropriate to various users needs can be extracted and analysed to generate information that can be used to make decisions and to improve the process of delivering the facility. BIM also refers to a shift in culture that will influence the way the built environment and infrastructure operates and how it is delivered. One of the main issues of concern in the construction industry at present in the UK is its record on Health & Safety (H&S). It is, therefore, important that new technologies such as BIM are developed to help improve the quality of health and safety. Historically the H&S record of the construction industry in the UK is relatively poor as compared to the manufacturing industries. BIM and the digital environment it operates within now allow us to use design and construction data in a more intelligent way. It allows data generated by the design process to be re-purposed and contribute to improving efficiencies in other areas of a project. This evolutionary step in design is not only creating exciting opportunities for the designers themselves but it is also creating opportunity for every stakeholder in any given project. From designers, engineers, contractors through to H&S managers, BIM is accelerating a cultural change. The paper introduces the concept behind a research project that mitigates the H&S risks associated with the construction, operation and maintenance of assets through the adoption of BIM.Keywords: building information modeling, BIM levels, health, safety, integration
Procedia PDF Downloads 2512522 Simulation of the Visco-Elasto-Plastic Deformation Behaviour of Short Glass Fibre Reinforced Polyphthalamides
Authors: V. Keim, J. Spachtholz, J. Hammer
Abstract:
The importance of fibre reinforced plastics continually increases due to the excellent mechanical properties, low material and manufacturing costs combined with significant weight reduction. Today, components are usually designed and calculated numerically by using finite element methods (FEM) to avoid expensive laboratory tests. These programs are based on material models including material specific deformation characteristics. In this research project, material models for short glass fibre reinforced plastics are presented to simulate the visco-elasto-plastic deformation behaviour. Prior to modelling specimens of the material EMS Grivory HTV-5H1, consisting of a Polyphthalamide matrix reinforced by 50wt.-% of short glass fibres, are characterized experimentally in terms of the highly time dependent deformation behaviour of the matrix material. To minimize the experimental effort, the cyclic deformation behaviour under tensile and compressive loading (R = −1) is characterized by isothermal complex low cycle fatigue (CLCF) tests. Combining cycles under two strain amplitudes and strain rates within three orders of magnitude and relaxation intervals into one experiment the visco-elastic deformation is characterized. To identify visco-plastic deformation monotonous tensile tests either displacement controlled or strain controlled (CERT) are compared. All relevant modelling parameters for this complex superposition of simultaneously varying mechanical loadings are quantified by these experiments. Subsequently, two different material models are compared with respect to their accuracy describing the visco-elasto-plastic deformation behaviour. First, based on Chaboche an extended 12 parameter model (EVP-KV2) is used to model cyclic visco-elasto-plasticity at two time scales. The parameters of the model including a total separation of elastic and plastic deformation are obtained by computational optimization using an evolutionary algorithm based on a fitness function called genetic algorithm. Second, the 12 parameter visco-elasto-plastic material model by Launay is used. In detail, the model contains a different type of a flow function based on the definition of the visco-plastic deformation as a part of the overall deformation. The accuracy of the models is verified by corresponding experimental LCF testing.Keywords: complex low cycle fatigue, material modelling, short glass fibre reinforced polyphthalamides, visco-elasto-plastic deformation
Procedia PDF Downloads 2152521 Family of Density Curves of Queensland Soils from Compaction Tests, on a 3D Z-Plane Function of Moisture Content, Saturation, and Air-Void Ratio
Authors: Habib Alehossein, M. S. K. Fernando
Abstract:
Soil density depends on the volume of the voids and the proportion of the water and air in the voids. However, there is a limit to the contraction of the voids at any given compaction energy, whereby additional water is used to reduce the void volume further by lubricating the particles' frictional contacts. Hence, at an optimum moisture content and specific compaction energy, the density of unsaturated soil can be maximized where the void volume is minimum. However, when considering a full compaction curve and permutations and variations of all these components (soil, air, water, and energy), laboratory soil compaction tests can become expensive, time-consuming, and exhausting. Therefore, analytical methods constructed on a few test data can be developed and used to reduce such unnecessary efforts significantly. Concentrating on the compaction testing results, this study discusses the analytical modelling method developed for some fine-grained and coarse-grained soils of Queensland. Soil properties and characteristics, such as full functional compaction curves under various compaction energy conditions, were studied and developed for a few soil types. Using MATLAB, several generic analytical codes were created for this study, covering all possible compaction parameters and results as they occur in a soil mechanics lab. These MATLAB codes produce a family of curves to determine the relationships between the density, moisture content, void ratio, saturation, and compaction energy.Keywords: analytical, MATLAB, modelling, compaction curve, void ratio, saturation, moisture content
Procedia PDF Downloads 902520 Optimum Design of Hybrid (Metal-Composite) Mechanical Power Transmission System under Uncertainty by Convex Modelling
Authors: Sfiso Radebe
Abstract:
The design models dealing with flawless composite structures are in abundance, where the mechanical properties of composite structures are assumed to be known a priori. However, if the worst case scenario is assumed, where material defects combined with processing anomalies in composite structures are expected, a different solution is attained. Furthermore, if the system being designed combines in series hybrid elements, individually affected by material constant variations, it implies that a different approach needs to be taken. In the body of literature, there is a compendium of research that investigates different modes of failure affecting hybrid metal-composite structures. It covers areas pertaining to the failure of the hybrid joints, structural deformation, transverse displacement, the suppression of vibration and noise. In the present study a system employing a combination of two or more hybrid power transmitting elements will be explored for the least favourable dynamic loads as well as weight minimization, subject to uncertain material properties. Elastic constants are assumed to be uncertain-but-bounded quantities varying slightly around their nominal values where the solution is determined using convex models of uncertainty. Convex analysis of the problem leads to the computation of the least favourable solution and ultimately to a robust design. This approach contrasts with a deterministic analysis where the average values of elastic constants are employed in the calculations, neglecting the variations in the material properties.Keywords: convex modelling, hybrid, metal-composite, robust design
Procedia PDF Downloads 2112519 Modelling Agricultural Commodity Price Volatility with Markov-Switching Regression, Single Regime GARCH and Markov-Switching GARCH Models: Empirical Evidence from South Africa
Authors: Yegnanew A. Shiferaw
Abstract:
Background: commodity price volatility originating from excessive commodity price fluctuation has been a global problem especially after the recent financial crises. Volatility is a measure of risk or uncertainty in financial analysis. It plays a vital role in risk management, portfolio management, and pricing equity. Objectives: the core objective of this paper is to examine the relationship between the prices of agricultural commodities with oil price, gas price, coal price and exchange rate (USD/Rand). In addition, the paper tries to fit an appropriate model that best describes the log return price volatility and estimate Value-at-Risk and expected shortfall. Data and methods: the data used in this study are the daily returns of agricultural commodity prices from 02 January 2007 to 31st October 2016. The data sets consists of the daily returns of agricultural commodity prices namely: white maize, yellow maize, wheat, sunflower, soya, corn, and sorghum. The paper applies the three-state Markov-switching (MS) regression, the standard single-regime GARCH and the two regime Markov-switching GARCH (MS-GARCH) models. Results: to choose the best fit model, the log-likelihood function, Akaike information criterion (AIC), Bayesian information criterion (BIC) and deviance information criterion (DIC) are employed under three distributions for innovations. The results indicate that: (i) the price of agricultural commodities was found to be significantly associated with the price of coal, price of natural gas, price of oil and exchange rate, (ii) for all agricultural commodities except sunflower, k=3 had higher log-likelihood values and lower AIC and BIC values. Thus, the three-state MS regression model outperformed the two-state MS regression model (iii) MS-GARCH(1,1) with generalized error distribution (ged) innovation performs best for white maize and yellow maize; MS-GARCH(1,1) with student-t distribution (std) innovation performs better for sorghum; MS-gjrGARCH(1,1) with ged innovation performs better for wheat, sunflower and soya and MS-GARCH(1,1) with std innovation performs better for corn. In conclusion, this paper provided a practical guide for modelling agricultural commodity prices by MS regression and MS-GARCH processes. This paper can be good as a reference when facing modelling agricultural commodity price problems.Keywords: commodity prices, MS-GARCH model, MS regression model, South Africa, volatility
Procedia PDF Downloads 2022518 The Long-Term Impact of Health Conditions on Social Mobility Outcomes: A Modelling Study
Authors: Lise Retat, Maria Carmen Huerta, Laura Webber, Franco Sassi
Abstract:
Background: Intra-generational social mobility (ISM) can be defined as the extent to which individuals change their socio-economic position over a period of time or during their entire life course. The relationship between poor health and ISM is established. Therefore, quantifying the impact that potential health policies have on ISM now and into the future would provide evidence for how social inequality could be reduced. This paper takes the condition of overweight and obesity as an example and estimates the mean earning change per individual if the UK were to introduce policies to effectively reduce overweight and obesity. Methods: The HealthLumen individual-based model was used to estimate the impact of obesity on social mobility measures, such as earnings, occupation, and wealth. The HL tool models each individual's probability of experiencing downward ISM as a result of their overweight and obesity status. For example, one outcome of interest was the cumulative mean earning per person of implementing a policy which would reduce adult overweight and obesity by 1% each year between 2020 and 2030 in the UK. Results: Preliminary analysis showed that by reducing adult overweight and obesity by 1% each year between 2020 and 2030, the cumulative additional mean earnings would be ~1,000 Euro per adult by 2030. Additional analysis will include other social mobility indicators. Conclusions: These projections are important for illustrating the role of health in social mobility and for providing evidence for how health policy can make a difference to social mobility outcomes and, in turn, help to reduce inequality.Keywords: modelling, social mobility, obesity, health
Procedia PDF Downloads 1222517 Finite Difference Modelling of Temperature Distribution around Fire Generated Heat Source in an Enclosure
Authors: A. A. Dare, E. U. Iniegbedion
Abstract:
Industrial furnaces generally involve enclosures of fire typically initiated by the combustion of gases. The fire leads to temperature distribution inside the enclosure. A proper understanding of the temperature and velocity distribution within the enclosure is often required for optimal design and use of the furnace. This study was therefore directed at numerical modeling of temperature distribution inside an enclosure as typical in a furnace. A mathematical model was developed from the conservation of mass, momentum and energy. The stream function-vorticity formulation of the governing equations was solved by an alternating direction implicit (ADI) finite difference technique. The finite difference formulation obtained were then developed into a computer code. This was used to determine the temperature, velocities, stream function and vorticity. The effect of the wall heat conduction was also considered, by assuming a one-dimensional heat flow through the wall. The computer code (MATLAB program) developed was used for the determination of the aforementioned variables. The results obtained showed that the transient temperature distribution assumed a uniform profile which becomes more chaotic with increasing time. The vertical velocity showed increasing turbulent behavior with time, while the horizontal velocity assumed decreasing laminar behavior with time. All of these behaviours were equally reported in the literature. The developed model has provided understanding of heat transfer process in an industrial furnace.Keywords: heat source, modelling, enclosure, furnace
Procedia PDF Downloads 2552516 Development of a Comprehensive Energy Model for Canada
Authors: Matthew B. Davis, Amit Kumar
Abstract:
With potentially dangerous impacts of climate change on the horizon, Canada has an opportunity to take a lead role on the international stage to demonstrate how energy use intensity and greenhouse gas emission intensity may be effectively reduced. Through bottom-up modelling of Canada’s energy sector using Long-range Energy Alternative Planning (LEAP) software, it can be determined where efforts should to be concentrated to produce the most positive energy management results. By analyzing a provincially integrated Canada, one can develop strategies to minimize the country’s economic downfall while transitioning to lower-emission energy technologies. Canada’s electricity sector plays an important role in accommodating these transitionary technologies as fossil-fuel based power production is prevalent in many parts of the country and is responsible for a large portion (17%) of Canada’s greenhouse gas emissions. Current findings incorporate an in-depth model of Canada’s current energy supply and demand sectors, as well as a business-as-usual scenario up to the year 2035. This allows for in-depth analysis of energy flow from resource potential, to extraction, to fuel and electricity production, to energy end use and emissions in Canada’s residential, transportation, commercial, institutional, industrial, and agricultural sectors. Bottom-up modelling techniques such as these are useful to critically analyze and compare the various possible scenarios of implementing sustainable energy measures. This work can aid government in creating effective energy and environmental policies, as well as guide industry to what technology or process changes would be most worthwhile to pursue.Keywords: energy management, LEAP, energy end-use, GHG emissions
Procedia PDF Downloads 3012515 Patriarchy and Clearance Rates of Sexual Victimization: A Multilevel Analysis
Authors: Margaret Schmuhl, Michelle Cubellis
Abstract:
Violence against women (VAW) is a widespread social problem affecting nearly two million women in the United States each year. Recently, feminist criminologists have sought to examine patriarchy as a guiding framework for understanding violence against women. Literature on VAW often examines measures of structural gender equality, often overlooking ideological patriarchy which is necessary for structural inequality to remain unchallenged. Additionally, empirical literature generally focuses on extreme forms of VAW, rape, and femicide, often neglecting more common types of violence. This literature, under the theoretical guidance of the Liberal, Radical, and Marxist feminist traditions, finds mixed support for the relationship of patriarchy and VAW. Explanations for these inconsistencies may include data availability, and the use of different operationalizations of structural patriarchy. Research is needed to examine fuller operationalizations of patriarchy in social institutions and to extend this theoretical framework to the criminal justice response to VAW (i.e., clearance rates). This study examines sexual violence clearance rates under the theoretical guidance of these feminist traditions using incident- and county-level data from National Incident Based Reporting System and other sources in multilevel modelling. The findings suggest mixed support for the feminist hypotheses and that patriarchy and gender equality differentially affect arrest clearance rates and clearance through exceptional means for sexual violence.Keywords: clearance rates, gender equality, multilevel modelling, patriarchy, sexual victimization, violence against women
Procedia PDF Downloads 1832514 Improving Trainings of Mineral Processing Operators Through Gamification and Modelling and Simulation
Authors: Pedro A. S. Bergamo, Emilia S. Streng, Jan Rosenkranz, Yousef Ghorbani
Abstract:
Within the often-hazardous mineral industry, simulation training has speedily gained appreciation as an important method of increasing site safety and productivity through enhanced operator skill and knowledge. Performance calculations related to froth flotation, one of the most important concentration methods, is probably the hardest topic taught during the training of plant operators. Currently, most training teach those skills by traditional methods like slide presentations and hand-written exercises with a heavy focus on memorization. To optimize certain aspects of these pieces of training, we developed “MinFloat”, which teaches the operation formulas of the froth flotation process with the help of gamification. The simulation core based on a first-principles flotation model was implemented in Unity3D and an instructor tutoring system was developed, which presents didactic content and reviews the selected answers. The game was tested by 25 professionals with extensive experience in the mining industry based on a questionnaire formulated for training evaluations. According to their feedback, the game scored well in terms of quality, didactic efficacy and inspiring character. The feedback of the testers on the main target audience and the outlook of the mentioned solution is presented. This paper aims to provide technical background on the construction of educational games for the mining industry besides showing how feedback from experts can more efficiently be gathered thanks to new technologies such as online forms.Keywords: training evaluation, simulation based training, modelling, and simulation, froth flotation
Procedia PDF Downloads 1132513 Modelling High Strain Rate Tear Open Behavior of a Bilaminate Consisting of Foam and Plastic Skin Considering Tensile Failure and Compression
Authors: Laura Pytel, Georg Baumann, Gregor Gstrein, Corina Klug
Abstract:
Premium cars often coat the instrument panels with a bilaminate consisting of a soft foam and a plastic skin. The coating is torn open during the passenger airbag deployment under high strain rates. Characterizing and simulating the top coat layer is crucial for predicting the attenuation that delays the airbag deployment, effecting the design of the restrain system and to reduce the demand of simulation adjustments through expensive physical component testing.Up to now, bilaminates used within cars either have been modelled by using a two-dimensional shell formulation for the whole coating system as one which misses out the interaction of the two layers or by combining a three-dimensional formulation foam layer with a two-dimensional skin layer but omitting the foam in the significant parts like the expected tear line area and the hinge where high compression is expected. In both cases, the properties of the coating causing the attenuation are not considered. Further, at present, the availability of material information, as there are failure dependencies of the two layers, as well as the strain rate of up to 200 1/s, are insufficient. The velocity of the passenger airbag flap during an airbag shot has been measured with about 11.5 m/s during first ripping; the digital image correlation evaluation showed resulting strain rates of above 1500 1/s. This paper provides a high strain rate material characterization of a bilaminate consisting of a thin polypropylene foam and a thermoplasctic olefins (TPO) skin and the creation of validated material models. With the help of a Split Hopkinson tension bar, strain rates of 1500 1/s were within reach. The experimental data was used to calibrate and validate a more physical modelling approach of the forced ripping of the bilaminate. In the presented model, the three-dimensional foam layer is continuously tied to the two-dimensional skin layer, allowing failure in both layers at any possible position. The simulation results show a higher agreement in terms of the trajectory of the flaps and its velocity during ripping. The resulting attenuation of the airbag deployment measured by the contact force between airbag and flaps increases and serves usable data for dimensioning modules of an airbag system.Keywords: bilaminate ripping behavior, High strain rate material characterization and modelling, induced material failure, TPO and foam
Procedia PDF Downloads 692512 Application of a Model-Free Artificial Neural Networks Approach for Structural Health Monitoring of the Old Lidingö Bridge
Authors: Ana Neves, John Leander, Ignacio Gonzalez, Raid Karoumi
Abstract:
Systematic monitoring and inspection are needed to assess the present state of a structure and predict its future condition. If an irregularity is noticed, repair actions may take place and the adequate intervention will most probably reduce the future costs with maintenance, minimize downtime and increase safety by avoiding the failure of the structure as a whole or of one of its structural parts. For this to be possible decisions must be made at the right time, which implies using systems that can detect abnormalities in their early stage. In this sense, Structural Health Monitoring (SHM) is seen as an effective tool for improving the safety and reliability of infrastructures. This paper explores the decision-making problem in SHM regarding the maintenance of civil engineering structures. The aim is to assess the present condition of a bridge based exclusively on measurements using the suggested method in this paper, such that action is taken coherently with the information made available by the monitoring system. Artificial Neural Networks are trained and their ability to predict structural behavior is evaluated in the light of a case study where acceleration measurements are acquired from a bridge located in Stockholm, Sweden. This relatively old bridge is presently still in operation despite experiencing obvious problems already reported in previous inspections. The prediction errors provide a measure of the accuracy of the algorithm and are subjected to further investigation, which comprises concepts like clustering analysis and statistical hypothesis testing. These enable to interpret the obtained prediction errors, draw conclusions about the state of the structure and thus support decision making regarding its maintenance.Keywords: artificial neural networks, clustering analysis, model-free damage detection, statistical hypothesis testing, structural health monitoring
Procedia PDF Downloads 2082511 Assessing and Managing the Risk of Inland Acid Sulfate Soil Drainage via Column Leach Tests and 1D Modelling: A Case Study from South East Australia
Authors: Nicolaas Unland, John Webb
Abstract:
The acidification and mobilisation of metals during the oxidation of acid sulfate soils exposed during lake bed drying is an increasingly common phenomenon under climate scenarios with reduced rainfall. In order to assess the risk of generating high concentrations of acidity and dissolved metals, chromium suite analysis are fundamental, but sometimes limited in characterising the potential risks they pose. This study combines such fundamental test work, along with incubation tests and 1D modelling to investigate the risks associated with the drying of Third Reedy Lake in South East Australia. Core samples were collected from a variable depth of 0.5 m below the lake bed, at 19 locations across the lake’s footprint, using a boat platform. Samples were subjected to a chromium suite of analysis, including titratable actual acidity, chromium reducible sulfur and acid neutralising capacity. Concentrations of reduced sulfur up to 0.08 %S and net acidities up to 0.15 %S indicate that acid sulfate soils have formed on the lake bed during permanent inundation over the last century. A further sub-set of samples were prepared in 7 columns and subject to accelerated heating, drying and wetting over a period of 64 days in laboratory. Results from the incubation trial indicate that while pyrite oxidation proceeded, minimal change to soil pH or the acidity of leachate occurred, suggesting that the internal buffering capacity of lake bed sediments was sufficient to neutralise a large proportion of the acidity produced. A 1D mass balance model was developed to assess potential changes in lake water quality during drying based on the results of chromium suite and incubation tests. Results from the above test work and modelling suggest that acid sulfate soils pose a moderate to low risk to the Third Reedy Lake system. Further, the risks can be effectively managed during the initial stages of lake drying via flushing with available mildly alkaline water. The study finds that while test work such as chromium suite analysis are fundamental in characterizing acid sulfate soil environments, they can the overestimate risks associated with the soils. Subsequent incubation test work may more accurately characterise such soils and lead to better-informed management strategies.Keywords: acid sulfate soil, incubation, management, model, risk
Procedia PDF Downloads 3582510 Effect of Co-Infection With Intestinal Parasites on COVID-19 Severity: A Prospective Observational Cohort Study
Authors: Teklay Gebrecherkos, Dawit Wolday, Muhamud Abdulkader
Abstract:
Background: COVID-19 symptomatology in Africa appears significantly less serious than in the industrialized world. Our hypothesis for this phenomenon, being a different, more activated immune system due to parasite infections contributes to reduced COVID-19 outcome. We investigated this hypothesis in an endemic area in sub sub-saharan Africa. Methods: Ethiopian COVID-19 patients were enrolled and screened for intestinal parasites, between July 2020 and March 2021. The primary outcome was the proportion of patients with severe COVID-19. SARS-CoV-2 infection was confirmed by RT-PCR on samples obtained from nasopharyngeal swabs, while direct microscopic examination, modified Ritchie concentration, and Kato-Katz methods were used to identify parasites and ova from a fresh stool sample. Ordinal logistic regression models were used to estimate the association between parasite infection and COVID-19 severity. Models were adjusted for sex, age, residence, education level, occupation, body mass index, and comorbidities. Data were analyzed using STATA version 14. P-value <0.05 was considered statistically significant. Results: A total of 751 SARS-CoV-2 infected patients were enrolled, of whom 284 (37•8%) had an intestinal parasitic infection. Only 27/255 (10•6%) severe COVID-19 patients were co-infected with intestinal parasites, while 257/496 (51•8%) non-severe COVID-19 patients appeared parasite positive (p<0.0001). Patients co-infected with parasites had lower odds of developing severe COVID-19, with an adjusted odds ratio (AOR) of 0•14 (95% CI 0•09–0•24; p<0•0001) for all parasites, AOR 0•20 ([95% CI 0•11–0•38]; p<0•0001) for protozoa, and AOR 0•13 ([95% CI 0•07–0•26]; p<0•0001) for helminths. When stratified by species, co-infection with Entamoeba spp., Hymenolopis nana, and Schistosoma mansoni implied a lower probability of developing severe COVID-19. There were 11 deaths (1•5%), and all were among patients without parasites (p=0•009). Conclusions: Parasite co-infection is associated with a reduced risk of severe COVID-19 in African patients. Parasite-driven immunomodulatory responses may mute hyper-inflammation associated with severe COVID-19.Keywords: COVID-19, SARS-COV-2, intestinal parasite, RT-PCR, co-infection
Procedia PDF Downloads 602509 4D Modelling of Low Visibility Underwater Archaeological Excavations Using Multi-Source Photogrammetry in the Bulgarian Black Sea
Authors: Rodrigo Pacheco-Ruiz, Jonathan Adams, Felix Pedrotti
Abstract:
This paper introduces the applicability of underwater photogrammetric survey within challenging conditions as the main tool to enhance and enrich the process of documenting archaeological excavation through the creation of 4D models. Photogrammetry was being attempted on underwater archaeological sites at least as early as the 1970s’ and today the production of traditional 3D models is becoming a common practice within the discipline. Photogrammetry underwater is more often implemented to record exposed underwater archaeological remains and less so as a dynamic interpretative tool. Therefore, it tends to be applied in bright environments and when underwater visibility is > 1m, reducing its implementation on most submerged archaeological sites in more turbid conditions. Recent years have seen significant development of better digital photographic sensors and the improvement of optical technology, ideal for darker environments. Such developments, in tandem with powerful processing computing systems, have allowed underwater photogrammetry to be used by this research as a standard recording and interpretative tool. Using multi-source photogrammetry (5, GoPro5 Hero Black cameras) this paper presents the accumulation of daily (4D) underwater surveys carried out in the Early Bronze Age (3,300 BC) to Late Ottoman (17th Century AD) archaeological site of Ropotamo in the Bulgarian Black Sea under challenging conditions (< 0.5m visibility). It proves that underwater photogrammetry can and should be used as one of the main recording methods even in low light and poor underwater conditions as a way to better understand the complexity of the underwater archaeological record.Keywords: 4D modelling, Black Sea Maritime Archaeology Project, multi-source photogrammetry, low visibility underwater survey
Procedia PDF Downloads 2362508 Hysteresis Modeling in Iron-Dominated Magnets Based on a Deep Neural Network Approach
Authors: Maria Amodeo, Pasquale Arpaia, Marco Buzio, Vincenzo Di Capua, Francesco Donnarumma
Abstract:
Different deep neural network architectures have been compared and tested to predict magnetic hysteresis in the context of pulsed electromagnets for experimental physics applications. Modelling quasi-static or dynamic major and especially minor hysteresis loops is one of the most challenging topics for computational magnetism. Recent attempts at mathematical prediction in this context using Preisach models could not attain better than percent-level accuracy. Hence, this work explores neural network approaches and shows that the architecture that best fits the measured magnetic field behaviour, including the effects of hysteresis and eddy currents, is the nonlinear autoregressive exogenous neural network (NARX) model. This architecture aims to achieve a relative RMSE of the order of a few 100 ppm for complex magnetic field cycling, including arbitrary sequences of pseudo-random high field and low field cycles. The NARX-based architecture is compared with the state-of-the-art, showing better performance than the classical operator-based and differential models, and is tested on a reference quadrupole magnetic lens used for CERN particle beams, chosen as a case study. The training and test datasets are a representative example of real-world magnet operation; this makes the good result obtained very promising for future applications in this context.Keywords: deep neural network, magnetic modelling, measurement and empirical software engineering, NARX
Procedia PDF Downloads 1302507 Testing Nature Based Solutions for Air Quality Improvement: Aveiro Case Study
Authors: A. Ascenso, C. Silveira, B. Augusto, S. Rafael, S. Coelho, J. Ferreira, A. Monteiro, P. Roebeling, A. I. Miranda
Abstract:
Innovative nature-based solutions (NBSs) can provide answers to the challenges that urban areas are currently facing due to urban densification and extreme weather conditions. The effects of NBSs are recognized and include improved quality of life, mental and physical health and improvement of air quality, among others. Part of the work developed in the scope of the UNaLab project, which aims to guide cities in developing and implementing their own co-creative NBSs, intends to assess the impacts of NBSs on air quality, using Eindhoven city as a case study. The state-of-the-art online air quality modelling system WRF-CHEM was applied to simulate meteorological and concentration fields over the study area with a spatial resolution of 1 km2 for the year 2015. The baseline simulation (without NBSs) was validated by comparing the model results with monitored data retrieved from the Eindhoven air quality database, showing an adequate model performance. In addition, land use changes were applied in a set of simulations to assess the effects of different types of NBSs. Finally, these simulations were compared with the baseline scenario and the impacts of the NBSs were assessed. Reductions on pollutant concentrations, namely for NOx and PM, were found after the application of the NBSs in the Eindhoven study area. The present work is particularly important to support public planners and decision makers in understanding the effects of their actions and planning more sustainable cities for the future.Keywords: air quality, modelling approach, nature based solutions, urban area
Procedia PDF Downloads 2382506 Optimizing the Morphology and Flow Patterns of Scaffold Perfusion Systems for Effective Cell Deposition Using Computational Fluid Dynamics
Authors: Vineeth Siripuram, Abhineet Nigam
Abstract:
A bioreactor is an engineered system that supports a biologically active environment. Along the years, the advancements in bioreactors have been widely accepted all over the world for varied applications ranging from sewage treatment to tissue cloning. Driven by tissue and organ shortage, tissue engineering has emerged as an alternative to transplantation for the reconstruction of lost or damaged organs. In this study, Computational fluid dynamics (CFD) has been used to model porous medium flow in scaffolds (taken from the literature) with different flow patterns. A detailed analysis of different scaffold geometries and their influence on cell deposition in the perfusion system is been carried out using Computational fluid dynamics (CFD). Considering the fact that, the scaffold should mimic the organs or tissues structures in a three-dimensional manner, certain assumptions were made accordingly. The research on scaffolds has been extensively carried out in different bioreactors. However, there has been less focus on the morphology of the scaffolds and the flow patterns in which the perfusion system is laid upon. The objective of this paper is to employ a computational approach using CFD simulation to determine the optimal morphology and the anisotropic measurements of the various samples of scaffolds. Using predictive computational modelling approach, variables which exert dominant effects on the cell deposition within the scaffold were prioritised and corresponding changes in morphology of scaffold and flow patterns in the perfusion systems are made. A Eulerian approach was carried on in multiple CFD simulations, and it is observed that the morphological and topological changes in the scaffold perfusion system are of great importance in the commercial applications of scaffolds.Keywords: cell seeding, CFD, flow patterns, modelling, perfusion systems, scaffold
Procedia PDF Downloads 1602505 Partisan Agenda Setting in Digital Media World
Authors: Hai L. Tran
Abstract:
Previous research on agenda setting effects has often focused on the top-down influence of the media at the aggregate level, while overlooking the capacity of audience members to select media and content to fit their individual dispositions. The decentralized characteristics of online communication and digital news create more choices and greater user control, thereby enabling each audience member to seek out a unique blend of media sources, issues, and elements of messages and to mix them into a coherent individual picture of the world. This study examines how audiences use media differently depending on their prior dispositions, thereby making sense of the world in ways that are congruent with their preferences and cognitions. The current undertaking is informed by theoretical frameworks from two distinct lines of scholarship. According to the ideological migration hypothesis, individuals choose to live in communities with ideologies like their own to satisfy their need to belong. One tends to move away from Zip codes that are incongruent and toward those that are more aligned with one’s ideological orientation. This geographical division along ideological lines has been documented in social psychology research. As an extension of agenda setting, the agendamelding hypothesis argues that audiences seek out information in attractive media and blend them into a coherent narrative that fits with a common agenda shared by others, who think as they do and communicate with them about issues of public. In other words, individuals, through their media use, identify themselves with a group/community that they want to join. Accordingly, the present study hypothesizes that because ideology plays a role in pushing people toward a physical community that fits their need to belong, it also leads individuals to receive an idiosyncratic blend of media and be influenced by such selective exposure in deciding what issues are more relevant. Consequently, the individualized focus of media choices impacts how audiences perceive political news coverage and what they know about political issues. The research project utilizes recent data from The American Trends Panel survey conducted by Pew Research Center to explore the nuanced nature of agenda setting at the individual level and amid heightened polarization. Hypothesis testing is performed with both nonparametric and parametric procedures, including regression and path analysis. This research attempts to explore the media-public relationship from a bottom-up approach, considering the ability of active audience members to select among media in a larger process that entails agenda setting. It helps encourage agenda-setting scholars to further examine effects at the individual, rather than aggregate, level. In addition to theoretical contributions, the study’s findings are useful for media professionals in building and maintaining relationships with the audience considering changes in market share due to the spread of digital and social media.Keywords: agenda setting, agendamelding, audience fragmentation, ideological migration, partisanship, polarization
Procedia PDF Downloads 592504 Building Information Modelling (BIM) and Unmanned Aerial Vehicles (UAV) Technologies in Road Construction Project Monitoring and Management: Case Study of a Project in Cyprus
Authors: Yiannis Vacanas, Kyriacos Themistocleous, Athos Agapiou, Diofantos Hadjimitsis
Abstract:
Building Information Modelling (BIM) technology is considered by construction professionals as a very valuable process in modern design, procurement and project management. Construction professionals of all disciplines can use a single 3D model which BIM technology provides, to design a project accurately and furthermore monitor the progress of construction works effectively and efficiently. Unmanned Aerial Vehicles (UAVs), a technology initially developed for military applications, is now without any difficulty accessible and has already been used by commercial industries, including the construction industry. UAV technology has mainly been used for collection of images that allow visual monitoring of building and civil engineering projects conditions in various circumstances. UAVs, nevertheless, have undergone significant advances in equipment capabilities and now have the capacity to acquire high-resolution imagery from many angles in a cost effective manner, and by using photogrammetry methods, someone can determine characteristics such as distances, angles, areas, volumes and elevations of an area within overlapping images. In order to examine the potential of using a combination of BIM and UAV technologies in construction project management, this paper presents the results of a case study of a typical road construction project where the combined use of the two technologies was used in order to achieve efficient and accurate as-built data collection of the works progress, with outcomes such as volumes, and production of sections and 3D models, information necessary in project progress monitoring and efficient project management.Keywords: BIM, project management, project monitoring, UAV
Procedia PDF Downloads 3032503 Albanian Students’ Errors in Spoken and Written English and the Role of Error Correction in Assessment and Self-Assessment
Authors: Arburim Iseni, Afrim Aliti, Nagri Rexhepi
Abstract:
This paper focuses mainly on an important aspect of student-linguistic errors. It aims to explore the nature of Albanian intermediate level or B1 students’ language errors and mistakes and attempts to trace the possible sources or causes by classifying the error samples into both inter lingual and intra lingual errors. The hypothesis that intra lingua errors may be determined or induced somehow by the native language influence seems to be confirmed by the significant number of errors found in Albanian EFL students in the Study Program of the English Language and Literature at the State University of Tetova. Findings of this study have revealed that L1 interference first and then ignorance of the English Language grammar rules constitute the main sources or causes of errors, even though carelessness cannot be ruled out. Although we have conducted our study with 300 students of intermediate or B1 level, we believe that this hypothesis would need to be confirmed by further research, maybe with a larger number of students with different levels in order to draw more steady and accurate conclusions. The analysis of the questionnaires was done according to quantitative and qualitative research methods. This study was also conducted by taking written samples on different topics from our students and then distributing them with comments to the students and University teachers as well. These questionnaires were designed to gather information among 300 students and 48 EFL teachers, all of whom teach in the Study Program of English Language and Literature at the State University of Tetova. From the analyzed written samples of the students and face-to-face interviews, we could get useful insights into some important aspects of students’ error-making and error-correction. These different research methodologies were used in order to comprise a holistic research and the findings of the questionnaires helped us to come up with some more steady solutions in order to minimize the potential gap between students and teachers.Keywords: L1 & L2, Linguistics, Applied linguistics, SLA, Albanian EFL students and teachers, Errors and Mistakes, Students’ Assessment and Self-Assessment
Procedia PDF Downloads 4882502 Large Eddy Simulation with Energy-Conserving Schemes: Understanding Wind Farm Aerodynamics
Authors: Dhruv Mehta, Alexander van Zuijlen, Hester Bijl
Abstract:
Large Eddy Simulation (LES) numerically resolves the large energy-containing eddies of a turbulent flow, while modelling the small dissipative eddies. On a wind farm, these large scales carry the energy wind turbines extracts and are also responsible for transporting the turbines’ wakes, which may interact with downstream turbines and certainly with the atmospheric boundary layer (ABL). In this situation, it is important to conserve the energy that these wake’s carry and which could be altered artificially through numerical dissipation brought about by the schemes used for the spatial discretisation and temporal integration. Numerical dissipation has been reported to cause the premature recovery of turbine wakes, leading to an over prediction in the power produced by wind farms.An energy-conserving scheme is free from numerical dissipation and ensures that the energy of the wakes is increased or decreased only by the action of molecular viscosity or the action of wind turbines (body forces). The aim is to create an LES package with energy-conserving schemes to simulate wind turbine wakes correctly to gain insight into power-production, wake meandering etc. Such knowledge will be useful in designing more efficient wind farms with minimal wake interaction, which if unchecked could lead to major losses in energy production per unit area of the wind farm. For their research, the authors intend to use the Energy-Conserving Navier-Stokes code developed by the Energy Research Centre of the Netherlands.Keywords: energy-conserving schemes, modelling turbulence, Large Eddy Simulation, atmospheric boundary layer
Procedia PDF Downloads 4652501 Dynamic Wetting and Solidification
Authors: Yulii D. Shikhmurzaev
Abstract:
The modelling of the non-isothermal free-surface flows coupled with the solidification process has become the topic of intensive research with the advent of additive manufacturing, where complex 3-dimensional structures are produced by successive deposition and solidification of microscopic droplets of different materials. The issue is that both the spreading of liquids over solids and the propagation of the solidification front into the fluid and along the solid substrate pose fundamental difficulties for their mathematical modelling. The first of these processes, known as ‘dynamic wetting’, leads to the well-known ‘moving contact-line problem’ where, as shown recently both experimentally and theoretically, the contact angle formed by the free surfac with the solid substrate is not a function of the contact-line speed but is rather a functional of the flow field. The modelling of the propagating solidification front requires generalization of the classical Stefan problem, which would be able to describe the onset of the process and the non-equilibrium regime of solidification. Furthermore, given that both dynamic wetting and solification occur concurrently and interactively, they should be described within the same conceptual framework. The present work addresses this formidable problem and presents a mathematical model capable of describing the key element of additive manufacturing in a self-consistent and singularity-free way. The model is illustrated simple examples highlighting its main features. The main idea of the work is that both dynamic wetting and solidification, as well as some other fluid flows, are particular cases in a general class of flows where interfaces form and/or disappear. This conceptual framework allows one to derive a mathematical model from first principles using the methods of irreversible thermodynamics. Crucially, the interfaces are not considered as zero-mass entities introduced using Gibbsian ‘dividing surface’ but the 2-dimensional surface phases produced by the continuum limit in which the thickness of what physically is an interfacial layer vanishes, and its properties are characterized by ‘surface’ parameters (surface tension, surface density, etc). This approach allows for the mass exchange between the surface and bulk phases, which is the essence of the interface formation. As shown numerically, the onset of solidification is preceded by the pure interface formation stage, whilst the Stefan regime is the final stage where the temperature at the solidification front asymptotically approaches the solidification temperature. The developed model can also be applied to the flow with the substrate melting as well as a complex flow where both types of phase transition take place.Keywords: dynamic wetting, interface formation, phase transition, solidification
Procedia PDF Downloads 652500 Modelling Phase Transformations in Zircaloy-4 Fuel Cladding under Transient Heating Rates
Authors: Jefri Draup, Antoine Ambard, Chi-Toan Nguyen
Abstract:
Zirconium alloys exhibit solid-state phase transformations under thermal loading. These can lead to a significant evolution of the microstructure and associated mechanical properties of materials used in nuclear fuel cladding structures. Therefore, the ability to capture effects of phase transformation on the material constitutive behavior is of interest during conditions of severe transient thermal loading. Whilst typical Avrami, or Johnson-Mehl-Avrami-Kolmogorov (JMAK), type models for phase transformations have been shown to have a good correlation with the behavior of Zircaloy-4 under constant heating rates, the effects of variable and fast heating rates are not fully explored. The present study utilises the results of in-situ high energy synchrotron X-ray diffraction (SXRD) measurements in order to validate the phase transformation models for Zircaloy-4 under fast variable heating rates. These models are used to assess the performance of fuel cladding structures under loss of coolant accident (LOCA) scenarios. The results indicate that simple Avrami type models can provide a reasonable indication of the phase distribution in experimental test specimens under variable fast thermal loading. However, the accuracy of these models deteriorates under the faster heating regimes, i.e., 100Cs⁻¹. The studies highlight areas for improvement of simple Avrami type models, such as the inclusion of temperature rate dependence of the JMAK n-exponent.Keywords: accident, fuel, modelling, zirconium
Procedia PDF Downloads 1422499 Determination of ILSS of Composite Materials Using Micromechanical FEA Analysis
Authors: K. Rana, H.A.Saeed, S. Zahir
Abstract:
Inter Laminar Shear Stress (ILSS) is a main key parameter which quantify the properties of composite materials. These properties can ascertain the use of material for a specific purpose like aerospace, automotive etc. A modelling approach for determination of ILSS is presented in this paper. Geometric modelling of composite material is performed in TEXGEN software where reinforcement, cured matrix and their interfaces are modelled separately as per actual geometry. Mechanical properties of matrix and reinforcements are modelled separately which incorporated anisotropy in the real world composite material. ASTM D2344 is modelled in ANSYS for ILSS. In macroscopic analysis model approximates the anisotropy of the material and uses orthotropic properties by applying homogenization techniques. Shear Stress analysis in that case does not show the actual real world scenario and rather approximates it. In this paper actual geometry and properties of reinforcement and matrix are modelled to capture the actual stress state during the testing of samples as per ASTM standards. Testing of samples is also performed in order to validate the results. Fibre volume fraction of yarn is determined by image analysis of manufactured samples. Fibre volume fraction data is incorporated into the numerical model for correction of transversely isotropic properties of yarn. A comparison between experimental and simulated results is presented.Keywords: ILSS, FEA, micromechanical, fibre volume fraction, image analysis
Procedia PDF Downloads 373