Search results for: testing fresh concrete
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5629

Search results for: testing fresh concrete

3589 Electrochemical Corrosion and Mechanical Properties of Structural Materials for Oil and Gas Applications in Simulated Deep-Sea Well Environments

Authors: Turin Datta, Kisor K. Sahu

Abstract:

Structural materials used in today’s oil and gas exploration and drilling of both onshore and offshore oil and gas wells must possess superior tensile properties, excellent resistance to corrosive degradation that includes general, localized (pitting and crevice) and environment assisted cracking such as stress corrosion cracking and hydrogen embrittlement. The High Pressure and High Temperature (HPHT) wells are typically operated at temperature and pressure that can exceed 300-3500F and 10,000psi (69MPa) respectively which necessitates the use of exotic materials in these exotic sources of natural resources. This research investigation is focussed on the evaluation of tensile properties and corrosion behavior of AISI 4140 High-Strength Low Alloy Steel (HSLA) possessing tempered martensitic microstructure and Duplex 2205 Stainless Steel (DSS) having austenitic and ferritic phase. The selection of this two alloys are primarily based on economic considerations as 4140 HSLA is cheaper when compared to DSS 2205. Due to the harsh aggressive chemical species encountered in deep oil and gas wells like chloride ions (Cl-), carbon dioxide (CO2), hydrogen sulphide (H2S) along with other mineral organic acids, DSS 2205, having a dual-phase microstructure can mitigate the degradation resulting from the presence of both chloride ions (Cl-) and hydrogen simultaneously. Tensile properties evaluation indicates a ductile failure of DSS 2205 whereas 4140 HSLA exhibit quasi-cleavage fracture due to the phenomenon of ‘tempered martensitic embrittlement’. From the potentiodynamic polarization testing, it is observed that DSS 2205 has higher corrosion resistance than 4140 HSLA; the former exhibits passivity signifying resistance to localized corrosion while the latter exhibits active dissolution in all the environmental parameters space that was tested. From the Scanning Electron Microscopy (SEM) evaluation, it is understood that stable pits appear in DSS 2205 only when the temperature exceeds the critical pitting temperature (CPT). SEM observation of the corroded 4140 HSLA specimen tested in aqueous 3.5 wt.% NaCl solution reveals intergranular cracking which appears due to the adsorption and diffusion of hydrogen during polarization, thus, causing hydrogen-induced cracking/hydrogen embrittlement. General corrosion testing of DSS 2205 in acidic brine (pH~3.0) solution at ambient temperature using coupons indicate no weight loss even after three months whereas the corrosion rate of AISI 4140 HSLA is significantly higher after one month of testing.

Keywords: DSS 2205, polarization, pitting, SEM

Procedia PDF Downloads 264
3588 Biochar - A Multi-Beneficial and Cost-Effective Amendment to Clay Soil for Stormwater Runoff Treatment

Authors: Mohammad Khalid, Mariya Munir, Jacelyn Rice Boyaue

Abstract:

Highways are considered a major source of pollution to storm-water, and its runoff can introduce various contaminants, including nutrients, Indicator bacteria, heavy metals, chloride, and phosphorus compounds, which can have negative impacts on receiving waters. This study assessed the ability of biochar for contaminants removal and to improve the water holding capacity of soil biochar mixture. For this, ten commercially available biochar has been strategically selected. Lab scale batch testing was done at 3% and 6% by the weight of the soil to find the preliminary estimate of contaminants removal along with hydraulic conductivity and water retention capacity. Furthermore, from the above-conducted studies, six best performing candidate and an application rate of 6% has been selected for the column studies. Soil biochar mixture was filled in 7.62 cm assembled columns up to a fixed height of 76.2 cm based on hydraulic conductivity. A total of eight column experiments have been conducted for nutrient, heavy metal, and indicator bacteria analysis over a period of one year, which includes a drying as well as a deicing period. The saturated hydraulic conductivity was greatly improved, which is attributed to the high porosity of the biochar soil mixture. Initial data from the column testing shows that biochar may have the ability to significantly remove nutrients, indicator bacteria, and heavy metals. The overall study demonstrates that biochar could be efficiently applied with clay soil to improve the soil's hydraulic characteristics as well as remove the pollutants from the stormwater runoff.

Keywords: biochar, nutrients, indicator bacteria, storm-water treatment, sustainability

Procedia PDF Downloads 121
3587 The Development of Traffic Devices Using Natural Rubber in Thailand

Authors: Weeradej Cheewapattananuwong, Keeree Srivichian, Godchamon Somchai, Wasin Phusanong, Nontawat Yoddamnern

Abstract:

Natural rubber used for traffic devices in Thailand has been developed and researched for several years. When compared with Dry Rubber Content (DRC), the quality of Rib Smoked Sheet (RSS) is better. However, the cost of admixtures, especially CaCO₃ and sulphur, is higher than the cost of RSS itself. In this research, Flexible Guideposts and Rubber Fender Barriers (RFB) are taken into consideration. In case of flexible guideposts, the materials used are both RSS and DRC60%, but for RFB, only RSS is used due to the controlled performance tests. The objective of flexible guideposts and RFB is to decrease a number of accidents, fatal rates, and serious injuries. Functions of both devices are to save road users and vehicles as well as to absorb impact forces from vehicles so as to decrease of serious road accidents. This leads to the mitigation methods to remedy the injury of motorists, form severity to moderate one. The solution is to find the best practice of traffic devices using natural rubber under the engineering concepts. In addition, the performances of materials, such as tensile strength and durability, are calculated for the modulus of elasticity and properties. In the laboratory, the simulation of crashes, finite element of materials, LRFD, and concrete technology methods are taken into account. After calculation, the trials' compositions of materials are mixed and tested in the laboratory. The tensile test, compressive test, and weathering or durability test are followed and based on ASTM. Furthermore, the Cycle-Repetition Test of Flexible Guideposts will be taken into consideration. The final decision is to fabricate all materials and have a real test section in the field. In RFB test, there will be 13 crash tests, 7 Pickup Truck tests, and 6 Motorcycle Tests. The test of vehicular crashes happens for the first time in Thailand, applying the trial and error methods; for example, the road crash test under the standard of NCHRP-TL3 (100 kph) is changed to the MASH 2016. This is owing to the fact that MASH 2016 is better than NCHRP in terms of speed, types, and weight of vehicles and the angle of crash. In the processes of MASH, Test Level 6 (TL-6), which is composed of 2,270 kg Pickup Truck, 100 kph, and 25 degree of crash-angle is selected. The final test for real crash will be done, and the whole system will be evaluated again in Korea. The researchers hope that the number of road accidents will decrease, and Thailand will be no more in the top tenth ranking of road accidents in the world.

Keywords: LRFD, load and resistance factor design, ASTM, american society for testing and materials, NCHRP, national cooperation highway research program, MASH, manual for assessing safety hardware

Procedia PDF Downloads 128
3586 Pushover Analysis of a Typical Bridge Built in Central Zone of Mexico

Authors: Arturo Galvan, Jatziri Y. Moreno-Martinez, Daniel Arroyo-Montoya, Jose M. Gutierrez-Villalobos

Abstract:

Bridges are one of the most seismically vulnerable structures on highway transportation systems. The general process for assessing the seismic vulnerability of a bridge involves the evaluation of its overall capacity and demand. One of the most common procedures to obtain this capacity is by means of pushover analysis of the structure. Typically, the bridge capacity is assessed using non-linear static methods or non-linear dynamic analyses. The non-linear dynamic approaches use step by step numerical solutions for assessing the capacity with the consuming computer time inconvenience. In this study, a nonlinear static analysis (‘pushover analysis’) was performed to predict the collapse mechanism of a typical bridge built in the central zone of Mexico (Celaya, Guanajuato). The bridge superstructure consists of three simple supported spans with a total length of 76 m: 22 m of the length of extreme spans and 32 m of length of the central span. The deck width is of 14 m and the concrete slab depth is of 18 cm. The bridge is built by means of frames of five piers with hollow box-shaped sections. The dimensions of these piers are 7.05 m height and 1.20 m diameter. The numerical model was created using a commercial software considering linear and non-linear elements. In all cases, the piers were represented by frame type elements with geometrical properties obtained from the structural project and construction drawings of the bridge. The deck was modeled with a mesh of rectangular thin shell (plate bending and stretching) finite elements. The moment-curvature analysis was performed for the sections of the piers of the bridge considering in each pier the effect of confined concrete and its reinforcing steel. In this way, plastic hinges were defined on the base of the piers to carry out the pushover analysis. In addition, time history analyses were performed using 19 accelerograms of real earthquakes that have been registered in Guanajuato. In this way, the displacements produced by the bridge were determined. Finally, pushover analysis was applied through the control of displacements in the piers to obtain the overall capacity of the bridge before the failure occurs. It was concluded that the lateral deformation of the piers due to a critical earthquake occurred in this zone is almost imperceptible due to the geometry and reinforcement demanded by the current design standards and compared to its displacement capacity, they were excessive. According to the analysis, it was found that the frames built with five piers increase the rigidity in the transverse direction of the bridge. Hence it is proposed to reduce these frames of five piers to three piers, maintaining the same geometrical characteristics and the same reinforcement in each pier. Also, the mechanical properties of materials (concrete and reinforcing steel) were maintained. Once a pushover analysis was performed considering this configuration, it was concluded that the bridge would continue having a “correct” seismic behavior, at least for the 19 accelerograms considered in this study. In this way, costs in material, construction, time and labor would be reduced in this study case.

Keywords: collapse mechanism, moment-curvature analysis, overall capacity, push-over analysis

Procedia PDF Downloads 152
3585 Comparative Analysis of Glycated Hemoglobin (hba1c) Between HPLC and Immunoturbidimetry Method in Type II Diabetes Mellitus Patient

Authors: Intanri Kurniati, Raja Iqbal Mulya Harahap, Agustyas Tjiptaningrum, Reni Zuraida

Abstract:

Background: Diabetes mellitus is still increasing and has become a health and social burden in the world. It is known that glycation among various proteins is increased in diabetic patients compared with non-diabetic subjects. Some of these glycated proteins are suggested to be involved in the development and progression of chronic diabetic complications. Among these glycated proteins, glycated hemoglobin (HbA1C) is commonly used as the gold standard index of glycemic control in the clinical setting. HbA1C testing has some methods, and the most commonly used is immunoturbidimetry. This research aimed to compare the HbA1c level between immunoturbidimetry and HbA1C level in T2DM patients. Methods: This research involves 77 patients from Abd Muluk Hospital Bandar Lampung; the patient was asked for consent in this research, then underwent phlebotomy to be tested for HbA1C; the sample was then examined for HbA1C with Turbidimetric Inhibition Immunoassay (TINIA) and High-Performance Liquid Chromatography (HPLC) method. Result: Mean± SD of the samples with the TINIA method was 9.2±1,2; meanwhile, the level HbA1C with the HPLC method is 9.6±1,2. The t-test showed no significant difference between the group subjects. (p<0.05). It was proposed that the two methods have high suitability in testing, and both are eligibly used for the patient. Discussion: There was no significant difference among research subjects, indicating that the high conformity of the two methods is suitable to be used for monitoring patients clinically. Conclusion: There is increasing in HbA1C level in a patient with T2DM measured with HPLC and or Turbidimetric Inhibition Immunoassay (TINIA) method, and there were no significant differences among those methods.

Keywords: diabetes mellitus, glycated albumin, HbA1C, HPLC, immunoturbidimetry

Procedia PDF Downloads 99
3584 The Effect of Transparent Oil Wood Stain on the Colour Stability of Spruce Wood during Weathering

Authors: Eliska Oberhofnerova, Milos Panek, Stepan Hysek, Martin Lexa

Abstract:

Nowadays the use of wood, both indoors and outdoors, is constantly increasing. However wood is a natural organic material and in the exterior is subjected to a degradation process caused by abiotic factors (solar radiation, rain, moisture, wind, dust etc.). This process affects only surface layers of wood but neglecting some of the basic rules of wood protection leads to increased possibility of biological agents attack and thereby influences a function of the wood element. The process of wood degradation can be decreased by proper surface treatment, especially in the case of less naturally durable wood species, as spruce. Modern coating systems are subjected to many requirements such as colour stability, hydrophobicity, low volatile organic compound (VOC) content, long service life or easy maintenance. The aim of this study is to evaluate the colour stability of spruce wood (Picea abies), as the basic parameter indicating the coating durability, treated with two layers of transparent natural oil wood stain and exposed to outdoor conditions. The test specimens were exposed for 2 years to natural weathering and 2000 hours to artificial weathering in UV-chamber. The colour parameters were measured before and during exposure to weathering by the spectrophotometer according to CIELab colour space. The comparison between untreated and treated wood and both testing procedures was carried out. The results showed a significant effect of coating on the colour stability of wood, as expected. Nevertheless, increasing colour changes of wood observed during the exposure to weathering differed according to applied testing procedure - natural and artificial.

Keywords: colour stability, natural and artificial weathering, spruce wood, transparent coating

Procedia PDF Downloads 220
3583 Influence of Infinite Elements in Vibration Analysis of High-Speed Railway Track

Authors: Janaki Rama Raju Patchamatla, Emani Pavan Kumar

Abstract:

The idea of increasing the existing train speeds and introduction of the high-speed trains in India as a part of Vision-2020 is really challenging from both economic viability and technical feasibility. More than economic viability, technical feasibility has to be thoroughly checked for safe operation and execution. Trains moving at high speeds need a well-established firm and safe track thoroughly tested against vibration effects. With increased speeds of trains, the track structure and layered soil-structure interaction have to be critically assessed for vibration and displacements. Physical establishment of track, testing and experimentation is a costly and time taking process. Software-based modelling and simulation give relatively reliable, cost-effective means of testing effects of critical parameters like sleeper design and density, properties of track and sub-grade, etc. The present paper reports the applicability of infinite elements in reducing the unrealistic stress-wave reflections from so-called soil-structure interface. The influence of the infinite elements is quantified in terms of the displacement time histories of adjoining soil and the deformation pattern in general. In addition, the railhead response histories at various locations show that the numerical model is realistic without any aberrations at the boundaries. The numerical model is quite promising in its ability to simulate the critical parameters of track design.

Keywords: high speed railway track, finite element method, Infinite elements, vibration analysis, soil-structure interface

Procedia PDF Downloads 272
3582 Degradation Kinetics of Cardiovascular Implants Employing Full Blood and Extra-Corporeal Circulation Principles: Mimicking the Human Circulation In vitro

Authors: Sara R. Knigge, Sugat R. Tuladhar, Hans-Klaus HöFfler, Tobias Schilling, Tim Kaufeld, Axel Haverich

Abstract:

Tissue engineered (TE) heart valves based on degradable electrospun fiber scaffold represent a promising approach to overcome the known limitations of mechanical or biological prostheses. But the mechanical stress in the high-pressure system of the human circulation is a severe challenge for the delicate materials. Hence, the prediction of the scaffolds` in vivo degradation kinetics must be as accurate as possible to prevent fatal events in future animal or even clinical trials. Therefore, this study investigates whether long-term testing in full blood provides more meaningful results regarding the degradation behavior than conventional tests in simulated body fluids (SBF) or Phosphate Buffered Saline (PBS). Fiber mats were produced from a polycaprolactone (PCL)/tetrafluoroethylene solution by electrospinning. The morphology of the fiber mats was characterized via scanning electron microscopy (SEM). A maximum physiological degradation environment utilizing a test set-up with porcine full blood was established. The set-up consists of a reaction vessel, an oxygenator unit, and a roller pump. The blood parameters (pO2, pCO2, temperature, and pH) were monitored with an online test system. All tests were also carried out in the test circuit with SBF and PBS to compare conventional degradation media with the novel full blood setting. The polymer's degradation is quantified by SEM picture analysis, differential scanning calorimetry (DSC), and Raman spectroscopy. Tensile and cyclic loading tests were performed to evaluate the mechanical integrity of the scaffold. Preliminary results indicate that PCL degraded slower in full blood than in SBF and PBS. The uptake of water is more pronounced in the full blood group. Also, PCL preserved its mechanical integrity longer when degraded in full blood. Protein absorption increased during the degradation process. Red blood cells, platelets, and their aggregates adhered on the PCL. Presumably, the degradation led to a more hydrophilic polymeric surface which promoted the protein adsorption and the blood cell adhesion. Testing degradable implants in full blood allows for developing more reliable scaffold materials in the future. Material tests in small and large animal trials thereby can be focused on testing candidates that have proven to function well in an in-vivo-like setting.

Keywords: Electrospun scaffold, full blood degradation test, long-term polymer degradation, tissue engineered aortic heart valve

Procedia PDF Downloads 150
3581 Social-Cognitive Aspects of Interpretation: Didactic Approaches in Language Processing and English as a Second Language Difficulties in Dyslexia

Authors: Schnell Zsuzsanna

Abstract:

Background: The interpretation of written texts, language processing in the visual domain, in other words, atypical reading abilities, also known as dyslexia, is an ever-growing phenomenon in today’s societies and educational communities. The much-researched problem affects cognitive abilities and, coupled with normal intelligence normally manifests difficulties in the differentiation of sounds and orthography and in the holistic processing of written words. The factors of susceptibility are varied: social, cognitive psychological, and linguistic factors interact with each other. Methods: The research will explain the psycholinguistics of dyslexia on the basis of several empirical experiments and demonstrate how domain-general abilities of inhibition, retrieval from the mental lexicon, priming, phonological processing, and visual modality transfer affect successful language processing and interpretation. Interpretation of visual stimuli is hindered, and the problem seems to be embedded in a sociocultural, psycholinguistic, and cognitive background. This makes the picture even more complex, suggesting that the understanding and resolving of the issues of dyslexia has to be interdisciplinary, aided by several disciplines in the field of humanities and social sciences, and should be researched from an empirical approach, where the practical, educational corollaries can be analyzed on an applied basis. Aim and applicability: The lecture sheds light on the applied, cognitive aspects of interpretation, social cognitive traits of language processing, the mental underpinnings of cognitive interpretation strategies in different languages (namely, Hungarian and English), offering solutions with a few applied techniques for success in foreign language learning that can be useful advice for the developers of testing methodologies and measures across ESL teaching and testing platforms.

Keywords: dyslexia, social cognition, transparency, modalities

Procedia PDF Downloads 84
3580 Issues in Organizational Assessment: The Case of Frustration Tolerance Measurement in Mexico

Authors: David Ruiz, Carlos Nava, Roberto Carbajal

Abstract:

The psychological profile has become one of the most important sources of information when it comes to individual selection and the hiring process in any organization. Psychological instruments are used to collect data about variables that are considered critically important for performance in work. However, because of conceptual chaos in organizational psychology, most of the information provided by psychological testing is not directly useful for Mexican human resources professionals to take hiring decisions. The aims of this paper are 1) to underline the lack of conceptual precision in theoretical testing foundations in Mexico and 2) presenting a reliability and validity analysis of a frustration tolerance instrument created as an alternative to a heuristically conduct individual assessment in organizations. First, a description of assessment conditions in Mexico is made. Second, an instrument and a theoretical framework is presented as an alternative to the assessment practices in the country. A total of 65 Psychology Iztacala Superior Studies Faculty students were assessed. Cronbach´s alpha coefficient was calculated and an exploratory factor analysis was carried out to prove the scale unidimensionality. Reliability analysis revealed good internal consistency of the scale (Cronbach’s α = 0.825). Factor analysis produced 4 factors for the scale. However, factor loadings and explained variation give proof to the scale unidimensionality. It is concluded that the instrument has good psychometric properties that will allow human resources professionals to collect useful data. Different possibilities to conduct psychological assessment are suggested for future development.

Keywords: psychological assessment, frustration tolerance, human resources, organizational psychology

Procedia PDF Downloads 309
3579 Determining Which Material Properties Resist the Tool Wear When Machining Pre-Sintered Zirconia

Authors: David Robert Irvine

Abstract:

In the dental restoration sector, there has been a shift to using zirconia. With the ever increasing need to decrease lead times to deliver restorations faster the zirconia is machined in its pre-sintered state instead of grinding the very hard sintered state. As with all machining, there is tool wear and while investigating the tooling used to machine pre-sintered zirconia it became apparent that the wear rate is based more on material build up and abrasion than it is on plastic deformation like conventional metal machining. It also came to light that the tool material can currently not be selected based on wear resistance, as there is no data. Different works have analysed the effect of the individual wear mechanism separately using similar if not the same material. In this work, the testing method used to analyse the wear was a modified from ISO 8688:1989 to use the pre-sintered zirconia and the cutting conditions used in dental to machine it. This understanding was developed through a series of tests based in machining operations, to give the best representation of the multiple wear factors that can occur in machining of pre-sintered zirconia such as 3 body abrasion, material build up, surface welding, plastic deformation, tool vibration and thermal cracking. From the testing, it found that carbide grades with low trans-granular rupture toughness would fail due to abrasion while those with high trans-granular rupture toughness failed due to edge chipping from build up or thermal properties. The results gained can assist the development of these tools and the restorative dental process. This work was completed with the aim of assisting in the selection of tool material for future tools along with a deeper understanding of the properties that assist in abrasive wear resistance and material build up.

Keywords: abrasive wear, cemented carbide, pre-sintered zirconia, tool wear

Procedia PDF Downloads 160
3578 Speech Recognition Performance by Adults: A Proposal for a Battery for Marathi

Authors: S. B. Rathna Kumar, Pranjali A Ujwane, Panchanan Mohanty

Abstract:

The present study aimed to develop a battery for assessing speech recognition performance by adults in Marathi. A total of four word lists were developed by considering word frequency, word familiarity, words in common use, and phonemic balance. Each word list consists of 25 words (15 monosyllabic words in CVC structure and 10 monosyllabic words in CVCV structure). Equivalence analysis and performance-intensity function testing was carried using the four word lists on a total of 150 native speakers of Marathi belonging to different regions of Maharashtra (Vidarbha, Marathwada, Khandesh and Northern Maharashtra, Pune, and Konkan). The subjects were further equally divided into five groups based on above mentioned regions. It was found that there was no significant difference (p > 0.05) in the speech recognition performance between groups for each word list and between word lists for each group. Hence, the four word lists developed were equally difficult for all the groups and can be used interchangeably. The performance-intensity (PI) function curve showed semi-linear function, and the groups’ mean slope of the linear portions of the curve indicated an average linear slope of 4.64%, 4.73%, 4.68%, and 4.85% increase in word recognition score per dB for list 1, list 2, list 3 and list 4 respectively. Although, there is no data available on speech recognition tests for adults in Marathi, most of the findings of the study are in line with the findings of research reports on other languages. The four word lists, thus developed, were found to have sufficient reliability and validity in assessing speech recognition performance by adults in Marathi.

Keywords: speech recognition performance, phonemic balance, equivalence analysis, performance-intensity function testing, reliability, validity

Procedia PDF Downloads 357
3577 Modelling of Geotechnical Data Using Geographic Information System and MATLAB for Eastern Ahmedabad City, Gujarat

Authors: Rahul Patel

Abstract:

Ahmedabad, a city located in western India, is experiencing rapid growth due to urbanization and industrialization. It is projected to become a metropolitan city in the near future, resulting in various construction activities. Soil testing is necessary before construction can commence, requiring construction companies and contractors to periodically conduct soil testing. The focus of this study is on the process of creating a spatial database that is digitally formatted and integrated with geotechnical data and a Geographic Information System (GIS). Building a comprehensive geotechnical (Geo)-database involves three steps: collecting borehole data from reputable sources, verifying the accuracy and redundancy of the data, and standardizing and organizing the geotechnical information for integration into the database. Once the database is complete, it is integrated with GIS, allowing users to visualize, analyze, and interpret geotechnical information spatially. Using a Topographic to Raster interpolation process in GIS, estimated values are assigned to all locations based on sampled geotechnical data values. The study area was contoured for SPT N-Values, Soil Classification, Φ-Values, and Bearing Capacity (T/m2). Various interpolation techniques were cross-validated to ensure information accuracy. This GIS map enables the calculation of SPT N-Values, Φ-Values, and bearing capacities for different footing widths and various depths. This study highlights the potential of GIS in providing an efficient solution to complex phenomena that would otherwise be tedious to achieve through other means. Not only does GIS offer greater accuracy, but it also generates valuable information that can be used as input for correlation analysis. Furthermore, this system serves as a decision support tool for geotechnical engineers.

Keywords: ArcGIS, borehole data, geographic information system, geo-database, interpolation, SPT N-value, soil classification, Φ-Value, bearing capacity

Procedia PDF Downloads 74
3576 Study of Ageing in the Marine Environment of Bonded Composite Structures by Ultrasonic Guided Waves. Comparison of the Case of a Conventional Carbon-epoxy Composite and a Recyclable Resin-Based Composite

Authors: Hamza Hafidi Alaoui, Damien Leduc, Mounsif Ech Cherif El Kettani

Abstract:

This study is dedicated to the evaluation of the ageing of turbine blades in sea conditions, based on ultrasonic Non Destructive Testing (NDT) methods. This study is being developed within the framework of the European Interreg TIGER project. The Tidal Stream Industry Energiser Project, known as TIGER, is the biggest ever Interreg project driving collaboration and cost reductionthrough tidal turbine installations in the UK and France. The TIGER project will drive the growth of tidal stream energy to become a greater part of the energy mix, with significant benefits for coastal communities. In the bay of Paimpol-Bréhat (Brittany), different samples of composite material and bonded composite/composite structures have been immersed at the same time near a turbine. The studied samples are either conventional carbon-epoxy composite samples or composite samples based on a recyclable resin (called recyclamine). One of the objectives of the study is to compare the ageing of the two types of structure. A sample of each structure is picked up every 3 to 6 months and analyzed using ultrasonic guided waves and bulk waves and compared to reference samples. In order to classify the damage level as a function of time spent under the sea, the measure have been compared to a rheological model based on the Finite Elements Method (FEM). Ageing of the composite material, as well as that of the adhesive, is identified. The aim is to improve the quality of the turbine blade structure in terms of longevity and reduced maintenance needs.

Keywords: non-destructive testing, ultrasound, composites, guides waves

Procedia PDF Downloads 220
3575 Wear Measurement of Thermomechanical Parameters of the Metal Carbide

Authors: Riad Harouz, Brahim Mahfoud

Abstract:

The threads and the circles on reinforced concrete are obtained by process of hot rolling with pebbles finishers in metal carbide which present a way of rolling around the outside diameter. Our observation is that this throat presents geometrical wear after the end of its cycle determined in tonnage. In our study, we have determined, in a first step, experimentally measurements of the wear in terms of thermo-mechanical parameters (Speed, Load, and Temperature) and the influence of these parameters on the wear. In the second stage, we have developed a mathematical model of lifetime useful for the prognostic of the wear and their changes.

Keywords: lifetime, metal carbides, modeling, thermo-mechanical, wear

Procedia PDF Downloads 310
3574 The Adsorption of Zinc Metal in Waste Water Using ZnCl2 Activated Pomegranate Peel

Authors: S. N. Turkmen, A. S. Kipcak, N. Tugrul, E. M. Derun, S. Piskin

Abstract:

Activated carbon is an amorphous carbon chain which has extremely extended surface area. High surface area of activated carbon is due to the porous structure. Activated carbon, using a variety of materials such as coal and cellulosic materials; can be obtained by both physical and chemical methods. The prepared activated carbon can be used for decolorize, deodorize and also can be used for removal of organic and non-organic pollution. In this study, pomegranate peel was subjected to 800W microwave power for 1 to 4 minutes. Also fresh pomegranate peel was used for the reference material. Then ZnCl2 was used for the chemical activation purpose. After the activation process, activated pomegranate peels were used for the adsorption of Zn metal (40 ppm) in the waste water. As a result of the adsorption experiments, removal of heavy metals ranged from 89% to 85%.

Keywords: activated carbon, adsorption, chemical activation, microwave, pomegranate peel

Procedia PDF Downloads 547
3573 Mondoc: Informal Lightweight Ontology for Faceted Semantic Classification of Hypernymy

Authors: M. Regina Carreira-Lopez

Abstract:

Lightweight ontologies seek to concrete union relationships between a parent node, and a secondary node, also called "child node". This logic relation (L) can be formally defined as a triple ontological relation (LO) equivalent to LO in ⟨LN, LE, LC⟩, and where LN represents a finite set of nodes (N); LE is a set of entities (E), each of which represents a relationship between nodes to form a rooted tree of ⟨LN, LE⟩; and LC is a finite set of concepts (C), encoded in a formal language (FL). Mondoc enables more refined searches on semantic and classified facets for retrieving specialized knowledge about Atlantic migrations, from the Declaration of Independence of the United States of America (1776) and to the end of the Spanish Civil War (1939). The model looks forward to increasing documentary relevance by applying an inverse frequency of co-ocurrent hypernymy phenomena for a concrete dataset of textual corpora, with RMySQL package. Mondoc profiles archival utilities implementing SQL programming code, and allows data export to XML schemas, for achieving semantic and faceted analysis of speech by analyzing keywords in context (KWIC). The methodology applies random and unrestricted sampling techniques with RMySQL to verify the resonance phenomena of inverse documentary relevance between the number of co-occurrences of the same term (t) in more than two documents of a set of texts (D). Secondly, the research also evidences co-associations between (t) and their corresponding synonyms and antonyms (synsets) are also inverse. The results from grouping facets or polysemic words with synsets in more than two textual corpora within their syntagmatic context (nouns, verbs, adjectives, etc.) state how to proceed with semantic indexing of hypernymy phenomena for subject-heading lists and for authority lists for documentary and archival purposes. Mondoc contributes to the development of web directories and seems to achieve a proper and more selective search of e-documents (classification ontology). It can also foster on-line catalogs production for semantic authorities, or concepts, through XML schemas, because its applications could be used for implementing data models, by a prior adaptation of the based-ontology to structured meta-languages, such as OWL, RDF (descriptive ontology). Mondoc serves to the classification of concepts and applies a semantic indexing approach of facets. It enables information retrieval, as well as quantitative and qualitative data interpretation. The model reproduces a triple tuple ⟨LN, LE, LT, LCF L, BKF⟩ where LN is a set of entities that connect with other nodes to concrete a rooted tree in ⟨LN, LE⟩. LT specifies a set of terms, and LCF acts as a finite set of concepts, encoded in a formal language, L. Mondoc only resolves partial problems of linguistic ambiguity (in case of synonymy and antonymy), but neither the pragmatic dimension of natural language nor the cognitive perspective is addressed. To achieve this goal, forthcoming programming developments should target at oriented meta-languages with structured documents in XML.

Keywords: hypernymy, information retrieval, lightweight ontology, resonance

Procedia PDF Downloads 125
3572 Use of Information Technology in the Government of a State

Authors: Pavel E. Golosov, Vladimir I. Gorelov, Oksana L. Karelova

Abstract:

There are visible changes in the world organization, environment and health of national conscience that create a background for discussion on possible redefinition of global, state and regional management goals. Authors apply the sustainable development criteria to a hierarchical management scheme that is to lead the world community to non-contradictory growth. Concrete definitions are discussed in respect of decision-making process representing the state mostly. With the help of system analysis it is highlighted how to understand who would carry the distinctive sign of world leadership in the nearest future.

Keywords: decision-making, information technology, public administration

Procedia PDF Downloads 512
3571 Forest Fire Burnt Area Assessment in a Part of West Himalayan Region Using Differenced Normalized Burnt Ratio and Neural Network Approach

Authors: Sunil Chandra, Himanshu Rawat, Vikas Gusain, Triparna Barman

Abstract:

Forest fires are a recurrent phenomenon in the Himalayan region owing to the presence of vulnerable forest types, topographical gradients, climatic weather conditions, and anthropogenic pressure. The present study focuses on the identification of forest fire-affected areas in a small part of the West Himalayan region using a differential normalized burnt ratio method and spectral unmixing methods. The study area has a rugged terrain with the presence of sub-tropical pine forest, montane temperate forest, and sub-alpine forest and scrub. The major reason for fires in this region is anthropogenic in nature, with the practice of human-induced fires for getting fresh leaves, scaring wild animals to protect agricultural crops, grazing practices within reserved forests, and igniting fires for cooking and other reasons. The fires caused by the above reasons affect a large area on the ground, necessitating its precise estimation for further management and policy making. In the present study, two approaches have been used for carrying out a burnt area analysis. The first approach followed for burnt area analysis uses a differenced normalized burnt ratio (dNBR) index approach that uses the burnt ratio values generated using the Short-Wave Infrared (SWIR) band and Near Infrared (NIR) bands of the Sentinel-2 image. The results of the dNBR have been compared with the outputs of the spectral mixing methods. It has been found that the dNBR is able to create good results in fire-affected areas having homogenous forest stratum and with slope degree <5 degrees. However, in a rugged terrain where the landscape is largely influenced by the topographical variations, vegetation types, tree density, the results may be largely influenced by the effects of topography, complexity in tree composition, fuel load composition, and soil moisture. Hence, such variations in the factors influencing burnt area assessment may not be effectively carried out using a dNBR approach which is commonly followed for burnt area assessment over a large area. Hence, another approach that has been attempted in the present study utilizes a spectral mixing method where the individual pixel is tested before assigning an information class to it. The method uses a neural network approach utilizing Sentinel-2 bands. The training and testing data are generated from the Sentinel-2 data and the national field inventory, which is further used for generating outputs using ML tools. The analysis of the results indicates that the fire-affected regions and their severity can be better estimated using spectral unmixing methods, which have the capability to resolve the noise in the data and can classify the individual pixel to the precise burnt/unburnt class.

Keywords: categorical data, log linear modeling, neural network, shifting cultivation

Procedia PDF Downloads 54
3570 Permeodynamic Particulate Matter Filtration for Improved Air Quality

Authors: Hamad M. Alnagran, Mohammed S. Imbabi

Abstract:

Particulate matter (PM) in the air we breathe is detrimental to health. Overcoming this problem has attracted interest and prompted research on the use of PM filtration in commercial buildings and homes to be carried out. The consensus is that tangible health benefits can result from the use of PM filters in most urban environments, to clean up the building’s fresh air supply and thereby reduce exposure of residents to airborne PM. The authors have investigated and are developing a new large-scale Permeodynamic Filtration Technology (PFT) capable of permanently filtering and removing airborne PMs from outdoor spaces, thus also benefiting internal spaces such as the interiors of buildings. Theoretical models were developed, and laboratory trials carried out to determine, and validate through measurement permeodynamic filtration efficiency and pressure drop as functions of PM particle size distributions. The conclusion is that PFT offers a potentially viable, cost effective end of pipe solution to the problem of airborne PM.

Keywords: air filtration, particulate matter, particle size distribution, permeodynamic

Procedia PDF Downloads 204
3569 Potential of ᵞ-Polyglutamic Acid for Cadmium Toxicity Alleviation in Rice

Authors: N. Kotabin, Y. Tahara, K. Issakul, O. Chunhachart

Abstract:

Cadmium (II) (Cd) is one of the major toxic elemental pollutants which is hazardous for humans, animals and plants. γ-Polyglutamic acid (γ-PGA) is an extracellular biopolymer produced by several species of Bacillus which has been reported to be an effective biosorbent for metal ions. The effect of γ-PGA on growth of rice grown under laboratory conditions was investigated. Rice seeds were germinated and then grown at 30±1°C on filter paper soaked with Cd solution and γ-PGA for 7 days. The result showed that Cd significantly inhibited the growth of roots and shoots by reducing root and shoot lengths. Fresh and dry weights also decreased compared with control; however, the addition of 500 mg•L-1 γ-PGA alleviated rice seedlings from the adverse effects of Cd. The analysis of physiological traits revealed that Cd caused a decrease in the total chlorophyll and soluble protein contents and amylase activities in all treatments. The Cd content in seedling tissues increased for the Cd 250 μM treatment (P < 0.05) but the addition of 500 mg•L-1 γ-PGA resulted in a noticeable decrease in Cd (P < 0.05).

Keywords: polyglutamic acid, cadmium, rice, bacillus subtilis

Procedia PDF Downloads 299
3568 Probiotics’ Antibacterial Activity on Beef and Camel Minced Meat at Altered Ranges of Temperature

Authors: Rania Samir Zaki

Abstract:

Because of their inhibitory effects, selected probiotic Lactobacilli may be used as antimicrobial against some hazardous microorganisms responsible for spoilage of fresh minced beef (cattle) minced meat and camel minced meat. Lactic acid bacteria were isolated from camel meat. These included 10 isolates; 1 Lactobacillus fermenti, 4 Lactobacillus plantarum, 4 Lactobacillus pulgaricus, 3 Lactobacillus acidophilus and 1 Lactobacillus brevis. The most efficient inhibitory organism was Lactobacillus plantarum which can be used as a propiotic with antibacterial activity. All microbiological analyses were made at the time 0, first day and the second day at altered ranges of temperature [4±2 ⁰C (chilling temperature), 25±2 ⁰C, and 38±2 ⁰C]. Results showed a significant decrease of pH 6.2 to 5.1 within variant types of meat, in addition to reduction of Total Bacterial Count, Enterococci, Bacillus cereus and Escherichia coli together with the stability of Coliforms and absence of Staphylococcus aureus.

Keywords: antibacterial, camel meat, inhibition, probiotics

Procedia PDF Downloads 299
3567 Solid State Drive End to End Reliability Prediction, Characterization and Control

Authors: Mohd Azman Abdul Latif, Erwan Basiron

Abstract:

A flaw or drift from expected operational performance in one component (NAND, PMIC, controller, DRAM, etc.) may affect the reliability of the entire Solid State Drive (SSD) system. Therefore, it is important to ensure the required quality of each individual component through qualification testing specified using standards or user requirements. Qualification testing is time-consuming and comes at a substantial cost for product manufacturers. A highly technical team, from all the eminent stakeholders is embarking on reliability prediction from beginning of new product development, identify critical to reliability parameters, perform full-blown characterization to embed margin into product reliability and establish control to ensure the product reliability is sustainable in the mass production. The paper will discuss a comprehensive development framework, comprehending SSD end to end from design to assembly, in-line inspection, in-line testing and will be able to predict and to validate the product reliability at the early stage of new product development. During the design stage, the SSD will go through intense reliability margin investigation with focus on assembly process attributes, process equipment control, in-process metrology and also comprehending forward looking product roadmap. Once these pillars are completed, the next step is to perform process characterization and build up reliability prediction modeling. Next, for the design validation process, the reliability prediction specifically solder joint simulator will be established. The SSD will be stratified into Non-Operating and Operating tests with focus on solder joint reliability and connectivity/component latent failures by prevention through design intervention and containment through Temperature Cycle Test (TCT). Some of the SSDs will be subjected to the physical solder joint analysis called Dye and Pry (DP) and Cross Section analysis. The result will be feedbacked to the simulation team for any corrective actions required to further improve the design. Once the SSD is validated and is proven working, it will be subjected to implementation of the monitor phase whereby Design for Assembly (DFA) rules will be updated. At this stage, the design change, process and equipment parameters are in control. Predictable product reliability at early product development will enable on-time sample qualification delivery to customer and will optimize product development validation, effective development resource and will avoid forced late investment to bandage the end-of-life product failures. Understanding the critical to reliability parameters earlier will allow focus on increasing the product margin that will increase customer confidence to product reliability.

Keywords: e2e reliability prediction, SSD, TCT, solder joint reliability, NUDD, connectivity issues, qualifications, characterization and control

Procedia PDF Downloads 174
3566 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 276
3565 Evaluation of the Grammar Questions at the Undergraduate Level

Authors: Preeti Gacche

Abstract:

A considerable part of undergraduate level English Examination papers is devoted to grammar. Hence the grammar questions in the question papers are evaluated and the opinions of both students and teachers about them are obtained and analyzed. A grammar test of 100 marks is administered to 43 students to check their performance. The question papers have been evaluated by 10 different teachers and their scores compared. The analysis of 38 University question papers reveals that on an average 20 percent marks are allotted to grammar. Almost all the grammar topics are tested. Abundant use of grammatical terminology is observed in the questions. Decontextualization, repetition, possibility of multiple correct answers and grammatical errors in framing the questions have been observed. Opinions of teachers and students about grammar questions vary in many respects. The students responses are analyzed medium-wise and sex-wise. The Medium at the School level and the sex of the students are found to play no role as far as interest in the study of grammar is concerned. English medium students solve grammar questions intuitively whereas non-English medium students are required to recollect the rules of grammar. Prepositions, Verbs, Articles and Model auxiliaries are found to be easy topics for most students whereas the use of conjunctions is the most difficult topic. Out of context items of grammar are difficult to answer in comparison with contextualized items of grammar. Hence contextualized texts to test grammar items are desirable. No formal training in setting questions is imparted to teachers by the competent authorities like the University. They need to be trained in testing. Statistically there is no significant change of score with the change in the rater in testing of grammar items. There is scope of future improvement. The question papers need to be evaluated and feedback needs to be obtained from students and teachers for future improvement.

Keywords: context, evaluation, grammar, tests

Procedia PDF Downloads 353
3564 Seismic Evaluation of Multi-Plastic Hinge Design Approach on RC Shear Wall-Moment Frame Systems against Near-Field Earthquakes

Authors: Mohsen Tehranizadeh, Mahboobe Forghani

Abstract:

The impact of higher modes on the seismic response of dual structural system consist of concrete moment-resisting frame and with RC shear walls is investigated against near-field earthquakes in this paper. a 20 stories reinforced concrete shear wall-special moment frame structure is designed in accordance with ASCE7 requirements and The nonlinear model of the structure was performed on OpenSees platform. Nonlinear time history dynamic analysis with 3 near-field records are performed on them. In order to further understand the structural collapse behavior in the near field, the response of the structure at the moment of collapse especially the formation of plastic hinges is explored. The results revealed that the amplification of moment at top of the wall due to higher modes, the plastic hinge can form in the upper part of wall, even when designed and detailed for plastic hinging at the base only (according to ACI code).on the other hand, shear forces in excess of capacity design values can develop due to the contribution of the higher modes of vibration to dynamic response due to the near field can cause brittle shear or sliding failure modes. The past investigation on shear walls clearly shows the dual-hinge design concept is effective at reducing the effects of the second mode of response. An advantage of the concept is that, when combined with capacity design, it can result in relaxation of special reinforcing detailing in large portions of the wall. In this study, to investigate the implications of multi-design approach, 4 models with varies arrangement of hinge plastics at the base and height of the shear wall are considered. results base on time history analysis showed that the dual or multi plastic hinges approach can be useful in order to control the high moment and shear demand of higher mode effect.

Keywords: higher mode effect, Near-field earthquake, nonlinear time history analysis, multi plastic hinge design

Procedia PDF Downloads 430
3563 Predicting Foreign Direct Investment of IC Design Firms from Taiwan to East and South China Using Lotka-Volterra Model

Authors: Bi-Huei Tsai

Abstract:

This work explores the inter-region investment behaviors of integrated circuit (IC) design industry from Taiwan to China using the amount of foreign direct investment (FDI). According to the mutual dependence among different IC design industrial locations, Lotka-Volterra model is utilized to explore the FDI interactions between South and East China. Effects of inter-regional collaborations on FDI flows into China are considered. Evolutions of FDIs into South China for IC design industry significantly inspire the subsequent FDIs into East China, while FDIs into East China for Taiwan’s IC design industry significantly hinder the subsequent FDIs into South China. The supply chain along IC industry includes IC design, manufacturing, packing and testing enterprises. I C manufacturing, packaging and testing industries depend on IC design industry to gain advanced business benefits. The FDI amount from Taiwan’s IC design industry into East China is the greatest among the four regions: North, East, Mid-West and South China. The FDI amount from Taiwan’s IC design industry into South China is the second largest. If IC design houses buy more equipment and bring more capitals in South China, those in East China will have pressure to undertake more FDIs into East China to maintain the leading position advantages of the supply chain in East China. On the other hand, as the FDIs in East China rise, the FDIs in South China will successively decline since capitals have concentrated in East China. Prediction of Lotka-Volterra model in FDI trends is accurate because the industrial interactions between the two regions are included. Finally, this work confirms that the FDI flows cannot reach a stable equilibrium point, so the FDI inflows into East and South China will expand in the future.

Keywords: Lotka-Volterra model, foreign direct investment, competitive, Equilibrium analysis

Procedia PDF Downloads 363
3562 In vitro Skin Model for Enhanced Testing of Antimicrobial Textiles

Authors: Steven Arcidiacono, Robert Stote, Erin Anderson, Molly Richards

Abstract:

There are numerous standard test methods for antimicrobial textiles that measure activity against specific microorganisms. However, many times these results do not translate to the performance of treated textiles when worn by individuals. Standard test methods apply a single target organism grown under optimal conditions to a textile, then recover the organism to quantitate and determine activity; this does not reflect the actual performance environment that consists of polymicrobial communities in less than optimal conditions or interaction of the textile with the skin substrate. Here we propose the development of in vitro skin model method to bridge the gap between lab testing and wear studies. The model will consist of a defined polymicrobial community of 5-7 commensal microbes simulating the skin microbiome, seeded onto a solid tissue platform to represent the skin. The protocol would entail adding a non-commensal test organism of interest to the defined community and applying a textile sample to the solid substrate. Following incubation, the textile would be removed and the organisms recovered, which would then be quantitated to determine antimicrobial activity. Important parameters to consider include identification and assembly of the defined polymicrobial community, growth conditions to allow the establishment of a stable community, and choice of skin surrogate. This model could answer the following questions: 1) is the treated textile effective against the target organism? 2) How is the defined community affected? And 3) does the textile cause unwanted effects toward the skin simulant? The proposed model would determine activity under conditions comparable to the intended application and provide expanded knowledge relative to current test methods.

Keywords: antimicrobial textiles, defined polymicrobial community, in vitro skin model, skin microbiome

Procedia PDF Downloads 137
3561 Qualitative Analysis of Current Child Custody Evaluation Practices

Authors: Carolyn J. Ortega, Stephen E. Berger

Abstract:

The role of the custody evaluator is perhaps one of the most controversial and risky endeavors in clinical practice. Complaints filed with licensing boards regarding a child-custody evaluation constitute the second most common reason for such an event. Although the evaluator is expected to answer for the family-law court what is in the “best interest of the child,” there is a lack of clarity on how to establish this in any empirically validated manner. Hence, practitioners must contend with a nebulous framework in formulating their methodological procedures that inherently places them at risk in an already litigious context. This study sought to qualitatively investigate patterns of practice among doctoral practitioners conducting child custody evaluations in the area of Southern California. Ten psychologists were interviewed who devoted between 25 and 100% of their California private practice to custody work. All held Ph.D. degrees with a range of eight to 36 years of experience in custody work. Semi-structured interviews were used to investigate assessment practices, ensure adherence to guidelines, risk management, and qualities of evaluators. Forty-three Specific Themes were identified using Interpretive Phenomenological Analysis (IPA). Seven Higher Order Themes clustered on salient factors such as use of Ethics, Law, Guidelines; Parent Variables; Child Variables; Psychologist Variables; Testing; Literature; and Trends. Evaluators were aware of the ever-present reality of a licensure complaint and thus presented idiosyncratic descriptions of risk management considerations. Ambiguity about quantifying and validly tapping parenting abilities was also reviewed. Findings from this study suggested a high reliance on unstructured and observational methods in child custody practices.

Keywords: forensic psychology, psychological testing, assessment methodology, child custody

Procedia PDF Downloads 284
3560 Modeling of Thermally Induced Acoustic Emission Memory Effects in Heterogeneous Rocks with Consideration for Fracture Develo

Authors: Vladimir A. Vinnikov

Abstract:

The paper proposes a model of an inhomogeneous rock mass with initially random distribution of microcracks on mineral grain boundaries. It describes the behavior of cracks in a medium under the effect of thermal field, the medium heated instantaneously to a predetermined temperature. Crack growth occurs according to the concept of fracture mechanics provided that the stress intensity factor K exceeds the critical value of Kc. The modeling of thermally induced acoustic emission memory effects is based on the assumption that every event of crack nucleation or crack growth caused by heating is accompanied by a single acoustic emission event. Parameters of the thermally induced acoustic emission memory effect produced by cyclic heating and cooling (with the temperature amplitude increasing from cycle to cycle) were calculated for several rock texture types (massive, banded, and disseminated). The study substantiates the adaptation of the proposed model to humidity interference with the thermally induced acoustic emission memory effect. The influence of humidity on the thermally induced acoustic emission memory effect in quasi-homogeneous and banded rocks is estimated. It is shown that such modeling allows the structure and texture of rocks to be taken into account and the influence of interference factors on the distinctness of the thermally induced acoustic emission memory effect to be estimated. The numerical modeling can be used to obtain information about the thermal impacts on rocks in the past and determine the degree of rock disturbance by means of non-destructive testing.

Keywords: degree of rock disturbance, non-destructive testing, thermally induced acoustic emission memory effects, structure and texture of rocks

Procedia PDF Downloads 263