Search results for: pharmacodynamic testing
2516 Modelling of Geotechnical Data Using Geographic Information System and MATLAB for Eastern Ahmedabad City, Gujarat
Authors: Rahul Patel
Abstract:
Ahmedabad, a city located in western India, is experiencing rapid growth due to urbanization and industrialization. It is projected to become a metropolitan city in the near future, resulting in various construction activities. Soil testing is necessary before construction can commence, requiring construction companies and contractors to periodically conduct soil testing. The focus of this study is on the process of creating a spatial database that is digitally formatted and integrated with geotechnical data and a Geographic Information System (GIS). Building a comprehensive geotechnical (Geo)-database involves three steps: collecting borehole data from reputable sources, verifying the accuracy and redundancy of the data, and standardizing and organizing the geotechnical information for integration into the database. Once the database is complete, it is integrated with GIS, allowing users to visualize, analyze, and interpret geotechnical information spatially. Using a Topographic to Raster interpolation process in GIS, estimated values are assigned to all locations based on sampled geotechnical data values. The study area was contoured for SPT N-Values, Soil Classification, Φ-Values, and Bearing Capacity (T/m2). Various interpolation techniques were cross-validated to ensure information accuracy. This GIS map enables the calculation of SPT N-Values, Φ-Values, and bearing capacities for different footing widths and various depths. This study highlights the potential of GIS in providing an efficient solution to complex phenomena that would otherwise be tedious to achieve through other means. Not only does GIS offer greater accuracy, but it also generates valuable information that can be used as input for correlation analysis. Furthermore, this system serves as a decision support tool for geotechnical engineers.Keywords: ArcGIS, borehole data, geographic information system, geo-database, interpolation, SPT N-value, soil classification, Φ-Value, bearing capacity
Procedia PDF Downloads 742515 Study of Ageing in the Marine Environment of Bonded Composite Structures by Ultrasonic Guided Waves. Comparison of the Case of a Conventional Carbon-epoxy Composite and a Recyclable Resin-Based Composite
Authors: Hamza Hafidi Alaoui, Damien Leduc, Mounsif Ech Cherif El Kettani
Abstract:
This study is dedicated to the evaluation of the ageing of turbine blades in sea conditions, based on ultrasonic Non Destructive Testing (NDT) methods. This study is being developed within the framework of the European Interreg TIGER project. The Tidal Stream Industry Energiser Project, known as TIGER, is the biggest ever Interreg project driving collaboration and cost reductionthrough tidal turbine installations in the UK and France. The TIGER project will drive the growth of tidal stream energy to become a greater part of the energy mix, with significant benefits for coastal communities. In the bay of Paimpol-Bréhat (Brittany), different samples of composite material and bonded composite/composite structures have been immersed at the same time near a turbine. The studied samples are either conventional carbon-epoxy composite samples or composite samples based on a recyclable resin (called recyclamine). One of the objectives of the study is to compare the ageing of the two types of structure. A sample of each structure is picked up every 3 to 6 months and analyzed using ultrasonic guided waves and bulk waves and compared to reference samples. In order to classify the damage level as a function of time spent under the sea, the measure have been compared to a rheological model based on the Finite Elements Method (FEM). Ageing of the composite material, as well as that of the adhesive, is identified. The aim is to improve the quality of the turbine blade structure in terms of longevity and reduced maintenance needs.Keywords: non-destructive testing, ultrasound, composites, guides waves
Procedia PDF Downloads 2212514 Influence of Magnetized Water on the Split Tensile Strength of Concrete
Authors: Justine Cyril E. Nunag, Nestor B. Sabado Jr., Jienne Chester M. Tolosa
Abstract:
Concrete has high compressive strength but a low-tension strength. The small tensile strength of concrete is regarded as its primary weakness, which is why it is typically reinforced with steel, a material that is resistant to tension. Even with steel, however, cracking can occur. In strengthening concrete, only a few researchers have modified the water to be used in a concrete mix. This study aims to compare the split tensile strength of normal structural concrete to concrete prepared with magnetic water and a quick setting admixture. In this context, magnetic water is defined as tap water that has undergone a magnetic process to become magnetized water. To test the hypothesis that magnetized concrete leads to higher split tensile strength, twenty concrete specimens were made. There were five groups, each with five samples, that were differentiated by the number of cycles (0, 50, 100, and 150). The data from the Universal Testing Machine's split tensile strength were then analyzed using various statistical models and tests to determine the significant effect of magnetized water. The result showed a moderate (+0.579) but still significant degree of correlation. The researchers also discovered that using magnetic water for 50 cycles did not result in a significant increase in the concrete's split tensile strength, which influenced the analysis of variance. These results suggest that a concrete mix containing magnetic water and a quick-setting admixture alters the typical split tensile strength of normal concrete. Magnetic water has a significant impact on concrete tensile strength. The hardness property of magnetic water influenced the split tensile strength of concrete. In addition, a higher number of cycles results in a strong water magnetism. The laboratory test results show that a higher cycle translates to a higher tensile strength.Keywords: hardness property, magnetic water, quick-setting admixture, split tensile strength, universal testing machine
Procedia PDF Downloads 1462513 Solid State Drive End to End Reliability Prediction, Characterization and Control
Authors: Mohd Azman Abdul Latif, Erwan Basiron
Abstract:
A flaw or drift from expected operational performance in one component (NAND, PMIC, controller, DRAM, etc.) may affect the reliability of the entire Solid State Drive (SSD) system. Therefore, it is important to ensure the required quality of each individual component through qualification testing specified using standards or user requirements. Qualification testing is time-consuming and comes at a substantial cost for product manufacturers. A highly technical team, from all the eminent stakeholders is embarking on reliability prediction from beginning of new product development, identify critical to reliability parameters, perform full-blown characterization to embed margin into product reliability and establish control to ensure the product reliability is sustainable in the mass production. The paper will discuss a comprehensive development framework, comprehending SSD end to end from design to assembly, in-line inspection, in-line testing and will be able to predict and to validate the product reliability at the early stage of new product development. During the design stage, the SSD will go through intense reliability margin investigation with focus on assembly process attributes, process equipment control, in-process metrology and also comprehending forward looking product roadmap. Once these pillars are completed, the next step is to perform process characterization and build up reliability prediction modeling. Next, for the design validation process, the reliability prediction specifically solder joint simulator will be established. The SSD will be stratified into Non-Operating and Operating tests with focus on solder joint reliability and connectivity/component latent failures by prevention through design intervention and containment through Temperature Cycle Test (TCT). Some of the SSDs will be subjected to the physical solder joint analysis called Dye and Pry (DP) and Cross Section analysis. The result will be feedbacked to the simulation team for any corrective actions required to further improve the design. Once the SSD is validated and is proven working, it will be subjected to implementation of the monitor phase whereby Design for Assembly (DFA) rules will be updated. At this stage, the design change, process and equipment parameters are in control. Predictable product reliability at early product development will enable on-time sample qualification delivery to customer and will optimize product development validation, effective development resource and will avoid forced late investment to bandage the end-of-life product failures. Understanding the critical to reliability parameters earlier will allow focus on increasing the product margin that will increase customer confidence to product reliability.Keywords: e2e reliability prediction, SSD, TCT, solder joint reliability, NUDD, connectivity issues, qualifications, characterization and control
Procedia PDF Downloads 1742512 Predictive Analytics for Theory Building
Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim
Abstract:
Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building
Procedia PDF Downloads 2772511 Evaluation of the Grammar Questions at the Undergraduate Level
Authors: Preeti Gacche
Abstract:
A considerable part of undergraduate level English Examination papers is devoted to grammar. Hence the grammar questions in the question papers are evaluated and the opinions of both students and teachers about them are obtained and analyzed. A grammar test of 100 marks is administered to 43 students to check their performance. The question papers have been evaluated by 10 different teachers and their scores compared. The analysis of 38 University question papers reveals that on an average 20 percent marks are allotted to grammar. Almost all the grammar topics are tested. Abundant use of grammatical terminology is observed in the questions. Decontextualization, repetition, possibility of multiple correct answers and grammatical errors in framing the questions have been observed. Opinions of teachers and students about grammar questions vary in many respects. The students responses are analyzed medium-wise and sex-wise. The Medium at the School level and the sex of the students are found to play no role as far as interest in the study of grammar is concerned. English medium students solve grammar questions intuitively whereas non-English medium students are required to recollect the rules of grammar. Prepositions, Verbs, Articles and Model auxiliaries are found to be easy topics for most students whereas the use of conjunctions is the most difficult topic. Out of context items of grammar are difficult to answer in comparison with contextualized items of grammar. Hence contextualized texts to test grammar items are desirable. No formal training in setting questions is imparted to teachers by the competent authorities like the University. They need to be trained in testing. Statistically there is no significant change of score with the change in the rater in testing of grammar items. There is scope of future improvement. The question papers need to be evaluated and feedback needs to be obtained from students and teachers for future improvement.Keywords: context, evaluation, grammar, tests
Procedia PDF Downloads 3542510 Predicting Foreign Direct Investment of IC Design Firms from Taiwan to East and South China Using Lotka-Volterra Model
Authors: Bi-Huei Tsai
Abstract:
This work explores the inter-region investment behaviors of integrated circuit (IC) design industry from Taiwan to China using the amount of foreign direct investment (FDI). According to the mutual dependence among different IC design industrial locations, Lotka-Volterra model is utilized to explore the FDI interactions between South and East China. Effects of inter-regional collaborations on FDI flows into China are considered. Evolutions of FDIs into South China for IC design industry significantly inspire the subsequent FDIs into East China, while FDIs into East China for Taiwan’s IC design industry significantly hinder the subsequent FDIs into South China. The supply chain along IC industry includes IC design, manufacturing, packing and testing enterprises. I C manufacturing, packaging and testing industries depend on IC design industry to gain advanced business benefits. The FDI amount from Taiwan’s IC design industry into East China is the greatest among the four regions: North, East, Mid-West and South China. The FDI amount from Taiwan’s IC design industry into South China is the second largest. If IC design houses buy more equipment and bring more capitals in South China, those in East China will have pressure to undertake more FDIs into East China to maintain the leading position advantages of the supply chain in East China. On the other hand, as the FDIs in East China rise, the FDIs in South China will successively decline since capitals have concentrated in East China. Prediction of Lotka-Volterra model in FDI trends is accurate because the industrial interactions between the two regions are included. Finally, this work confirms that the FDI flows cannot reach a stable equilibrium point, so the FDI inflows into East and South China will expand in the future.Keywords: Lotka-Volterra model, foreign direct investment, competitive, Equilibrium analysis
Procedia PDF Downloads 3632509 In vitro Skin Model for Enhanced Testing of Antimicrobial Textiles
Authors: Steven Arcidiacono, Robert Stote, Erin Anderson, Molly Richards
Abstract:
There are numerous standard test methods for antimicrobial textiles that measure activity against specific microorganisms. However, many times these results do not translate to the performance of treated textiles when worn by individuals. Standard test methods apply a single target organism grown under optimal conditions to a textile, then recover the organism to quantitate and determine activity; this does not reflect the actual performance environment that consists of polymicrobial communities in less than optimal conditions or interaction of the textile with the skin substrate. Here we propose the development of in vitro skin model method to bridge the gap between lab testing and wear studies. The model will consist of a defined polymicrobial community of 5-7 commensal microbes simulating the skin microbiome, seeded onto a solid tissue platform to represent the skin. The protocol would entail adding a non-commensal test organism of interest to the defined community and applying a textile sample to the solid substrate. Following incubation, the textile would be removed and the organisms recovered, which would then be quantitated to determine antimicrobial activity. Important parameters to consider include identification and assembly of the defined polymicrobial community, growth conditions to allow the establishment of a stable community, and choice of skin surrogate. This model could answer the following questions: 1) is the treated textile effective against the target organism? 2) How is the defined community affected? And 3) does the textile cause unwanted effects toward the skin simulant? The proposed model would determine activity under conditions comparable to the intended application and provide expanded knowledge relative to current test methods.Keywords: antimicrobial textiles, defined polymicrobial community, in vitro skin model, skin microbiome
Procedia PDF Downloads 1392508 Qualitative Analysis of Current Child Custody Evaluation Practices
Authors: Carolyn J. Ortega, Stephen E. Berger
Abstract:
The role of the custody evaluator is perhaps one of the most controversial and risky endeavors in clinical practice. Complaints filed with licensing boards regarding a child-custody evaluation constitute the second most common reason for such an event. Although the evaluator is expected to answer for the family-law court what is in the “best interest of the child,” there is a lack of clarity on how to establish this in any empirically validated manner. Hence, practitioners must contend with a nebulous framework in formulating their methodological procedures that inherently places them at risk in an already litigious context. This study sought to qualitatively investigate patterns of practice among doctoral practitioners conducting child custody evaluations in the area of Southern California. Ten psychologists were interviewed who devoted between 25 and 100% of their California private practice to custody work. All held Ph.D. degrees with a range of eight to 36 years of experience in custody work. Semi-structured interviews were used to investigate assessment practices, ensure adherence to guidelines, risk management, and qualities of evaluators. Forty-three Specific Themes were identified using Interpretive Phenomenological Analysis (IPA). Seven Higher Order Themes clustered on salient factors such as use of Ethics, Law, Guidelines; Parent Variables; Child Variables; Psychologist Variables; Testing; Literature; and Trends. Evaluators were aware of the ever-present reality of a licensure complaint and thus presented idiosyncratic descriptions of risk management considerations. Ambiguity about quantifying and validly tapping parenting abilities was also reviewed. Findings from this study suggested a high reliance on unstructured and observational methods in child custody practices.Keywords: forensic psychology, psychological testing, assessment methodology, child custody
Procedia PDF Downloads 2852507 Modeling of Thermally Induced Acoustic Emission Memory Effects in Heterogeneous Rocks with Consideration for Fracture Develo
Authors: Vladimir A. Vinnikov
Abstract:
The paper proposes a model of an inhomogeneous rock mass with initially random distribution of microcracks on mineral grain boundaries. It describes the behavior of cracks in a medium under the effect of thermal field, the medium heated instantaneously to a predetermined temperature. Crack growth occurs according to the concept of fracture mechanics provided that the stress intensity factor K exceeds the critical value of Kc. The modeling of thermally induced acoustic emission memory effects is based on the assumption that every event of crack nucleation or crack growth caused by heating is accompanied by a single acoustic emission event. Parameters of the thermally induced acoustic emission memory effect produced by cyclic heating and cooling (with the temperature amplitude increasing from cycle to cycle) were calculated for several rock texture types (massive, banded, and disseminated). The study substantiates the adaptation of the proposed model to humidity interference with the thermally induced acoustic emission memory effect. The influence of humidity on the thermally induced acoustic emission memory effect in quasi-homogeneous and banded rocks is estimated. It is shown that such modeling allows the structure and texture of rocks to be taken into account and the influence of interference factors on the distinctness of the thermally induced acoustic emission memory effect to be estimated. The numerical modeling can be used to obtain information about the thermal impacts on rocks in the past and determine the degree of rock disturbance by means of non-destructive testing.Keywords: degree of rock disturbance, non-destructive testing, thermally induced acoustic emission memory effects, structure and texture of rocks
Procedia PDF Downloads 2642506 A Retrospective Cross-Sectional Study on the Prevalence and Factors Associated with Virological Non-Suppression among HIV-Positive Adult Patients on Antiretroviral Therapy in Woliso Town, Oromia, Ethiopia
Authors: Teka Haile, Behailu Hawulte, Solomon Alemayehu
Abstract:
Background: HIV virological failure still remains a problem in HV/AIDS treatment and care. This study aimed to describe the prevalence and identify the factors associated with viral non-suppression among HIV-positive adult patients on antiretroviral therapy in Woliso Town, Oromia, Ethiopia. Methods: A retrospective cross-sectional study was conducted among 424 HIV-positive patient’s attending antiretroviral therapy (ART) in Woliso Town during the period from August 25, 2020 to August 30, 2020. Data collected from patient medical records were entered into Epi Info version 2.3.2.1 and exported to SPSS version 21.0 for analysis. Logistic regression analysis was done to identify factors associated with viral load non-suppression, and statistical significance of odds ratios were declared using 95% confidence interval and p-value < 0.05. Results: A total of 424 patients were included in this study. The mean age (± SD) of the study participants was 39.88 (± 9.995) years. The prevalence of HIV viral load non-suppression was 55 (13.0%) with 95% CI (9.9-16.5). Second-line ART treatment regimen (Adjusted Odds Ratio (AOR) = 8.98, 95% Confidence Interval (CI): 2.64, 30.58) and routine viral load testing (AOR = 0.01, 95% CI: 0.001, 0.02) were significantly associated with virological non-suppression. Conclusion: Virological non-suppression was high, which hinders the achievement of the third global 95 target. The second-line regimen and routine viral load testing were significantly associated with virological non-suppression. It suggests the need to assess the effectiveness of antiretroviral drugs for epidemic control. It also clearly shows the need to decentralize third-line ART treatment for those patients in need.Keywords: virological non-suppression, HIV-positive, ART, Woliso town, Ethiopia
Procedia PDF Downloads 1532505 Energy Retrofitting Application Research to Achieve Energy Efficiency in Hot-Arid Climates in Residential Buildings: A Case Study of Saudi Arabia
Authors: A. Felimban, A. Prieto, U. Knaack, T. Klein
Abstract:
This study aims to present an overview of recent research in building energy-retrofitting strategy applications and analyzing them within the context of hot arid climate regions which is in this case study represented by the Kingdom of Saudi Arabia. The main goal of this research is to do an analytical study of recent research approaches to show where the primary gap in knowledge exists and outline which possible strategies are available that can be applied in future research. Also, the paper focuses on energy retrofitting strategies at a building envelop level. The study is limited to specific measures within the hot arid climate region. Scientific articles were carefully chosen as they met the expression criteria, such as retrofitting, energy-retrofitting, hot-arid, energy efficiency, residential buildings, which helped narrow the research scope. Then the papers were explored through descriptive analysis and justified results within the Saudi context in order to draw an overview of future opportunities from the field of study for the last two decades. The conclusions of the analysis of the recent research confirmed that the field of study had a research shortage on investigating actual applications and testing of newly introduced energy efficiency applications, lack of energy cost feasibility studies and there was also a lack of public awareness. In terms of research methods, it was found that simulation software was a major instrument used in energy retrofitting application research. The main knowledge gaps that were identified included the need for certain research regarding actual application testing; energy retrofitting strategies application feasibility; the lack of research on the importance of how strategies apply first followed by the user acceptance of developed scenarios.Keywords: energy efficiency, energy retrofitting, hot arid, Saudi Arabia
Procedia PDF Downloads 1252504 A Handheld Light Meter Device for Methamphetamine Detection in Oral Fluid
Authors: Anindita Sen
Abstract:
Oral fluid is a promising diagnostic matrix for drugs of abuse compared to urine and serum. Detection of methamphetamine in oral fluid would pave way for the easy evaluation of impairment in drivers during roadside drug testing as well as ensure safe working environments by facilitating evaluation of impairment in employees at workplaces. A membrane-based point-of-care (POC) friendly pre-treatment technique has been developed which aided elimination of interferences caused by salivary proteins and facilitated the demonstration of methamphetamine detection in saliva using a gold nanoparticle based colorimetric aptasensor platform. It was found that the colorimetric response in saliva was always suppressed owing to the matrix effects. By navigating the challenging interfering issues in saliva, we were successfully able to detect methamphetamine at nanomolar levels in saliva offering immense promise for the translation of these platforms for on-site diagnostic systems. This subsequently motivated the development of a handheld portable light meter device that can reliably transduce the aptasensors colorimetric response into absorbance, facilitating quantitative detection of analyte concentrations on-site. This is crucial due to the prevalent unreliability and sensitivity problems of the conventional drug testing kits. The fabricated light meter device response was validated against a standard UV-Vis spectrometer to confirm reliability. The portable and cost-effective handheld detector device features sensitivity comparable to the well-established UV-Vis benchtop instrument and the easy-to-use device could potentially serve as a prototype for a commercial device in the future.Keywords: aptasensors, colorimetric gold nanoparticle assay, point-of-care, oral fluid
Procedia PDF Downloads 622503 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis
Authors: Wenbo Du, Xiaomei Ma
Abstract:
With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression
Procedia PDF Downloads 1462502 Pros and Cons of Nanoparticles on Health
Authors: Amber Shahi, Ayesha Tazeen, Abdus Samad, Shama Parveen
Abstract:
Nanoparticles (NPs) are tiny particles. According to the International Organization for Standardization, the size range of NPs is in the nanometer range (1-100 nm). They show distinct properties that are not shown by larger particles of the same material. NPs are currently being used in different fields due to their unique physicochemical nature. NPs are a boon for medical sciences, environmental sciences, electronics, and textile industries. However, there is growing concern about their potential adverse effects on human health. This poster presents a comprehensive review of the current literature on the pros and cons of NPs on human health. The poster will discuss the various types of interactions of NPs with biological systems. There are a number of beneficial uses of NPs in the field of health and environmental welfare. NPs are very useful in disease diagnosis, antimicrobial action, and the treatment of diseases like Alzheimer’s. They can also cross the blood-brain barrier, making them capable of treating brain diseases. Additionally, NPs can target specific tumors and be used for cancer treatment. To treat environmental health, NPs also act as catalytic converters to reduce pollution from the environment. On the other hand, NPs also have some negative impacts on the human body, such as being cytotoxic and genotoxic. They can also affect the reproductive system, such as the testis and ovary, and sexual behavior. The poster will further discuss the routes of exposure of NPs. The poster will conclude with a discussion of the current regulations and guidelines on the use of NPs in various applications. It will highlight the need for further research and the development of standardized toxicity testing methods to ensure the safe use of NPs in various applications. When using NPs in diagnosis and treatment, we should also take into consideration their safe concentration in the body. Overall, this poster aims to provide a comprehensive overview of the pros and cons of NPs on human health and to promote awareness and understanding of the potential risks and benefits associated with their use.Keywords: disease diagnosis, human health, nanoparticles, toxicity testing
Procedia PDF Downloads 812501 Hearing Conservation Program for Vector Control Workers: Short-Term Outcomes from a Cluster-Randomized Controlled Trial
Authors: Rama Krishna Supramanian, Marzuki Isahak, Noran Naqiah Hairi
Abstract:
Noise-induced hearing loss (NIHL) is one of the highest recorded occupational diseases, despite being preventable. Hearing Conservation Program (HCP) is designed to protect workers hearing and prevent them from developing hearing impairment due to occupational noise exposures. However, there is still a lack of evidence regarding the effectiveness of this program. The purpose of this study was to determine the effectiveness of a Hearing Conservation Program (HCP) in preventing or reducing audiometric threshold changes among vector control workers. This study adopts a cluster randomized controlled trial study design, with district health offices as the unit of randomization. Nine district health offices were randomly selected and 183 vector control workers were randomized to intervention or control group. The intervention included a safety and health policy, noise exposure assessment, noise control, distribution of appropriate hearing protection devices, training and education program and audiometric testing. The control group only underwent audiometric testing. Audiometric threshold changes observed in the intervention group showed improvement in the hearing threshold level for all frequencies except 500 Hz and 8000 Hz for the left ear. The hearing threshold changes range from 1.4 dB to 5.2 dB with largest improvement at higher frequencies mainly 4000 Hz and 6000 Hz. Meanwhile for the right ear, the mean hearing threshold level remained similar at 4000 Hz and 6000 Hz after 3 months of intervention. The Hearing Conservation Program (HCP) is effective in preserving the hearing of vector control workers involved in fogging activity as well as increasing their knowledge, attitude and practice towards noise-induced hearing loss (NIHL).Keywords: adult, hearing conservation program, noise-induced hearing loss, vector control worker
Procedia PDF Downloads 1702500 The Importance of Imaging and Functional Tests for Early Detection of Occupational Diseases in Kosovo's Miners
Authors: Krenare Shabani, Kreshnike Dedushi Hoti, Serbeze Kabashi, Jeton Shatri, Arben Rroji, Mrikë Bunjaku, Leotrim Berisha, Jona Kosova, Edmond Puca, Bleriana Shabani
Abstract:
Introduction: Workers in Kosovo's mining industry are subjected to hazardous working conditions and airborne particles, such as silica dust, which can cause silicosis and other severe respiratory illnesses. The purpose of this research is to assess the health impacts of such exposures, as well as the importance of imaging and functional testing in detecting pathological changes early on. Methodology: The study is prospective and cross-sectional and was carried out during the year 2024. 626 people (446 miners and 180 non-miners) were enrolled in the study. Subjects underwent spirometry and chest radiography. Data were analysed with SPSS24. Results: The average age of the participants is 48 years. Demographics and Smoking: Smoking was common among young miners. Radiological Changes: Radiographic abnormalities in the lungs were seen in 23.1% of miners and 10.6% of non-miners, including small irregular opacities and emphysematous changes. Lung Function: The FEV1/FVC ratio decreased with increased exposure time, indicating a decline in pulmonary function.Impact of Exposure Duration: Longer exposure duration was associated with a higher number of miners experiencing coughs and requiring medical consultations such as CT scans and biopsies. Conclusions: Medical imaging and functional testing are critical for early diagnosis of lung abnormalities in miners.Findings demonstrate a strong correlation between extended exposure to mine dust and the development of respiratory disorders, emphasising the importance of preventative measures and routine health monitoring.Keywords: silicosis, miners, imaging, spirometry
Procedia PDF Downloads 292499 Teaching Practices for Subverting Significant Retentive Learner Errors in Arithmetic
Authors: Michael Lousis
Abstract:
The systematic identification of the most conspicuous and significant errors made by learners during three-years of testing of their progress in learning Arithmetic throughout the development of the Kassel Project in England and Greece was accomplished. How much retentive these errors were over three-years in the officially provided school instruction of Arithmetic in these countries has also been shown. The learners’ errors in Arithmetic stemmed from a sample, which was comprised of two hundred (200) English students and one hundred and fifty (150) Greek students. The sample was purposefully selected according to the students’ participation in each testing session in the development of the three-year project, in both domains simultaneously in Arithmetic and Algebra. Specific teaching practices have been invented and are presented in this study for subverting these learners’ errors, which were found out to be retentive to the level of the nationally provided mathematical education of each country. The invention and the development of these proposed teaching practices were founded on the rationality of the theoretical accounts concerning the explanation, prediction and control of the errors, on the conceptual metaphor and on an analysis, which tried to identify the required cognitive components and skills of the specific tasks, in terms of Psychology and Cognitive Science as applied to information-processing. The aim of the implementation of these instructional practices is not only the subversion of these errors but the achievement of the mathematical competence, as this was defined to be constituted of three elements: appropriate representations - appropriate meaning - appropriately developed schemata. However, praxis is of paramount importance, because there is no independent of science ‘real-truth’ and because praxis serves as quality control when it takes the form of a cognitive method.Keywords: arithmetic, cognitive science, cognitive psychology, information-processing paradigm, Kassel project, level of the nationally provided mathematical education, praxis, remedial mathematical teaching practices, retentiveness of errors
Procedia PDF Downloads 3172498 A 3D Cell-Based Biosensor for Real-Time and Non-Invasive Monitoring of 3D Cell Viability and Drug Screening
Authors: Yuxiang Pan, Yong Qiu, Chenlei Gu, Ping Wang
Abstract:
In the past decade, three-dimensional (3D) tumor cell models have attracted increasing interest in the field of drug screening due to their great advantages in simulating more accurately the heterogeneous tumor behavior in vivo. Drug sensitivity testing based on 3D tumor cell models can provide more reliable in vivo efficacy prediction. The gold standard fluorescence staining is hard to achieve the real-time and label-free monitoring of the viability of 3D tumor cell models. In this study, micro-groove impedance sensor (MGIS) was specially developed for dynamic and non-invasive monitoring of 3D cell viability. 3D tumor cells were trapped in the micro-grooves with opposite gold electrodes for the in-situ impedance measurement. The change of live cell number would cause inversely proportional change to the impedance magnitude of the entire cell/matrigel to construct and reflect the proliferation and apoptosis of 3D cells. It was confirmed that 3D cell viability detected by the MGIS platform is highly consistent with the standard live/dead staining. Furthermore, the accuracy of MGIS platform was demonstrated quantitatively using 3D lung cancer model and sophisticated drug sensitivity testing. In addition, the parameters of micro-groove impedance chip processing and measurement experiments were optimized in details. The results demonstrated that the MGIS and 3D cell-based biosensor and would be a promising platform to improve the efficiency and accuracy of cell-based anti-cancer drug screening in vitro.Keywords: micro-groove impedance sensor, 3D cell-based biosensors, 3D cell viability, micro-electromechanical systems
Procedia PDF Downloads 1292497 A Radiomics Approach to Predict the Evolution of Prostate Imaging Reporting and Data System Score 3/5 Prostate Areas in Multiparametric Magnetic Resonance
Authors: Natascha C. D'Amico, Enzo Grossi, Giovanni Valbusa, Ala Malasevschi, Gianpiero Cardone, Sergio Papa
Abstract:
Purpose: To characterize, through a radiomic approach, the nature of areas classified PI-RADS (Prostate Imaging Reporting and Data System) 3/5, recognized in multiparametric prostate magnetic resonance with T2-weighted (T2w), diffusion and perfusion sequences with paramagnetic contrast. Methods and Materials: 24 cases undergoing multiparametric prostate MR and biopsy were admitted to this pilot study. Clinical outcome of the PI-RADS 3/5 was found through biopsy, finding 8 malignant tumours. The analysed images were acquired with a Philips achieva 1.5T machine with a CE- T2-weighted sequence in the axial plane. Semi-automatic tumour segmentation was carried out on MR images using 3DSlicer image analysis software. 45 shape-based, intensity-based and texture-based features were extracted and represented the input for preprocessing. An evolutionary algorithm (a TWIST system based on KNN algorithm) was used to subdivide the dataset into training and testing set and select features yielding the maximal amount of information. After this pre-processing 20 input variables were selected and different machine learning systems were used to develop a predictive model based on a training testing crossover procedure. Results: The best machine learning system (three-layers feed-forward neural network) obtained a global accuracy of 90% ( 80 % sensitivity and 100% specificity ) with a ROC of 0.82. Conclusion: Machine learning systems coupled with radiomics show a promising potential in distinguishing benign from malign tumours in PI-RADS 3/5 areas.Keywords: machine learning, MR prostate, PI-Rads 3, radiomics
Procedia PDF Downloads 1882496 Friction and Wear, Including Mechanisms, Modeling,Characterization, Measurement and Testing (Bangladesh Case)
Authors: Gor Muradyan
Abstract:
The paper is about friction and wear, including mechanisms, modeling, characterization, measurement and testing case in Bangladesh. Bangladesh is a country under development, A lot of people live here, approximately 145 million. The territory of this country is very small. Therefore buildings are very close to each other. As the pipe lines are very old, and people get almost dirty water, there are a lot of ongoing projects under ADB. In those projects the contractors using HDD machines (Horizontal Directional Drilling ) and grundoburst. These machines are working underground. As ground in Bangladesh is very sludge, machine can't work relevant because of big friction in the soil. When drilling works are finished machine is pulling the pipe underground. Very often the pulling of the pipes becomes very complicated because of the friction. Therefore long section of the pipe laying can’t be done because of a big friction. In that case, additional problems rise, as well as additional work must be done. As we mentioned above it is not possible to do big section of the pipe laying because of big friction in the soil, Because of this it is coming out that contractors must do more joints, more pressure test. It is always connected with additional expenditure and losing time. This machine can pull in 75 mm to 500 mm pipes connected with the soil condition. Length is possible till 500m related how much friction it will had on the puller. As less as much it can pull. Another machine grundoburst is not working at this soil condition at all. The machine is working with air compressor. This machine are using for the smaller diameter pipes, 20 mm to 63 mm. Most of the cases these machines are being used for the installing of the house connection pipes, for making service connection. To make a friction less contractors using bigger pulling had then the pipe. It is taking down the friction, But the problem of this machine is that it can't work at sludge. Because of mentioned reasons the friction has a big mining during this kind of works. There are a lot of ways to reduce the friction. In this paper we'll introduce the ways that we have researched during our practice in Bangladesh.Keywords: Bangladesh, friction and wear, HDD machines, reducing friction
Procedia PDF Downloads 3192495 Analyzing the Effect of Multilingualism, Language 1, and Language 2 on Reading Comprehension
Authors: Judith Hanke
Abstract:
Due to the increase of students with reading difficulties, digital reading support with diagnostics was developed to foster the individual student's reading comprehension. The digital reading support focused on the reading comprehension of elementary school students. The digital reading packages consist of literary texts with aligned reading exercises. The number of students with German as a second language is growing in Germany. Students with multilingualism, language 1, and language 2 learn German together in school. The research's focus is on determining whether and to what extent multilingualism, language 1, and language 2 affect reading comprehension. For the methodology, an ABA design was selected for the intervention study to examine the reading support. The study was expedited from April 2023 until July 2023 and collected quantitative data of individuals, groups, and classes. It comprised a survey group (N = 58) and a control group (N = 53). The quantitative data was collected from 3 classes of 3 teachers and 47 students for all three test times. To show differences between the groups, a standardized reading comprehension test was used for the three test times, pretest, posttest, and follow-up. The standardized test consists of three subtests regarding word comprehension, sentence comprehension, and text comprehension. The main findings include that students who spoke German as their first language had the best test scores. Interestingly, students with a different language had better testing scores than students with German as the first language and (an) other language/s. Also, the students with another language outperformed the native language speakers in one of the subtests of the post-testing. The variables of spoken language at home and German as a second language were also examined and correlated with the test results. One significant correlation was found between spoken language at home and the text comprehension test of the pretesting. Additionally, the variable German as a second language had multiple significant correlations in the pretest, posttest and follow-up. The study's significance is to understand the influence of several languages, language 1, and language 2, on reading comprehension.Keywords: multilingualism, language 1, language 2, reading comprehension, second language
Procedia PDF Downloads 322494 Testing of Complicated Bus Bar Protection Using Smart Testing Methodology
Authors: K. N. Dinesh Babu
Abstract:
In this paper, the protection of a complicated bus arrangement with a dual bus coupler and bus sectionalizer using low impedance differential protection applicable for very high voltages like 220kV and 400kV is discussed. In many power generation stations, several operational procedures are implemented to utilize the transfer bus as the main bus and to facilitate the maintenance of circuit breakers and current transformers (in each section) without shutting down the bay(s). Owing to this fact, the complications in operational philosophy have thrown challenges for the bus bar protection implementation. Many bus topologies allow any one of the main buses available in the station to be used as an auxiliary bus. In such a system, pre-defined precautions and procedures are made as guidelines, which are followed before assigning any bus as an auxiliary bus. The procedure involves shifting of links, changing rotary switches, insertion of test block, and so on, thereby causing unreliable operation. This kind of unreliable operation or inadvertent procedural lapse may result in the isolation of the bus bar from the grid due to the unpredictable operation of the bus bar protection relay, which is a commonly occurring phenomenon due to manual mistakes. With the sophisticated configuration and implementation of logic in modern intelligent electronic devices, the operator is free to select the transfer arrangement without sacrificing the protection required by a bus differential system for a reliable operation, and labor-intensive processes are completely eliminated. This paper deals with the procedure to test the security logic for such special scenarios using Megger make SMRT, bus bar protection relay to assure system stability and get rid of all the specific operational precautions/procedure.Keywords: bus bar protection, by-pass isolator, blind spot, breaker failure, intelligent electronic device, end fault, bus unification, directional principle, zones of protection, breaker re-trip, under voltage security, smart megger relay tester
Procedia PDF Downloads 692493 Proton Irradiation Testing on Commercial Enhancement Mode GaN Power Transistor
Authors: L. Boyaci
Abstract:
Two basic equipment of electrical power subsystem of space satellites are Power Conditioning Unit (PCU) and Power Distribution Unit (PDU). Today, the main switching element used in power equipment in satellites is silicon (Si) based radiation-hardened MOSFET. GaNFETs have superior performances over MOSFETs in terms of their conduction and switching characteristics. GaNFET has started to take MOSFET’s place in many applications in industry especially by virtue of its switching performances. If GaNFET can also be used in equipment for space applications, this would be great revolution for future space power subsystem designs. In this study, the effect of proton irradiation on Gallium Nitride based power transistors was investigated. Four commercial enhancement mode GaN power transistors from Efficient Power Conversion Corporation (EPC) are irradiated with 30MeV protons while devices are switching. Flux of 8.2x10⁹ protons/cm²/s is applied for 12.5 seconds to reach ultimate fluence of 10¹¹ protons/cm². Vgs-Ids characteristics are measured and recorded for each device before, during and after irradiation. It was observed that if there would be destructive events. Proton induced permanent damage on devices is not observed. All the devices remained healthy and continued to operate. For two of these devices, further irradiation is applied with same flux for 30 minutes up to a total fluence level of 1.476x10¹³ protons/cm². We observed that GaNFETs are fully functional under this high level of radiation and no destructive events and irreversible failures took place for transistors. Results reveal that irradiated GaNFET in this experiment has radiation tolerance under proton testing and very important candidate for being one of the future power switching element in space.Keywords: enhancement mode GaN power transistors, proton irradiation effects, radiation tolerance
Procedia PDF Downloads 1532492 Investigations of Bergy Bits and Ship Interactions in Extreme Waves Using Smoothed Particle Hydrodynamics
Authors: Mohammed Islam, Jungyong Wang, Dong Cheol Seo
Abstract:
The Smoothed Particle Hydrodynamics (SPH) method is a novel, meshless, and Lagrangian technique based numerical method that has shown promises to accurately predict the hydrodynamics of water and structure interactions in violent flow conditions. The main goal of this study is to build confidence on the versatility of the Smoothed Particle Hydrodynamics (SPH) based tool, to use it as a complementary tool to the physical model testing capabilities and support research need for the performance evaluation of ships and offshore platforms exposed to an extreme and harsh environment. In the current endeavor, an open-sourced SPH-based tool was used and validated for modeling and predictions of the hydrodynamic interactions of a 6-DOF ship and bergy bits. The study involved the modeling of a modern generic drillship and simplified bergy bits in floating and towing scenarios and in regular and irregular wave conditions. The predictions were validated using the model-scale measurements on a moored ship towed at multiple oblique angles approaching a floating bergy bit in waves. Overall, this study results in a thorough comparison between the model scale measurements and the prediction outcomes from the SPH tool for performance and accuracy. The SPH predicted ship motions and forces were primarily within ±5% of the measurements. The velocity and pressure distribution and wave characteristics over the free surface depicts realistic interactions of the wave, ship, and the bergy bit. This work identifies and presents several challenges in preparing the input file, particularly while defining the mass properties of complex geometry, the computational requirements, and the post-processing of the outcomes.Keywords: SPH, ship and bergy bit, hydrodynamic interactions, model validation, physical model testing
Procedia PDF Downloads 1332491 The Legal Procedure of Attestation of Public Servants
Authors: Armen Yezekyan
Abstract:
The main purpose of this research is to comprehensively explore and identify the problems of attestation of the public servants and to propose solutions for these issues through deeply analyzing laws and the legal theoretical literature. For the detailed analysis of the above-mentioned problems we will use some research methods, the implementation of which has a goal to ensure the objectivity and clarity of scientific research and its results.Keywords: attestation, attestation commission, competition commission, public servant, public service, testing
Procedia PDF Downloads 4142490 Symmetry of Performance across Lower Limb Tests between the Dominant and Non-Dominant Legs
Authors: Ghulam Hussain, Herrington Lee, Comfort Paul, Jones Richard
Abstract:
Background: To determine the functional limitations of the lower limbs or readiness to return to sport, most rehabilitation programs use some form of testing; however, it is still unknown what the pass criteria is. This study aims to investigate the differences between the dominant and non-dominant leg performances across several lower limb tasks, which are hop tests, two-dimensional (2D) frontal plane projection angle (FPPA) tests, and isokinetic muscle tests. This study also provides the reference values for the limb symmetry index (LSI) for the hop and isokinetic muscle strength tests. Twenty recreationally active participants were recruited, 11 males and 9 females (age 23.65±2.79 years; height 169.9±3.74 cm; and body mass 74.72±5.81 kg. All tests were undertaken on the dominant and non-dominant legs. These tests are (1) Hop tests, which include horizontal hop for distance and crossover hop tests, (2) Frontal plane projection angle (FPPA): 2D capturing from two different tasks, which are forward hop landing and squatting, and (3) Isokinetic muscle strength tests: four different muscles were tested: quadriceps, hamstring, ankle plantar flexor, and hip extensor muscles. The main outcome measurements were, for the (1) hop tests: maximum distance was taken when undertaking single/crossover hop for distance using a standard tape measure, (2) for the FPPA: the knee valgus angle was measured from the maximum knee flexion position using a single 2D camera, and (3) for the isokinetic muscle strength tests: three different variables were measured: peak torque, peak torque to body weight, and the total work to body weight. All the muscle strength tests have been applied in both concentric and eccentric muscle actions at a speed of 60°/sec. This study revealed no differences between the dominant and non-dominant leg performance, and 85% of LSI was achieved by the majority of the subjects in both hop and isokinetic muscle tests, and; therefore, one leg’s hop performance can define the other.Keywords: 2D FPPA, hop tests, isokinetic testing, LSI
Procedia PDF Downloads 672489 Molecularly Imprinted Nanoparticles (MIP NPs) as Non-Animal Antibodies Substitutes for Detection of Viruses
Authors: Alessandro Poma, Kal Karim, Sergey Piletsky, Giuseppe Battaglia
Abstract:
The recent increasing emergency threat to public health of infectious influenza diseases has prompted interest in the detection of avian influenza virus (AIV) H5N1 in humans as well as animals. A variety of technologies for diagnosing AIV infection have been developed. However, various disadvantages (costs, lengthy analyses, and need for high-containment facilities) make these methods less than ideal in their practical application. Molecularly Imprinted Polymeric Nanoparticles (MIP NPs) are suitable to overcome these limitations by having high affinity, selectivity, versatility, scalability and cost-effectiveness with the versatility of post-modification (labeling – fluorescent, magnetic, optical) opening the way to the potential introduction of improved diagnostic tests capable of providing rapid differential diagnosis. Here we present our first results in the production and testing of MIP NPs for the detection of AIV H5N1. Recent developments in the solid-phase synthesis of MIP NPs mean that for the first time a reliable supply of ‘soluble’ synthetic antibodies can be made available for testing as potential biological or diagnostic active molecules. The MIP NPs have the potential to detect viruses that are widely circulating in farm animals and indeed humans. Early and accurate identification of the infectious agent will expedite appropriate control measures. Thus, diagnosis at an early stage of infection of a herd or flock or individual maximizes the efficiency with which containment, prevention and possibly treatment strategies can be implemented. More importantly, substantiating the practicability’s of these novel reagents should lead to an initial reduction and eventually to a potential total replacement of animals, both large and small, to raise such specific serological materials.Keywords: influenza virus, molecular imprinting, nanoparticles, polymers
Procedia PDF Downloads 3632488 Functionalized Nano porous Ceramic Membranes for Electrodialysis Treatment of Harsh Wastewater
Authors: Emily Rabe, Stephanie Candelaria, Rachel Malone, Olivia Lenz, Greg Newbloom
Abstract:
Electrodialysis (ED) is a well-developed technology for ion removal in a variety of applications. However, many industries generate harsh wastewater streams that are incompatible with traditional ion exchange membranes. Membrion® has developed novel ceramic-based ion exchange membranes (IEMs) offering several advantages over traditional polymer membranes: high performance in low pH, chemical resistance to oxidizers, and a rigid structure that minimizes swelling. These membranes are synthesized with our patented silane-based sol-gel techniques. The pore size, shape, and network structure are engineered through a molecular self-assembly process where thermodynamic driving forces are used to direct where and how pores form. Either cationic or anionic groups can be added within the membrane nanopore structure to create cation- and anion-exchange membranes. The ceramic IEMs are produced on a roll-to-roll manufacturing line with low-temperature processing. Membrane performance testing is conducted using in-house permselectivity, area-specific resistance, and ED stack testing setups. Ceramic-based IEMs show comparable performance to traditional IEMs and offer some unique advantages. Long exposure to highly acidic solutions has a negligible impact on ED performance. Additionally, we have observed stable performance in the presence of strong oxidizing agents such as hydrogen peroxide. This stability is expected, as the ceramic backbone of these materials is already in a fully oxidized state. This data suggests ceramic membranes, made using sol-gel chemistry, could be an ideal solution for acidic and/or oxidizing wastewater streams from processes such as semiconductor manufacturing and mining.Keywords: ion exchange, membrane, silane chemistry, nanostructure, wastewater
Procedia PDF Downloads 862487 Micro-Scale Digital Image Correlation-Driven Finite Element Simulations of Deformation and Damage Initiation in Advanced High Strength Steels
Authors: Asim Alsharif, Christophe Pinna, Hassan Ghadbeigi
Abstract:
The development of next-generation advanced high strength steels (AHSS) used in the automotive industry requires a better understanding of local deformation and damage development at the scale of their microstructures. This work is focused on dual-phase DP1000 steels and involves micro-mechanical tensile testing inside a scanning electron microscope (SEM) combined with digital image correlation (DIC) to quantify the heterogeneity of deformation in both ferrite and martensite and its evolution up to fracture. Natural features of the microstructure are used for the correlation carried out using Davis LaVision software. Strain localization is observed in both phases with tensile strain values up to 130% and 110% recorded in ferrite and martensite respectively just before final fracture. Damage initiation sites have been observed during deformation in martensite but could not be correlated to local strain values. A finite element (FE) model of the microstructure has then been developed using Abaqus to map stress distributions over representative areas of the microstructure by forcing the model to deform as in the experiment using DIC-measured displacement maps as boundary conditions. A MATLAB code has been developed to automatically mesh the microstructure from SEM images and to map displacement vectors from DIC onto the FE mesh. Results show a correlation of damage initiation at the interface between ferrite and martensite with local principal stress values of about 1700MPa in the martensite phase. Damage in ferrite is now being investigated, and results are expected to bring new insight into damage development in DP steels.Keywords: advanced high strength steels, digital image correlation, finite element modelling, micro-mechanical testing
Procedia PDF Downloads 146