Search results for: strength prediction models
10172 Behaviour of Lightweight Expanded Clay Aggregate Concrete Exposed to High Temperatures
Authors: Lenka Bodnárová, Rudolf Hela, Michala Hubertová, Iveta Nováková
Abstract:
This paper is concerning the issues of behaviour of lightweight expanded clay aggregates concrete exposed to high temperature. Lightweight aggregates from expanded clay are produced by firing of row material up to temperature 1050°C. Lightweight aggregates have suitable properties in terms of volume stability, when exposed to temperatures up to 1050°C, which could indicate their suitability for construction applications with higher risk of fire. The test samples were exposed to heat by using the standard temperature-time curve ISO 834. Negative changes in resulting mechanical properties, such as compressive strength, tensile strength, and flexural strength were evaluated. Also visual evaluation of the specimen was performed. On specimen exposed to excessive heat, an explosive spalling could be observed, due to evaporation of considerable amount of unbounded water from the inner structure of the concrete.Keywords: expanded clay aggregate, explosive spalling, high temperature, lightweight concrete, temperature-time curve ISO 834
Procedia PDF Downloads 44710171 Measuring the Unmeasurable: A Project of High Risk Families Prediction and Management
Authors: Peifang Hsieh
Abstract:
The prevention of child abuse has aroused serious concerns in Taiwan because of the disparity between the increasing amount of reported child abuse cases that doubled over the past decade and the scarcity of social workers. New Taipei city, with the most population in Taiwan and over 70% of its 4 million citizens are migrant families in which the needs of children can be easily neglected due to insufficient support from relatives and communities, sees urgency for a social support system, by preemptively identifying and outreaching high-risk families of child abuse, so as to offer timely assistance and preventive measure to safeguard the welfare of the children. Big data analysis is the inspiration. As it was clear that high-risk families of child abuse have certain characteristics in common, New Taipei city decides to consolidate detailed background information data from departments of social affairs, education, labor, and health (for example considering status of parents’ employment, health, and if they are imprisoned, fugitives or under substance abuse), to cross-reference for accurate and prompt identification of the high-risk families in need. 'The Service Center for High-Risk Families' (SCHF) was established to integrate data cross-departmentally. By utilizing the machine learning 'random forest method' to build a risk prediction model which can early detect families that may very likely to have child abuse occurrence, the SCHF marks high-risk families red, yellow, or green to indicate the urgency for intervention, so as to those families concerned can be provided timely services. The accuracy and recall rates of the above model were 80% and 65%. This prediction model can not only improve the child abuse prevention process by helping social workers differentiate the risk level of newly reported cases, which may further reduce their major workload significantly but also can be referenced for future policy-making.Keywords: child abuse, high-risk families, big data analysis, risk prediction model
Procedia PDF Downloads 13510170 Continuum-Based Modelling Approaches for Cell Mechanics
Authors: Yogesh D. Bansod, Jiri Bursa
Abstract:
The quantitative study of cell mechanics is of paramount interest since it regulates the behavior of the living cells in response to the myriad of extracellular and intracellular mechanical stimuli. The novel experimental techniques together with robust computational approaches have given rise to new theories and models, which describe cell mechanics as a combination of biomechanical and biochemical processes. This review paper encapsulates the existing continuum-based computational approaches that have been developed for interpreting the mechanical responses of living cells under different loading and boundary conditions. The salient features and drawbacks of each model are discussed from both structural and biological points of view. This discussion can contribute to the development of even more precise and realistic computational models of cell mechanics based on continuum approaches or on their combination with microstructural approaches, which in turn may provide a better understanding of mechanotransduction in living cells.Keywords: cell mechanics, computational models, continuum approach, mechanical models
Procedia PDF Downloads 36310169 Mathematical Modeling and Optimization of Burnishing Parameters for 15NiCr6 Steel
Authors: Tarek Litim, Ouahiba Taamallah
Abstract:
The present paper is an investigation of the effect of burnishing on the surface integrity of a component made of 15NiCr6 steel. This work shows a statistical study based on regression, and Taguchi's design has allowed the development of mathematical models to predict the output responses as a function of the technological parameters studied. The response surface methodology (RSM) showed a simultaneous influence of the burnishing parameters and observe the optimal processing parameters. ANOVA analysis of the results resulted in the validation of the prediction model with a determination coefficient R=90.60% and 92.41% for roughness and hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=10kgf, i=3passes, and f=0.074mm/rev, which favours minimum roughness and maximum hardness. The result was validated by the desirability of D= (0.99 and 0.95) for roughness and hardness, respectively.Keywords: 15NiCr6 steel, burnishing, surface integrity, Taguchi, RSM, ANOVA
Procedia PDF Downloads 19110168 To Optimise the Mechanical Properties of Structural Concrete by Partial Replacement of Natural Aggregates by Glass Aggregates
Authors: Gavin Gengan, Hsein Kew
Abstract:
Glass from varying recycling processes is considered a material that can be used as aggregate. Waste glass is available from different sources and has been used in the construction industry over the last decades. This current study aims to use recycled glass as a partial replacement for conventional aggregate materials. The experimental programme was designed to optimise the mechanical properties of structural concrete made with recycled glass aggregates (GA). NA (natural aggregates) was partially substituted by GA in a mix design of concrete of 30N/mm2 in proportions of 10%, 20%, and 25% 30%, 40%, and 50%. It was found that with an increasing proportion of GA, there is a decline in compressive strength. The optimum percentage replacement of NA by GA is 25%. The heat of hydration was also investigated with thermocouples placed in the concrete. This revealed an early acceleration of hydration heat in glass concrete, resulting from the thermal properties of glass. The gain in the heat of hydration and the better bonding of glass aggregates together with the pozzolanic activity of the finest glass particles caused the concrete to develop early age and long-term strength higher than that of control concreteKeywords: concrete, compressive strength, glass aggregates, heat of hydration, pozzolanic
Procedia PDF Downloads 20810167 On Strengthening Program of Sixty Years Old Dome Using Carbon Fiber
Authors: Humayun R. H. Kabir
Abstract:
A reinforced concrete dome-built 60 years ago- of circular shape of diameter of 30 m was in distressed conditions due to adverse weathering effects, such as high temperature, wind, and poor maintenance. It was decided to restore the dome to its full strength for future use. A full material strength and durability check including petrography test were conducted. It was observed that the concrete strength was in acceptable range, while bars were corroded more than 40% to their original configurations. Widespread cracks were almost in every meter square. A strengthening program with filling the cracks by injection method, and carbon fiber layup and wrap was considered. Ultra Sound Pulse Velocity (UPV) test was conducted to observe crack depth. Ground Penetration Radar (GPR) test was conducted to observe internal bar conditions and internal cracks. Finally, a load test was conducted to certify the carbon fiber effectiveness, injection method procedure and overall behavior of dome.Keywords: dome, strengthening program, carbon fiber, load test
Procedia PDF Downloads 25610166 Lexicon-Based Sentiment Analysis for Stock Movement Prediction
Authors: Zane Turner, Kevin Labille, Susan Gauch
Abstract:
Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We present a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.Keywords: computational finance, sentiment analysis, sentiment lexicon, stock movement prediction
Procedia PDF Downloads 12710165 Lexicon-Based Sentiment Analysis for Stock Movement Prediction
Authors: Zane Turner, Kevin Labille, Susan Gauch
Abstract:
Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We introduce a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.Keywords: computational finance, sentiment analysis, sentiment lexicon, stock movement prediction
Procedia PDF Downloads 17010164 Optimization of Submerged Arc Welding Parameters for Joining SS304 and MS1018
Authors: Jasvinder Singh, Manjinder Singh
Abstract:
Welding of dissimilar materials is a complicated process due to the difference in melting point of two materials. Thermal conductivity and coefficient of thermal expansion of dissimilar materials also different; therefore, residual stresses produced in the weldment and base metal are the most critical problem associated with the joining of dissimilar materials. Tensile strength and impact toughness also reduced due to the residual stresses. In the present research work, an attempt has been made to weld SS304 and MS1018 dissimilar materials by submerged arc welding (SAW). By conducting trail, runs most effective parameters welding current, Arc voltage, welding speed and nozzle to plate distance were selected to weld these materials. The fractional factorial technique was used to optimize the welding parameters. Effect on tensile strength (TS), fracture toughness (FT) and microhardness of weldment were studied. It was concluded that by optimizing welding current, voltage and welding speed the properties of weldment can be enhanced.Keywords: SAW, Tensile Strength (TS), fracture toughness, micro hardness
Procedia PDF Downloads 53810163 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 12710162 Modeling and Numerical Simulation of Heat Transfer and Internal Loads at Insulating Glass Units
Authors: Nina Penkova, Kalin Krumov, Liliana Zashcova, Ivan Kassabov
Abstract:
The insulating glass units (IGU) are widely used in the advanced and renovated buildings in order to reduce the energy for heating and cooling. Rules for the choice of IGU to ensure energy efficiency and thermal comfort in the indoor space are well known. The existing of internal loads - gage or vacuum pressure in the hermetized gas space, requires additional attention at the design of the facades. The internal loads appear at variations of the altitude, meteorological pressure and gas temperature according to the same at the process of sealing. The gas temperature depends on the presence of coatings, coating position in the transparent multi-layer system, IGU geometry and space orientation, its fixing on the facades and varies with the climate conditions. An algorithm for modeling and numerical simulation of thermal fields and internal pressure in the gas cavity at insulating glass units as function of the meteorological conditions is developed. It includes models of the radiation heat transfer in solar and infrared wave length, indoor and outdoor convection heat transfer and free convection in the hermetized gas space, assuming the gas as compressible. The algorithm allows prediction of temperature and pressure stratification in the gas domain of the IGU at different fixing system. The models are validated by comparison of the numerical results with experimental data obtained by Hot-box testing. Numerical calculations and estimation of 3D temperature, fluid flow fields, thermal performances and internal loads at IGU in window system are implemented.Keywords: insulating glass units, thermal loads, internal pressure, CFD analysis
Procedia PDF Downloads 27310161 Application of EEG Wavelet Power to Prediction of Antidepressant Treatment Response
Authors: Dorota Witkowska, Paweł Gosek, Lukasz Swiecicki, Wojciech Jernajczyk, Bruce J. West, Miroslaw Latka
Abstract:
In clinical practice, the selection of an antidepressant often degrades to lengthy trial-and-error. In this work we employ a normalized wavelet power of alpha waves as a biomarker of antidepressant treatment response. This novel EEG metric takes into account both non-stationarity and intersubject variability of alpha waves. We recorded resting, 19-channel EEG (closed eyes) in 22 inpatients suffering from unipolar (UD, n=10) or bipolar (BD, n=12) depression. The EEG measurement was done at the end of the short washout period which followed previously unsuccessful pharmacotherapy. The normalized alpha wavelet power of 11 responders was markedly different than that of 11 nonresponders at several, mostly temporoparietal sites. Using the prediction of treatment response based on the normalized alpha wavelet power, we achieved 81.8% sensitivity and 81.8% specificity for channel T4.Keywords: alpha waves, antidepressant, treatment outcome, wavelet
Procedia PDF Downloads 31510160 The Use of Piezocone Penetration Test Data for the Assessment of Iron Ore Tailings Liquefaction Susceptibility
Authors: Breno M. Castilho
Abstract:
The Iron Ore Quadrangle, located in the state of Minas Gerais, Brazil is responsible for most of the country’s iron ore production. As a result, some of the biggest tailings dams in the country are located in this area. In recent years, several major failure events have happened in Tailings Storage Facilities (TSF) located in the Iron Ore Quadrangle. Some of these failures were found to be caused by liquefaction flowslides. This paper presents Piezocone Penetration Test (CPTu) data that was used, by applying Olson and Peterson methods, for the liquefaction susceptibility assessment of the iron ore tailings that are typically found in most TSF in the area. Piezocone data was also used to determine the steady-state strength of the tailings so as to allow for comparison with its drained strength. Results have shown great susceptibility for liquefaction to occur in the studied tailings and, more importantly, a large reduction in its strength. These results are key to understanding the failures that took place over the last few years.Keywords: Piezocone Penetration Test CPTu, iron ore tailings, mining, liquefaction susceptibility assessment
Procedia PDF Downloads 23310159 Early-Age Cracking of Low Carbon Concrete Incorporating Ferronickel Slag as Supplementary Cementitious Material
Authors: Mohammad Khan, Arnaud Castel
Abstract:
Concrete viscoelastic properties such as shrinkage, creep, and associated relaxation are important in assessing the risk of cracking during the first few days after placement. This paper investigates the early-age mechanical and viscoelastic properties, restrained shrinkage-induced cracking and time to cracking of concrete incorporating ferronickel slag (FNS) as supplementary cementitious material. Compressive strength, indirect tensile strength and elastic modulus were measured. Tensile creep and drying shrinkage was measured on dog-bone shaped specimens. Restrained shrinkage induced stresses and concrete cracking age were assessed by using the ring test. Results revealed that early-age strength development of FNS blended concrete is lower than that of the corresponding ordinary Portland cement (OPC) concrete. FNS blended concrete showed significantly higher tensile creep. The risk of early-age cracking for the restrained specimens depends on the development of concrete tensile stress considering both restrained shrinkage and tensile creep and the development of the tensile strength. FNS blended concrete showed only 20% reduction in time to cracking compared to reference OPC concrete, and this reduction is significantly lower compared to fly ash and ground granulated blast furnace slag blended concretes at similar replacement level.Keywords: ferronickel slag, restraint shrinkage, tensile creep, time to cracking
Procedia PDF Downloads 18510158 Generalized Limit Equilibrium Solution for the Lateral Pile Capacity Problem
Authors: Tomer Gans-Or, Shmulik Pinkert
Abstract:
The determination of lateral pile capacity per unit length is a key aspect in geotechnical engineering. Traditional approaches for assessing piles lateral capacity in cohesive soils involve the application of upper-bound and lower-bound plasticity theorems. However, a comprehensive solution encompassing the entire spectrum of soil strength parameters, particularly in frictional soils with or without cohesion, is still lacking. This research introduces an innovative implementation of the slice method limit equilibrium solution for lateral capacity assessment. For any given numerical discretization of the soil's domain around the pile, the lateral capacity evaluation is based on mobilized strength concept. The critical failure geometry is then found by a unique optimization procedure which includes both factor of safety minimization and geometrical optimization. The robustness of this suggested methodology is that the solution is independent of any predefined assumptions. Validation of the solution is accomplished through a comparison with established plasticity solutions for cohesive soils. Furthermore, the study demonstrates the applicability of the limit equilibrium method to address unresolved cases related to frictional and cohesive-frictional soils. Beyond providing capacity values, the method enables the utilization of the mobilized strength concept to generate safety-factor distributions for scenarios representing pre-failure states.Keywords: lateral pile capacity, slice method, limit equilibrium, mobilized strength
Procedia PDF Downloads 6110157 Validation of Nutritional Assessment Scores in Prediction of Mortality and Duration of Admission in Elderly, Hospitalized Patients: A Cross-Sectional Study
Authors: Christos Lampropoulos, Maria Konsta, Vicky Dradaki, Irini Dri, Konstantina Panouria, Tamta Sirbilatze, Ifigenia Apostolou, Vaggelis Lambas, Christina Kordali, Georgios Mavras
Abstract:
Objectives: Malnutrition in hospitalized patients is related to increased morbidity and mortality. The purpose of our study was to compare various nutritional scores in order to detect the most suitable one for assessing the nutritional status of elderly, hospitalized patients and correlate them with mortality and extension of admission duration, due to patients’ critical condition. Methods: Sample population included 150 patients (78 men, 72 women, mean age 80±8.2). Nutritional status was assessed by Mini Nutritional Assessment (MNA full, short-form), Malnutrition Universal Screening Tool (MUST) and short Nutritional Appetite Questionnaire (sNAQ). Sensitivity, specificity, positive and negative predictive values and ROC curves were assessed after adjustment for the cause of current admission, a known prognostic factor according to previously applied multivariate models. Primary endpoints were mortality (from admission until 6 months afterwards) and duration of hospitalization, compared to national guidelines for closed consolidated medical expenses. Results: Concerning mortality, MNA (short-form and full) and SNAQ had similar, low sensitivity (25.8%, 25.8% and 35.5% respectively) while MUST had higher sensitivity (48.4%). In contrast, all the questionnaires had high specificity (94%-97.5%). Short-form MNA and sNAQ had the best positive predictive value (72.7% and 78.6% respectively) whereas all the questionnaires had similar negative predictive value (83.2%-87.5%). MUST had the highest ROC curve (0.83) in contrast to the rest questionnaires (0.73-0.77). With regard to extension of admission duration, all four scores had relatively low sensitivity (48.7%-56.7%), specificity (68.4%-77.6%), positive predictive value (63.1%-69.6%), negative predictive value (61%-63%) and ROC curve (0.67-0.69). Conclusion: MUST questionnaire is more advantageous in predicting mortality due to its higher sensitivity and ROC curve. None of the nutritional scores is suitable for prediction of extended hospitalization.Keywords: duration of admission, malnutrition, nutritional assessment scores, prognostic factors for mortality
Procedia PDF Downloads 34610156 Study on Brick Aggregate Made Pervious Concrete at Zero Fine Level
Authors: Monjurul Hasan, Golam Kibria, Abdus Salam
Abstract:
Pervious concrete is a form of lightweight porous concrete, obtained by eliminating the fine aggregate from the normal concrete mix. The advantages of this type of concrete are lower density, lower cost due to lower cement content, lower thermal conductivity, relatively low drying shrinkage, no segregation and capillary movement of water. In this paper an investigation is made on the mechanical response of the pervious concrete at zero fine level (zero fine concrete) made with local brick aggregate. Effect of aggregate size variation on the strength, void ratio and permeability of the zero fine concrete is studied. Finally, a comparison is also presented between the stone aggregate made pervious concrete and brick aggregate made pervious concrete. In total 75 concrete cylinder were tested for compressive strength, 15 cylinder were tested for void ratio and 15 cylinder were tested for permeability test. Mix proportion (cement: Coarse aggregate) was kept fixed at 1:6 (by weights), where water cement ratio was valued 0.35 for preparing the sample specimens. The brick aggregate size varied among 25mm, 19mm, 12mm. It has been found that the compressive strength decreased with the increment of aggregate size but permeability increases and concrete made with 19mm maximum aggregate size yields the optimum value. No significant differences on the strength and permeability test are observed between the brick aggregate made zero fine concrete and stone aggregate made zero fine concrete.Keywords: pervious concrete, brick aggregate concrete, zero fine concrete, permeability, porosity
Procedia PDF Downloads 55510155 An Experimental Study on the Influence of Mineral Admixtures on the Fire Resistance of High-Strength Concrete
Authors: Ki-seok Kwon, Dong-woo Ryu, Heung-Youl Kim
Abstract:
Although high-strength concrete has many advantages over generic concrete at normal temperatures (around 20℃), it undergoes spalling at high temperatures, which constitutes its structurally fatal drawback. In this study, fire resistance tests were conducted for 3 hours in accordance with ASTM E119 on bearing wall specimens which were 3,000mm x 3,000mm x 300mm in dimensions to investigate the influence the type of admixtures would exert on the fire resistance performance of high-strength concrete. Portland cement, blast furnace slag, fly ash and silica fume were used as admixtures, among which 2 or 3 components were combined to make 7 types of mixtures. In 56MPa specimens, the severity of spalling was in order of SF5 > F25 > S65SF5 > S50. Specimen S50 where an admixture consisting of 2 components was added did not undergo spalling. In 70MPa specimens, the severity of spalling was in order of SF5 > F25SF5 > S45SF5 and the result was similar to that observed in 56MPa specimens. Acknowledgements— This study was conducted by the support of the project, “Development of performance-based fire safety design of the building and improvement of fire safety” (18AUDP-B100356-04) which is under the management of Korea Agency for Infrastructure Technology Advancement as part of the urban architecture research project for the Ministry of Land, Infrastructure and Transport, for which we extend our deep thanks.Keywords: high strength concrete, mineral admixture, fire resistance, social disaster
Procedia PDF Downloads 14410154 Sugarcane Bagasse Ash Geopolymer Mixtures: A Step Towards Sustainable Materials
Authors: Mohammad J. Khattak, Atif Khan, Thomas C. Pesacreta
Abstract:
Millions of tons of sugarcane bagasse ash (SBA) are produced as a byproduct by burning sugarcane bagasse in powerplants to run the steam engines for sugar production. This bagasse ash is disposed into landfills effecting their overall capacity. SBA contains very fine particles that can easily become airborne, causing serious respiratory health risks when inhaled. This research study evaluated the utilization of high dosage of SBA for developing geopolymer based “Green” construction materials. An experimental design matrix was developed with varying dosages of SBA (0, 20%, 60%, and 80%) and Na₂SiO3/NaOH ratio (0, 0.5, 1, 1.5, 2) based on the response surface methodology. Precursor (consisting of SBA and fly ash) to aggregate ration was kept constant at 30:70 and the alkali to binder ratio was maintained at 0.45 for all the mixtures. Geopolymer samples of size 50.8 x 50.8 mm (2” X 2”) were casted and cured at 65oC for 48 hours in a water bath followed by curing at room temperature for 24 hours. The samples were then tested for compressive strength as per ASTM C39. The results revealed that based on varying SBA dosage the compressive strengths ranged from 6.78 MPa to 22.63 MPa. Moreover, the effect of SiO2, Na₂O and Fe₂O₃ on the compressive strength of these mixtures was also evaluated. The results depicted that the compressive strength increased with increasing Na₂O and Fe₂O₃ concentration in the binder. It was also observed that the compressive strength of SBA based geopolymer mixtures improved as the SiO₂ content increased, reaching an optimum at 42%. However, further increase in SiO₂ reduced the strength of the mixtures. The resulting geopolymer mixtures possess compressive strengths according to the requirements set by ASTM standard. Such mixtures can be used as a structural and non-structural element as strong road bases, sidewalks, curbs, bricks for buildings and highway infrastructure. Using industrial SBA in geopolymer based construction materials can address the carbon emissions related to cement production, reduce landfill burden from SBA storage, and mitigate health risks associated with high content of silica in SBA.Keywords: compressive strength, geopolymer concrete, green materials, sugarcane bagasse ash
Procedia PDF Downloads 1010153 Activation Parameters of the Low Temperature Creep Controlling Mechanism in Martensitic Steels
Abstract:
Martensitic steels with an ultimate tensile strength beyond 2000 MPa are applied in the powertrain of vehicles due to their excellent fatigue strength and high creep resistance. However, the creep controlling mechanism in martensitic steels at ambient temperatures up to 423 K is not evident. The purpose of this study is to review the low temperature creep (LTC) behavior of martensitic steels at temperatures from 363 K to 523 K. Thus, the validity of a logarithmic creep law is reviewed and the stress and temperature dependence of the creep parameters α and β are revealed. Furthermore, creep tests are carried out, which include stepped changes in temperature or stress, respectively. On one hand, the change of the creep rate due to a temperature step provides information on the magnitude of the activation energy of the LTC controlling mechanism and on the other hand, the stress step approach provides information on the magnitude of the activation volume. The magnitude, the temperature dependency, and the stress dependency of both material specific activation parameters may deliver a significant contribution to the disclosure of the nature of the LTC rate controlling mechanism.Keywords: activation parameters, creep mechanisms, high strength steels, low temperature creep
Procedia PDF Downloads 17110152 Evaluation and Compression of Different Language Transformer Models for Semantic Textual Similarity Binary Task Using Minority Language Resources
Authors: Ma. Gracia Corazon Cayanan, Kai Yuen Cheong, Li Sha
Abstract:
Training a language model for a minority language has been a challenging task. The lack of available corpora to train and fine-tune state-of-the-art language models is still a challenge in the area of Natural Language Processing (NLP). Moreover, the need for high computational resources and bulk data limit the attainment of this task. In this paper, we presented the following contributions: (1) we introduce and used a translation pair set of Tagalog and English (TL-EN) in pre-training a language model to a minority language resource; (2) we fine-tuned and evaluated top-ranking and pre-trained semantic textual similarity binary task (STSB) models, to both TL-EN and STS dataset pairs. (3) then, we reduced the size of the model to offset the need for high computational resources. Based on our results, the models that were pre-trained to translation pairs and STS pairs can perform well for STSB task. Also, having it reduced to a smaller dimension has no negative effect on the performance but rather has a notable increase on the similarity scores. Moreover, models that were pre-trained to a similar dataset have a tremendous effect on the model’s performance scores.Keywords: semantic matching, semantic textual similarity binary task, low resource minority language, fine-tuning, dimension reduction, transformer models
Procedia PDF Downloads 21110151 Evaluation of Static Modulus of Elasticity Depending on Concrete Compressive Strength
Authors: Klara Krizova, Rudolf Hela
Abstract:
The paper is focused on monitoring of dependencies of different composition concretes on elastic modulus values. To obtain a summary of elastic modulus development independence of concrete composition design variability was the objective of the experiment. Essential part of this work was initiated as a reaction to building practice when questions of elastic moduli arose at the same time and which mostly did not obtain the required and expected values from concrete constructions. With growing interest in this theme the elastic modulus questions have been developing further.Keywords: concrete, compressive strength, modulus of elasticity, EuroCode 2
Procedia PDF Downloads 45510150 Replacement of the Distorted Dentition of the Cone Beam Computed Tomography Scan Models for Orthognathic Surgery Planning
Authors: T. Almutairi, K. Naudi, N. Nairn, X. Ju, B. Eng, J. Whitters, A. Ayoub
Abstract:
Purpose: At present Cone Beam Computed Tomography (CBCT) imaging does not record dental morphology accurately due to the scattering produced by metallic restorations and the reported magnification. The aim of this pilot study is the development and validation of a new method for the replacement of the distorted dentition of CBCT scans with the dental image captured by the digital intraoral camera. Materials and Method: Six dried skulls with orthodontics brackets on the teeth were used in this study. Three intra-oral markers made of dental stone were constructed which were attached to orthodontics brackets. The skulls were CBCT scanned, and occlusal surface was captured using TRIOS® 3D intraoral scanner. Marker based and surface based registrations were performed to fuse the digital intra-oral scan(IOS) into the CBCT models. This produced a new composite digital model of the skull and dentition. The skulls were scanned again using the commercially accurate Laser Faro® arm to produce the 'gold standard' model for the assessment of the accuracy of the developed method. The accuracy of the method was assessed by measuring the distance between the occlusal surfaces of the new composite model and the 'gold standard' 3D model of the skull and teeth. The procedure was repeated a week apart to measure the reproducibility of the method. Results: The results showed no statistically significant difference between the measurements on the first and second occasions. The absolute mean distance between the new composite model and the laser model ranged between 0.11 mm to 0.20 mm. Conclusion: The dentition of the CBCT can be accurately replaced with the dental image captured by the intra-oral scanner to create a composite model. This method will improve the accuracy of orthognathic surgical prediction planning, with the final goal of the fabrication of a physical occlusal wafer without to guide orthognathic surgery and eliminate the need for dental impression.Keywords: orthognathic surgery, superimposition, models, cone beam computed tomography
Procedia PDF Downloads 19810149 Evaluation of Engineering Cementitious Composites (ECC) with Different Percentage of Fibers
Authors: Bhaumik Merchant, Ajay Gelot
Abstract:
Concrete is good in compression but if any type of strain applied to it, it starts to fail. Where the steel is good tension, it can bear the deflection up to its elastic limits. This project is based on behavior of engineered cementitious composited (ECC) when it is replaced with the different amount of Polyvinyl Alcohol (PVA) Fibers. As for research, PVA fibers is used with cementitious up to 2% to evaluate the optimum amount of fiber on which we can find the maximum compressive, tensile and flexural strength. PVA is basically an adhesive which is used to formulate glue. Generally due to excessive loading, cracks develops which concludes to successive damage to the structural component. In research plasticizer is used to increase workability. With the help of optimum amount of PVA fibers, it can limit the crack widths up to 60µm to 100µm. Also can be used to reduce resources and funds for rehabilitation of structure. At the starting this fiber concrete can be double the cost as compare to conventional concrete but as it can amplify the duration of structure, it will be less costlier than the conventional concrete.Keywords: compressive strength, engineered cementitious composites, flexural strength, polyvinyl alcohol fibers, rehabilitation of structures
Procedia PDF Downloads 29010148 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate
Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe
Abstract:
This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.Keywords: ARIMA, error metrices, model selection, SETAR
Procedia PDF Downloads 24410147 Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Secondary Distant Metastases Growth
Authors: Ella Tyuryumina, Alexey Neznanov
Abstract:
This study is an attempt to obtain reliable data on the natural history of breast cancer growth. We analyze the opportunities for using classical mathematical models (exponential and logistic tumor growth models, Gompertz and von Bertalanffy tumor growth models) to try to describe growth of the primary tumor and the secondary distant metastases of human breast cancer. The research aim is to improve predicting accuracy of breast cancer progression using an original mathematical model referred to CoMPaS and corresponding software. We are interested in: 1) modelling the whole natural history of the primary tumor and the secondary distant metastases; 2) developing adequate and precise CoMPaS which reflects relations between the primary tumor and the secondary distant metastases; 3) analyzing the CoMPaS scope of application; 4) implementing the model as a software tool. The foundation of the CoMPaS is the exponential tumor growth model, which is described by determinate nonlinear and linear equations. The CoMPaS corresponds to TNM classification. It allows to calculate different growth periods of the primary tumor and the secondary distant metastases: 1) ‘non-visible period’ for the primary tumor; 2) ‘non-visible period’ for the secondary distant metastases; 3) ‘visible period’ for the secondary distant metastases. The CoMPaS is validated on clinical data of 10-years and 15-years survival depending on the tumor stage and diameter of the primary tumor. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer growth models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. The CoMPaS model and predictive software: a) fit to clinical trials data; b) detect different growth periods of the primary tumor and the secondary distant metastases; c) make forecast of the period of the secondary distant metastases appearance; d) have higher average prediction accuracy than the other tools; e) can improve forecasts on survival of breast cancer and facilitate optimization of diagnostic tests. The following are calculated by CoMPaS: the number of doublings for ‘non-visible’ and ‘visible’ growth period of the secondary distant metastases; tumor volume doubling time (days) for ‘non-visible’ and ‘visible’ growth period of the secondary distant metastases. The CoMPaS enables, for the first time, to predict ‘whole natural history’ of the primary tumor and the secondary distant metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on the primary tumor sizes. Summarizing: a) CoMPaS describes correctly the primary tumor growth of IA, IIA, IIB, IIIB (T1-4N0M0) stages without metastases in lymph nodes (N0); b) facilitates the understanding of the appearance period and inception of the secondary distant metastases.Keywords: breast cancer, exponential growth model, mathematical model, metastases in lymph nodes, primary tumor, survival
Procedia PDF Downloads 34110146 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data
Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar
Abstract:
It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.Keywords: accuracy, exponential smoothing, forecasting, initial value
Procedia PDF Downloads 17710145 BART Matching Method: Using Bayesian Additive Regression Tree for Data Matching
Authors: Gianna Zou
Abstract:
Propensity score matching (PSM), introduced by Paul R. Rosenbaum and Donald Rubin in 1983, is a popular statistical matching technique which tries to estimate the treatment effects by taking into account covariates that could impact the efficacy of study medication in clinical trials. PSM can be used to reduce the bias due to confounding variables. However, PSM assumes that the response values are normally distributed. In some cases, this assumption may not be held. In this paper, a machine learning method - Bayesian Additive Regression Tree (BART), is used as a more robust method of matching. BART can work well when models are misspecified since it can be used to model heterogeneous treatment effects. Moreover, it has the capability to handle non-linear main effects and multiway interactions. In this research, a BART Matching Method (BMM) is proposed to provide a more reliable matching method over PSM. By comparing the analysis results from PSM and BMM, BMM can perform well and has better prediction capability when the response values are not normally distributed.Keywords: BART, Bayesian, matching, regression
Procedia PDF Downloads 14710144 Reliability Assessment of Various Empirical Formulas for Prediction of Scour Hole Depth (Plunge Pool) Using a Comprehensive Physical Model
Authors: Majid Galoie, Khodadad Safavi, Abdolreza Karami Nejad, Reza Roshan
Abstract:
In this study, a comprehensive scouring model has been developed in order to evaluate the accuracy of various empirical relationships which were suggested for prediction of scour hole depth in plunge pools by Martins, Mason, Chian and Veronese. For this reason, scour hole depths caused by free falling jets from a flip bucket to a plunge pool were investigated. In this study various discharges, angles, scouring times, etc. have been considered. The final results demonstrated that the all mentioned empirical formulas, except Mason formula, were reasonably agreement with the experimental data.Keywords: scour hole depth, plunge pool, physical model, reliability assessment
Procedia PDF Downloads 53510143 Characterization of Mechanical Properties of Graphene-Modified Epoxy Resin for Pipeline Repair
Authors: Siti Nur Afifah Azraai, Lim Kar Sing, Nordin Yahaya, Norhazilan Md Noor
Abstract:
This experimental study consists of a characterization of epoxy grout where an amount of 2% of graphene nanoplatelets particles were added to commercial epoxy resin to evaluate their behavior regarding neat epoxy resin. Compressive tests, tensile tests and flexural tests were conducted to study the effect of graphene nanoplatelets on neat epoxy resin. By comparing graphene-based and neat epoxy grout, there is no significant increase of strength due to weak interface in the graphene nanoplatelets/epoxy composites. From this experiment, the tension and flexural strength of graphene-based epoxy grouts is slightly lower than ones of neat epoxy grout. Nevertheless, the addition of graphene has produced more consistent results according to a smaller standard deviation of strength. Furthermore, the graphene has also improved the ductility of the grout, hence reducing its brittle behaviour. This shows that the performance of graphene-based grout is reliably predictable and able to minimize sudden rupture. This is important since repair design of damaged pipeline is of deterministic nature.Keywords: composite, epoxy resin, graphene nanoplatelets, pipeline
Procedia PDF Downloads 482