Search results for: hybrid models
6698 Generating a Multiplex Sensing Platform for the Accurate Diagnosis of Sepsis
Authors: N. Demertzis, J. L. Bowen
Abstract:
Sepsis is a complex and rapidly evolving condition, resulting from uncontrolled prolonged activation of host immune system due to pathogenic insult. The aim of this study is the development of a multiplex electrochemical sensing platform, capable of detecting both pathogen associated and host immune markers to enable the rapid and definitive diagnosis of sepsis. A combination of aptamers and molecular imprinting approaches have been employed to generate sensing systems for lipopolysaccharide (LPS), c-reactive protein (CRP) and procalcitonin (PCT). Gold working electrodes were mechanically polished and electrochemically cleaned with 0.1 M sulphuric acid using cyclic voltammetry (CV). Following activation, a self-assembled monolayer (SAM) was generated, by incubating the electrodes with a thiolated anti-LPS aptamer / dithiodibutiric acid (DTBA) mixture (1:20). 3-aminophenylboronic acid (3-APBA) in combination with the anti-LPS aptamer was used for the development of the hybrid molecularly imprinted sensor (apta-MIP). Aptasensors, targeting PCT and CRP were also fabricated, following the same approach as in the case of LPS, with mercaptohexanol (MCH) replacing DTBA. In the case of the CRP aptasensor, the SAM was formed following incubation of a 1:1 aptamer: MCH mixture. However, in the case of PCT, the SAM was formed with the aptamer itself, with subsequent backfilling with 1 μM MCH. The binding performance of all systems has been evaluated using electrochemical impedance spectroscopy. The apta-MIP’s polymer thickness is controlled by varying the number of electropolymerisation cycles. In the ideal number of polymerisation cycles, the polymer must cover the electrode surface and create a binding pocket around LPS and its aptamer binding site. Less polymerisation cycles will create a hybrid system which resembles an aptasensor, while more cycles will be able to cover the complex and demonstrate a bulk polymer-like behaviour. Both aptasensor and apta-MIP were challenged with LPS and compared to conventional imprinted (absence of aptamer from the binding site, polymer formed in presence of LPS) and non-imprinted polymers (NIPS, absence of LPS whilst hybrid polymer is formed). A stable LPS aptasensor, capable of detecting down to 5 pg/ml of LPS was generated. The apparent Kd of the system was estimated at 17 pM, with a Bmax of approximately 50 pM. The aptasensor demonstrated high specificity to LPS. The apta-MIP demonstrated superior recognition properties with a limit of detection of 1 fg/ml and a Bmax of 100 pg/ml. The CRP and PCT aptasensors were both able to detect down to 5 pg/ml. Whilst full binding performance is currently being evaluated, there is none of the sensors demonstrate cross-reactivity towards LPS, CRP or PCT. In conclusion, stable aptasensors capable of detecting LPS, PCT and CRP at low concentrations have been generated. The realisation of a multiplex panel such as described herein, will effectively contribute to the rapid, personalised diagnosis of sepsis.Keywords: aptamer, electrochemical impedance spectroscopy, molecularly imprinted polymers, sepsis
Procedia PDF Downloads 1256697 Development of Highly Repellent Silica Nanoparticles Treatment for Protection of Bio-Based Insulation Composite Material
Authors: Nadia Sid, Alan Taylor, Marion Bourebrab
Abstract:
The construction sector is on the critical path to decarbonise the European economy by 2050. In order to achieve this objective it must enable reducing its CO2 emission by 90% and its energy consumption by as much as 50%. For this reason, a new class of low environmental impact construction materials named “eco-material” are becoming increasingly important in the struggle against climate change. A European funded collaborative project ISOBIO coordinated by TWI is aimed at taking a radical approach to the use of bio-based aggregates to create novel construction materials that are usable in high volume in using traditional methods, as well as developing markets such as exterior insulation of existing house stocks. The approach taken for this project is to use finely chopped material protected from bio-degradation through the use of functionalized silica nanoparticles. TWI is exploring the development of novel inorganic-organic hybrid nano-materials, to be applied as a surface treatment onto bio-based aggregates. These nanoparticles are synthesized by sol-gel processing and then functionalised with silanes to impart multifunctionality e.g. hydrophobicity, fire resistance and chemical bonding between the silica nanoparticles and the bio-based aggregates. This talk will illustrate the approach taken by TWI to design the functionalized silica nanoparticles by using a material-by-design approach. The formulation and synthesize process will be presented together with the challenges addressed by those hybrid nano-materials. The results obtained with regards to the water repellence and fire resistance will be displayed together with preliminary public results of the ISOBIO project. (This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 641927).Keywords: bio-sourced material, composite material, durable insulation panel, water repellent material
Procedia PDF Downloads 2376696 Construction of QSAR Models to Predict Potency on a Series of substituted Imidazole Derivatives as Anti-fungal Agents
Authors: Sara El Mansouria Beghdadi
Abstract:
Quantitative structure–activity relationship (QSAR) modelling is one of the main computer tools used in medicinal chemistry. Over the past two decades, the incidence of fungal infections has increased due to the development of resistance. In this study, the QSAR was performed on a series of esters of 2-carboxamido-3-(1H-imidazole-1-yl) propanoic acid derivatives. These compounds have showed moderate and very good antifungal activity. The multiple linear regression (MLR) was used to generate the linear 2d-QSAR models. The dataset consists of 115 compounds with their antifungal activity (log MIC) against «Candida albicans» (ATCC SC5314). Descriptors were calculated, and different models were generated using Chemoffice, Avogadro, GaussView software. The selected model was validated. The study suggests that the increase in lipophilicity and the reduction in the electronic character of the substituent in R1, as well as the reduction in the steric hindrance of the substituent in R2 and its aromatic character, supporting the potentiation of the antifungal effect. The results of QSAR could help scientists to propose new compounds with higher antifungal activities intended for immunocompromised patients susceptible to multi-resistant nosocomial infections.Keywords: quantitative structure–activity relationship, imidazole, antifungal, candida albicans (ATCC SC5314)
Procedia PDF Downloads 846695 The Design of the Questionnaire of Attitudes in Physics Teaching
Authors: Ricardo Merlo
Abstract:
Attitude is a hypothetical construct that can be significantly measured to know the favorable or unfavorable predisposition that students have towards the teaching of sciences such as Physics. Although the state-of-the-art attitude test used in Physics teaching indicated different design and validation models in different groups of students, the analysis of the weight given to each dimension that supported the attitude was scarcely evaluated. Then, in this work, a methodology of attitude questionnaire construction process was proposed that allowed the teacher to design and validate the measurement instrument for different subjects of Physics at the university level developed in the classroom according to the weight considered to the affective, knowledge, and behavioural dimensions. Finally, questionnaire models were tested for the case of incoming university students, achieving significant results in the improvement of Physics teaching.Keywords: attitude, physics teaching, motivation, academic performance
Procedia PDF Downloads 706694 Testing and Validation Stochastic Models in Epidemiology
Authors: Snigdha Sahai, Devaki Chikkavenkatappa Yellappa
Abstract:
This study outlines approaches for testing and validating stochastic models used in epidemiology, focusing on the integration and functional testing of simulation code. It details methods for combining simple functions into comprehensive simulations, distinguishing between deterministic and stochastic components, and applying tests to ensure robustness. Techniques include isolating stochastic elements, utilizing large sample sizes for validation, and handling special cases. Practical examples are provided using R code to demonstrate integration testing, handling of incorrect inputs, and special cases. The study emphasizes the importance of both functional and defensive programming to enhance code reliability and user-friendliness.Keywords: computational epidemiology, epidemiology, public health, infectious disease modeling, statistical analysis, health data analysis, disease transmission dynamics, predictive modeling in health, population health modeling, quantitative public health, random sampling simulations, randomized numerical analysis, simulation-based analysis, variance-based simulations, algorithmic disease simulation, computational public health strategies, epidemiological surveillance, disease pattern analysis, epidemic risk assessment, population-based health strategies, preventive healthcare models, infection dynamics in populations, contagion spread prediction models, survival analysis techniques, epidemiological data mining, host-pathogen interaction models, risk assessment algorithms for disease spread, decision-support systems in epidemiology, macro-level health impact simulations, socioeconomic determinants in disease spread, data-driven decision making in public health, quantitative impact assessment of health policies, biostatistical methods in population health, probability-driven health outcome predictions
Procedia PDF Downloads 66693 From the Sharing Economy to Social Manufacturing: Analyzing Collaborative Service Networks in the Manufacturing Domain
Authors: Babak Mohajeri
Abstract:
In recent years, the conventional business model of ownership has been changed towards accessibility in a variety of markets. Two trends can be observed in the evolution of this rental-like business model. Firstly, the technological development that enables the emergence of new business models. These new business models increasingly become agile and flexible. For example Spotify, an online music stream company provides consumers access to over millions of music tracks, conveniently through the smartphone, tablet or computer. Similarly, Car2Go, the car sharing company accesses its members with flexible and nearby sharing cars. The second trend is the increasing communication and connections via social networks. This trend enables a shift to peer-to-peer accessibility based business models. Conventionally, companies provide access for their customers to own companies products or services. In peer-to-peer model, nonetheless, companies facilitate access and connection across their customers to use other customers owned property or skills, competencies or services .The is so-called the sharing economy business model. The aim of this study is to investigate into a new and emerging type of the sharing economy model in which role of customers and service providers may dramatically change. This new model is called Collaborative Service Networks. We propose a mechanism for Collaborative Service Networks business model. Uber and Airbnb, two successful growing companies, have been selected for our case studies and their business models are analyzed. Finally, we study the emergence of the collaborative service networks in the manufacturing domain. Our finding results to a new manufacturing paradigm called social manufacturing.Keywords: sharing economy, collaborative service networks, social manufacturing, manufacturing development
Procedia PDF Downloads 3176692 Comparative Operating Speed and Speed Differential Day and Night Time Models for Two Lane Rural Highways
Authors: Vinayak Malaghan, Digvijay Pawar
Abstract:
Speed is the independent parameter which plays a vital role in the highway design. Design consistency of the highways is checked based on the variation in the operating speed. Often the design consistency fails to meet the driver’s expectation which results in the difference between operating and design speed. Literature reviews have shown that significant crashes take place in horizontal curves due to lack of design consistency. The paper focuses on continuous speed profile study on tangent to curve transition for both day and night daytime. Data is collected using GPS device which gives continuous speed profile and other parameters such as acceleration, deceleration were analyzed along with Tangent to Curve Transition. In this present study, models were developed to predict operating speed on tangents and horizontal curves as well as model indicating the speed reduction from tangent to curve based on continuous speed profile data. It is observed from the study that vehicle tends to decelerate from approach tangent to between beginning of the curve and midpoint of the curve and then accelerates from curve to tangent transition. The models generated were compared for both day and night and can be used in the road safety improvement by evaluating the geometric design consistency.Keywords: operating speed, design consistency, continuous speed profile data, day and night time
Procedia PDF Downloads 1576691 A Basic Modeling Approach for the 3D Protein Structure of Insulin
Authors: Daniel Zarzo Montes, Manuel Zarzo Castelló
Abstract:
Proteins play a fundamental role in biology, but their structure is complex, and it is a challenge for teachers to conceptually explain the differences between their primary, secondary, tertiary, and quaternary structures. On the other hand, there are currently many computer programs to visualize the 3D structure of proteins, but they require advanced training and knowledge. Moreover, it becomes difficult to visualize the sequence of amino acids in these models, and how the protein conformation is reached. Given this drawback, a simple and instructive procedure is proposed in order to teach the protein structure to undergraduate and graduate students. For this purpose, insulin has been chosen because it is a protein that consists of 51 amino acids, a relatively small number. The methodology has consisted of the use of plastic atom models, which are frequently used in organic chemistry and biochemistry to explain the chirality of biomolecules. For didactic purposes, when the aim is to teach the biochemical foundations of proteins, a manipulative system seems convenient, starting from the chemical structure of amino acids. It has the advantage that the bonds between amino acids can be conveniently rotated, following the pattern marked by the 3D models. First, the 51 amino acids were modeled, and then they were linked according to the sequence of this protein. Next, the three disulfide bonds that characterize the stability of insulin have been established, and then the alpha-helix structure has been formed. In order to reach the tertiary 3D conformation of this protein, different interactive models available on the Internet have been visualized. In conclusion, the proposed methodology seems very suitable for biology and biochemistry students because they can learn the fundamentals of protein modeling by means of a manipulative procedure as a basis for understanding the functionality of proteins. This methodology would be conveniently useful for a biology or biochemistry laboratory practice, either at the pre-graduate or university level.Keywords: protein structure, 3D model, insulin, biomolecule
Procedia PDF Downloads 556690 Numerical Model Validation Using Durbin Method
Authors: H. Al-Hajeri
Abstract:
The computation of the effectiveness of turbulence enhancement surface features, such as ribs as means of promoting mixing and hence heat transfer, has attracted the continued attention of the engineering community. In this study, the simulation of a three-dimensional cooling passage is carried out employing a number of turbulence models including Durbin model. The cooling passage consists of a square section duct whose upper and lower surfaces feature staggered cuboid ribs. The main objective of this paper is to provide comparisons of the performance of the v2-f model against other established turbulence models as implemented in the commercial CFD code Ansys Fluent. The present study demonstrates that the v2-f model can successfully capture the isothermal air flow phenomena in flow over obstacles.Keywords: CFD, cooling passage, Durbin model, turbulence model
Procedia PDF Downloads 5036689 A Sliding Mesh Technique and Compressibility Correction Effects of Two-Equation Turbulence Models for a Pintle-Perturbed Flow Analysis
Authors: J. Y. Heo, H. G. Sung
Abstract:
Numerical simulations have been performed for assessment of compressibility correction of two-equation turbulence models suitable for large scale separation flows perturbed by pintle strokes. In order to take into account pintle movement, a sliding mesh method was applied. The chamber pressure, mass flow rate, and thrust have been analyzed, and the response lag and sensitivity at the chamber and nozzle were estimated for a movable pintle. The nozzle performance for pintle reciprocating as its insertion and extraction processes, were analyzed to better understand the dynamic performance of the pintle nozzle.Keywords: pintle, sliding mesh, turbulent model, compressibility correction
Procedia PDF Downloads 4896688 Hate Speech Detection Using Deep Learning and Machine Learning Models
Authors: Nabil Shawkat, Jamil Saquer
Abstract:
Social media has accelerated our ability to engage with others and eliminated many communication barriers. On the other hand, the widespread use of social media resulted in an increase in online hate speech. This has drastic impacts on vulnerable individuals and societies. Therefore, it is critical to detect hate speech to prevent innocent users and vulnerable communities from becoming victims of hate speech. We investigate the performance of different deep learning and machine learning algorithms on three different datasets. Our results show that the BERT model gives the best performance among all the models by achieving an F1-score of 90.6% on one of the datasets and F1-scores of 89.7% and 88.2% on the other two datasets.Keywords: hate speech, machine learning, deep learning, abusive words, social media, text classification
Procedia PDF Downloads 1366687 Enhancing Sell-In and Sell-Out Forecasting Using Ensemble Machine Learning Method
Authors: Vishal Das, Tianyi Mao, Zhicheng Geng, Carmen Flores, Diego Pelloso, Fang Wang
Abstract:
Accurate sell-in and sell-out forecasting is a ubiquitous problem in the retail industry. It is an important element of any demand planning activity. As a global food and beverage company, Nestlé has hundreds of products in each geographical location that they operate in. Each product has its sell-in and sell-out time series data, which are forecasted on a weekly and monthly scale for demand and financial planning. To address this challenge, Nestlé Chilein collaboration with Amazon Machine Learning Solutions Labhas developed their in-house solution of using machine learning models for forecasting. Similar products are combined together such that there is one model for each product category. In this way, the models learn from a larger set of data, and there are fewer models to maintain. The solution is scalable to all product categories and is developed to be flexible enough to include any new product or eliminate any existing product in a product category based on requirements. We show how we can use the machine learning development environment on Amazon Web Services (AWS) to explore a set of forecasting models and create business intelligence dashboards that can be used with the existing demand planning tools in Nestlé. We explored recent deep learning networks (DNN), which show promising results for a variety of time series forecasting problems. Specifically, we used a DeepAR autoregressive model that can group similar time series together and provide robust predictions. To further enhance the accuracy of the predictions and include domain-specific knowledge, we designed an ensemble approach using DeepAR and XGBoost regression model. As part of the ensemble approach, we interlinked the sell-out and sell-in information to ensure that a future sell-out influences the current sell-in predictions. Our approach outperforms the benchmark statistical models by more than 50%. The machine learning (ML) pipeline implemented in the cloud is currently being extended for other product categories and is getting adopted by other geomarkets.Keywords: sell-in and sell-out forecasting, demand planning, DeepAR, retail, ensemble machine learning, time-series
Procedia PDF Downloads 2746686 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models
Authors: Ramin Vafadary, Maryam Khanbaghi
Abstract:
Forecasting electricity load is important for various purposes like planning, operation, and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet, and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria, namely, the mean absolute error and root mean square error. The National Renewable Energy Laboratory (NREL) residential energy consumption data is used to train the models. The results of this study show that the SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts, we can improve the robustness of the models for 24 hours ahead of electricity load forecasting.Keywords: bagging, Fbprophet, Holt-Winters, LSTM, load forecast, SARIMA, TensorFlow probability, time series
Procedia PDF Downloads 956685 Hidden Markov Movement Modelling with Irregular Data
Authors: Victoria Goodall, Paul Fatti, Norman Owen-Smith
Abstract:
Hidden Markov Models have become popular for the analysis of animal tracking data. These models are being used to model the movements of a variety of species in many areas around the world. A common assumption of the model is that the observations need to have regular time steps. In many ecological studies, this will not be the case. The objective of the research is to modify the movement model to allow for irregularly spaced locations and investigate the effect on the inferences which can be made about the latent states. A modification of the likelihood function to allow for these irregular spaced locations is investigated, without using interpolation or averaging the movement rate. The suitability of the modification is investigated using GPS tracking data for lion (Panthera leo) in South Africa, with many observations obtained during the night, and few observations during the day. Many nocturnal predator tracking studies are set up in this way, to obtain many locations at night when the animal is most active and is difficult to observe. Few observations are obtained during the day, when the animal is expected to rest and is potentially easier to observe. Modifying the likelihood function allows the popular Hidden Markov Model framework to be used to model these irregular spaced locations, making use of all the observed data.Keywords: hidden Markov Models, irregular observations, animal movement modelling, nocturnal predator
Procedia PDF Downloads 2446684 Comparison of Adsorbents for Ammonia Removal from Mining Wastewater
Authors: F. Al-Sheikh, C. Moralejo, M. Pritzker, W. A. Anderson, A. Elkamel
Abstract:
Ammonia in mining wastewater is a significant problem, and treatment can be especially difficult in cold climates where biological treatment is not feasible. An adsorption process is one of the alternative processes that can be used to reduce ammonia concentrations to acceptable limits, and therefore a LEWATIT resin strongly acidic H+ form ion exchange resin and a Bowie Chabazite Na form AZLB-Na zeolite were tested to assess their effectiveness. For these adsorption tests, two packed bed columns (a mini-column constructed from a 32-cm long x 1-cm diameter piece of glass tubing, and a 60-cm long x 2.5-cm diameter Ace Glass chromatography column) were used containing varying quantities of the adsorbents. A mining wastewater with ammonia concentrations of 22.7 mg/L was fed through the columns at controlled flowrates. In the experimental work, maximum capacities of the LEWATIT ion exchange resin were 0.438, 0.448, and 1.472 mg/g for 3, 6, and 9 g respectively in a mini column and 1.739 mg/g for 141.5 g in a larger Ace column while the capacities for the AZLB-Na zeolite were 0.424, and 0.784 mg/g for 3, and 6 g respectively in the mini column and 1.1636 mg/g for 38.5 g in the Ace column. In the theoretical work, Thomas, Adams-Bohart, and Yoon-Nelson models were constructed to describe a breakthrough curve of the adsorption process and find the constants of the above-mentioned models. In the regeneration tests, 5% hydrochloric acid, HCl (v/v) and 10% sodium hydroxide, NaOH (w/v) were used to regenerate the LEWATIT resin and AZLB-Na zeolite with 44 and 63.8% recovery, respectively. In conclusion, continuous flow adsorption using a LEWATIT ion exchange resin and an AZLB-Na zeolite is efficient when using a co-flow technique for removal of the ammonia from wastewater. Thomas, Adams-Bohart, and Yoon-Nelson models satisfactorily fit the data with R2 closer to 1 in all cases.Keywords: AZLB-Na zeolite, continuous adsorption, Lewatit resin, models, regeneration
Procedia PDF Downloads 3896683 Benefits of Automobile Electronic Technology in the Logistics Industry in Third World Countries
Authors: Jonathan Matyenyika
Abstract:
In recent years, automobile manufacturers have increasingly produced vehicles equipped with cutting-edge automotive electronic technology to match the fast-paced digital world of today; this has brought about various benefits in different business sectors that make use of these vehicles as a means of turning over a profit. In the logistics industry, vehicles equipped with this technology have proved to be very utilitarian; this paper focuses on the benefits automobile electronic equipped vehicles have in the logistics industry. Automotive vehicle manufacturers have introduced new technological electronic features to their vehicles to enhance and improve the overall performance, efficiency, safety and driver comfort. Some of these features have proved to be beneficial to logistics operators. To start with the introduction of adaptive cruise control in long-distance haulage vehicles, to see how this system benefits the drivers, we carried out research in the form of interviews with long-distance truck drivers with the main question being, what major difference have they experienced since they started to operate vehicles equipped with this technology to which most stated they had noticed that they are less tired and are able to drive longer distances as compared to when they used vehicles not equipped with this system. As a result, they can deliver faster and take on the next assignment, thus improving efficiency and bringing in more monetary return for the logistics company. Secondly, the introduction of electric hybrid technology, this system allows the vehicle to be propelled by electric power stored in batteries located in the vehicle instead of fossil fuel. Consequently, this benefits the logistic company as vehicles become cheaper to run as electricity is more affordable as compared to fossil fuel. The merging of electronic systems in vehicles has proved to be of great benefit, as my research proves that this can benefit the logistics industry in plenty of ways.Keywords: logistics, manufacturing, hybrid technology, haulage vehicles
Procedia PDF Downloads 576682 Improving the Accuracy of Stress Intensity Factors Obtained by Scaled Boundary Finite Element Method on Hybrid Quadtree Meshes
Authors: Adrian W. Egger, Savvas P. Triantafyllou, Eleni N. Chatzi
Abstract:
The scaled boundary finite element method (SBFEM) is a semi-analytical numerical method, which introduces a scaling center in each element’s domain, thus transitioning from a Cartesian reference frame to one resembling polar coordinates. Consequently, an analytical solution is achieved in radial direction, implying that only the boundary need be discretized. The only limitation imposed on the resulting polygonal elements is that they remain star-convex. Further arbitrary p- or h-refinement may be applied locally in a mesh. The polygonal nature of SBFEM elements has been exploited in quadtree meshes to alleviate all issues conventionally associated with hanging nodes. Furthermore, since in 2D this results in only 16 possible cell configurations, these are precomputed in order to accelerate the forward analysis significantly. Any cells, which are clipped to accommodate the domain geometry, must be computed conventionally. However, since SBFEM permits polygonal elements, significantly coarser meshes at comparable accuracy levels are obtained when compared with conventional quadtree analysis, further increasing the computational efficiency of this scheme. The generalized stress intensity factors (gSIFs) are computed by exploiting the semi-analytical solution in radial direction. This is initiated by placing the scaling center of the element containing the crack at the crack tip. Taking an analytical limit of this element’s stress field as it approaches the crack tip, delivers an expression for the singular stress field. By applying the problem specific boundary conditions, the geometry correction factor is obtained, and the gSIFs are then evaluated based on their formal definition. Since the SBFEM solution is constructed as a power series, not unlike mode superposition in FEM, the two modes contributing to the singular response of the element can be easily identified in post-processing. Compared to the extended finite element method (XFEM) this approach is highly convenient, since neither enrichment terms nor a priori knowledge of the singularity is required. Computation of the gSIFs by SBFEM permits exceptional accuracy, however, when combined with hybrid quadtrees employing linear elements, this does not always hold. Nevertheless, it has been shown that crack propagation schemes are highly effective even given very coarse discretization since they only rely on the ratio of mode one to mode two gSIFs. The absolute values of the gSIFs may still be subject to large errors. Hence, we propose a post-processing scheme, which minimizes the error resulting from the approximation space of the cracked element, thus limiting the error in the gSIFs to the discretization error of the quadtree mesh. This is achieved by h- and/or p-refinement of the cracked element, which elevates the amount of modes present in the solution. The resulting numerical description of the element is highly accurate, with the main error source now stemming from its boundary displacement solution. Numerical examples show that this post-processing procedure can significantly improve the accuracy of the computed gSIFs with negligible computational cost even on coarse meshes resulting from hybrid quadtrees.Keywords: linear elastic fracture mechanics, generalized stress intensity factors, scaled finite element method, hybrid quadtrees
Procedia PDF Downloads 1466681 Towards a Standardization in Scheduling Models: Assessing the Variety of Homonyms
Authors: Marcel Rojahn, Edzard Weber, Norbert Gronau
Abstract:
Terminology is a critical instrument for each researcher. Different terminologies for the same research object may arise in different research communities. By this inconsistency, many synergistic effects get lost. Theories and models will be more understandable and reusable if a common terminology is applied. This paper examines the terminological (in) consistency for the research field of job-shop scheduling through a literature review. There is an enormous variety in the choice of terms and mathematical notation for the same concept. The comparability, reusability, and combinability of scheduling methods are unnecessarily hampered by the arbitrary use of homonyms and synonyms. The acceptance in the community of used variables and notation forms is shown by means of a compliance quotient. This is proven by the evaluation of 240 scientific publications on planning methods.Keywords: job-shop scheduling, terminology, notation, standardization
Procedia PDF Downloads 1096680 Estimating Evapotranspiration Irrigated Maize in Brazil Using a Hybrid Modelling Approach and Satellite Image Inputs
Authors: Ivo Zution Goncalves, Christopher M. U. Neale, Hiran Medeiros, Everardo Mantovani, Natalia Souza
Abstract:
Multispectral and thermal infrared imagery from satellite sensors coupled with climate and soil datasets were used to estimate evapotranspiration and biomass in center pivots planted to maize in Brazil during the 2016 season. The hybrid remote sensing based model named Spatial EvapoTranspiration Modelling Interface (SETMI) was applied using multispectral and thermal infrared imagery from the Landsat Thematic Mapper instrument. Field data collected by the IRRIGER center pivot management company included daily weather information such as maximum and minimum temperature, precipitation, relative humidity for estimating reference evapotranspiration. In addition, soil water content data were obtained every 0.20 m in the soil profile down to 0.60 m depth throughout the season. Early season soil samples were used to obtain water-holding capacity, wilting point, saturated hydraulic conductivity, initial volumetric soil water content, layer thickness, and saturated volumetric water content. Crop canopy development parameters and irrigation application depths were also inputs of the model. The modeling approach is based on the reflectance-based crop coefficient approach contained within the SETMI hybrid ET model using relationships developed in Nebraska. The model was applied to several fields located in Minas Gerais State in Brazil with approximate latitude: -16.630434 and longitude: -47.192876. The model provides estimates of real crop evapotranspiration (ET), crop irrigation requirements and all soil water balance outputs, including biomass estimation using multi-temporal satellite image inputs. An interpolation scheme based on the growing degree-day concept was used to model the periods between satellite inputs, filling the gaps between image dates and obtaining daily data. Actual and accumulated ET, accumulated cold temperature and water stress and crop water requirements estimated by the model were compared with data measured at the experimental fields. Results indicate that the SETMI modeling approach using data assimilation, showed reliable daily ET and crop water requirements for maize, interpolated between remote sensing observations, confirming the applicability of the SETMI model using new relationships developed in Nebraska for estimating mainly ET and water requirements in Brazil under tropical conditions.Keywords: basal crop coefficient, irrigation, remote sensing, SETMI
Procedia PDF Downloads 1406679 Assessment of Material Type, Diameter, Orientation and Closeness of Fibers in Vulcanized Reinforced Rubbers
Authors: Ali Osman Güney, Bahattin Kanber
Abstract:
In this work, the effect of material type, diameter, orientation and closeness of fibers on the general performance of reinforced vulcanized rubbers are investigated using finite element method with experimental verification. Various fiber materials such as hemp, nylon, polyester are used for different fiber diameters, orientations and closeness. 3D finite element models are developed by considering bonded contact elements between fiber and rubber sheet interfaces. The fibers are assumed as linear elastic, while vulcanized rubber is considered as hyper-elastic. After an experimental verification of finite element results, the developed models are analyzed under prescribed displacement that causes tension. The normal stresses in fibers and shear stresses between fibers and rubber sheet are investigated in all models. Large deformation of reinforced rubber sheet also represented with various fiber conditions under incremental loading. A general assessment is achieved about best fiber properties of reinforced rubber sheets for tension-load conditions.Keywords: reinforced vulcanized rubbers, fiber properties, out of plane loading, finite element method
Procedia PDF Downloads 3466678 Improving the Biomechanical Resistance of a Treated Tooth via Composite Restorations Using Optimised Cavity Geometries
Authors: Behzad Babaei, B. Gangadhara Prusty
Abstract:
The objective of this study is to assess the hypotheses that a restored tooth with a class II occlusal-distal (OD) cavity can be strengthened by designing an optimized cavity geometry, as well as selecting the composite restoration with optimized elastic moduli when there is a sharp de-bonded edge at the interface of the tooth and restoration. Methods: A scanned human maxillary molar tooth was segmented into dentine and enamel parts. The dentine and enamel profiles were extracted and imported into a finite element (FE) software. The enamel rod orientations were estimated virtually. Fifteen models for the restored tooth with different cavity occlusal depths (1.5, 2, and 2.5 mm) and internal cavity angles were generated. By using a semi-circular stone part, a 400 N load was applied to two contact points of the restored tooth model. The junctions between the enamel, dentine, and restoration were considered perfectly bonded. All parts in the model were considered homogeneous, isotropic, and elastic. The quadrilateral and triangular elements were employed in the models. A mesh convergence analysis was conducted to verify that the element numbers did not influence the simulation results. According to the criteria of a 5% error in the stress, we found that a total element number of over 14,000 elements resulted in the convergence of the stress. A Python script was employed to automatically assign 2-22 GPa moduli (with increments of 4 GPa) for the composite restorations, 18.6 GPa to the dentine, and two different elastic moduli to the enamel (72 GPa in the enamel rods’ direction and 63 GPa in perpendicular one). The linear, homogeneous, and elastic material models were considered for the dentine, enamel, and composite restorations. 108 FEA simulations were successively conducted. Results: The internal cavity angles (α) significantly altered the peak maximum principal stress at the interface of the enamel and restoration. The strongest structures against the contact loads were observed in the models with α = 100° and 105. Even when the enamel rods’ directional mechanical properties were disregarded, interestingly, the models with α = 100° and 105° exhibited the highest resistance against the mechanical loads. Regarding the effect of occlusal cavity depth, the models with 1.5 mm depth showed higher resistance to contact loads than the model with thicker cavities (2.0 and 2.5 mm). Moreover, the composite moduli in the range of 10-18 GPa alleviated the stress levels in the enamel. Significance: For the class II OD cavity models in this study, the optimal geometries, composite properties, and occlusal cavity depths were determined. Designing the cavities with α ≥100 ̊ was significantly effective in minimizing peak stress levels. The composite restoration with optimized properties reduced the stress concentrations on critical points of the models. Additionally, when more enamel was preserved, the sturdier enamel-restoration interface against the mechanical loads was observed.Keywords: dental composite restoration, cavity geometry, finite element approach, maximum principal stress
Procedia PDF Downloads 1016677 An Application of Graph Theory to The Electrical Circuit Using Matrix Method
Authors: Samai'la Abdullahi
Abstract:
A graph is a pair of two set and so that a graph is a pictorial representation of a system using two basic element nodes and edges. A node is represented by a circle (either hallo shade) and edge is represented by a line segment connecting two nodes together. In this paper, we present a circuit network in the concept of graph theory application and also circuit models of graph are represented in logical connection method were we formulate matrix method of adjacency and incidence of matrix and application of truth table.Keywords: euler circuit and path, graph representation of circuit networks, representation of graph models, representation of circuit network using logical truth table
Procedia PDF Downloads 5616676 Using Neural Networks for Click Prediction of Sponsored Search
Authors: Afroze Ibrahim Baqapuri, Ilya Trofimov
Abstract:
Sponsored search is a multi-billion dollar industry and makes up a major source of revenue for search engines (SE). Click-through-rate (CTR) estimation plays a crucial role for ads selection, and greatly affects the SE revenue, advertiser traffic and user experience. We propose a novel architecture of solving CTR prediction problem by combining artificial neural networks (ANN) with decision trees. First, we compare ANN with respect to other popular machine learning models being used for this task. Then we go on to combine ANN with MatrixNet (proprietary implementation of boosted trees) and evaluate the performance of the system as a whole. The results show that our approach provides a significant improvement over existing models.Keywords: neural networks, sponsored search, web advertisement, click prediction, click-through rate
Procedia PDF Downloads 5726675 Seismic Behavior of Suction Caisson Foundations
Authors: Mohsen Saleh Asheghabadi, Alireza Jafari Jebeli
Abstract:
Increasing population growth requires more sustainable development of energy. This non-contaminated energy has an inexhaustible energy source. One of the vital parameters in such structures is the choice of foundation type. Suction caissons are now used extensively worldwide for offshore wind turbine. Considering the presence of a number of offshore wind farms in earthquake areas, the study of the seismic behavior of suction caisson is necessary for better design. In this paper, the results obtained from three suction caisson models with different diameter (D) and skirt length (L) in saturated sand were compared with centrifuge test results. All models are analyzed using 3D finite element (FE) method taking account of elasto-plastic Mohr–Coulomb constitutive model for soil which is available in the ABAQUS library. The earthquake load applied to the base of models with a maximum acceleration of 0.65g. The results showed that numerical method is in relative good agreement with centrifuge results. The settlement and rotation of foundation decrease by increasing the skirt length and foundation diameter. The sand soil outside the caisson is prone to liquefaction due to its low confinement.Keywords: liquefaction, suction caisson foundation, offshore wind turbine, numerical analysis, seismic behavior
Procedia PDF Downloads 1196674 Robotics Technology Supported Pedagogic Models in Science, Technology, Engineering, Arts and Mathematics Education
Authors: Sereen Itani
Abstract:
As the world aspires for technological innovation, Innovative Robotics Technology-Supported Pedagogic Models in STEAM Education (Science, Technology, Engineering, Arts, and Mathematics) are critical in our global education system to build and enhance the next generation 21st century skills. Thus, diverse international schools endeavor in attempts to construct an integrated robotics and technology enhanced curriculum based on interdisciplinary subjects. Accordingly, it is vital that the globe remains resilient in STEAM fields by equipping the future learners and educators with Innovative Technology Experiences through robotics to support such fields. A variety of advanced teaching methods is employed to learn about Robotics Technology-integrated pedagogic models. Therefore, it is only when STEAM and innovations in Robotic Technology becomes integrated with real-world applications that transformational learning can occur. Robotics STEAM education implementation faces major challenges globally. Moreover, STEAM skills and concepts are communicated in separation from the real world. Instilling the passion for robotics and STEAM subjects and educators’ preparation could lead to the students’ majoring in such fields by acquiring enough knowledge to make vital contributions to the global STEAM industries. Thus, this necessitates the establishment of Pedagogic models such as Innovative Robotics Technologies to enhance STEAM education and develop students’ 21st-century skills. Moreover, an ICT innovative supported robotics classroom will help educators empower and assess students academically. Globally, the Robotics Design System and platforms are developing in schools and university labs creating a suitable environment for the robotics cross-discipline STEAM learning. Accordingly, the research aims at raising awareness about the importance of robotics design systems and methodologies of effective employment of robotics innovative technology-supported pedagogic models to enhance and develop (STEAM) education globally and enhance the next generation 21st century skills.Keywords: education, robotics, STEAM (Science, Technology, Engineering, Arts and Mathematics Education), challenges
Procedia PDF Downloads 3846673 The Effect of Hybrid SPD Process on Mechanical Properties, Drawability, and Plastic Anisotropy of DC03 Steel
Authors: Karolina Kowalczyk-Skoczylas
Abstract:
The hybrid SPD process called DRECE (Dual Rolls Equal Channel Extrusion) combines the concepts of ECAP method and CONFORM extrusion, and is intended for processing sheet-metal workpieces. The material in the fоrm оf a metal strip is subjected tо plastic defоrmation by passing thrоugh the shaping tоol at a given angle α. Importantly, in this process the dimensions of the metal strip dо nоt change after the pass is cоmpleted. Subsequent DRECE passes allоw fоr increasing the effective strain in the tested material. The methоd has a significant effect оn the micrоstructure and mechanical prоperties оf the strip. The experimental tests have been conducted on the unconventional DRECE device in VŠB Ostrava, the Czech Republic. The DC03 steel strips have been processed in several passes - up to six. Then, both Erichsen cupping tests as well as static tensile tests have been performed to evaluate the effect of DRECE process on drawability, plastic anisotropy and mechanical properties of the investigated steel. Both yield strength and ultimate tensile strength increase significantly after consecutive passes. Drawability decreases slightly after the first and second pass. Then it stabilizes on a reasonably high level, which means that the steel is characterized by useful drawability for technological processes. It was investigated in the material is characterized by a normal anisotropy. In the microstructure, an intensification of the development of microshear bands and their mutual intersection is observed, which leads to the fragmentation of the grain into smaller volumes and, consequently, to the formation of an ultrafine grained structure. "The project was co-financed by the European Union within the programme "The European Funds for Śląsk (Silesia) 2021-2027".Keywords: SPD process, low carbon steel, mechanical properties, plastic deformation, microstructure evolution
Procedia PDF Downloads 166672 Computer Simulation Studies of Aircraft Wing Architectures on Vibration Responses
Authors: Shengyong Zhang, Mike Mikulich
Abstract:
Vibration is a crucial limiting consideration in the analysis and design of airplane wing structures to avoid disastrous failures due to the propagation of existing cracks in the material. In this paper, we build CAD models of aircraft wings to capture the design intent with configurations. Subsequent FEA vibration analysis is performed to study the natural vibration properties and impulsive responses of the resulting user-defined wing models. This study reveals the variations of the wing’s vibration characteristics with respect to changes in its structural configurations. Integrating CAD modelling and FEA vibration analysis enables designers to improve wing architectures for implementing design requirements in the preliminary design stage.Keywords: aircraft wing, CAD modelling, FEA, vibration analysis
Procedia PDF Downloads 1656671 A High Content Screening Platform for the Accurate Prediction of Nephrotoxicity
Authors: Sijing Xiong, Ran Su, Lit-Hsin Loo, Daniele Zink
Abstract:
The kidney is a major target for toxic effects of drugs, industrial and environmental chemicals and other compounds. Typically, nephrotoxicity is detected late during drug development, and regulatory animal models could not solve this problem. Validated or accepted in silico or in vitro methods for the prediction of nephrotoxicity are not available. We have established the first and currently only pre-validated in vitro models for the accurate prediction of nephrotoxicity in humans and the first predictive platforms based on renal cells derived from human pluripotent stem cells. In order to further improve the efficiency of our predictive models, we recently developed a high content screening (HCS) platform. This platform employed automated imaging in combination with automated quantitative phenotypic profiling and machine learning methods. 129 image-based phenotypic features were analyzed with respect to their predictive performance in combination with 44 compounds with different chemical structures that included drugs, environmental and industrial chemicals and herbal and fungal compounds. The nephrotoxicity of these compounds in humans is well characterized. A combination of chromatin and cytoskeletal features resulted in high predictivity with respect to nephrotoxicity in humans. Test balanced accuracies of 82% or 89% were obtained with human primary or immortalized renal proximal tubular cells, respectively. Furthermore, our results revealed that a DNA damage response is commonly induced by different PTC-toxicants with diverse chemical structures and injury mechanisms. Together, the results show that the automated HCS platform allows efficient and accurate nephrotoxicity prediction for compounds with diverse chemical structures.Keywords: high content screening, in vitro models, nephrotoxicity, toxicity prediction
Procedia PDF Downloads 3136670 Using Mathematical Models to Predict the Academic Performance of Students from Initial Courses in Engineering School
Authors: Martín Pratto Burgos
Abstract:
The Engineering School of the University of the Republic in Uruguay offers an Introductory Mathematical Course from the second semester of 2019. This course has been designed to assist students in preparing themselves for math courses that are essential for Engineering Degrees, namely Math1, Math2, and Math3 in this research. The research proposes to build a model that can accurately predict the student's activity and academic progress based on their performance in the three essential Mathematical courses. Additionally, there is a need for a model that can forecast the incidence of the Introductory Mathematical Course in the three essential courses approval during the first academic year. The techniques used are Principal Component Analysis and predictive modelling using the Generalised Linear Model. The dataset includes information from 5135 engineering students and 12 different characteristics based on activity and course performance. Two models are created for a type of data that follows a binomial distribution using the R programming language. Model 1 is based on a variable's p-value being less than 0.05, and Model 2 uses the stepAIC function to remove variables and get the lowest AIC score. After using Principal Component Analysis, the main components represented in the y-axis are the approval of the Introductory Mathematical Course, and the x-axis is the approval of Math1 and Math2 courses as well as student activity three years after taking the Introductory Mathematical Course. Model 2, which considered student’s activity, performed the best with an AUC of 0.81 and an accuracy of 84%. According to Model 2, the student's engagement in school activities will continue for three years after the approval of the Introductory Mathematical Course. This is because they have successfully completed the Math1 and Math2 courses. Passing the Math3 course does not have any effect on the student’s activity. Concerning academic progress, the best fit is Model 1. It has an AUC of 0.56 and an accuracy rate of 91%. The model says that if the student passes the three first-year courses, they will progress according to the timeline set by the curriculum. Both models show that the Introductory Mathematical Course does not directly affect the student’s activity and academic progress. The best model to explain the impact of the Introductory Mathematical Course on the three first-year courses was Model 1. It has an AUC of 0.76 and 98% accuracy. The model shows that if students pass the Introductory Mathematical Course, it will help them to pass Math1 and Math2 courses without affecting their performance on the Math3 course. Matching the three predictive models, if students pass Math1 and Math2 courses, they will stay active for three years after taking the Introductory Mathematical Course, and also, they will continue following the recommended engineering curriculum. Additionally, the Introductory Mathematical Course helps students to pass Math1 and Math2 when they start Engineering School. Models obtained in the research don't consider the time students took to pass the three Math courses, but they can successfully assess courses in the university curriculum.Keywords: machine-learning, engineering, university, education, computational models
Procedia PDF Downloads 946669 Reconstruction of Holographic Dark Energy in Chameleon Brans-Dicke Cosmology
Authors: Surajit Chattopadhyay
Abstract:
Accelerated expansion of the current universe is well-established in the literature. Dark energy and modified gravity are two approaches to account for this accelerated expansion. In the present work, we consider scalar field models of dark energy, namely, tachyon and DBI essence in the framework of chameleon Brans-Dicke cosmology. The equation of state parameter is reconstructed and the subsequent cosmological implications are studied. We examined the stability for the obtained solutions of the crossing of the phantom divide under a quantum correction of massless conformally invariant fields and we have seen that quantum correction could be small when the phantom crossing occurs and the obtained solutions of the phantom crossing could be stable under the quantum correction. In the subsequent phase, we have established a correspondence between the NHDE model and the quintessence, the DBI-essence and the tachyon scalar field models in the framework of chameleon Brans–Dicke cosmology. We reconstruct the potentials and the dynamics for these three scalar field models we have considered. The reconstructed potentials are found to increase with the evolution of the universe and in a very late stage they are observed to decay.Keywords: dark energy, holographic principle, modified gravity, reconstruction
Procedia PDF Downloads 412