Search results for: CLIL models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6543

Search results for: CLIL models

5763 Orthogonal Regression for Nonparametric Estimation of Errors-In-Variables Models

Authors: Anastasiia Yu. Timofeeva

Abstract:

Two new algorithms for nonparametric estimation of errors-in-variables models are proposed. The first algorithm is based on penalized regression spline. The spline is represented as a piecewise-linear function and for each linear portion orthogonal regression is estimated. This algorithm is iterative. The second algorithm involves locally weighted regression estimation. When the independent variable is measured with error such estimation is a complex nonlinear optimization problem. The simulation results have shown the advantage of the second algorithm under the assumption that true smoothing parameters values are known. Nevertheless the use of some indexes of fit to smoothing parameters selection gives the similar results and has an oversmoothing effect.

Keywords: grade point average, orthogonal regression, penalized regression spline, locally weighted regression

Procedia PDF Downloads 395
5762 Methodology for Obtaining Static Alignment Model

Authors: Lely A. Luengas, Pedro R. Vizcaya, Giovanni Sánchez

Abstract:

In this paper, a methodology is presented to obtain the Static Alignment Model for any transtibial amputee person. The proposed methodology starts from experimental data collected on the Hospital Militar Central, Bogotá, Colombia. The effects of transtibial prosthesis malalignment on amputees were measured in terms of joint angles, center of pressure (COP) and weight distribution. Some statistical tools are used to obtain the model parameters. Mathematical predictive models of prosthetic alignment were created. The proposed models are validated in amputees and finding promising results for the prosthesis Static Alignment. Static alignment process is unique to each subject; nevertheless the proposed methodology can be used in each transtibial amputee.

Keywords: information theory, prediction model, prosthetic alignment, transtibial prosthesis

Procedia PDF Downloads 238
5761 Steel Bridge Coating Inspection Using Image Processing with Neural Network Approach

Authors: Ahmed Elbeheri, Tarek Zayed

Abstract:

Steel bridges deterioration has been one of the problems in North America for the last years. Steel bridges deterioration mainly attributed to the difficult weather conditions. Steel bridges suffer fatigue cracks and corrosion, which necessitate immediate inspection. Visual inspection is the most common technique for steel bridges inspection, but it depends on the inspector experience, conditions, and work environment. So many Non-destructive Evaluation (NDE) models have been developed use Non-destructive technologies to be more accurate, reliable and non-human dependent. Non-destructive techniques such as The Eddy Current Method, The Radiographic Method (RT), Ultra-Sonic Method (UT), Infra-red thermography and Laser technology have been used. Digital Image processing will be used for Corrosion detection as an Alternative for visual inspection. Different models had used grey-level and colored digital image for processing. However, color image proved to be better as it uses the color of the rust to distinguish it from the different backgrounds. The detection of the rust is an important process as it’s the first warning for the corrosion and a sign of coating erosion. To decide which is the steel element to be repainted and how urgent it is the percentage of rust should be calculated. In this paper, an image processing approach will be developed to detect corrosion and its severity. Two models were developed 1st to detect rust and 2nd to detect rust percentage.

Keywords: steel bridge, bridge inspection, steel corrosion, image processing

Procedia PDF Downloads 283
5760 Analysis of the 2023 Karnataka State Elections Using Online Sentiment

Authors: Pranav Gunhal

Abstract:

This paper presents an analysis of sentiment on Twitter towards the Karnataka elections held in 2023, utilizing transformer-based models specifically designed for sentiment analysis in Indic languages. Through an innovative data collection approach involving a combination of novel methods of data augmentation, online data preceding the election was analyzed. The study focuses on sentiment classification, effectively distinguishing between positive, negative, and neutral posts while specifically targeting the sentiment regarding the loss of the Bharatiya Janata Party (BJP) or the win of the Indian National Congress (INC). Leveraging high-performing transformer architectures, specifically IndicBERT, coupled with specifically fine-tuned hyperparameters, the AI models employed in this study achieved remarkable accuracy in predicting the INC’s victory in the election. The findings shed new light on the potential of cutting-edge transformer-based models in capturing and analyzing sentiment dynamics within the Indian political landscape. The implications of this research are far-reaching, providing invaluable insights to political parties for informed decision-making and strategic planning in preparation for the forthcoming 2024 Lok Sabha elections in the nation.

Keywords: sentiment analysis, twitter, Karnataka elections, congress, BJP, transformers, Indic languages, AI, novel architectures, IndicBERT, lok sabha elections

Procedia PDF Downloads 69
5759 Piping Fragility Composed of Different Materials by Using OpenSees Software

Authors: Woo Young Jung, Min Ho Kwon, Bu Seog Ju

Abstract:

A failure of the non-structural component can cause significant damages in critical facilities such as nuclear power plants and hospitals. Historically, it was reported that the damage from the leakage of sprinkler systems, resulted in the shutdown of hospitals for several weeks by the 1971 San Fernando and 1994 North Ridge earthquakes. In most cases, water leakages were observed at the cross joints, sprinkler heads, and T-joint connections in piping systems during and after the seismic events. Hence, the primary objective of this study was to understand the seismic performance of T-joint connections and to develop an analytical Finite Element (FE) model for the T-joint systems of 2-inch fire protection piping system in hospitals subjected to seismic ground motions. In order to evaluate the FE models of the piping systems using OpenSees, two types of materials were used: 1) Steel 02 materials and 2) Pinching 4 materials. Results of the current study revealed that the nonlinear moment-rotation FE models for the threaded T-joint reconciled well with the experimental results in both FE material models. However, the system-level fragility determined from multiple nonlinear time history analyses at the threaded T-joint was slightly different. The system-level fragility at the T-joint, determined by Pinching 4 material was more conservative than that of using Steel 02 material in the piping system.

Keywords: fragility, t-joint, piping, leakage, sprinkler

Procedia PDF Downloads 284
5758 Comparison of Two Neural Networks To Model Margarine Age And Predict Shelf-Life Using Matlab

Authors: Phakamani Xaba, Robert Huberts, Bilainu Oboirien

Abstract:

The present study was aimed at developing & comparing two neural-network-based predictive models to predict shelf-life/product age of South African margarine using free fatty acid (FFA), water droplet size (D3.3), water droplet distribution (e-sigma), moisture content, peroxide value (PV), anisidine valve (AnV) and total oxidation (totox) value as input variables to the model. Brick margarine products which had varying ages ranging from fresh i.e. week 0 to week 47 were sourced. The brick margarine products which had been stored at 10 & 25 °C and were characterized. JMP and MATLAB models to predict shelf-life/ margarine age were developed and their performances were compared. The key performance indicators to evaluate the model performances were correlation coefficient (CC), root mean square error (RMSE), and mean absolute percentage error (MAPE) relative to the actual data. The MATLAB-developed model showed a better performance in all three performance indicators. The correlation coefficient of the MATLAB model was 99.86% versus 99.74% for the JMP model, the RMSE was 0.720 compared to 1.005 and the MAPE was 7.4% compared to 8.571%. The MATLAB model was selected to be the most accurate, and then, the number of hidden neurons/ nodes was optimized to develop a single predictive model. The optimized MATLAB with 10 neurons showed a better performance compared to the models with 1 & 5 hidden neurons. The developed models can be used by margarine manufacturers, food research institutions, researchers etc, to predict shelf-life/ margarine product age, optimize addition of antioxidants, extend shelf-life of products and proactively troubleshoot for problems related to changes which have an impact on shelf-life of margarine without conducting expensive trials.

Keywords: margarine shelf-life, predictive modelling, neural networks, oil oxidation

Procedia PDF Downloads 180
5757 Flexural Behavior of Composite Hybrid Beam Models Combining Steel Inverted T-Section and RC Flange

Authors: Abdul Qader Melhem, Hacene Badache

Abstract:

This paper deals with the theoretical and experimental study of shear connection via simple steel reinforcement shear connectors, which are steel reinforcing bars bent into L-shapes, instead of commonly used headed studs. This suggested L-shape connectors are readily available construction material in steel reinforcement. The composite section, therefore, consists of steel inverted T-section being embedded within a lightly reinforced concrete flange at the top slab as a unit. It should be noted that the cross section of these composite models involves steel inverted T-beam, replacing the steel top flange of a standard commonly employed I-beam section. The paper concentrates on the elastic and elastic-plastic behavior of these composite models. Failure modes either by cracking of concrete or shear connection be investigated in details. Elastic and elastoplastic formulas of the composite model have been computed for different locations of NA. Deflection formula has been derived, its value was close to the test value. With a supportive designing curve, this curve is valuable for both designing engineers and researchers. Finally, suggested designing curves and valuable equations will be presented. A check is made between theoretical and experimental outcomes.

Keywords: composite, elastic-plastic, failure, inverted T-section, L-Shape connectors

Procedia PDF Downloads 206
5756 Analysis of Expert Information in Linguistic Terms

Authors: O. Poleshchuk, E. Komarov

Abstract:

In this paper, semantic spaces with the properties of completeness and orthogonality (complete orthogonal semantic spaces) were chosen as models of expert evaluations. As the theoretical and practical studies have shown all the properties of complete orthogonal semantic spaces correspond to the thinking activity of experts that is why these semantic spaces were chosen for modeling. Two methods of construction such spaces were proposed. Models of comparative and fuzzy cluster analysis of expert evaluations were developed. The practical application of the developed methods has demonstrated their viability and validity.

Keywords: expert evaluation, comparative analysis, fuzzy cluster analysis, theoretical and practical studies

Procedia PDF Downloads 513
5755 Proposal of Design Method in the Semi-Acausal System Model

Authors: Shigeyuki Haruyama, Ken Kaminishi, Junji Kaneko, Tadayuki Kyoutani, Siti Ruhana Omar, Oke Oktavianty

Abstract:

This study is used as a definition method to the value and function in manufacturing sector. In concurrence of discussion about present condition of modeling method, until now definition of 1D-CAE is ambiguity and not conceptual. Across all the physics fields, those methods are defined with the formulation of differential algebraic equation which only applied time derivation and simulation. At the same time, we propose semi-acausal modeling concept and differential algebraic equation method as a newly modeling method which the efficiency has been verified through the comparison of numerical analysis result between the semi-acausal modeling calculation and FEM theory calculation.

Keywords: system model, physical models, empirical models, conservation law, differential algebraic equation, object-oriented

Procedia PDF Downloads 465
5754 A Neural Network Approach to Understanding Turbulent Jet Formations

Authors: Nurul Bin Ibrahim

Abstract:

Advancements in neural networks have offered valuable insights into Fluid Dynamics, notably in addressing turbulence-related challenges. In this research, we introduce multiple applications of models of neural networks, namely Feed-Forward and Recurrent Neural Networks, to explore the relationship between jet formations and stratified turbulence within stochastically excited Boussinesq systems. Using machine learning tools like TensorFlow and PyTorch, the study has created models that effectively mimic and show the underlying features of the complex patterns of jet formation and stratified turbulence. These models do more than just help us understand these patterns; they also offer a faster way to solve problems in stochastic systems, improving upon traditional numerical techniques to solve stochastic differential equations such as the Euler-Maruyama method. In addition, the research includes a thorough comparison with the Statistical State Dynamics (SSD) approach, which is a well-established method for studying chaotic systems. This comparison helps evaluate how well neural networks can help us understand the complex relationship between jet formations and stratified turbulence. The results of this study underscore the potential of neural networks in computational physics and fluid dynamics, opening up new possibilities for more efficient and accurate simulations in these fields.

Keywords: neural networks, machine learning, computational fluid dynamics, stochastic systems, simulation, stratified turbulence

Procedia PDF Downloads 53
5753 Housing Price Dynamics: Comparative Study of 1980-1999 and the New Millenium

Authors: Janne Engblom, Elias Oikarinen

Abstract:

The understanding of housing price dynamics is of importance to a great number of agents: to portfolio investors, banks, real estate brokers and construction companies as well as to policy makers and households. A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models is dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Common Correlated Effects estimator (CCE) of dynamic panel data which also accounts for cross-sectional dependence which is caused by common structures of the economy. In presence of cross-sectional dependence standard OLS gives biased estimates. In this study, U.S housing price dynamics were examined empirically using the dynamic CCE estimator with first-difference of housing price as the dependent and first-differences of per capita income, interest rate, housing stock and lagged price together with deviation of housing prices from their long-run equilibrium level as independents. These deviations were also estimated from the data. The aim of the analysis was to provide estimates with comparisons of estimates between 1980-1999 and 2000-2012. Based on data of 50 U.S cities over 1980-2012 differences of short-run housing price dynamics estimates were mostly significant when two time periods were compared. Significance tests of differences were provided by the model containing interaction terms of independents and time dummy variable. Residual analysis showed very low cross-sectional correlation of the model residuals compared with the standard OLS approach. This means a good fit of CCE estimator model. Estimates of the dynamic panel data model were in line with the theory of housing price dynamics. Results also suggest that dynamics of a housing market is evolving over time.

Keywords: dynamic model, panel data, cross-sectional dependence, interaction model

Procedia PDF Downloads 236
5752 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing

Authors: Yehjune Heo

Abstract:

As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.

Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer

Procedia PDF Downloads 116
5751 Understanding the Role of Social Entrepreneurship in Building Mobility of a Service Transportation Models

Authors: Liam Fassam, Pouria Liravi, Jacquie Bridgman

Abstract:

Introduction: The way we travel is rapidly changing, car ownership and use are declining among young people and those residents in urban areas. Also, the increasing role and popularity of sharing economy companies like Uber highlight a movement towards consuming transportation solutions as a service [Mobility of a Service]. This research looks to bridge the knowledge gap that exists between city mobility, smart cities, sharing economy and social entrepreneurship business models. Understanding of this subject is crucial for smart city design, as access to affordable transport has been identified as a contributing factor to social isolation leading to issues around health and wellbeing. Methodology: To explore the current fit vis-a-vis transportation business models and social impact this research undertook a comparative analysis between a systematic literature review and a Delphi study. The systematic literature review was undertaken to gain an appreciation of the current academic thinking on ‘social entrepreneurship and smart city mobility’. The second phase of the research initiated a Delphi study across a group of 22 participants to review future opinion on ‘how social entrepreneurship can assist city mobility sharing models?’. The Delphi delivered an initial 220 results, which once cross-checked for duplication resulted in 130. These 130 answers were sent back to participants to score importance against a 5-point LIKERT scale, enabling a top 10 listing of areas for shared user transports in society to be gleaned. One further round (4) identified no change in the coefficient of variant thus no further rounds were required. Findings: Initial results of the literature review returned 1,021 journals using the search criteria ‘social entrepreneurship and smart city mobility’. Filtering allied to ‘peer review’, ‘date’, ‘region’ and ‘Chartered associated of business school’ ranking proffered a resultant journal list of 75. Of these, 58 focused on smart city design, 9 on social enterprise in cityscapes, 6 relating to smart city network design and 3 on social impact, with no journals purporting the need for social entrepreneurship to be allied to city mobility. The future inclusion factors from the Delphi expert panel indicated that smart cities needed to include shared economy models in their strategies. Furthermore, social isolation born by costs of infrastructure needed addressing through holistic A-political social enterprise models, and a better understanding of social benefit measurement is needed. Conclusion: In investigating the collaboration between key public transportation stakeholders, a theoretical model of social enterprise transportation models that positively impact upon the smart city needs of reduced transport poverty and social isolation was formed. As such, the research has identified how a revised business model of Mobility of a Service allied to a social entrepreneurship can deliver impactful measured social benefits associated to smart city design existent research.

Keywords: social enterprise, collaborative transportation, new models of ownership, transport social impact

Procedia PDF Downloads 126
5750 Modeling and Benchmarking the Thermal Energy Performance of Palm Oil Production Plant

Authors: Mathias B. Michael, Esther T. Akinlabi, Tien-Chien Jen

Abstract:

Thermal energy consumption in palm oil production plant comprises mainly of steam, hot water and hot air. In most efficient plants, hot water and air are generated from the steam supply system. Research has shown that thermal energy utilize in palm oil production plants is about 70 percent of the total energy consumption of the plant. In order to manage the plants’ energy efficiently, the energy systems are modelled and optimized. This paper aimed to present the model of steam supply systems of a typical palm oil production plant in Ghana. The models include exergy and energy models of steam boiler, steam turbine and the palm oil mill. The paper further simulates the virtual plant model to obtain the thermal energy performance of the plant under study. The simulation results show that, under normal operating condition, the boiler energy performance is considerably below the expected level as a result of several factors including intermittent biomass fuel supply, significant moisture content of the biomass fuel and significant heat losses. The total thermal energy performance of the virtual plant is set as a baseline. The study finally recommends number of energy efficiency measures to improve the plant’s energy performance.

Keywords: palm biomass, steam supply, exergy and energy models, energy performance benchmark

Procedia PDF Downloads 334
5749 Practical Modelling of RC Structural Walls under Monotonic and Cyclic Loading

Authors: Reza E. Sedgh, Rajesh P. Dhakal

Abstract:

Shear walls have been used extensively as the main lateral force resisting systems in multi-storey buildings. The recent development in performance based design urges practicing engineers to conduct nonlinear static or dynamic analysis to evaluate seismic performance of multi-storey shear wall buildings by employing distinct analytical models suggested in the literature. For practical purpose, application of macroscopic models to simulate the global and local nonlinear behavior of structural walls outweighs the microscopic models. The skill level, computational time and limited access to RC specialized finite element packages prevents the general application of this method in performance based design or assessment of multi-storey shear wall buildings in design offices. Hence, this paper organized to verify capability of nonlinear shell element in commercially available package (Sap2000) in simulating results of some specimens under monotonic and cyclic loads with very oversimplified available cyclic material laws in the analytical tool. The selection of constitutive models, the determination of related parameters of the constituent material and appropriate nonlinear shear model are presented in detail. Adoption of proposed simple model demonstrated that the predicted results follow the overall trend of experimental force-displacement curve. Although, prediction of ultimate strength and the overall shape of hysteresis model agreed to some extent with experiment, the ultimate displacement(significant strength degradation point) prediction remains challenging in some cases.

Keywords: analytical model, nonlinear shell element, structural wall, shear behavior

Procedia PDF Downloads 383
5748 Seafloor and Sea Surface Modelling in the East Coast Region of North America

Authors: Magdalena Idzikowska, Katarzyna Pająk, Kamil Kowalczyk

Abstract:

Seafloor topography is a fundamental issue in geological, geophysical, and oceanographic studies. Single-beam or multibeam sonars attached to the hulls of ships are used to emit a hydroacoustic signal from transducers and reproduce the topography of the seabed. This solution provides relevant accuracy and spatial resolution. Bathymetric data from ships surveys provides National Centers for Environmental Information – National Oceanic and Atmospheric Administration. Unfortunately, most of the seabed is still unidentified, as there are still many gaps to be explored between ship survey tracks. Moreover, such measurements are very expensive and time-consuming. The solution is raster bathymetric models shared by The General Bathymetric Chart of the Oceans. The offered products are a compilation of different sets of data - raw or processed. Indirect data for the development of bathymetric models are also measurements of gravity anomalies. Some forms of seafloor relief (e.g. seamounts) increase the force of the Earth's pull, leading to changes in the sea surface. Based on satellite altimetry data, Sea Surface Height and marine gravity anomalies can be estimated, and based on the anomalies, it’s possible to infer the structure of the seabed. The main goal of the work is to create regional bathymetric models and models of the sea surface in the area of the east coast of North America – a region of seamounts and undulating seafloor. The research includes an analysis of the methods and techniques used, an evaluation of the interpolation algorithms used, model thickening, and the creation of grid models. Obtained data are raster bathymetric models in NetCDF format, survey data from multibeam soundings in MB-System format, and satellite altimetry data from Copernicus Marine Environment Monitoring Service. The methodology includes data extraction, processing, mapping, and spatial analysis. Visualization of the obtained results was carried out with Geographic Information System tools. The result is an extension of the state of the knowledge of the quality and usefulness of the data used for seabed and sea surface modeling and knowledge of the accuracy of the generated models. Sea level is averaged over time and space (excluding waves, tides, etc.). Its changes, along with knowledge of the topography of the ocean floor - inform us indirectly about the volume of the entire water ocean. The true shape of the ocean surface is further varied by such phenomena as tides, differences in atmospheric pressure, wind systems, thermal expansion of water, or phases of ocean circulation. Depending on the location of the point, the higher the depth, the lower the trend of sea level change. Studies show that combining data sets, from different sources, with different accuracies can affect the quality of sea surface and seafloor topography models.

Keywords: seafloor, sea surface height, bathymetry, satellite altimetry

Procedia PDF Downloads 60
5747 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course

Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu

Abstract:

This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.

Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN

Procedia PDF Downloads 28
5746 Vibrations of Springboards: Mode Shape and Time Domain Analysis

Authors: Stefano Frassinelli, Alessandro Niccolai, Riccardo E. Zich

Abstract:

Diving is an important Olympic sport. In this sport, the effective performance of the athlete is related to his capability to interact correctly with the springboard. In fact, the elevation of the jump and the correctness of the dive are influenced by the vibrations of the board. In this paper, the vibrations of the springboard will be analyzed by means of typical tools for vibration analysis: Firstly, a modal analysis will be done on two different models of the springboard, then, these two model and another one will be analyzed with a time analysis, done integrating the equations of motion od deformable bodies. All these analyses will be compared with experimental data measured on a real springboard by means of a 6-axis accelerometer; these measurements are aimed to assess the models proposed. The acquired data will be analyzed both in frequency domain and in time domain.

Keywords: springboard analysis, modal analysis, time domain analysis, vibrations

Procedia PDF Downloads 439
5745 Predicting the Effect of Silicon Electrode Design Parameters on Thermal Performance of a Lithium-Ion Battery

Authors: Harika Dasari, Eric Eisenbraun

Abstract:

The present study models the role of electrode structural characteristics on the thermal behavior of lithium-ion batteries. Preliminary modeling runs have employed a 1D lithium-ion battery coupled to a two-dimensional axisymmetric model using silicon as the battery anode material. The two models are coupled by the heat generated and the average temperature. Our study is focused on the silicon anode particle sizes and it is observed that silicon anodes with nano-sized particles reduced the temperature of the battery in comparison to anodes with larger particles. These results are discussed in the context of the relationship between particle size and thermal transport properties in the electrode.

Keywords: particle size, NMC, silicon, heat generation, separator

Procedia PDF Downloads 266
5744 Stability Analysis of Two-delay Differential Equation for Parkinson's Disease Models with Positive Feedback

Authors: M. A. Sohaly, M. A. Elfouly

Abstract:

Parkinson's disease (PD) is a heterogeneous movement disorder that often appears in the elderly. PD is induced by a loss of dopamine secretion. Some drugs increase the secretion of dopamine. In this paper, we will simply study the stability of PD models as a nonlinear delay differential equation. After a period of taking drugs, these act as positive feedback and increase the tremors of patients, and then, the differential equation has positive coefficients and the system is unstable under these conditions. We will present a set of suggested modifications to make the system more compatible with the biodynamic system. When giving a set of numerical examples, this research paper is concerned with the mathematical analysis, and no clinical data have been used.

Keywords: Parkinson's disease, stability, simulation, two delay differential equation

Procedia PDF Downloads 109
5743 Selection of Variogram Model for Environmental Variables

Authors: Sheikh Samsuzzhan Alam

Abstract:

The present study investigates the selection of variogram model in analyzing spatial variations of environmental variables with the trend. Sometimes, the autofitted theoretical variogram does not really capture the true nature of the empirical semivariogram. So proper exploration and analysis are needed to select the best variogram model. For this study, an open source data collected from California Soil Resource Lab1 is used to explain the problems when fitting a theoretical variogram. Five most commonly used variogram models: Linear, Gaussian, Exponential, Matern, and Spherical were fitted to the experimental semivariogram. Ordinary kriging methods were considered to evaluate the accuracy of the selected variograms through cross-validation. This study is beneficial for selecting an appropriate theoretical variogram model for environmental variables.

Keywords: anisotropy, cross-validation, environmental variables, kriging, variogram models

Procedia PDF Downloads 313
5742 Chemometric Analysis of Raw Milk Quality Originating from Conventional and Organic Dairy Farming in AP Vojvodina, Serbia

Authors: Sanja Podunavac-Kuzmanović, Denis Kučević, Strahinja Kovačević, Milica Karadžić, Lidija Jevrić

Abstract:

The present study describes the application of chemometric methods in analysis of milk samples which were collected in a conventional dairy farm and an organic dairy farm in AP Vojvodina, Republic of Serbia. The chemometric analysis included the application of univariate regression modeling and Analysis of Variance (ANOVA) method. The ANOVA was used in order to determine the differences in fatty acids content in the milk samples from conventional and organic farm. The results of the ANOVA testing indicate that there is a highly statistically significant difference between the content of fatty acid (saturated fatty acid vs. unsaturated fatty acids) in different dairy farming. Besides, the linear univariate models have been obtained as a result of modeling the linear relationships between the milk fat content and saturated fatty acids content, and the linear relationships between the milk fat content and unsaturated fatty acids content. The models obtained on the basis of the milk samples which originate from the organic farming are statistically better than the models based on the milk samples from conventional farming.

Keywords: hemometrics, milk, organic farming, quality control

Procedia PDF Downloads 221
5741 Estimating Bridge Deterioration for Small Data Sets Using Regression and Markov Models

Authors: Yina F. Muñoz, Alexander Paz, Hanns De La Fuente-Mella, Joaquin V. Fariña, Guilherme M. Sales

Abstract:

The primary approach for estimating bridge deterioration uses Markov-chain models and regression analysis. Traditional Markov models have problems in estimating the required transition probabilities when a small sample size is used. Often, reliable bridge data have not been taken over large periods, thus large data sets may not be available. This study presents an important change to the traditional approach by using the Small Data Method to estimate transition probabilities. The results illustrate that the Small Data Method and traditional approach both provide similar estimates; however, the former method provides results that are more conservative. That is, Small Data Method provided slightly lower than expected bridge condition ratings compared with the traditional approach. Considering that bridges are critical infrastructures, the Small Data Method, which uses more information and provides more conservative estimates, may be more appropriate when the available sample size is small. In addition, regression analysis was used to calculate bridge deterioration. Condition ratings were determined for bridge groups, and the best regression model was selected for each group. The results obtained were very similar to those obtained when using Markov chains; however, it is desirable to use more data for better results.

Keywords: concrete bridges, deterioration, Markov chains, probability matrix

Procedia PDF Downloads 325
5740 Faster, Lighter, More Accurate: A Deep Learning Ensemble for Content Moderation

Authors: Arian Hosseini, Mahmudul Hasan

Abstract:

To address the increasing need for efficient and accurate content moderation, we propose an efficient and lightweight deep classification ensemble structure. Our approach is based on a combination of simple visual features, designed for high-accuracy classification of violent content with low false positives. Our ensemble architecture utilizes a set of lightweight models with narrowed-down color features, and we apply it to both images and videos. We evaluated our approach using a large dataset of explosion and blast contents and compared its performance to popular deep learning models such as ResNet-50. Our evaluation results demonstrate significant improvements in prediction accuracy, while benefiting from 7.64x faster inference and lower computation cost. While our approach is tailored to explosion detection, it can be applied to other similar content moderation and violence detection use cases as well. Based on our experiments, we propose a "think small, think many" philosophy in classification scenarios. We argue that transforming a single, large, monolithic deep model into a verification-based step model ensemble of multiple small, simple, and lightweight models with narrowed-down visual features can possibly lead to predictions with higher accuracy.

Keywords: deep classification, content moderation, ensemble learning, explosion detection, video processing

Procedia PDF Downloads 28
5739 Future Design and Innovative Economic Models for Futuristic Markets in Developing Countries

Authors: Nessreen Y. Ibrahim

Abstract:

Designing the future according to realistic analytical study for the futuristic market needs can be a milestone strategy to make a huge improvement in developing countries economics. In developing countries, access to high technology and latest science approaches is very limited. The financial problems in low and medium income countries have negative effects on the kind and quality of imported new technologies and application for their markets. Thus, there is a strong need for shifting paradigm thinking in the design process to improve and evolve their development strategy. This paper discusses future possibilities in developing countries, and how they can design their own future according to specific future models FDM (Future Design Models), which established to solve certain economical problems, as well as political and cultural conflicts. FDM is strategic thinking framework provides an improvement in both content and process. The content includes; beliefs, values, mission, purpose, conceptual frameworks, research, and practice, while the process includes; design methodology, design systems, and design managements tools. In this paper the main objective was building an innovative economic model to design a chosen possible futuristic scenario; by understanding the market future needs, analyze real world setting, solve the model questions by future driven design, and finally interpret the results, to discuss to what extent the results can be transferred to the real world. The paper discusses Egypt as a potential case study. Since, Egypt has highly complex economical problems, extra-dynamic political factors, and very rich cultural aspects; we considered Egypt is a very challenging example for applying FDM. The paper results recommended using FDM numerical modeling as a starting point to design the future.

Keywords: developing countries, economic models, future design, possible futures

Procedia PDF Downloads 252
5738 Forecasting Solid Waste Generation in Turkey

Authors: Yeliz Ekinci, Melis Koyuncu

Abstract:

Successful planning of solid waste management systems requires successful prediction of the amount of solid waste generated in an area. Waste management planning can protect the environment and human health, hence it is tremendously important for countries. The lack of information in waste generation can cause many environmental and health problems. Turkey is a country that plans to join European Union, hence, solid waste management is one of the most significant criteria that should be handled in order to be a part of this community. Solid waste management system requires a good forecast of solid waste generation. Thus, this study aims to forecast solid waste generation in Turkey. Artificial Neural Network and Linear Regression models will be used for this aim. Many models will be run and the best one will be selected based on some predetermined performance measures.

Keywords: forecast, solid waste generation, solid waste management, Turkey

Procedia PDF Downloads 491
5737 CAD Tool for Parametric Design modification of Yacht Hull Surface Models

Authors: Shahroz Khan, Erkan Gunpinar, Kemal Mart

Abstract:

Recently parametric design techniques became a vital concept in the field of Computer Aided Design (CAD), which helps to provide sophisticated platform to the designer in order to automate the design process in efficient time. In these techniques, design process starts by parameterizing the important features of design models (typically the key dimensions), with the implementation of design constraints. The design constraints help to retain the overall shape of the model while modifying its parameters. However, the process of initializing an appropriate number of design parameters and constraints is the crucial part of parametric design techniques, especially for complex surface models such as yacht hull. This paper introduces a method to create complex surface models in favor of parametric design techniques, a method to define the right number of parameters and respective design constraints, and a system to implement design parameters in contract to design constraints schema. For this, in our proposed approach the design process starts by dividing the yacht hull into three sections. Each section consists of different shape lines, which form the overall shape of yacht hull. The shape lines are created using Cubic Bezier Curves, which allow larger design flexibility. Design parameters and constraints are defined on the shape lines in 3D design space to facilitate the designers for better and individual handling of parameters. Afterwards, shape modifiers are developed, which allow the modification of each parameter while satisfying the respective set of criteria and design constraints. Such as, geometric continuities should be maintained between the shape lines of the three sections, fairness of the hull surfaces should be preserved after modification and while design modification, effect of a single parameter should be negligible on other parameters. The constraints are defined individually on shape lines of each section and mutually between the shape lines of two connecting sections. In order to validate and visualize design results of our shape modifiers, a real time graphic interface is created.

Keywords: design parameter, design constraints, shape modifies, yacht hull

Procedia PDF Downloads 286
5736 Modelling of Damage as Hinges in Segmented Tunnels

Authors: Gelacio JuáRez-Luna, Daniel Enrique GonzáLez-RamíRez, Enrique Tenorio-Montero

Abstract:

Frame elements coupled with springs elements are used for modelling the development of hinges in segmented tunnels, the spring elements modelled the rotational, transversal and axial failure. These spring elements are equipped with constitutive models to include independently the moment, shear force and axial force, respectively. These constitutive models are formulated based on damage mechanics and experimental test reported in the literature review. The mesh of the segmented tunnels was discretized in the software GID, and the nonlinear analyses were carried out in the finite element software ANSYS. These analyses provide the capacity curve of the primary and secondary lining of a segmented tunnel. Two numerical examples of segmented tunnels show the capability of the spring elements to release energy by the development of hinges. The first example is a segmental concrete lining discretized with frame elements loaded until hinges occurred in the lining. The second example is a tunnel with primary and secondary lining, discretized with a double ring frame model. The outer ring simulates the segmental concrete lining and the inner ring simulates the secondary cast-in-place concrete lining. Spring elements also modelled the joints between the segments in the circumferential direction and the ring joints, which connect parallel adjacent rings. The computed load vs displacement curves are congruent with numerical and experimental results reported in the literature review. It is shown that the modelling of a tunnel with primary and secondary lining with frame elements and springs provides reasonable results and save computational cost, comparing with 2D or 3D models equipped with smeared crack models.

Keywords: damage, hinges, lining, tunnel

Procedia PDF Downloads 374
5735 The Creation of a Yeast Model for 5-oxoproline Accumulation

Authors: Pratiksha Dubey, Praveen Singh, Shantanu Sen Gupta, Anand K. Bachhawat

Abstract:

5-oxoproline (pyroglutamic acid) is a cyclic lactam of glutamic acid. In the cell, it can be produced by several different pathways and is metabolized into glutamate with the help of the 5-oxoprolinase enzyme (OPLAH or OXP1). The inhibition of 5-oxoprolinase enzyme in mammals was found to result in heart failure and is thought to be a consequence of oxidative stress [1]. To analyze the consequences of 5-oxoproline accumulation more clearly, we are generating models for 5-oxoproline accumulation in yeast. The 5-oxoproline accumulation model in yeast is being developed by two different strategies. The first one is by overexpression of the mouse  -glutamylcyclotransferase enzyme. It degrades -glu-met dipeptide into 5-oxoproline and methionine taken by the cell from the medium. The second strategy is by providing high concentration of 5-oxoproline externally to the yeast cells. The intracellular 5-oxoproline levels in both models are being evaluated. In addition, the metabolic and cellular consequences are being investigated.

Keywords: 5-oxoproline, pyroglutamic acid, yeast, genetics

Procedia PDF Downloads 68
5734 Detecting Earnings Management via Statistical and Neural Networks Techniques

Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie

Abstract:

Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.

Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange

Procedia PDF Downloads 404