Search results for: multivariate models.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7105

Search results for: multivariate models.

5305 Determinants of International Volatility Passthroughs of Agricultural Commodities: A Panel Analysis of Developing Countries

Authors: Tetsuji Tanaka, Jin Guo

Abstract:

The extant literature has not succeeded in uncovering the common determinants of price volatility transmissions of agricultural commodities from international to local markets, and further, has rarely investigated the role of self-sufficiency measures in the context of national food security. We analyzed various factors to determine the degree of price volatility transmissions of wheat, rice, and maize between world and domestic markets using GARCH models with dynamic conditional correlation (DCC) specifications and panel-feasible generalized least square models. We found that the grain autarky system has the potential to diminish volatility pass-throughs for three grain commodities. Furthermore, it was discovered that the substitutive commodity consumption behavior between maize and wheat buffers the volatility transmissions of both, but rice does not function as a transmission-relieving element, either for the volatilities of wheat or maize. The effectiveness of grain consumption substitution to insulate the pass-throughs from global markets is greater than that of cereal self-sufficiency. These implications are extremely beneficial for developing governments to protect their domestic food markets from uncertainty in foreign countries and as such, improves food security.

Keywords: food security, GARCH, grain self-sufficiency, volatility transmission

Procedia PDF Downloads 145
5304 The Comparison of Primary B-Cell and NKT-Cell Non-Hodgkin Lymphomas in Nasopharynx, Nasal Cavity, and Paranasal Sinuses

Authors: Jiajia Peng, Jianqing Qiu, Jianjun Ren, Yu Zhao

Abstract:

Background: We aimed to compare clinical and survival differences between B-cell (B-NHL) and NKT-cell non-Hodgkin lymphomas (NKT-NHL) located in the nasal cavity, nasopharynx and paranasal sinuses, which are always categorized as one sinonasal type. Methods: Patients diagnosed with primary B-NHL and NKT-NHL in the nasal cavity, nasopharynx, and paranasal sinuses from the SEER database were included. We identified these patients based on histological types and anatomical sites and subsequently conducted univariate and multivariate Cox regression and Kaplan–Meier analyses to examine cancer-special survival (CSS) outcomes. Results: Overall, most B-NHL cases originated from the nasopharynx, while the majority of NKT-NHL cases occurred in the nasal cavity. Notably, the CSS outcomes improved significantly in all sinonasal B-NHL cases over time, whereas no such improvement trend was observed in each sinonasal NKT-NHL type. Additionally, increasing age was linked with an elevated risk of death in B-NHL, particularly in the nasal cavity (HR:3.37), rather than in NKT-NHL. Compared with B-NHL, the adverse effect of the higher stage on CSS was more evident in NKT-NHL, particularly in its nasopharynx site (HR: 5.12). Furthermore, radiotherapy was beneficial for survival in patients with sinonasal B-NHL and NKT-NHL, except in those with NKT-NHL in the nasopharynx site. However, chemotherapy has only been beneficial for CSS in patients with B-NHL in paranasal sinuses (HR: 0.42) since 2010, rather than in other types of B-NHL or NKT-NHL. Conclusions: Although B-NHL and NKT-NHL in the nasal cavity, nasopharynx and paranasal sinuses have similar anatomical locations, their clinic demographics and prognoses are largely different and should be treated and studied as distinct diseases.

Keywords: B-cell non-Hodgkin lymphomas, NKT-cell non-Hodgkin lymphomas, nasal cavity lymphomas, nasal sinuses lymphomas, nasopharynx lymphomas

Procedia PDF Downloads 92
5303 Evaluation of UI for 3D Visualization-Based Building Information Applications

Authors: Monisha Pattanaik

Abstract:

In scenarios where users have to work with large amounts of hierarchical data structures combined with visualizations (For example, Construction 3d Models, Manufacturing equipment's models, Gantt charts, Building Plans), the data structures have a high density in terms of consisting multiple parent nodes up to 50 levels and their siblings to descendants, therefore convey an immediate feeling of complexity. With customers moving to consumer-grade enterprise software, it is crucial to make sophisticated features made available to touch devices or smaller screen sizes. This paper evaluates the UI component that allows users to scroll through all deep density levels using a slider overlay on top of the hierarchy table, performing several actions to focus on one set of objects at any point in time. This overlay component also solves the problem of excessive horizontal scrolling of the entire table on a fixed pane for a hierarchical table. This component can be customized to navigate through parents, only siblings, or a specific component of the hierarchy only. The evaluation of the UI component was done by End Users of application and Human-Computer Interaction (HCI) experts to test the UI component's usability with statistical results and recommendations to handle complex hierarchical data visualizations.

Keywords: building information modeling, digital twin, navigation, UI component, user interface, usability, visualization

Procedia PDF Downloads 125
5302 Study of the Influence of Eccentricity Due to Configuration and Materials on Seismic Response of a Typical Building

Authors: A. Latif Karimi, M. K. Shrimali

Abstract:

Seismic design is a critical stage in the process of design and construction of a building. It includes strategies for designing earthquake-resistant buildings to ensure health, safety, and security of the building occupants and assets. Hence, it becomes very important to understand the behavior of structural members precisely, for construction of buildings that can yield a better response to seismic forces. This paper investigates the behavior of a typical structure when subjected to ground motion. The corresponding mode shapes and modal frequencies are studied to interpret the response of an actual structure using different fabricated models and 3D visual models. In this study, three different structural configurations are subjected to horizontal ground motion, and the effect of “stiffness eccentricity” and placement of infill walls are checked to determine how each parameter contributes in a building’s response to dynamic forces. The deformation data from lab experiments and the analysis on SAP2000 software are reviewed to obtain the results. This study revealed that seismic response in a building can be improved by introducing higher deformation capacity in the building. Also, proper design of infill walls and maintaining a symmetrical configuration in a building are the key factors in building stability during the earthquake.

Keywords: eccentricity, seismic response, mode shape, building configuration, building dynamics

Procedia PDF Downloads 188
5301 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 115
5300 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market

Authors: Cristian Păuna

Abstract:

In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.

Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex

Procedia PDF Downloads 122
5299 Large Scale Method to Assess the Seismic Vulnerability of Heritage Buidings: Modal Updating of Numerical Models and Vulnerability Curves

Authors: Claire Limoge Schraen, Philippe Gueguen, Cedric Giry, Cedric Desprez, Frédéric Ragueneau

Abstract:

Mediterranean area is characterized by numerous monumental or vernacular masonry structures illustrating old ways of build and live. Those precious buildings are often poorly documented, present complex shapes and loadings, and are protected by the States, leading to legal constraints. This area also presents a moderate to high seismic activity. Even moderate earthquakes can be magnified by local site effects and cause collapse or significant damage. Moreover the structural resistance of masonry buildings, especially when less famous or located in rural zones has been generally lowered by many factors: poor maintenance, unsuitable restoration, ambient pollution, previous earthquakes. Recent earthquakes prove that any damage to these architectural witnesses to our past is irreversible, leading to the necessity of acting preventively. This means providing preventive assessments for hundreds of structures with no or few documents. In this context we want to propose a general method, based on hierarchized numerical models, to provide preliminary structural diagnoses at a regional scale, indicating whether more precise investigations and models are necessary for each building. To this aim, we adapt different tools, being developed such as photogrammetry or to be created such as a preprocessor starting from pictures to build meshes for a FEM software, in order to allow dynamic studies of the buildings of the panel. We made an inventory of 198 baroque chapels and churches situated in the French Alps. Then their structural characteristics have been determined thanks field surveys and the MicMac photogrammetric software. Using structural criteria, we determined eight types of churches and seven types for chapels. We studied their dynamical behavior thanks to CAST3M, using EC8 spectrum and accelerogramms of the studied zone. This allowed us quantifying the effect of the needed simplifications in the most sensitive zones and choosing the most effective ones. We also proposed threshold criteria based on the observed damages visible in the in situ surveys, old pictures and Italian code. They are relevant in linear models. To validate the structural types, we made a vibratory measures campaign using vibratory ambient noise and velocimeters. It also allowed us validating this method on old masonry and identifying the modal characteristics of 20 churches. Then we proceeded to a dynamic identification between numerical and experimental modes. So we updated the linear models thanks to material and geometrical parameters, often unknown because of the complexity of the structures and materials. The numerically optimized values have been verified thanks to the measures we made on the masonry components in situ and in laboratory. We are now working on non-linear models redistributing the strains. So we validate the damage threshold criteria which we use to compute the vulnerability curves of each defined structural type. Our actual results show a good correlation between experimental and numerical data, validating the final modeling simplifications and the global method. We now plan to use non-linear analysis in the critical zones in order to test reinforcement solutions.

Keywords: heritage structures, masonry numerical modeling, seismic vulnerability assessment, vibratory measure

Procedia PDF Downloads 480
5298 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 195
5297 Improved Elastoplastic Bounding Surface Model for the Mathematical Modeling of Geomaterials

Authors: Andres Nieto-Leal, Victor N. Kaliakin, Tania P. Molina

Abstract:

The nature of most engineering materials is quite complex. It is, therefore, difficult to devise a general mathematical model that will cover all possible ranges and types of excitation and behavior of a given material. As a result, the development of mathematical models is based upon simplifying assumptions regarding material behavior. Such simplifications result in some material idealization; for example, one of the simplest material idealization is to assume that the material behavior obeys the elasticity. However, soils are nonhomogeneous, anisotropic, path-dependent materials that exhibit nonlinear stress-strain relationships, changes in volume under shear, dilatancy, as well as time-, rate- and temperature-dependent behavior. Over the years, many constitutive models, possessing different levels of sophistication, have been developed to simulate the behavior geomaterials, particularly cohesive soils. Early in the development of constitutive models, it became evident that elastic or standard elastoplastic formulations, employing purely isotropic hardening and predicated in the existence of a yield surface surrounding a purely elastic domain, were incapable of realistically simulating the behavior of geomaterials. Accordingly, more sophisticated constitutive models have been developed; for example, the bounding surface elastoplasticity. The essence of the bounding surface concept is the hypothesis that plastic deformations can occur for stress states either within or on the bounding surface. Thus, unlike classical yield surface elastoplasticity, the plastic states are not restricted only to those lying on a surface. Elastoplastic bounding surface models have been improved; however, there is still need to improve their capabilities in simulating the response of anisotropically consolidated cohesive soils, especially the response in extension tests. Thus, in this work an improved constitutive model that can more accurately predict diverse stress-strain phenomena exhibited by cohesive soils was developed. Particularly, an improved rotational hardening rule that better simulate the response of cohesive soils in extension. The generalized definition of the bounding surface model provides a convenient and elegant framework for unifying various previous versions of the model for anisotropically consolidated cohesive soils. The Generalized Bounding Surface Model for cohesive soils is a fully three-dimensional, time-dependent model that accounts for both inherent and stress induced anisotropy employing a non-associative flow rule. The model numerical implementation in a computer code followed an adaptive multistep integration scheme in conjunction with local iteration and radial return. The one-step trapezoidal rule was used to get the stiffness matrix that defines the relationship between the stress increment and the strain increment. After testing the model in simulating the response of cohesive soils through extensive comparisons of model simulations to experimental data, it has been shown to give quite good simulations. The new model successfully simulates the response of different cohesive soils; for example, Cardiff Kaolin, Spestone Kaolin, and Lower Cromer Till. The simulated undrained stress paths, stress-strain response, and excess pore pressures are in very good agreement with the experimental values, especially in extension.

Keywords: bounding surface elastoplasticity, cohesive soils, constitutive model, modeling of geomaterials

Procedia PDF Downloads 304
5296 Environmental Effects on Energy Consumption of Smart Grid Consumers

Authors: S. M. Ali, A. Salam Khan, A. U. Khan, M. Tariq, M. S. Hussain, B. A. Abbasi, I. Hussain, U. Farid

Abstract:

Environment and surrounding plays a pivotal rule in structuring life-style of the consumers. Living standards intern effect the energy consumption of the consumers. In smart grid paradigm, climate drifts, weather parameter and green environmental directly relates to the energy profiles of the various consumers, such as residential, commercial and industrial. Considering above factors helps policy in shaping utility load curves and optimal management of demand and supply. Thus, there is a pressing need to develop correlation models of load and weather parameters and critical analysis of the factors effecting energy profiles of smart grid consumers. In this paper, we elaborated various environment and weather parameter factors effecting demand of consumers. Moreover, we developed correlation models, such as Pearson, Spearman, and Kendall, an inter-relation between dependent (load) parameter and independent (weather) parameters. Furthermore, we validated our discussion with real-time data of Texas State. The numerical simulations proved the effective relation of climatic drifts with energy consumption of smart grid consumers.

Keywords: climatic drifts, correlation analysis, energy consumption, smart grid, weather parameter

Procedia PDF Downloads 357
5295 Driving towards Better Health: A Cross-Sectional Study of the Prevalence and Correlates of Obesity among Commercial Drivers in East London, South Africa

Authors: Daniel Ter Goon, Aanuoluwa O. Adedokun, Eyitayo Omolara Owolabi, Oladele Vincent Adeniyi, Anthony Idowu Ajayi

Abstract:

Background: The unhealthy food choices and sedentary lifestyle of commercial drivers predisposes them to obesity and obesity related diseases. Yet, no attention has been paid to obesity burden among this high risk group in South Africa. This study examines the prevalence of obesity and its risk factors among commercial drivers in East London, South Africa. Methods: This cross-sectional study utilized the WHO STEP wise approach to screen for obesity among 403 drivers in Buffalo City Metropolitan Municipality (BCMM), South Africa. Anthropometric, blood pressure and blood glucose measurements were taken following a standard procedure. Overweight and obesity was defined as a body mass index (BMI) of 25.0 kgm⁻²–29.9 kg/m² and≥ 30 kg/ m², respectively. Bivariate and multivariate analysis were used to determine the prevalence and determinants of obesity. Result: The mean age of the participants was 43.3 (SD12.5) years, mean height (cm) and weight (kg) were 170.1(6.2cm) and 83(SD18.7), respectively. The prevalence of overweight and obesity was 34.0% and 38.0%, respectively. After adjusting for confounding factors, only age (OR 1.6, CI 1.0-2.7), hypertension (OR 3.6, CI 2.3-5.7) and non-smoking (OR 2.0, CI 1.3-3.1) were independent predictors of obesity. Conclusion: The prevalence of overweight and obesity is high among commercial drivers. Age, hypertension, and non-smoking were independent predictors of obesity among the sample. Measures aimed at promoting health and reducing obesity should be prioritized among this group.

Keywords: obesity and overweight, commercial taxi drivers, risk factors, South Africa

Procedia PDF Downloads 327
5294 Characterising Stable Model by Extended Labelled Dependency Graph

Authors: Asraful Islam

Abstract:

Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.

Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring

Procedia PDF Downloads 199
5293 Studying the Impact of Soil Characteristics in Displacement of Retaining Walls Using Finite Element

Authors: Mojtaba Ahmadabadi, Akbar Masoudi, Morteza Rezai

Abstract:

In this paper, using the finite element method, the effect of soil and wall characteristics was investigated. Thirty and two different models were studied by different parameters. These studies could calculate displacement at any height of the wall for frictional-cohesive soils. The main purpose of this research is to determine the most effective soil characteristics in reducing the wall displacement. Comparing different models showed that the overall increase in internal friction angle, angle of friction between soil and wall and modulus of elasticity reduce the replacement of the wall. In addition, increase in special weight of soil will increase the wall displacement. Based on results, it can be said that all wall displacements were overturning and in the backfill, soil was bulging. Results show that the highest impact is seen in reducing wall displacement, internal friction angle, and the angle friction between soil and wall. One of the advantages of this study is taking into account all the parameters of the soil and walls replacement distribution in wall and backfill soil. In this paper, using the finite element method and considering all parameters of the soil, we investigated the impact of soil parameter in wall displacement. The aim of this study is to provide the best conditions in reducing the wall displacement and displacement wall and soil distribution.

Keywords: retaining wall, fem, soil and wall interaction, angle of internal friction of the soil, wall displacement

Procedia PDF Downloads 378
5292 Predictive Analysis of the Stock Price Market Trends with Deep Learning

Authors: Suraj Mehrotra

Abstract:

The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.

Keywords: machine learning, testing set, artificial intelligence, stock analysis

Procedia PDF Downloads 83
5291 Creating Energy Sustainability in an Enterprise

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.

Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure

Procedia PDF Downloads 94
5290 End-to-End Spanish-English Sequence Learning Translation Model

Authors: Vidhu Mitha Goutham, Ruma Mukherjee

Abstract:

The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.

Keywords: attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation

Procedia PDF Downloads 163
5289 Corporate Governance of State-Owned Enterprises: A Comparative Analysis

Authors: Adeyemi Adebayo, Barry Ackers

Abstract:

This paper comparatively analyses the corporate governance of SOEs in South Africa and Singapore in the context of the World Bank’s framework for corporate governance of SOEs. This framework ensured that the analysis holistically covered key aspects of corporate governance of SOEs in these states. In order to ground our understanding of the paths taken by SOEs in the states, the paper presents the evolution and reforms of SOEs in the states before analyzing key aspects of their corporate governance. The analysis shows that even though SOEs in South Africa and Singapore are comparable in a number of ways, there are notable differences. In this context, this paper finds that the main difference between corporate governance of SOEs in South Africa and Singapore is their organizing model. Further, the analysis, among other findings, shows that SOEs Boards in Singapore are better remunerated. Further finding reveals that, even though some board members are politically connected, Singaporean SOEs boards are better constituted based on skills and experience compared to SOEs boards in South Africa. Overall, the analysis opens up new debates and as such concludes by providing avenues for further research.

Keywords: corporate governance, comparative corporate governance, corporate governance framework, government business enterprises, government linked companies, organizing models, ownership models, state-owned companies, state-owned enterprises

Procedia PDF Downloads 204
5288 Multidimensional Poverty and Child Cognitive Development

Authors: Bidyadhar Dehury, Sanjay Kumar Mohanty

Abstract:

According to the Right to Education Act of India, education is the fundamental right of all children of age group 6-14 year irrespective of their status. Using the unit level data from India Human Development Survey (IHDS), we tried to understand the inter-relationship between the level of poverty and the academic performance of the children aged 8-11 years. The level of multidimensional poverty is measured using five dimensions and 10 indicators using Alkire-Foster approach. The weighted deprivation score was obtained by giving equal weight to each dimension and indicators within the dimension. The weighted deprivation score varies from 0 to 1 and grouped into four categories as non-poor, vulnerable, multidimensional poor and sever multidimensional poor. The academic performance index was measured using three variables reading skills, math skills and writing skills using PCA. The bivariate and multivariate analysis was used in the analysis. The outcome variable was ordinal. So the predicted probabilities were calculated using the ordinal logistic regression. The predicted probabilities of good academic performance index was 0.202 if the child was sever multidimensional poor, 0.235 if the child was multidimensional poor, 0.264 if the child was vulnerable, and 0.316 if the child was non-poor. Hence, if the level of poverty among the children decreases from sever multidimensional poor to non-poor, the probability of good academic performance increases.

Keywords: multidimensional poverty, academic performance index, reading skills, math skills, writing skills, India

Procedia PDF Downloads 578
5287 Error Amount in Viscoelasticity Analysis Depending on Time Step Size and Method used in ANSYS

Authors: A. Fettahoglu

Abstract:

Theory of viscoelasticity is used by many researchers to represent behavior of many materials such as pavements on roads or bridges. Several researches used analytical methods and rheology to predict the material behaviors of simple models. Today, more complex engineering structures are analyzed using Finite Element Method, in which material behavior is embedded by means of three dimensional viscoelastic material laws. As a result, structures of unordinary geometry and domain like pavements of bridges can be analyzed by means of Finite Element Method and three dimensional viscoelastic equations. In the scope of this study, rheological models embedded in ANSYS, namely, generalized Maxwell elements and Prony series, which are two methods used by ANSYS to represent viscoelastic material behavior, are presented explicitly. Subsequently, a practical problem, which has an analytical solution given in literature, is used to verify the applicability of viscoelasticity tool embedded in ANSYS. Finally, amount of error in the results of ANSYS is compared with the analytical results to indicate the influence of used method and time step size.

Keywords: generalized Maxwell model, finite element method, prony series, time step size, viscoelasticity

Procedia PDF Downloads 356
5286 Virtual Modelling of Turbulent Fibre Flow in a Low Consistency Refiner for a Sustainable and Energy Efficient Process

Authors: Simon Ingelsten, Anton Lundberg, Vijay Shankar, Lars-Olof Landström, Örjan Johansson

Abstract:

The flow in a low consistency disc refiner is simulated with the aim of identifying flow structures possibly being of importance for a future study to optimise the energy efficiency in refining processes. A simplified flow geometry is used, where a single groove of a refiner disc is modelled. Two different fibre models are used to simulate turbulent fibre suspension flow in the groove. The first model is a Bingham viscoplastic fluid model where the fibre suspension is treated as a non-Newtonian fluid with a yield stress. The second model is a new model proposed in a recent study where the suspended fibres effect on flow is accounted for through a modelled orientation distribution function (ODF). Both models yielded similar results with small differences. Certain flow characteristics that were expected and that was found in the literature were identified. Some of these flow characteristics may be of importance in a future process to optimise the refiner geometry to increase the energy efficiency. Further study and a more detailed flow model is; however, needed in order for the simulations to yield results valid for quantitative use in such an optimisation study. An outline of the next steps in such a study is proposed.

Keywords: disc refiner, fibre flow, sustainability, turbulence modelling

Procedia PDF Downloads 392
5285 Elaboration and Physico-Chemical Characterization of Edible Films Made from Chitosan and Spray Dried Ethanolic Extracts of Propolis

Authors: David Guillermo Piedrahita Marquez, Hector Suarez Mahecha, Jairo Humberto Lopez

Abstract:

It was necessary to establish which formulation is suitable for the preservation of aquaculture products, that why edible films were made. These were to a characterization in order to meet their morphology physicochemical and mechanical properties, optical. Six Formulations of chitosan and propolis ethanolic extract encapsulated were developed because of their activity against pathogens and due to their properties, which allows the creation waterproof polymer networks against gasses, vapor, and physical damage. In the six Formulations, the concentration of comparison material (1% w/v, 2% pv) and the bioactive concentrations (0.5% w/v, 1% w/v, 1.5% pv) were changed and the results obtained were compared with statistical and multivariate analysis methods. It was observed that the matrices showed a mayor impermeability and thickness control samples and the samples reported in the literature. Also, these films showed a notorious uniformity of the films and a bigger resistance to the physical damage compared with other edible films made of other biopolymers. However the action of some compounds had a negative effect on the mechanical properties and changed drastically the optical properties, the bioactive has an effect on Polymer Matrix and it was determined that the films with 2% w / v of chitosan and 1.5% w/v encapsulated, exhibited the best properties and suffered to a lesser extent the negative impact of immiscible substances.

Keywords: chitosan, edible films, ethanolic extract of propolis, mechanical properties, optical properties, physical characterization, scanning electron microscopy (SEM)

Procedia PDF Downloads 428
5284 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome

Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler

Abstract:

Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.

Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model

Procedia PDF Downloads 141
5283 How Unicode Glyphs Revolutionized the Way We Communicate

Authors: Levi Corallo

Abstract:

Typed language made by humans on computers and cell phones has made a significant distinction from previous modes of written language exchanges. While acronyms remain one of the most predominant markings of typed language, another and perhaps more recent revolution in the way humans communicate has been with the use of symbols or glyphs, primarily Emojis—globally introduced on the iPhone keyboard by Apple in 2008. This paper seeks to analyze the use of symbols in typed communication from both a linguistic and machine learning perspective. The Unicode system will be explored and methods of encoding will be juxtaposed with the current machine and human perception. Topics in how typed symbol usage exists in conversation will be explored as well as topics across current research methods dealing with Emojis like sentiment analysis, predictive text models, and so on. This study proposes that sequential analysis is a significant feature for analyzing unicode characters in a corpus with machine learning. Current models that are trying to learn or translate the meaning of Emojis should be starting to learn using bi- and tri-grams of Emoji, as well as observing the relationship between combinations of different Emoji in tandem. The sociolinguistics of an entire new vernacular of language referred to here as ‘typed language’ will also be delineated across my analysis with unicode glyphs from both a semantic and technical perspective.

Keywords: unicode, text symbols, emojis, glyphs, communication

Procedia PDF Downloads 185
5282 Qsar Studies of Certain Novel Heterocycles Derived From bis-1, 2, 4 Triazoles as Anti-Tumor Agents

Authors: Madhusudan Purohit, Stephen Philip, Bharathkumar Inturi

Abstract:

In this paper we report the quantitative structure activity relationship of novel bis-triazole derivatives for predicting the activity profile. The full model encompassed a dataset of 46 Bis- triazoles. Tripos Sybyl X 2.0 program was used to conduct CoMSIA QSAR modeling. The Partial Least-Squares (PLS) analysis method was used to conduct statistical analysis and to derive a QSAR model based on the field values of CoMSIA descriptor. The compounds were divided into test and training set. The compounds were evaluated by various CoMSIA parameters to predict the best QSAR model. An optimum numbers of components were first determined separately by cross-validation regression for CoMSIA model, which were then applied in the final analysis. A series of parameters were used for the study and the best fit model was obtained using donor, partition coefficient and steric parameters. The CoMSIA models demonstrated good statistical results with regression coefficient (r2) and the cross-validated coefficient (q2) of 0.575 and 0.830 respectively. The standard error for the predicted model was 0.16322. In the CoMSIA model, the steric descriptors make a marginally larger contribution than the electrostatic descriptors. The finding that the steric descriptor is the largest contributor for the CoMSIA QSAR models is consistent with the observation that more than half of the binding site area is occupied by steric regions.

Keywords: 3D QSAR, CoMSIA, triazoles, novel heterocycles

Procedia PDF Downloads 432
5281 Excitation Modeling for Hidden Markov Model-Based Speech Synthesis Based on Wavelet Analysis

Authors: M. Kiran Reddy, K. Sreenivasa Rao

Abstract:

The conventional Hidden Markov Model (HMM)-based speech synthesis system (HTS) uses only a pulse excitation model, which significantly differs from natural excitation signal. Hence, buzziness can be perceived in the speech generated using HTS. This paper proposes an efficient excitation modeling method that can significantly reduce the buzziness, and improve the quality of HMM-based speech synthesis. The proposed approach models the pitch-synchronous residual frames extracted from the residual excitation signal. Each pitch synchronous residual frame is parameterized using 30 wavelet coefficients. These 30 wavelet coefficients are found to accurately capture the perceptually important information present in the residual waveform. In synthesis phase, the residual frames are reconstructed from the generated wavelet coefficients and are pitch-synchronously overlap-added to generate the excitation signal. The proposed excitation modeling method is integrated into HMM-based speech synthesis system. Evaluation results indicate that the speech synthesized by the proposed excitation model is significantly better than the speech generated using state-of-the-art excitation modeling methods.

Keywords: excitation modeling, hidden Markov models, pitch-synchronous frames, speech synthesis, wavelet coefficients

Procedia PDF Downloads 235
5280 The Relationship between the Use of Social Networks with Executive Functions and Academic Performance in High School Students in Tehran

Authors: Esmail Sadipour

Abstract:

The use of social networks is increasing day by day in all societies. The purpose of this research was to know the relationship between the use of social networks (Instagram, WhatsApp, and Telegram) with executive functions and academic performance in first-year female high school students. This research was applied in terms of purpose, quantitative in terms of data type, and correlational in terms of technique. The population of this research consisted of all female high school students in the first year of district 2 of Tehran. Using Green's formula, the sample size of 150 people was determined and selected by cluster random method. In this way, from all 17 high schools in district 2 of Tehran, 5 high schools were selected by a simple random method and then one class was selected from each high school, and a total of 155 students were selected. To measure the use of social networks, a researcher-made questionnaire was used, the Barclay test (2012) was used for executive functions, and last semester's GPA was used for academic performance. Pearson's correlation coefficient and multivariate regression were used to analyze the data. The results showed that there is a negative relationship between the amount of use of social networks and self-control, self-motivation and time self-management. In other words, the more the use of social networks, the fewer executive functions of students, self-control, self-motivation, and self-management of their time. Also, with the increase in the use of social networks, the academic performance of students has decreased.

Keywords: social networks, executive function, academic performance, working memory

Procedia PDF Downloads 74
5279 A Multilevel Approach of Reproductive Preferences and Subsequent Behavior in India

Authors: Anjali Bansal

Abstract:

Reproductive preferences mainly deal with two questions: when a couple wants children and how many they want. Questions related to these desires are often included in the fertility surveys as they can provide relevant information on the subsequent behavior. The aim of the study is to observe whether respondent’s response to these questions changed over time or not. We also tried to identify socio- economic and demographic factors associated with the stability (or instability) of fertility preferences. For this purpose, we used IHDS1 (2004-05) and follow up survey IHDS2 (2011-12) data and applied bivariate, multivariate and multilevel repeated measure analysis to it to find the consistency between responses. From the analysis, we found that preferences of women changes over the course of time as from the bivariate analysis we have found that 52% of women are not consistent in their desired family size and huge inconsistency are found in desire to continue childbearing. To get a better overlook of these inconsistencies, we have computed Intra Class Correlation (ICC) which tries to explain the consistency between individuals on their fertility responses at two time periods. We also explored that husband’s desire for additional child specifically male offspring contribute to these variations. Our findings lead us to a cessation that in India, individuals fertility preferences changed over a seven-year time period as the Intra Class correlation comes out to be very small which explains the variations among individuals. Concerted efforts should be made, therefore, to educate people, and conduct motivational programs to promote family planning for family welfare.

Keywords: change, consistency, preferences, over time

Procedia PDF Downloads 153
5278 Naked Machismo: Uncovered Masculinity in an Israeli Home Design Campaign

Authors: Gilad Padva, Sigal Barak Brandes

Abstract:

This research centers on an unexpected Israeli advertising campaign for Elemento, a local furniture company, which eroticizes male nudity. The discussed campaign includes a series of printed ads that depict naked male models in effeminate positions. This campaign included a series of ads published in Haaretz, a small-scaled yet highly prestigious daily newspaper which is typically read by urban middle-upper-class left-winged Israelis. Apparently, this campaign embodies an alternative masculinity that challenges the prevalent machismo in Israeli society and advertising. Although some of the ads focus on young men in effeminate positions, they never expose their genitals and anuses, and their bodies are never permeable. The 2010s Elemento male models are seemingly contrasted to conventional representation of manhood in contemporary mainstream advertising. They display a somewhat inactive, passive and self-indulgent masculinity which involves 'conspicuous leisure'. In the process of commodity fetishism, the advertised furniture are emptied of the original meaning of their production, and then filled with new meanings in ways that both mystify the product and turn it into a fetish object. Yet, our research critically reconsiders this sensational campaign as sophisticated patriarchal parody that does not subvert but rather reconfirms and even fetishizes patriarchal premises; it parodizes effeminacy rather than the prevalent (Israeli) machismo. Following Pierre Bourdieu's politics of cultural taste, our research reconsiders and criticizes the male models' domesticated masculinity in a fantasized and cosmopolitan hedonistic habitus. Notwithstanding, we suggest that the Elemento campaign, despite its conformity, does question some Israeli and global axioms about gender roles, corporeal ideologies, idealized bodies, and domesticated phalluses and anuses. Although the naked truth is uncovered by this campaign, it does erect a vibrant discussion of contemporary masculinities and their exploitation in current mass consumption.

Keywords: male body, campaign, advertising, gender studies, men's studies, Israeli culture, masculinity, parody, effeminacy

Procedia PDF Downloads 200
5277 Healthy Lifestyle and Risky Behaviors amongst Students of Physical Education High Schools

Authors: Amin Amani, Masomeh Reihany Shirvan, Mahla Nabizadeh Mashizi, Mohadese Khoshtinat, Mohammad Elyas Ansarinia

Abstract:

The purpose of this study is the relationship between a healthy lifestyle and risky behavior in physical education students of Bojnourd schools. The study sample consisted of teenagers studying in second and third grade of Bojnourd's high schools. According to level sampling, 604 students studying in the second grade, and 600 students studying in third grade were tested from physical education schools in Bojnourd. For sample selection, populations were divided into 4 area including north, East, West and South. Then according to the number of students of each area, sample size of each level was determined. Two questionnaires were used to collect data in this study which were consisted of three parts: The demographic data, Iranian teenagers' risk taking (IARS) and prevention methods with emphasize on the importance of family role were examined. The Central and dispersion indices, such as standard deviation, multiple variance analysis, and multivariate regression analysis were used. Results showed that the observed F is significant (P ≤ 0.01) and 21% of variance related to risky behavior is explained by the lack of awareness. Given the significance of the regression, the coefficients of risky behavior in teenagers in prediction equation showed that each of teenagers' risky behavior can have an impact on healthy lifestyle.

Keywords: healthy lifestyle, high-risk behavior, students, physical education

Procedia PDF Downloads 178
5276 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.

Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element

Procedia PDF Downloads 55