Search results for: large language models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15822

Search results for: large language models

11592 Study of the Influence of Eccentricity Due to Configuration and Materials on Seismic Response of a Typical Building

Authors: A. Latif Karimi, M. K. Shrimali

Abstract:

Seismic design is a critical stage in the process of design and construction of a building. It includes strategies for designing earthquake-resistant buildings to ensure health, safety, and security of the building occupants and assets. Hence, it becomes very important to understand the behavior of structural members precisely, for construction of buildings that can yield a better response to seismic forces. This paper investigates the behavior of a typical structure when subjected to ground motion. The corresponding mode shapes and modal frequencies are studied to interpret the response of an actual structure using different fabricated models and 3D visual models. In this study, three different structural configurations are subjected to horizontal ground motion, and the effect of “stiffness eccentricity” and placement of infill walls are checked to determine how each parameter contributes in a building’s response to dynamic forces. The deformation data from lab experiments and the analysis on SAP2000 software are reviewed to obtain the results. This study revealed that seismic response in a building can be improved by introducing higher deformation capacity in the building. Also, proper design of infill walls and maintaining a symmetrical configuration in a building are the key factors in building stability during the earthquake.

Keywords: eccentricity, seismic response, mode shape, building configuration, building dynamics

Procedia PDF Downloads 184
11591 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 113
11590 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market

Authors: Cristian Păuna

Abstract:

In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.

Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex

Procedia PDF Downloads 120
11589 The Efficacy of Clobazam for Landau-Kleffner Syndrome

Authors: Nino Gogatishvili, Davit Kvernadze, Giorgi Japharidze

Abstract:

Background and aims: Landau Kleffner syndrome (LKS) is a rare disorder with epileptic seizures and acquired aphasia. It usually starts in initially healthy children. The first symptoms are language regression and behavioral disturbances, and the sleep EEG reveals abnormal epileptiform activity. The aim was to discuss the efficacy of Clobazam for Landau Kleffner syndrome. Case report: We report a case of an 11-year-old boy with an uneventful pregnancy and delivery. He began to walk at 11 months and speak with simple phrases at the age of 2,5 years. At the age of 18 months, he had febrile convulsions; at the age of 5 years, the parents noticed language regression, stuttering, and serious behavioral dysfunction, including hyperactivity, temper outbursts. The epileptic seizure was not noticed. MRI was without any abnormality. Neuropsychological testing revealed verbal auditory agnosia. Sleep EEG showed abundant left fronto-temporal spikes, reaching over 85% during non-rapid eye movement sleep (non-REM sleep). Treatment was started with Clobazam. After ten weeks, EEG was improved. Stuttering and behavior also improved. Results: Since the start of Clobazam treatment, stuttering and behavior improved. Now, he is 11 years old, without antiseizure medication. Sleep EEG shows fronto-temporal spikes on the left side, over 10-49 % of non-REM sleep, bioccipital spikes, and slow-wave discharges and spike-waves. Conclusions: This case provides further support for the efficacy of Clobazam in patients with LKS.

Keywords: Landau-Kleffner syndrome, antiseizure medication, stuttering, aphasia

Procedia PDF Downloads 54
11588 Finite Element Modelling and Optimization of Post-Machining Distortion for Large Aerospace Monolithic Components

Authors: Bin Shi, Mouhab Meshreki, Grégoire Bazin, Helmi Attia

Abstract:

Large monolithic components are widely used in the aerospace industry in order to reduce airplane weight. Milling is an important operation in manufacturing of the monolithic parts. More than 90% of the material could be removed in the milling operation to obtain the final shape. This results in low rigidity and post-machining distortion. The post-machining distortion is the deviation of the final shape from the original design after releasing the clamps. It is a major challenge in machining of the monolithic parts, which costs billions of economic losses every year. Three sources are directly related to the part distortion, including initial residual stresses (RS) generated from previous manufacturing processes, machining-induced RS and thermal load generated during machining. A finite element model was developed to simulate a milling process and predicate the post-machining distortion. In this study, a rolled-aluminum plate AA7175 with a thickness of 60 mm was used for the raw block. The initial residual stress distribution in the block was measured using a layer-removal method. A stress-mapping technique was developed to implement the initial stress distribution into the part. It is demonstrated that this technique significantly accelerates the simulation time. Machining-induced residual stresses on the machined surface were measured using MTS3000 hole-drilling strain-gauge system. The measured RS was applied on the machined surface of a plate to predict the distortion. The predicted distortion was compared with experimental results. It is found that the effect of the machining-induced residual stress on the distortion of a thick plate is very limited. The distortion can be ignored if the wall thickness is larger than a certain value. The RS generated from the thermal load during machining is another important factor causing part distortion. Very limited number of research on this topic was reported in literature. A coupled thermo-mechanical FE model was developed to evaluate the thermal effect on the plastic deformation of a plate. A moving heat source with a feed rate was used to simulate the dynamic cutting heat in a milling process. When the heat source passed the part surface, a small layer was removed to simulate the cutting operation. The results show that for different feed rates and plate thicknesses, the plastic deformation/distortion occurs only if the temperature exceeds a critical level. It was found that the initial residual stress has a major contribution to the part distortion. The machining-induced stress has limited influence on the distortion for thin-wall structure when the wall thickness is larger than a certain value. The thermal load can also generate part distortion when the cutting temperature is above a critical level. The developed numerical model was employed to predict the distortion of a frame part with complex structures. The predictions were compared with the experimental measurements, showing both are in good agreement. Through optimization of the position of the part inside the raw plate using the developed numerical models, the part distortion can be significantly reduced by 50%.

Keywords: modelling, monolithic parts, optimization, post-machining distortion, residual stresses

Procedia PDF Downloads 38
11587 Improved Elastoplastic Bounding Surface Model for the Mathematical Modeling of Geomaterials

Authors: Andres Nieto-Leal, Victor N. Kaliakin, Tania P. Molina

Abstract:

The nature of most engineering materials is quite complex. It is, therefore, difficult to devise a general mathematical model that will cover all possible ranges and types of excitation and behavior of a given material. As a result, the development of mathematical models is based upon simplifying assumptions regarding material behavior. Such simplifications result in some material idealization; for example, one of the simplest material idealization is to assume that the material behavior obeys the elasticity. However, soils are nonhomogeneous, anisotropic, path-dependent materials that exhibit nonlinear stress-strain relationships, changes in volume under shear, dilatancy, as well as time-, rate- and temperature-dependent behavior. Over the years, many constitutive models, possessing different levels of sophistication, have been developed to simulate the behavior geomaterials, particularly cohesive soils. Early in the development of constitutive models, it became evident that elastic or standard elastoplastic formulations, employing purely isotropic hardening and predicated in the existence of a yield surface surrounding a purely elastic domain, were incapable of realistically simulating the behavior of geomaterials. Accordingly, more sophisticated constitutive models have been developed; for example, the bounding surface elastoplasticity. The essence of the bounding surface concept is the hypothesis that plastic deformations can occur for stress states either within or on the bounding surface. Thus, unlike classical yield surface elastoplasticity, the plastic states are not restricted only to those lying on a surface. Elastoplastic bounding surface models have been improved; however, there is still need to improve their capabilities in simulating the response of anisotropically consolidated cohesive soils, especially the response in extension tests. Thus, in this work an improved constitutive model that can more accurately predict diverse stress-strain phenomena exhibited by cohesive soils was developed. Particularly, an improved rotational hardening rule that better simulate the response of cohesive soils in extension. The generalized definition of the bounding surface model provides a convenient and elegant framework for unifying various previous versions of the model for anisotropically consolidated cohesive soils. The Generalized Bounding Surface Model for cohesive soils is a fully three-dimensional, time-dependent model that accounts for both inherent and stress induced anisotropy employing a non-associative flow rule. The model numerical implementation in a computer code followed an adaptive multistep integration scheme in conjunction with local iteration and radial return. The one-step trapezoidal rule was used to get the stiffness matrix that defines the relationship between the stress increment and the strain increment. After testing the model in simulating the response of cohesive soils through extensive comparisons of model simulations to experimental data, it has been shown to give quite good simulations. The new model successfully simulates the response of different cohesive soils; for example, Cardiff Kaolin, Spestone Kaolin, and Lower Cromer Till. The simulated undrained stress paths, stress-strain response, and excess pore pressures are in very good agreement with the experimental values, especially in extension.

Keywords: bounding surface elastoplasticity, cohesive soils, constitutive model, modeling of geomaterials

Procedia PDF Downloads 301
11586 Environmental Effects on Energy Consumption of Smart Grid Consumers

Authors: S. M. Ali, A. Salam Khan, A. U. Khan, M. Tariq, M. S. Hussain, B. A. Abbasi, I. Hussain, U. Farid

Abstract:

Environment and surrounding plays a pivotal rule in structuring life-style of the consumers. Living standards intern effect the energy consumption of the consumers. In smart grid paradigm, climate drifts, weather parameter and green environmental directly relates to the energy profiles of the various consumers, such as residential, commercial and industrial. Considering above factors helps policy in shaping utility load curves and optimal management of demand and supply. Thus, there is a pressing need to develop correlation models of load and weather parameters and critical analysis of the factors effecting energy profiles of smart grid consumers. In this paper, we elaborated various environment and weather parameter factors effecting demand of consumers. Moreover, we developed correlation models, such as Pearson, Spearman, and Kendall, an inter-relation between dependent (load) parameter and independent (weather) parameters. Furthermore, we validated our discussion with real-time data of Texas State. The numerical simulations proved the effective relation of climatic drifts with energy consumption of smart grid consumers.

Keywords: climatic drifts, correlation analysis, energy consumption, smart grid, weather parameter

Procedia PDF Downloads 354
11585 Systems Engineering and Project Management Process Modeling in the Aeronautics Context: Case Study of SMEs

Authors: S. Lemoussu, J. C. Chaudemar, R. A. Vingerhoeds

Abstract:

The aeronautics sector is currently living an unprecedented growth largely due to innovative projects. In several cases, such innovative developments are being carried out by Small and Medium sized-Enterprises (SMEs). For instance, in Europe, a handful of SMEs are leading projects like airships, large civil drones, or flying cars. These SMEs have all limited resources, must make strategic decisions, take considerable financial risks and in the same time must take into account the constraints of safety, cost, time and performance as any commercial organization in this industry. Moreover, today, no international regulations fully exist for the development and certification of this kind of projects. The absence of such a precise and sufficiently detailed regulatory framework requires a very close contact with regulatory instances. But, SMEs do not always have sufficient resources and internal knowledge to handle this complexity and to discuss these issues. This poses additional challenges for those SMEs that have system integration responsibilities and that must provide all the necessary means of compliance to demonstrate their ability to design, produce, and operate airships with the expected level of safety and reliability. The final objective of our research is thus to provide a methodological framework supporting SMEs in their development taking into account recent innovation and institutional rules of the sector. We aim to provide a contribution to the problematic by developing a specific Model-Based Systems Engineering (MBSE) approach. Airspace regulation, aeronautics standards and international norms on systems engineering are taken on board to be formalized in a set of models. This paper presents the on-going research project combining Systems Engineering and Project Management process modeling and taking into account the metamodeling problematic.

Keywords: aeronautics, certification, process modeling, project management, SME, systems engineering

Procedia PDF Downloads 152
11584 Strategies for the Optimization of Ground Resistance in Large Scale Foundations for Optimum Lightning Protection

Authors: Oibar Martinez, Clara Oliver, Jose Miguel Miranda

Abstract:

In this paper, we discuss the standard improvements which can be made to reduce the earth resistance in difficult terrains for optimum lightning protection, what are the practical limitations, and how the modeling can be refined for accurate diagnostics and ground resistance minimization. Ground resistance minimization can be made via three different approaches: burying vertical electrodes connected in parallel, burying horizontal conductive plates or meshes, or modifying the own terrain, either by changing the entire terrain material in a large volume or by adding earth-enhancing compounds. The use of vertical electrodes connected in parallel pose several practical limitations. In order to prevent loss of effectiveness, it is necessary to keep a minimum distance between each electrode, which is typically around five times larger than the electrode length. Otherwise, the overlapping of the local equipotential lines around each electrode reduces the efficiency of the configuration. The addition of parallel electrodes reduces the resistance and facilitates the measurement, but the basic parallel resistor formula of circuit theory will always underestimate the final resistance. Numerical simulation of equipotential lines around the electrodes overcomes this limitation. The resistance of a single electrode will always be proportional to the soil resistivity. The electrodes are usually installed with a backfilling material of high conductivity, which increases the effective diameter. However, the improvement is marginal, since the electrode diameter counts in the estimation of the ground resistance via a logarithmic function. Substances that are used for efficient chemical treatment must be environmentally friendly and must feature stability, high hygroscopicity, low corrosivity, and high electrical conductivity. A number of earth enhancement materials are commercially available. Many are comprised of carbon-based materials or clays like bentonite. These materials can also be used as backfilling materials to reduce the resistance of an electrode. Chemical treatment of soil has environmental issues. Some products contain copper sulfate or other copper-based compounds, which may not be environmentally friendly. Carbon-based compounds are relatively inexpensive and they do have very low resistivities, but they also feature corrosion issues. Typically, the carbon can corrode and destroy a copper electrode in around five years. These compounds also have potential environmental concerns. Some earthing enhancement materials contain cement, which, after installation acquire properties that are very close to concrete. This prevents the earthing enhancement material from leaching into the soil. After analyzing different configurations, we conclude that a buried conductive ring with vertical electrodes connected periodically should be the optimum baseline solution for the grounding of a large size structure installed on a large resistivity terrain. In order to show this, a practical example is explained here where we simulate the ground resistance of a conductive ring buried in a terrain with a resistivity in the range of 1 kOhm·m.

Keywords: grounding improvements, large scale scientific instrument, lightning risk assessment, lightning standards

Procedia PDF Downloads 121
11583 Research on Natural Lighting Design of Atriums Based on Energy-Saving Aim

Authors: Fan Yu

Abstract:

An atrium is a place for natural climate exchanging of indoor and outdoor space of buildings, which plays an active role in the overall energy conservation, climate control and environmental purification of buildings. Its greatest contribution is serving as a natural light collector and distributor to solve the problem of natural lighting in large and deep spaces. However, in real situations, the atrium space often results in energy consumption due to improper design in considering its big size and large amount use of glass. Based on the purpose of energy conservation of buildings, this paper emphasizes the significance of natural lighting of atriums. Through literature research, case analysis and other methods, four factors, namely: the light transmittance through the top of the atrium, the geometric proportion of the atrium space, the size and position of windows and the material of the surface of walls in the atrium, were studied, and the influence of different architectural compositions on the natural light distribution of the atrium is discussed. Relying on the analysis of relevant cases, it is proposed that when designing the natural lighting of the atrium, the height and width of the atrium should be paid attention to, the atrium walls are required being rough surfaces and the atrium top-level windows need to be minimized in order to introduce more natural light into the buildings and achieve the purpose of energy conservation.

Keywords: energy conservation, atrium, natural lighting, architectural design

Procedia PDF Downloads 172
11582 Characterising Stable Model by Extended Labelled Dependency Graph

Authors: Asraful Islam

Abstract:

Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.

Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring

Procedia PDF Downloads 198
11581 Studying the Impact of Soil Characteristics in Displacement of Retaining Walls Using Finite Element

Authors: Mojtaba Ahmadabadi, Akbar Masoudi, Morteza Rezai

Abstract:

In this paper, using the finite element method, the effect of soil and wall characteristics was investigated. Thirty and two different models were studied by different parameters. These studies could calculate displacement at any height of the wall for frictional-cohesive soils. The main purpose of this research is to determine the most effective soil characteristics in reducing the wall displacement. Comparing different models showed that the overall increase in internal friction angle, angle of friction between soil and wall and modulus of elasticity reduce the replacement of the wall. In addition, increase in special weight of soil will increase the wall displacement. Based on results, it can be said that all wall displacements were overturning and in the backfill, soil was bulging. Results show that the highest impact is seen in reducing wall displacement, internal friction angle, and the angle friction between soil and wall. One of the advantages of this study is taking into account all the parameters of the soil and walls replacement distribution in wall and backfill soil. In this paper, using the finite element method and considering all parameters of the soil, we investigated the impact of soil parameter in wall displacement. The aim of this study is to provide the best conditions in reducing the wall displacement and displacement wall and soil distribution.

Keywords: retaining wall, fem, soil and wall interaction, angle of internal friction of the soil, wall displacement

Procedia PDF Downloads 377
11580 Voice Quality in Italian-Speaking Children with Autism

Authors: Patrizia Bonaventura, Magda Di Renzo

Abstract:

This project aims to measure and assess the voice quality in children with autism. Few previous studies exist which have analyzed the voice quality of individuals with autism: abnormal voice characteristics have been found, like a high pitch, great pitch range, and sing-song quality. Existing studies did not focus specifically on Italian-speaking children’s voices and provided analysis of a few acoustic parameters. The present study aimed to gather more data and to perform acoustic analysis of the voice of children with autism in order to identify patterns of abnormal voice features that might shed some light on the causes of the dysphonia and possibly be used to create a pediatric assessment tool for early identification of autism. The participants were five native Italian-speaking boys with autism between the age of 4 years and 10 years (mean 6.8 ± SD 1.4). The children had a diagnosis of autism, were verbal, and had no other comorbid conditions (like Down syndrome or ADHD). The voices of the autistic children were recorded in the production of sustained vowels [ah] and [ih] and of sentences from the Italian version of the CAPE-V voice assessment test. The following voice parameters, representative of normal quality, were analyzed by acoustic spectrography through Praat: Speaking Fundamental Frequency, F0 range, average intensity, and dynamic range. The results showed that the pitch parameters (Speaking Fundamental Frequency and F0 range), as well as the intensity parameters (average intensity and dynamic range), were significantly different from the relative normal reference thresholds. Also, variability among children was found, so confirming a tendency revealed in previous studies of individual variation in these aspects of voice quality. The results indicate a general pattern of abnormal voice quality characterized by a high pitch and large variations in pitch and intensity. These acoustic voice characteristics found in Italian-speaking autistic children match those found in children speaking other languages, indicating that autism symptoms affecting voice quality might be independent of the native language of the children.

Keywords: autism, voice disorders, speech science, acoustic analysis of voice

Procedia PDF Downloads 52
11579 Relationship of Macro-Concepts in Educational Technologies

Authors: L. R. Valencia Pérez, A. Morita Alexander, Peña A. Juan Manuel, A. Lamadrid Álvarez

Abstract:

This research shows the reflection and identification of explanatory variables and their relationships between different variables that are involved with educational technology, all of them encompassed in macro-concepts which are: cognitive inequality, economy, food and language; These will give the guideline to have a more detailed knowledge of educational systems, the communication and equipment, the physical space and the teachers; All of them interacting with each other give rise to what is called educational technology management. These elements contribute to have a very specific knowledge of the equipment of communications, networks and computer equipment, systems and content repositories. This is intended to establish the importance of knowing a global environment in the transfer of knowledge in poor countries, so that it does not diminish the capacity to be authentic and preserve their cultures, their languages or dialects, their hierarchies and real needs; In short, to respect the customs of different towns, villages or cities that are intended to be reached through the use of internationally agreed professional educational technologies. The methodology used in this research is the analytical - descriptive, which allows to explain each of the variables, which in our opinion must be taken into account, in order to achieve an optimal incorporation of the educational technology in a model that gives results in a medium term. The idea is that in an encompassing way the concepts will be integrated to others with greater coverage until reaching macro concepts that are of national coverage in the countries and that are elements of conciliation in the different federal and international reforms. At the center of the model is the educational technology which is directly related to the concepts that are contained in factors such as the educational system, communication and equipment, spaces and teachers, which are globally immersed in macro concepts Cognitive inequality, economics, food and language. One of the major contributions of this article is to leave this idea under an algorithm that allows to be as unbiased as possible when evaluating this indicator, since other indicators that are to be taken from international preference entities like the OECD in the area of education systems studied, so that they are not influenced by particular political or interest pressures. This work opens the way for a relationship between involved entities, both conceptual, procedural and human activity, to clearly identify the convergence of their impact on the problem of education and how the relationship can contribute to an improvement, but also shows possibilities of being able to reach a comprehensive education reform for all.

Keywords: relationships macro-concepts, cognitive inequality, economics, alimentation and language

Procedia PDF Downloads 186
11578 Predictive Analysis of the Stock Price Market Trends with Deep Learning

Authors: Suraj Mehrotra

Abstract:

The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.

Keywords: machine learning, testing set, artificial intelligence, stock analysis

Procedia PDF Downloads 81
11577 Analysis and Modeling of the Building’s Facades in Terms of Different Convection Coefficients

Authors: Enes Yasa, Guven Fidan

Abstract:

Building Simulation tools need to better evaluate convective heat exchanges between external air and wall surfaces. Previous analysis demonstrated the significant effects of convective heat transfer coefficient values on the room energy balance. Some authors have pointed out that large discrepancies observed between widely used building thermal models can be attributed to the different correlations used to calculate or impose the value of the convective heat transfer coefficients. Moreover, numerous researchers have made sensitivity calculations and proved that the choice of Convective Heat Transfer Coefficient values can lead to differences from 20% to 40% of energy demands. The thermal losses to the ambient from a building surface or a roof mounted solar collector represent an important portion of the overall energy balance and depend heavily on the wind induced convection. In an effort to help designers make better use of the available correlations in the literature for the external convection coefficients due to the wind, a critical discussion and a suitable tabulation is presented, on the basis of algebraic form of the coefficients and their dependence upon characteristic length and wind direction, in addition to wind speed. Many research works have been conducted since early eighties focused on the convection heat transfer problems inside buildings. In this context, a Computational Fluid Dynamics (CFD) program has been used to predict external convective heat transfer coefficients at external building surfaces. For the building facades model, effects of wind speed and temperature differences between the surfaces and the external air have been analyzed, showing different heat transfer conditions and coefficients. In order to provide further information on external convective heat transfer coefficients, a numerical work is presented in this paper, using a Computational Fluid Dynamics (CFD) commercial package (CFX) to predict convective heat transfer coefficients at external building surface.

Keywords: CFD in buildings, external convective heat transfer coefficients, building facades, thermal modelling

Procedia PDF Downloads 405
11576 Information Technology Approaches to Literature Text Analysis

Authors: Ayse Tarhan, Mustafa Ilkan, Mohammad Karimzadeh

Abstract:

Science was considered as part of philosophy in ancient Greece. By the nineteenth century, it was understood that philosophy was very inclusive and that social and human sciences such as literature, history, and psychology should be separated and perceived as an autonomous branch of science. The computer was also first seen as a tool of mathematical science. Over time, computer science has grown by encompassing every area in which technology exists, and its growth compelled the division of computer science into different disciplines, just as philosophy had been divided into different branches of science. Now there is almost no branch of science in which computers are not used. One of the newer autonomous disciplines of computer science is digital humanities, and one of the areas of digital humanities is literature. The material of literature is words, and thanks to the software tools created using computer programming languages, data that a literature researcher would need months to complete, can be achieved quickly and objectively. In this article, three different tools that literary researchers can use in their work will be introduced. These studies were created with the computer programming languages Python and R and brought to the world of literature. The purpose of introducing the aforementioned studies is to set an example for the development of special tools or programs on Ottoman language and literature in the future and to support such initiatives. The first example to be introduced is the Stylometry tool developed with the R language. The other is The Metrical Tool, which is used to measure data in poems and was developed with Python. The latest literature analysis tool in this article is Voyant Tools, which is a multifunctional and easy-to-use tool.

Keywords: DH, literature, information technologies, stylometry, the metrical tool, voyant tools

Procedia PDF Downloads 135
11575 Using Seismic and GPS Data for Hazard Estimation in Some Active Regions in Egypt

Authors: Abdel-Monem Sayed Mohamed

Abstract:

Egypt rapidly growing development is accompanied by increasing levels of standard living particular in its urban areas. However, there is a limited experience in quantifying the sources of risk management in Egypt and in designing efficient strategies to keep away serious impacts of earthquakes. From the historical point of view and recent instrumental records, there are some seismo-active regions in Egypt, where some significant earthquakes had occurred in different places. The special tectonic features in Egypt: Aswan, Greater Cairo, Red Sea and Sinai Peninsula regions are the territories of a high seismic risk, which have to be monitored by up-to date technologies. The investigations of the seismic events and interpretations led to evaluate the seismic hazard for disaster prevention and for the safety of the dense populated regions and the vital national projects as the High Dam. In addition to the monitoring of the recent crustal movements, the most powerful technique of satellite geodesy GPS are used where geodetic networks are covering such seismo-active regions. The results from the data sets are compared and combined in order to determine the main characteristics of the deformation and hazard estimation for specified regions. The final compiled output from the seismological and geodetic analysis threw lights upon the geodynamical regime of these seismo-active regions and put Aswan and Greater Cairo under the lowest class according to horizontal crustal strains classifications. This work will serve a basis for the development of so-called catastrophic models and can be further used for catastrophic risk management. Also, this work is trying to evaluate risk of large catastrophic losses within the important regions including the High Dam, strategic buildings and archeological sites. Studies on possible scenarios of earthquakes and losses are a critical issue for decision making in insurance as a part of mitigation measures.

Keywords: b-value, Gumbel distribution, seismic and GPS data, strain parameters

Procedia PDF Downloads 439
11574 Corporate Governance of State-Owned Enterprises: A Comparative Analysis

Authors: Adeyemi Adebayo, Barry Ackers

Abstract:

This paper comparatively analyses the corporate governance of SOEs in South Africa and Singapore in the context of the World Bank’s framework for corporate governance of SOEs. This framework ensured that the analysis holistically covered key aspects of corporate governance of SOEs in these states. In order to ground our understanding of the paths taken by SOEs in the states, the paper presents the evolution and reforms of SOEs in the states before analyzing key aspects of their corporate governance. The analysis shows that even though SOEs in South Africa and Singapore are comparable in a number of ways, there are notable differences. In this context, this paper finds that the main difference between corporate governance of SOEs in South Africa and Singapore is their organizing model. Further, the analysis, among other findings, shows that SOEs Boards in Singapore are better remunerated. Further finding reveals that, even though some board members are politically connected, Singaporean SOEs boards are better constituted based on skills and experience compared to SOEs boards in South Africa. Overall, the analysis opens up new debates and as such concludes by providing avenues for further research.

Keywords: corporate governance, comparative corporate governance, corporate governance framework, government business enterprises, government linked companies, organizing models, ownership models, state-owned companies, state-owned enterprises

Procedia PDF Downloads 203
11573 Developing a Secure Iris Recognition System by Using Advance Convolutional Neural Network

Authors: Kamyar Fakhr, Roozbeh Salmani

Abstract:

Alphonse Bertillon developed the first biometric security system in the 1800s. Today, many governments and giant companies are considering or have procured biometrically enabled security schemes. Iris is a kaleidoscope of patterns and colors. Each individual holds a set of irises more unique than their thumbprint. Every single day, giant companies like Google and Apple are experimenting with reliable biometric systems. Now, after almost 200 years of improvements, face ID does not work with masks, it gives access to fake 3D images, and there is no global usage of biometric recognition systems as national identity (ID) card. The goal of this paper is to demonstrate the advantages of iris recognition overall biometric recognition systems. It make two extensions: first, we illustrate how a very large amount of internet fraud and cyber abuse is happening due to bugs in face recognition systems and in a very large dataset of 3.4M people; second, we discuss how establishing a secure global network of iris recognition devices connected to authoritative convolutional neural networks could be the safest solution to this dilemma. Another aim of this study is to provide a system that will prevent system infiltration caused by cyber-attacks and will block all wireframes to the data until the main user ceases the procedure.

Keywords: biometric system, convolutional neural network, cyber-attack, secure

Procedia PDF Downloads 199
11572 A Framework for Chinese Domain-Specific Distant Supervised Named Entity Recognition

Authors: Qin Long, Li Xiaoge

Abstract:

The Knowledge Graphs have now become a new form of knowledge representation. However, there is no consensus in regard to a plausible and definition of entities and relationships in the domain-specific knowledge graph. Further, in conjunction with several limitations and deficiencies, various domain-specific entities and relationships recognition approaches are far from perfect. Specifically, named entity recognition in Chinese domain is a critical task for the natural language process applications. However, a bottleneck problem with Chinese named entity recognition in new domains is the lack of annotated data. To address this challenge, a domain distant supervised named entity recognition framework is proposed. The framework is divided into two stages: first, the distant supervised corpus is generated based on the entity linking model of graph attention neural network; secondly, the generated corpus is trained as the input of the distant supervised named entity recognition model to train to obtain named entities. The link model is verified in the ccks2019 entity link corpus, and the F1 value is 2% higher than that of the benchmark method. The re-pre-trained BERT language model is added to the benchmark method, and the results show that it is more suitable for distant supervised named entity recognition tasks. Finally, it is applied in the computer field, and the results show that this framework can obtain domain named entities.

Keywords: distant named entity recognition, entity linking, knowledge graph, graph attention neural network

Procedia PDF Downloads 80
11571 Error Amount in Viscoelasticity Analysis Depending on Time Step Size and Method used in ANSYS

Authors: A. Fettahoglu

Abstract:

Theory of viscoelasticity is used by many researchers to represent behavior of many materials such as pavements on roads or bridges. Several researches used analytical methods and rheology to predict the material behaviors of simple models. Today, more complex engineering structures are analyzed using Finite Element Method, in which material behavior is embedded by means of three dimensional viscoelastic material laws. As a result, structures of unordinary geometry and domain like pavements of bridges can be analyzed by means of Finite Element Method and three dimensional viscoelastic equations. In the scope of this study, rheological models embedded in ANSYS, namely, generalized Maxwell elements and Prony series, which are two methods used by ANSYS to represent viscoelastic material behavior, are presented explicitly. Subsequently, a practical problem, which has an analytical solution given in literature, is used to verify the applicability of viscoelasticity tool embedded in ANSYS. Finally, amount of error in the results of ANSYS is compared with the analytical results to indicate the influence of used method and time step size.

Keywords: generalized Maxwell model, finite element method, prony series, time step size, viscoelasticity

Procedia PDF Downloads 354
11570 Reliability Analysis of a Life Support System in a Public Aquarium

Authors: Mehmet Savsar

Abstract:

Complex Life Support Systems (LSS) are used in all large commercial and public aquariums in order to keep the fish alive. Reliabilities of individual equipment, as well as the complete system, are extremely important and critical since the life and safety of important fish depend on these life support systems. Failure of some critical device or equipment, which do not have redundancy, results in negative consequences and affects life support as a whole. In this paper, we have considered a life support system in a large public aquarium in Kuwait Scientific Center and presented a procedure and analysis to show how the reliability of such systems can be estimated by using appropriate tools and collected data. We have also proposed possible improvements for systems reliability. In particular, addition of parallel components and spare parts are considered and the numbers of spare parts needed for each component to achieve a required reliability during specified lead time are calculated. The results show that significant improvements in system reliability can be achieved by operating some LSS components in parallel and having certain numbers of spares available in the spare parts inventories. The procedures and the results presented in this paper are expected to be useful for aquarium engineers and maintenance managers dealing with LSS.

Keywords: life support systems, aquariums, reliability, failures, availability, spare parts

Procedia PDF Downloads 268
11569 Virtual Modelling of Turbulent Fibre Flow in a Low Consistency Refiner for a Sustainable and Energy Efficient Process

Authors: Simon Ingelsten, Anton Lundberg, Vijay Shankar, Lars-Olof Landström, Örjan Johansson

Abstract:

The flow in a low consistency disc refiner is simulated with the aim of identifying flow structures possibly being of importance for a future study to optimise the energy efficiency in refining processes. A simplified flow geometry is used, where a single groove of a refiner disc is modelled. Two different fibre models are used to simulate turbulent fibre suspension flow in the groove. The first model is a Bingham viscoplastic fluid model where the fibre suspension is treated as a non-Newtonian fluid with a yield stress. The second model is a new model proposed in a recent study where the suspended fibres effect on flow is accounted for through a modelled orientation distribution function (ODF). Both models yielded similar results with small differences. Certain flow characteristics that were expected and that was found in the literature were identified. Some of these flow characteristics may be of importance in a future process to optimise the refiner geometry to increase the energy efficiency. Further study and a more detailed flow model is; however, needed in order for the simulations to yield results valid for quantitative use in such an optimisation study. An outline of the next steps in such a study is proposed.

Keywords: disc refiner, fibre flow, sustainability, turbulence modelling

Procedia PDF Downloads 391
11568 Nature of a Supercritical Mesophase

Authors: Hamza Javar Magnier, Leslie V. Woodcock

Abstract:

It has been reported that at temperatures above the critical there is no “continuity of liquid and gas”, as originally hypothesized by van der Waals. Rather, both gas and liquid phases, with characteristic properties as such, extend to supercritical temperatures. Each phase is bounded by the locus of a percolation transition, i.e. a higher-order thermodynamic phase change associated with percolation of gas clusters in a large void, or liquid interstitial vacancies in a large cluster. Between these two-phase bounds, it is reported there exists a mesophase that resembles an otherwise homogeneous dispersion of gas micro-bubbles in liquid (foam) and a dispersion of liquid micro-droplets in gas (mist). Such a colloidal-like state of a pure one-component fluid represents a hitherto unchartered equilibrium state of matter besides pure solid, liquid or gas. Here we provide compelling evidence, from molecular dynamics (MD) simulations, for the existence of this supercritical mesophase and its colloidal nature. We report preliminary results of computer simulations for a model fluid using a simplistic representation of atoms or molecules, i.e. a hard-core repulsion with an attraction so short that the atoms are referred to as “adhesive spheres”. Molecular clusters, and hence percolation transitions, are unambiguously defined. Graphics of color-coded clusters show colloidal characteristics of the supercritical mesophase.

Keywords: critical phenomena, mesophase, supercritical, square-well, critical parameters

Procedia PDF Downloads 415
11567 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning

Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park

Abstract:

The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.

Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement

Procedia PDF Downloads 219
11566 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome

Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler

Abstract:

Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.

Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model

Procedia PDF Downloads 139
11565 Analysis of the Development of Communicative Skills After Participating in the Equine-Assisted-Therapy Program Step-By-Step in Communication

Authors: Leticia Souza Guirra, Márcia Eduarda Vieira Ramos, Edlaine Souza Pereira, Leticia Correa Celeste

Abstract:

Introduction: Studies indicate that equine-assisted therapy enables improvements in several areas of functioning that are impaired in children with autism spectrum disorder (ASD), such as social interaction and communication. Objective: The study proposes to analyze the development of dialogic skills of a verbal child with ASD after participating in the equine-assisted therapy Step By Step in Communication. Method: This is quantitative and qualitative research through a case study. It refers to a 6 years old child diagnosed with ASD belonging to a group of practitioners of the Brazilian National Equine-Assited-Therapy Association. The Behavioral Observation Protocol (PROC) was used to evaluate communicative skills before and after the intervention, which consisted of 24 sessions once a week. Results: All conversational skills increased their frequency, with participation in dialogue and initiation of interaction. The child also increases the habit of waiting for his turn and answering the interlocutor. The emission of topics not related to conversation and echolalia showed a significant decrease after the intervention. Conclusion: The studied child showed improvement in communicative skills after participating in the equine-assisted therapy Step By Step in Communication. Contributions: This study contributes to a greater understanding of the impact of equine-assisted therapy on the communicative abilities of children with ASD.

Keywords: equine-assisted-therapy, autism spectrum disorder, language, communication, language and hearing sciences

Procedia PDF Downloads 70
11564 Qsar Studies of Certain Novel Heterocycles Derived From bis-1, 2, 4 Triazoles as Anti-Tumor Agents

Authors: Madhusudan Purohit, Stephen Philip, Bharathkumar Inturi

Abstract:

In this paper we report the quantitative structure activity relationship of novel bis-triazole derivatives for predicting the activity profile. The full model encompassed a dataset of 46 Bis- triazoles. Tripos Sybyl X 2.0 program was used to conduct CoMSIA QSAR modeling. The Partial Least-Squares (PLS) analysis method was used to conduct statistical analysis and to derive a QSAR model based on the field values of CoMSIA descriptor. The compounds were divided into test and training set. The compounds were evaluated by various CoMSIA parameters to predict the best QSAR model. An optimum numbers of components were first determined separately by cross-validation regression for CoMSIA model, which were then applied in the final analysis. A series of parameters were used for the study and the best fit model was obtained using donor, partition coefficient and steric parameters. The CoMSIA models demonstrated good statistical results with regression coefficient (r2) and the cross-validated coefficient (q2) of 0.575 and 0.830 respectively. The standard error for the predicted model was 0.16322. In the CoMSIA model, the steric descriptors make a marginally larger contribution than the electrostatic descriptors. The finding that the steric descriptor is the largest contributor for the CoMSIA QSAR models is consistent with the observation that more than half of the binding site area is occupied by steric regions.

Keywords: 3D QSAR, CoMSIA, triazoles, novel heterocycles

Procedia PDF Downloads 431
11563 Effect of Molecular Weight Distribution on Toughening Performance of Polybutadiene in Polystyrene

Authors: Mohamad Mohsen Yavarizadeh

Abstract:

Polystyrene (PS) and related homopolymers are brittle materials that typically fail in tensile tests at very low strains. These polymers can be toughened by the addition of rubbery particles which initiate a large number of crazes that produce substantial plastic strain at relatively low stresses. Considerable energy is dissipated in the formation of these crazes, producing a relatively tough material that shows an impact toughness of more than 5 times of pure PS. While cross linking of rubbery phase is necessary in aforementioned mechanism of toughening, another mechanism of toughening was also introduced in which low molecular weight liquid rubbers can also toughen PS when dispersed in the form of small pools in the glassy matrix without any cross linking. However, this new mechanism which is based on local plasticization, fails to act properly at high strain rate deformations, i.e. impact tests. In this work, the idea of combination of these two mechanisms was tried. To do so, Polybutadiene rubbers (PB) with bimodal distribution of molecular weight were prepared in which, comparable fractions of very high and very low molecular weight rubbers were mixed. Incorporation of these materials in PS matrix in a reactive process resulted in more significant increases in toughness of PS. In other words, although low molecular weight PB is ineffective in high strain rate impact test by itself, it showed a significant synergistic effect when combined with high molecular weight PB. Surprisingly, incorporation of just 10% of low molecular weight PB doubled the impact toughness of regular high impact PS (HIPS). It was observed that most of rubbery particles could initiate crazes. The effectiveness of low molecular weight PB in impact test was attributed to low strain rate deformation of each individual craze as a result of producing a large number of crazes in this material. In other words, high molecular weight PB chains make it possible to have an appropriate dispersion of rubbery phase in order to create a large number of crazes in the PS matrix and consequently decrease the velocity of each craze. Low molecular weight PB, in turn, would have enough time to locally plasticize craze fibrils and enhance the energy dissipation.

Keywords: molecular weight distribution, polystyrene, toughness, homopolymer

Procedia PDF Downloads 431