Search results for: computer modelling
3690 Clustering and Modelling Electricity Conductors from 3D Point Clouds in Complex Real-World Environments
Authors: Rahul Paul, Peter Mctaggart, Luke Skinner
Abstract:
Maintaining public safety and network reliability are the core objectives of all electricity distributors globally. For many electricity distributors, managing vegetation clearances from their above ground assets (poles and conductors) is the most important and costly risk mitigation control employed to meet these objectives. Light Detection And Ranging (LiDAR) is widely used by utilities as a cost-effective method to inspect their spatially-distributed assets at scale, often captured using high powered LiDAR scanners attached to fixed wing or rotary aircraft. The resulting 3D point cloud model is used by these utilities to perform engineering grade measurements that guide the prioritisation of vegetation cutting programs. Advances in computer vision and machine-learning approaches are increasingly applied to increase automation and reduce inspection costs and time; however, real-world LiDAR capture variables (e.g., aircraft speed and height) create complexity, noise, and missing data, reducing the effectiveness of these approaches. This paper proposes a method for identifying each conductor from LiDAR data via clustering methods that can precisely reconstruct conductors in complex real-world configurations in the presence of high levels of noise. It proposes 3D catenary models for individual clusters fitted to the captured LiDAR data points using a least square method. An iterative learning process is used to identify potential conductor models between pole pairs. The proposed method identifies the optimum parameters of the catenary function and then fits the LiDAR points to reconstruct the conductors.Keywords: point cloud, LİDAR data, machine learning, computer vision, catenary curve, vegetation management, utility industry
Procedia PDF Downloads 993689 Assessment the Correlation of Rice Yield Traits by Simulation and Modelling Methods
Authors: Davood Barari Tari
Abstract:
In order to investigate the correlation of rice traits in different nitrogen management methods by modeling programming, an experiment was laid out in rice paddy field in an experimental field at Caspian Coastal Sea region from 2013 to 2014. Variety used was Shiroudi as a high yielding variety. Nitrogen management was in two methods. Amount of nitrogen at four levels (30, 60, 90, and 120 Kg N ha-1 and control) and nitrogen-splitting at four levels (T1: 50% in base + 50% in maximum tillering stage, T2= 33.33% basal +33.33% in maximum tillering stage +33.33% in panicle initiation stage, T3=25% basal+37.5% in maximum tillering stage +37.5% in panicle initiation stage, T4: 25% in basal + 25% in maximum tillering stage + 50% in panicle initiation stage). Results showed that nitrogen traits, total grain number, filled spikelets, panicle number per m2 had a significant correlation with grain yield. Results related to calibrated and validation of rice model methods indicated that correlation between rice yield and yield components was accurate. The correlation between panicle length and grain yield was minimum. Physiological indices was simulated with low accuracy. According to results, investigation of the correlation between rice traits in physiological, morphological and phenological characters and yield by modeling and simulation methods are very useful.Keywords: rice, physiology, modelling, simulation, yield traits
Procedia PDF Downloads 3423688 Seepage Modelling of Jatigede Dam Towards Cisampih Village Based on Analysis Soil Characteristic Using Method Soil Reaction to Water, West Java Indonesia
Authors: Diemas Purnama Muhammad Firman Pratama, Denny Maulana Malik
Abstract:
Development of Jatigede Dam that was the mega project in Indonesia, since 1963. Area of around Jatigede Dam is complex, it has structural geology active fault, and as possible can occur landslide. This research focus on soil test. The purpose of this research to know soil quality Jatigede Dam which caused by water seepage of Jatigede Dam, then can be made seepage modelling around Jatigede Dam including Cisampih Village. Method of this research is SRW (Soil Reaction to Water). There are three samples are taken nearby Jatigede Dam. Four paramaters to determine water seepage such as : V ( velocity of soil to release water), Dl (Ability of soil to release water), Ds (Ability of soil to absorb water), Dt (Ability of soil to hold water). meanwhile, another proscess of interaction beetween water and soil are produced angle, which is made of water flow and vertikal line. Called name SIAT. SIAT has two type is na1 and na2. Each samples has a value from the first sample is 280,333(degree), the second 270 (degree) and the third 270 (degree). The difference na1 is, water interaction towards Dt value angle, while na2 is water interaction towards Dl and Ds value angle. Result of calculating SRW method, first till third sample has a value 7, 11,5 and 9. Based on data, interpreted in around teritory of Jatigede Dam, will get easier impact from water seepage because, condition soil reaction too bad so, it can not hold water.Keywords: Jatigede Dam, Cisampih village, water seepage, soil quality
Procedia PDF Downloads 3743687 Training of Future Computer Science Teachers Based on Machine Learning Methods
Authors: Meruert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova
Abstract:
The article highlights and describes the characteristic features of real-time face detection in images and videos using machine learning algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As a result, the advantages and disadvantages of Haar Cascade (Haar Cascade OpenCV), HoG SVM (Histogram of Oriented Gradients, Support Vector Machine), and MMOD CNN Dlib (Max-Margin Object Detection, convolutional neural network) detectors used for face detection were determined. Dlib is a general-purpose cross-platform software library written in the programming language C++. It includes detectors used for determining face detection. The Cascade OpenCV algorithm is efficient for fast face detection. The considered work forms the basis for the development of machine learning methods by future computer science teachers.Keywords: algorithm, artificial intelligence, education, machine learning
Procedia PDF Downloads 733686 Solution of Singularly Perturbed Differential Difference Equations Using Liouville Green Transformation
Authors: Y. N. Reddy
Abstract:
The class of differential-difference equations which have characteristics of both classes, i.e., delay/advance and singularly perturbed behaviour is known as singularly perturbed differential-difference equations. The expression ‘positive shift’ and ‘negative shift’ are also used for ‘advance’ and ‘delay’ respectively. In general, an ordinary differential equation in which the highest order derivative is multiplied by a small positive parameter and containing at least one delay/advance is known as singularly perturbed differential-difference equation. Singularly perturbed differential-difference equations arise in the modelling of various practical phenomena in bioscience, engineering, control theory, specifically in variational problems, in describing the human pupil-light reflex, in a variety of models for physiological processes or diseases and first exit time problems in the modelling of the determination of expected time for the generation of action potential in nerve cells by random synaptic inputs in dendrites. In this paper, we envisage the use of Liouville Green Transformation to find the solution of singularly perturbed differential difference equations. First, using Taylor series, the given singularly perturbed differential difference equation is approximated by an asymptotically equivalent singularly perturbation problem. Then the Liouville Green Transformation is applied to get the solution. Several model examples are solved, and the results are compared with other methods. It is observed that the present method gives better approximate solutions.Keywords: difference equations, differential equations, singular perturbations, boundary layer
Procedia PDF Downloads 1993685 Recommendations for Teaching Word Formation for Students of Linguistics Using Computer Terminology as an Example
Authors: Svetlana Kostrubina, Anastasia Prokopeva
Abstract:
This research presents a comprehensive study of the word formation processes in computer terminology within English and Russian languages and provides listeners with a system of exercises for training these skills. The originality is that this study focuses on a comparative approach, which shows both general patterns and specific features of English and Russian computer terms word formation. The key point is the system of exercises development for training computer terminology based on Bloom’s taxonomy. Data contain 486 units (228 English terms from the Glossary of Computer Terms and 258 Russian terms from the Terminological Dictionary-Reference Book). The objective is to identify the main affixation models in the English and Russian computer terms formation and to develop exercises. To achieve this goal, the authors employed Bloom’s Taxonomy as a methodological framework to create a systematic exercise program aimed at enhancing students’ cognitive skills in analyzing, applying, and evaluating computer terms. The exercises are appropriate for various levels of learning, from basic recall of definitions to higher-order thinking skills, such as synthesizing new terms and critically assessing their usage in different contexts. Methodology also includes: a method of scientific and theoretical analysis for systematization of linguistic concepts and clarification of the conceptual and terminological apparatus; a method of nominative and derivative analysis for identifying word-formation types; a method of word-formation analysis for organizing linguistic units; a classification method for determining structural types of abbreviations applicable to the field of computer communication; a quantitative analysis technique for determining the productivity of methods for forming abbreviations of computer vocabulary based on the English and Russian computer terms, as well as a technique of tabular data processing for a visual presentation of the results obtained. a technique of interlingua comparison for identifying common and different features of abbreviations of computer terms in the Russian and English languages. The research shows that affixation retains its productivity in the English and Russian computer terms formation. Bloom’s taxonomy allows us to plan a training program and predict the effectiveness of the compiled program based on the assessment of the teaching methods used.Keywords: word formation, affixation, computer terms, Bloom's taxonomy
Procedia PDF Downloads 113684 Development of Total Maximum Daily Load Using Water Quality Modelling as an Approach for Watershed Management in Malaysia
Authors: S. A. Che Osmi, W. M. F. Wan Ishak, H. Kim, M. A. Azman, M. A. Ramli
Abstract:
River is one of important water sources for many activities including industrial and domestic usage such as daily usage, transportation, power supply and recreational activities. However, increasing activities in a river has grown the sources of pollutant enters the water bodies, and degraded the water quality of the river. It becomes a challenge to develop an effective river management to ensure the water sources of the river are well managed and regulated. In Malaysia, several approaches for river management have been implemented such as Integrated River Basin Management (IRBM) program for coordinating the management of resources in a natural environment based on river basin to ensure their sustainability lead by Department of Drainage and Irrigation (DID), Malaysia. Nowadays, Total Maximum Daily Load (TMDL) is one of the best approaches for river management in Malaysia. TMDL implementation is regulated and implemented in the United States. A study on the development of TMDL in Malacca River has been carried out by doing water quality monitoring, the development of water quality model by using Environmental Fluid Dynamic Codes (EFDC), and TMDL implementation plan. The implementation of TMDL will help the stakeholders and regulators to control and improve the water quality of the river. It is one of the good approaches for river management in Malaysia.Keywords: EFDC, river management, TMDL, water quality modelling
Procedia PDF Downloads 3283683 Performance of Derna Steam Power Plant at Varying Super-Heater Operating Conditions Based on Exergy
Authors: Idris Elfeituri
Abstract:
In the current study, energy and exergy analysis of a 65 MW steam power plant was carried out. This study investigated the effect of variations of overall conductance of the super heater on the performance of an existing steam power plant located in Derna, Libya. The performance of the power plant was estimated by a mathematical modelling which considers the off-design operating conditions of each component. A fully interactive computer program based on the mass, energy and exergy balance equations has been developed. The maximum exergy destruction has been found in the steam generation unit. A 50% reduction in the design value of overall conductance of the super heater has been achieved, which accordingly decreases the amount of the net electrical power that would be generated by at least 13 MW, as well as the overall plant exergy efficiency by at least 6.4%, and at the same time that would cause an increase of the total exergy destruction by at least 14 MW. The achieved results showed that the super heater design and operating conditions play an important role on the thermodynamics performance and the fuel utilization of the power plant. Moreover, these considerations are very useful in the process of the decision that should be taken at the occasions of deciding whether to replace or renovate the super heater of the power plant.Keywords: Exergy, Super-heater, Fouling; Steam power plant; Off-design., Fouling;, Super-heater, Steam power plant
Procedia PDF Downloads 3333682 Analyzing the Attitudes of Prep-Class Students at Higher Education towards Computer-Based Foreign Language Education
Authors: Sakine Sincer
Abstract:
In today’s world, the borders between countries and globalization are getting faster. It is an undeniable fact that this trend mostly results from the developments and improvements in technology. Technology, which dominates our lives to a great extent, has turned out to be one of the most important resources to be used in building an effective and fruitful educational atmosphere. Nowadays, technology is a significant means of arranging educational activities at all levels of education such as primary, secondary or tertiary education. This study aims at analyzing the attitudes of prep-class students towards computer-based foreign language education. Within the scope of this study, prep-class students at a university in Ankara, Turkey in 2013-2014 Academic Year participated in this study. The participants were asked to fill in 'Computer-Based Educational Attitude Scale.' The data gathered in this study were analyzed by means of using statistical devices such as means, standard deviation, percentage as well as t-test and ANOVA. At the end of the analysis, it was found out that the participants had a highly positive attitude towards computer-based language education.Keywords: computer-based education, foreign language education, higher education, prep-class
Procedia PDF Downloads 4383681 Computer-Integrated Surgery of the Human Brain, New Possibilities
Authors: Ugo Galvanetto, Pirto G. Pavan, Mirco Zaccariotto
Abstract:
The discipline of Computer-integrated surgery (CIS) will provide equipment able to improve the efficiency of healthcare systems and, which is more important, clinical results. Surgeons and machines will cooperate in new ways that will extend surgeons’ ability to train, plan and carry out surgery. Patient specific CIS of the brain requires several steps: 1 - Fast generation of brain models. Based on image recognition of MR images and equipped with artificial intelligence, image recognition techniques should differentiate among all brain tissues and segment them. After that, automatic mesh generation should create the mathematical model of the brain in which the various tissues (white matter, grey matter, cerebrospinal fluid …) are clearly located in the correct positions. 2 – Reliable and fast simulation of the surgical process. Computational mechanics will be the crucial aspect of the entire procedure. New algorithms will be used to simulate the mechanical behaviour of cutting through cerebral tissues. 3 – Real time provision of visual and haptic feedback A sophisticated human-machine interface based on ergonomics and psychology will provide the feedback to the surgeon. The present work will address in particular point 2. Modelling the cutting of soft tissue in a structure as complex as the human brain is an extremely challenging problem in computational mechanics. The finite element method (FEM), that accurately represents complex geometries and accounts for material and geometrical nonlinearities, is the most used computational tool to simulate the mechanical response of soft tissues. However, the main drawback of FEM lies in the mechanics theory on which it is based, classical continuum Mechanics, which assumes matter is a continuum with no discontinuity. FEM must resort to complex tools such as pre-defined cohesive zones, external phase-field variables, and demanding remeshing techniques to include discontinuities. However, all approaches to equip FEM computational methods with the capability to describe material separation, such as interface elements with cohesive zone models, X-FEM, element erosion, phase-field, have some drawbacks that make them unsuitable for surgery simulation. Interface elements require a-priori knowledge of crack paths. The use of XFEM in 3D is cumbersome. Element erosion does not conserve mass. The Phase Field approach adopts a diffusive crack model instead of describing true tissue separation typical of surgical procedures. Modelling discontinuities, so difficult when using computational approaches based on classical continuum Mechanics, is instead easy for novel computational methods based on Peridynamics (PD). PD is a non-local theory of mechanics formulated with no use of spatial derivatives. Its governing equations are valid at points or surfaces of discontinuity, and it is, therefore especially suited to describe crack propagation and fragmentation problems. Moreover, PD does not require any criterium to decide the direction of crack propagation or the conditions for crack branching or coalescence; in the PD-based computational methods, cracks develop spontaneously in the way which is the most convenient from an energy point of view. Therefore, in PD computational methods, crack propagation in 3D is as easy as it is in 2D, with a remarkable advantage with respect to all other computational techniques.Keywords: computational mechanics, peridynamics, finite element, biomechanics
Procedia PDF Downloads 803680 Carbon Nanotube Field Effect Transistor - a Review
Authors: P. Geetha, R. S. D. Wahida Banu
Abstract:
The crowning advances in Silicon based electronic technology have dominated the computation world for the past decades. The captivating performance of Si devices lies in sustainable scaling down of the physical dimensions, by that increasing device density and improved performance. But, the fundamental limitations due to physical, technological, economical, and manufacture features restrict further miniaturization of Si based devices. The pit falls are due to scaling down of the devices such as process variation, short channel effects, high leakage currents, and reliability concerns. To fix the above-said problems, it is needed either to follow a new concept that will manage the current hitches or to support the available concept with different materials. The new concept is to design spintronics, quantum computation or two terminal molecular devices. Otherwise, presently used well known three terminal devices can be modified with different materials that suits to address the scaling down difficulties. The first approach will occupy in the far future since it needs considerable effort; the second path is a bright light towards the travel. Modelling paves way to know not only the current-voltage characteristics but also the performance of new devices. So, it is desirable to model a new device of suitable gate control and project the its abilities towards capability of handling high current, high power, high frequency, short delay, and high velocity with excellent electronic and optical properties. Carbon nanotube became a thriving material to replace silicon in nano devices. A well-planned optimized utilization of the carbon material leads to many more advantages. The unique nature of this organic material allows the recent developments in almost all fields of applications from an automobile industry to medical science, especially in electronics field-on which the automation industry depends. More research works were being done in this area. This paper reviews the carbon nanotube field effect transistor with various gate configurations, number of channel element, CNT wall configurations and different modelling techniques.Keywords: array of channels, carbon nanotube field effect transistor, double gate transistor, gate wrap around transistor, modelling, multi-walled CNT, single-walled CNT
Procedia PDF Downloads 3253679 Exploring the Applications of Neural Networks in the Adaptive Learning Environment
Authors: Baladitya Swaika, Rahul Khatry
Abstract:
Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.Keywords: computer adaptive tests, item response theory, machine learning, neural networks
Procedia PDF Downloads 1753678 Neural Network Modelling for Turkey Railway Load Carrying Demand
Authors: Humeyra Bolakar Tosun
Abstract:
The transport sector has an undisputed place in human life. People need transport access to continuous increase day by day with growing population. The number of rail network, urban transport planning, infrastructure improvements, transportation management and other related areas is a key factor affecting our country made it quite necessary to improve the work of transportation. In this context, it plays an important role in domestic rail freight demand planning. Alternatives that the increase in the transportation field and has made it mandatory requirements such as the demand for improving transport quality. In this study generally is known and used in studies by the definition, rail freight transport, railway line length, population, energy consumption. In this study, Iron Road Load Net Demand was modeled by multiple regression and ANN methods. In this study, model dependent variable (Output) is Iron Road Load Net demand and 6 entries variable was determined. These outcome values extracted from the model using ANN and regression model results. In the regression model, some parameters are considered as determinative parameters, and the coefficients of the determinants give meaningful results. As a result, ANN model has been shown to be more successful than traditional regression model.Keywords: railway load carrying, neural network, modelling transport, transportation
Procedia PDF Downloads 1433677 Frequency Recognition Models for Steady State Visual Evoked Potential Based Brain Computer Interfaces (BCIs)
Authors: Zeki Oralhan, Mahmut Tokmakçı
Abstract:
SSVEP based brain computer interface (BCI) systems have been preferred, because of high information transfer rate (ITR) and practical use. ITR is the parameter of BCI overall performance. For high ITR value, one of specification BCI system is that has high accuracy. In this study, we investigated to recognize SSVEP with shorter time and lower error rate. In the experiment, there were 8 flickers on light crystal display (LCD). Participants gazed to flicker which had 12 Hz frequency and 50% duty cycle ratio on the LCD during 10 seconds. During the experiment, EEG signals were acquired via EEG device. The EEG data was filtered in preprocessing session. After that Canonical Correlation Analysis (CCA), Multiset CCA (MsetCCA), phase constrained CCA (PCCA), and Multiway CCA (MwayCCA) methods were applied on data. The highest average accuracy value was reached when MsetCCA was applied.Keywords: brain computer interface, canonical correlation analysis, human computer interaction, SSVEP
Procedia PDF Downloads 2663676 Design and Realization of Computer Network Security Perception Control System
Authors: El Miloudi Djelloul
Abstract:
Based on analysis on applications by perception control technology in computer network security status and security protection measures, from the angles of network physical environment and network software system environmental security, this paper provides network security system perception control solution using Internet of Things (IOT), telecom and other perception technologies. Security Perception Control System is in the computer network environment, utilizing Radio Frequency Identification (RFID) of IOT and telecom integration technology to carry out integration design for systems. In the network physical security environment, RFID temperature, humidity, gas and perception technologies are used to do surveillance on environmental data, dynamic perception technology is used for network system security environment, user-defined security parameters, security log are used for quick data analysis, extends control on I/O interface, by development of API and AT command, Computer Network Security Perception Control based on Internet and GSM/GPRS is achieved, which enables users to carry out interactive perception and control for network security environment by WEB, E-MAIL as well as PDA, mobile phone short message and Internet. In the system testing, through middle ware server, security information data perception in real time with deviation of 3-5% was achieved; it proves the feasibility of Computer Network Security Perception Control System.Keywords: computer network, perception control system security strategy, Radio Frequency Identification (RFID)
Procedia PDF Downloads 4463675 Using Flow Line Modelling, Remote Sensing for Reconstructing Glacier Volume Loss Model for Athabasca Glacier, Canadian Rockies
Authors: Rituparna Nath, Shawn J. Marshall
Abstract:
Glaciers are one of the main sensitive climatic indicators, as they respond strongly to small climatic shifts. We develop a flow line model of glacier dynamics to simulate the past and future extent of glaciers in the Canadian Rocky Mountains, with the aim of coupling this model within larger scale regional climate models of glacier response to climate change. This paper will focus on glacier-climate modeling and reconstructions of glacier volume from the Little Ice Age (LIA) to present for Athabasca Glacier, Alberta, Canada. Glacier thickness, volume and mass change will be constructed using flow line modelling and examination of different climate scenarios that are able to give good reconstructions of LIA ice extent. With the availability of SPOT 5 imagery, Digital elevation models and GIS Arc Hydro tool, ice catchment properties-glacier width and LIA moraines have been extracted using automated procedures. Simulation of glacier mass change will inform estimates of meltwater run off over the historical period and model calibration from the LIA reconstruction will aid in future projections of the effects of climate change on glacier recession. Furthermore, the model developed will be effective for further future studies with ensembles of glaciers.Keywords: flow line modeling, Athabasca Glacier, glacier mass balance, Remote Sensing, Arc hydro tool, little ice age
Procedia PDF Downloads 2683674 Development of a Paediatric Head Model for the Computational Analysis of Head Impact Interactions
Authors: G. A. Khalid, M. D. Jones, R. Prabhu, A. Mason-Jones, W. Whittington, H. Bakhtiarydavijani, P. S. Theobald
Abstract:
Head injury in childhood is a common cause of death or permanent disability from injury. However, despite its frequency and significance, there is little understanding of how a child’s head responds during injurious loading. Whilst Infant Post Mortem Human Subject (PMHS) experimentation is a logical approach to understand injury biomechanics, it is the authors’ opinion that a lack of subject availability is hindering potential progress. Computer modelling adds great value when considering adult populations; however, its potential remains largely untapped for infant surrogates. The complexities of child growth and development, which result in age dependent changes in anatomy, geometry and physical response characteristics, present new challenges for computational simulation. Further geometric challenges are presented by the intricate infant cranial bones, which are separated by sutures and fontanelles and demonstrate a visible fibre orientation. This study presents an FE model of a newborn infant’s head, developed from high-resolution computer tomography scans, informed by published tissue material properties. To mimic the fibre orientation of immature cranial bone, anisotropic properties were applied to the FE cranial bone model, with elastic moduli representing the bone response both parallel and perpendicular to the fibre orientation. Biofiedility of the computational model was confirmed by global validation against published PMHS data, by replicating experimental impact tests with a series of computational simulations, in terms of head kinematic responses. Numerical results confirm that the FE head model’s mechanical response is in favourable agreement with the PMHS drop test results.Keywords: finite element analysis, impact simulation, infant head trauma, material properties, post mortem human subjects
Procedia PDF Downloads 3263673 Numerical Methods versus Bjerksund and Stensland Approximations for American Options Pricing
Authors: Marasovic Branka, Aljinovic Zdravka, Poklepovic Tea
Abstract:
Numerical methods like binomial and trinomial trees and finite difference methods can be used to price a wide range of options contracts for which there are no known analytical solutions. American options are the most famous of that kind of options. Besides numerical methods, American options can be valued with the approximation formulas, like Bjerksund-Stensland formulas from 1993 and 2002. When the value of American option is approximated by Bjerksund-Stensland formulas, the computer time spent to carry out that calculation is very short. The computer time spent using numerical methods can vary from less than one second to several minutes or even hours. However to be able to conduct a comparative analysis of numerical methods and Bjerksund-Stensland formulas, we will limit computer calculation time of numerical method to less than one second. Therefore, we ask the question: Which method will be most accurate at nearly the same computer calculation time?Keywords: Bjerksund and Stensland approximations, computational analysis, finance, options pricing, numerical methods
Procedia PDF Downloads 4563672 Building Information Models Utilization for Design Improvement of Infrastructure
Authors: Keisuke Fujioka, Yuta Itoh, Masaru Minagawa, Shunji Kusayanagi
Abstract:
In this study, building information models of the underground temporary structures and adjacent embedded pipes were constructed to show the importance of the information on underground pipes adjacent to the structures to enhance the productivity of execution of construction. Next, the bar chart used in actual construction process were employed to make the Gantt chart, and the critical pass analysis was carried out to show that accurate information on the arrangement of underground existing pipes can be used for the enhancement of the productivity of the construction of underground structures. In the analyzed project, significant construction delay was not caused by unforeseeable existence of underground pipes by the management ability of the construction manager. However, in many cases of construction executions in the developing countries, the existence of unforeseeable embedded pipes often causes substantial delay of construction. Design change based on uncertainty on the position information of embedded pipe can be also important risk for contractors in domestic construction. So CPM analyses were performed by a project-management-software to the situation that influence of the tasks causing construction delay was assumed more significant. Through the analyses, the efficiency of information management on underground pipes and BIM analysis in the design stage for workability improvement was indirectly confirmed.Keywords: building-information modelling, construction information modelling, design improvement, infrastructure
Procedia PDF Downloads 3083671 Invasive Ranges of Gorse (Ulex europaeus) in South Australia and Sri Lanka Using Species Distribution Modelling
Authors: Champika S. Kariyawasam
Abstract:
The distribution of gorse (Ulex europaeus) plants in South Australia has been modelled using 126 presence-only location data as a function of seven climate parameters. The predicted range of U. europaeus is mainly along the Mount Lofty Ranges in the Adelaide Hills and on Kangaroo Island. Annual precipitation and yearly average aridity index appeared to be the highest contributing variables to the final model formulation. The Jackknife procedure was employed to identify the contribution of different variables to gorse model outputs and response curves were used to predict changes with changing environmental variables. Based on this analysis, it was revealed that the combined effect of one or more variables could make a completely different impact to the original variables on their own to the model prediction. This work also demonstrates the need for a careful approach when selecting environmental variables for projecting correlative models to climatically distinct area. Maxent acts as a robust model when projecting the fitted species distribution model to another area with changing climatic conditions, whereas the generalized linear model, bioclim, and domain models to be less robust in this regard. These findings are important not only for predicting and managing invasive alien gorse in South Australia and Sri Lanka but also in other countries of the invasive range.Keywords: invasive species, Maxent, species distribution modelling, Ulex europaeus
Procedia PDF Downloads 1343670 Input-Output Analysis in Laptop Computer Manufacturing
Authors: H. Z. Ulukan, E. Demircioğlu, M. Erol Genevois
Abstract:
The scope of this paper and the aim of proposed model were to apply monetary Input –Output (I-O) analysis to point out the importance of reusing know-how and other requirements in order to reduce the production costs in a manufacturing process for a laptop computer. I-O approach using the monetary input-output model is employed to demonstrate the impacts of different factors in a manufacturing process. A sensitivity analysis showing the correlation between these different factors is also presented. It is expected that the recommended model would have an advantageous effect in the cost minimization process.Keywords: input-output analysis, monetary input-output model, manufacturing process, laptop computer
Procedia PDF Downloads 3913669 Evolution of Deformation in the Southern Central Tunisian Atlas: Parameters and Modelling
Authors: Mohamed Sadok Bensalem, Soulef Amamria, Khaled Lazzez, Mohamed Ghanmi
Abstract:
The southern-central Tunisian Atlas presents a typical example of an external zone. It occupies a particular position in the North African chains: firstly, it is the eastern limit of atlassic structures; secondly, it is the edges between the belts structures to the north and the stable Saharan platform in the south. The evolution of deformation study is based on several methods, such as classical or numerical methods. The principals parameters controlling the genesis of folds in the southern central Tunisian Atlas are; the reactivation of pre-existing faults during the later compressive phase, the evolution of decollement level, and the relation between thin and thick-skinned. One of the more principal characters of the southern-central Tunisian Atlas is the variation of belts structures directions determined by: NE-SW direction, named the attlassic direction in Tunisia, the NW-SE direction carried along the Gafsa fault (the oriental limit of southern atlassic accident), and the E-W direction defined in the southern Tunisian Atlas. This variation of direction is the result of important variation of deformation during different tectonics phases. A classical modelling of the Jebel ElKebar anticline, based on faults throw of the pre-existing faults and its reactivation during compressive phases, shows the importance of extensional deformation, particular during Aptian-Albian period, comparing with that of later compression (Alpine phases). A numerical modelling, based on the software Rampe E.M. 1.5.0, applied on the anticline of Jebel Orbata confirms the interpretation of “fault related fold” with decollement level within the Triassic successions. The other important parameter of evolution of deformation is the vertical migration of decollement level; indeed, more than the decollement level is in the recent series, most that the deformation is accentuated. The evolution of deformation is marked the development of duplex structure in Jebel At Taghli (eastern limit of Jebel Orbata). Consequently, the evolution of deformation is proportional to the depth of the decollement level, the most important deformation is in the higher successions; thus, is associated to the thin-skinned deformation; the decollement level permit the passive transfer of deformation in the cover.Keywords: evolution of deformation, pre-existing faults, decollement level, thin-skinned
Procedia PDF Downloads 1263668 Multilevel Modelling of Modern Contraceptive Use in Nigeria: Analysis of the 2013 NDHS
Authors: Akiode Ayobami, Akiode Akinsewa, Odeku Mojisola, Salako Busola, Odutolu Omobola, Nuhu Khadija
Abstract:
Purpose: Evidence exists that family planning use can contribute to reduction in infant and maternal mortality in any country. Despite these benefits, contraceptive use in Nigeria still remains very low, only 10% among married women. Understanding factors that predict contraceptive use is very important in order to improve the situation. In this paper, we analysed data from the 2013 Nigerian Demographic and Health Survey (NDHS) to better understand predictors of contraceptive use in Nigeria. The use of logistics regression and other traditional models in this type of situation is not appropriate as they do not account for social structure influence brought about by the hierarchical nature of the data on response variable. We therefore used multilevel modelling to explore the determinants of contraceptive use in order to account for the significant variation in modern contraceptive use by socio-demographic, and other proximate variables across the different Nigerian states. Method: This data has a two-level hierarchical structure. We considered the data of 26, 403 married women of reproductive age at level 1 and nested them within the 36 states and the Federal Capital Territory, Abuja at level 2. We modelled use of modern contraceptive against demographic variables, being told about FP at health facility, heard of FP on TV, Magazine or radio, husband desire for more children nested within the state. Results: Our results showed that the independent variables in the model were significant predictors of modern contraceptive use. The estimated variance component for the null model, random intercept, and random slope models were significant (p=0.00), indicating that the variation in contraceptive use across the Nigerian states is significant, and needs to be accounted for in order to accurately determine the predictors of contraceptive use, hence the data is best fitted by the multilevel model. Only being told about family planning at the health facility and religion have a significant random effect, implying that their predictability of contraceptive use varies across the states. Conclusion and Recommendation: Results showed that providing FP information at the health facility and religion needs to be considered when programming to improve contraceptive use at the state levels.Keywords: multilevel modelling, family planning, predictors, Nigeria
Procedia PDF Downloads 4183667 Understanding New Zealand’s 19th Century Timber Churches: Techniques in Extracting and Applying Underlying Procedural Rules
Authors: Samuel McLennan, Tane Moleta, Andre Brown, Marc Aurel Schnabel
Abstract:
The development of Ecclesiastical buildings within New Zealand has produced some unique design characteristics that take influence from both international styles and local building methods. What this research looks at is how procedural modelling can be used to define such common characteristics and understand how they are shared and developed within different examples of a similar architectural style. This will be achieved through the creation of procedural digital reconstructions of the various timber Gothic Churches built during the 19th century in the city of Wellington, New Zealand. ‘Procedural modelling’ is a digital modelling technique that has been growing in popularity, particularly within the game and film industry, as well as other fields such as industrial design and architecture. Such a design method entails the creation of a parametric ‘ruleset’ that can be easily adjusted to produce many variations of geometry, rather than a single geometry as is typically found in traditional CAD software. Key precedents within this area of digital heritage includes work by Haegler, Müller, and Gool, Nicholas Webb and Andre Brown, and most notably Mark Burry. What these precedents all share is how the forms of the reconstructed architecture have been generated using computational rules and an understanding of the architects’ geometric reasoning. This is also true within this research as Gothic architecture makes use of only a select range of forms (such as the pointed arch) that can be accurately replicated using the same standard geometric techniques originally used by the architect. The methodology of this research involves firstly establishing a sample group of similar buildings, documenting the existing samples, researching any lost samples to find evidence such as architectural plans, photos, and written descriptions, and then culminating all the findings into a single 3D procedural asset within the software ‘Houdini’. The end result will be an adjustable digital model that contains all the architectural components of the sample group, such as the various naves, buttresses, and windows. These components can then be selected and arranged to create visualisations of the sample group. Because timber gothic churches in New Zealand share many details between designs, the created collection of architectural components can also be used to approximate similar designs not included in the sample group, such as designs found beyond the Wellington Region. This creates an initial library of architectural components that can be further expanded on to encapsulate as wide of a sample size as desired. Such a methodology greatly improves upon the efficiency and adjustability of digital modelling compared to current practices found in digital heritage reconstruction. It also gives greater accuracy to speculative design, as a lack of evidence for lost structures can be approximated using components from still existing or better-documented examples. This research will also bring attention to the cultural significance these types of buildings have within the local area, addressing the public’s general unawareness of architectural history that is identified in the Wellington based research ‘Moving Images in Digital Heritage’ by Serdar Aydin et al.Keywords: digital forensics, digital heritage, gothic architecture, Houdini, procedural modelling
Procedia PDF Downloads 1313666 Effectiveness of Computer-Based Cognitive Training in Improving Attention-Deficit/Hyperactivity Disorder Rehabilitation
Authors: Marjan Ghazisaeedi, Azadeh Bashiri
Abstract:
Background: Attention-Deficit/Hyperactivity Disorder(ADHD), is one of the most common psychiatric disorders in early childhood that in addition to its main symptoms provide significant deficits in the areas of educational, social and individual relationship. Considering the importance of rehabilitation in ADHD patients to control these problems, this study investigated the advantages of computer-based cognitive training in these patients. Methods: This review article has been conducted by searching articles since 2005 in scientific databases and e-Journals and by using keywords including computerized cognitive rehabilitation, computer-based training and ADHD. Results: Since drugs have short term effects and also they have many side effects in the rehabilitation of ADHD patients, using supplementary methods such as computer-based cognitive training is one of the best solutions. This approach has quick feedback and also has no side effects. So, it provides promising results in cognitive rehabilitation of ADHD especially on the working memory and attention. Conclusion: Considering different cognitive dysfunctions in ADHD patients, application of the computerized cognitive training has the potential to improve cognitive functions and consequently social, academic and behavioral performances in patients with this disorder.Keywords: ADHD, computer-based cognitive training, cognitive functions, rehabilitation
Procedia PDF Downloads 2773665 3D Biomechanics Analysis of Tennis Elbow Factors & Injury Prevention Using Computer Vision and AI
Authors: Aaron Yan
Abstract:
Tennis elbow has been a leading injury and problem among amateur and even professional players. Many factors contribute to tennis elbow. In this research, we apply state of the art sensor-less computer vision and AI technology to study the biomechanics of a player’s tennis movements during training and competition as they relate to the causes of tennis elbow. We provide a framework for the analysis of key biomechanical parameters and their correlations with specific tennis stroke and movements that can lead to tennis elbow or elbow injury. We also devise a method for using AI to automatically detect player’s forms that can lead to tennis elbow development for on-court injury prevention.Keywords: Tennis Elbow, Computer Vision, AI, 3DAT
Procedia PDF Downloads 463664 The Experimental and Numerical Analysis of the Joining Processes for Air Conditioning Systems
Authors: M.St. Węglowski, D. Miara, S. Błacha, J. Dworak, J. Rykała, K. Kwieciński, J. Pikuła, G. Ziobro, A. Szafron, P. Zimierska-Nowak, M. Richert, P. Noga
Abstract:
In the paper the results of welding of car’s air-conditioning elements are presented. These systems based on, mainly, the environmental unfriendly refrigerants. Thus, the producers of cars will have to stop using traditional refrigerant and to change it to carbon dioxide (R744). This refrigerant is environmental friendly. However, it should be noted that the air condition system working with R744 refrigerant operates at high temperature (up to 150 °C) and high pressure (up to 130 bar). These two parameters are much higher than for other refrigerants. Thus new materials, design as well as joining technologies are strongly needed for these systems. AISI 304 and 316L steels as well as aluminium alloys 5xxx are ranked among the prospective materials. As a joining process laser welding, plasma welding, electron beam welding as well as high rotary friction welding can be applied. In the study, the metallographic examination based on light microscopy as well as SEM was applied to estimate the quality of welded joints. The analysis of welding was supported by numerical modelling based on Sysweld software. The results indicated that using laser, plasma and electron beam welding, it is possible to obtain proper quality of welds in stainless steel. Moreover, high rotary friction welding allows to guarantee the metallic continuity in the aluminium welded area. The metallographic examination revealed that the grain growth in the heat affected zone (HAZ) in laser and electron beam welded joints were not observed. It is due to low heat input and short welding time. The grain growth and subgrains can be observed at room temperature when the solidification mode is austenitic. This caused low microstructural changes during solidification. The columnar grain structure was found in the weld metal. Meanwhile, the equiaxed grains were detected in the interface. The numerical modelling of laser welding process allowed to estimate the temperature profile in the welded joint as well as predicts the dimensions of welds. The agreement between FEM analysis and experimental data was achieved.Keywords: car’s air–conditioning, microstructure, numerical modelling, welding
Procedia PDF Downloads 4083663 Meditation Based Brain Painting Promotes Foreign Language Memory through Establishing a Brain-Computer Interface
Authors: Zhepeng Rui, Zhenyu Gu, Caitilin de Bérigny
Abstract:
In the current study, we designed an interactive meditation and brain painting application to cultivate users’ creativity, promote meditation, reduce stress, and improve cognition while attempting to learn a foreign language. User tests and data analyses were conducted on 42 male and 42 female participants to better understand sex-associated psychological and aesthetic differences. Our method utilized brain-computer interfaces to import meditation and attention data to create artwork in meditation-based applications. Female participants showed statistically significantly different language learning outcomes following three meditation paradigms. The art style of brain painting helped females with language memory. Our results suggest that the most ideal methods for promoting memory attention were meditation methods and brain painting exercises contributing to language learning, memory concentration promotion, and foreign word memorization. We conclude that a short period of meditation practice can help in learning a foreign language. These findings provide new insights into meditation, creative language education, brain-computer interface, and human-computer interactions.Keywords: brain-computer interface, creative thinking, meditation, mental health
Procedia PDF Downloads 1273662 Modelling Mode Choice Behaviour Using Cloud Theory
Authors: Leah Wright, Trevor Townsend
Abstract:
Mode choice models are crucial instruments in the analysis of travel behaviour. These models show the relationship between an individual’s choice of transportation mode for a given O-D pair and the individual’s socioeconomic characteristics such as household size and income level, age and/or gender, and the features of the transportation system. The most popular functional forms of these models are based on Utility-Based Choice Theory, which addresses the uncertainty in the decision-making process with the use of an error term. However, with the development of artificial intelligence, many researchers have started to take a different approach to travel demand modelling. In recent times, researchers have looked at using neural networks, fuzzy logic and rough set theory to develop improved mode choice formulas. The concept of cloud theory has recently been introduced to model decision-making under uncertainty. Unlike the previously mentioned theories, cloud theory recognises a relationship between randomness and fuzziness, two of the most common types of uncertainty. This research aims to investigate the use of cloud theory in mode choice models. This paper highlights the conceptual framework of the mode choice model using cloud theory. Merging decision-making under uncertainty and mode choice models is state of the art. The cloud theory model is expected to address the issues and concerns with the nested logit and improve the design of mode choice models and their use in travel demand.Keywords: Cloud theory, decision-making, mode choice models, travel behaviour, uncertainty
Procedia PDF Downloads 3873661 Mathematical Modelling of Slag Formation in an Entrained-Flow Gasifier
Authors: Girts Zageris, Vadims Geza, Andris Jakovics
Abstract:
Gasification processes are of great interest due to their generation of renewable energy in the form of syngas from biodegradable waste. It is, therefore, important to study the factors that play a role in the efficiency of gasification and the longevity of the machines in which gasification takes place. This study focuses on the latter, aiming to optimize an entrained-flow gasifier by reducing slag formation on its walls to reduce maintenance costs. A CFD mathematical model for an entrained-flow gasifier is constructed – the model of an actual gasifier is rendered in 3D and appropriately meshed. Then, the turbulent gas flow in the gasifier is modeled with the realizable k-ε approach, taking devolatilization, combustion and coal gasification into account. Various such simulations are conducted, obtaining results for different air inlet positions and by tracking particles of varying sizes undergoing devolatilization and gasification. The model identifies potential problematic zones where most particles collide with the gasifier walls, indicating risk regions where ash deposits could most likely form. In conclusion, the effects on the formation of an ash layer of air inlet positioning and particle size allowed in the main gasifier tank are discussed, and possible solutions for decreasing a number of undesirable deposits are proposed. Additionally, an estimate of the impact of different factors such as temperature, gas properties and gas content, and different forces acting on the particles undergoing gasification is given.Keywords: biomass particles, gasification, slag formation, turbulence k-ε modelling
Procedia PDF Downloads 285