Search results for: estimating of trajectory
632 Economic Growth After an Earthquake: A Synthetic Control Approach
Authors: Diego Diaz H., Cristian Larroulet
Abstract:
Although a large earthquake has clear and immediate consequences such as deaths, destruction of infrastructure and displacement (at least temporary) of part of the population, scientific research about the impact of a geological disaster in economic activity is inconclusive, especially when looking beyond the very short term. Estimating the economic impact years after a disaster strike is non-trivial since there is an unavoidable difficulty in attributing the observed effect to the disaster and not to other economic shocks. Case studies are performed that determine the impact of earthquakes in Chile, Japan, and New Zealand at a regional level by applying the synthetic control method, using the natural disaster as treatment. This consisted in constructing a counterfactual from every region in the same country that is not affected (or is slightly affected) by the earthquake. The results show that the economies of Canterbury and Tohoku achieved greater levels of GDP per capita in the years after the disaster than they would have in the absence of the disaster. For the case of Chile, however, the region of Maule experiences a decline in GDP per capita because of the earthquake. All the results are robust according to the placebo tests. Also, the results suggest that national institutional quality improve the growth process after the disaster.Keywords: earthquake, economic growth, institutional quality, synthetic control
Procedia PDF Downloads 223631 Comparison between Simulation and Experimentally Observed Interactions between Two Different Sized Magnetic Beads in a Fluidic System
Authors: Olayinka Oduwole, Steve Sheard
Abstract:
The magnetic separation of biological cells using super-magnetic beads has been used widely for various bioassays. These bioassays can further be integrated with other laboratory components to form a biosensor which can be used for cell sorting, mixing, purification, transport, manipulation etc. These bio-sensing applications have also been facilitated by the wide availability of magnetic beads which range in size and magnetic properties produced by different manufacturers. In order to improve the efficiency and separation capabilities of these biosensors, it is important to determine the magnetic force induced velocities and interaction of beads within the magnetic field; this will help biosensor users choose the desired magnetic bead for their specific application. This study presents for the first time the interaction between a pair of different sized super-paramagnetic beads suspended in a static fluid moving within a uniform magnetic field using a modified finite-time-finite-difference scheme. A captured video was used to record the trajectory pattern and a good agreement was obtained between the simulated trajectories and the video data. The model is, therefore, a good approximation for predicting the velocities as well as the interaction between various magnetic particles which differ in size and magnetic properties for bio-sensing applications requiring a low concentration of magnetic beads.Keywords: biosensor, magnetic field, magnetic separation, super-paramagnetic bead
Procedia PDF Downloads 473630 Aircraft Automatic Collision Avoidance Using Spiral Geometric Approach
Authors: M. Orefice, V. Di Vito
Abstract:
This paper provides a description of a Collision Avoidance algorithm that has been developed starting from the mathematical modeling of the flight of insects, in terms of spirals and conchospirals geometric paths. It is able to calculate a proper avoidance manoeuver aimed to prevent the infringement of a predefined distance threshold between ownship and the considered intruder, while minimizing the ownship trajectory deviation from the original path and in compliance with the aircraft performance limitations and dynamic constraints. The algorithm is designed in order to be suitable for real-time applications, so that it can be considered for the implementation in the most recent airborne automatic collision avoidance systems using the traffic data received through an ADS-B IN device. The presented approach is able to take into account the rules-of-the-air, due to the possibility to select, through specifically designed decision making logic based on the consideration of the encounter geometry, the direction of the calculated collision avoidance manoeuver that allows complying with the rules-of-the-air, as for instance the fundamental right of way rule. In the paper, the proposed collision avoidance algorithm is presented and its preliminary design and software implementation is described. The applicability of this method has been proved through preliminary simulation tests performed in a 2D environment considering single intruder encounter geometries, as reported and discussed in the paper.Keywords: ADS-B Based Application, Collision Avoidance, RPAS, Spiral Geometry.
Procedia PDF Downloads 241629 Optimal Tuning of Linear Quadratic Regulator Controller Using a Particle Swarm Optimization for Two-Rotor Aerodynamical System
Authors: Ayad Al-Mahturi, Herman Wahid
Abstract:
This paper presents an optimal state feedback controller based on Linear Quadratic Regulator (LQR) for a two-rotor aero-dynamical system (TRAS). TRAS is a highly nonlinear multi-input multi-output (MIMO) system with two degrees of freedom and cross coupling. There are two parameters that define the behavior of LQR controller: state weighting matrix and control weighting matrix. The two parameters influence the performance of LQR. Particle Swarm Optimization (PSO) is proposed to optimally tune weighting matrices of LQR. The major concern of using LQR controller is to stabilize the TRAS by making the beam move quickly and accurately for tracking a trajectory or to reach a desired altitude. The simulation results were carried out in MATLAB/Simulink. The system is decoupled into two single-input single-output (SISO) systems. Comparing the performance of the optimized proportional, integral and derivative (PID) controller provided by INTECO, results depict that LQR controller gives a better performance in terms of both transient and steady state responses when PSO is performed.Keywords: LQR controller, optimal control, particle swarm optimization (PSO), two rotor aero-dynamical system (TRAS)
Procedia PDF Downloads 322628 An Improved Tracking Approach Using Particle Filter and Background Subtraction
Authors: Amir Mukhtar, Dr. Likun Xia
Abstract:
An improved, robust and efficient visual target tracking algorithm using particle filtering is proposed. Particle filtering has been proven very successful in estimating non-Gaussian and non-linear problems. In this paper, the particle filter is used with color feature to estimate the target state with time. Color distributions are applied as this feature is scale and rotational invariant, shows robustness to partial occlusion and computationally efficient. The performance is made more robust by choosing the different (YIQ) color scheme. Tracking is performed by comparison of chrominance histograms of target and candidate positions (particles). Color based particle filter tracking often leads to inaccurate results when light intensity changes during a video stream. Furthermore, background subtraction technique is used for size estimation of the target. The qualitative evaluation of proposed algorithm is performed on several real-world videos. The experimental results demonstrate that the improved algorithm can track the moving objects very well under illumination changes, occlusion and moving background.Keywords: tracking, particle filter, histogram, corner points, occlusion, illumination
Procedia PDF Downloads 381627 Assessment of Dose: Area Product of Common Radiographic Examinations in Selected Southern Nigerian Hospitals
Authors: Lateef Bamidele
Abstract:
Over the years, radiographic examinations are the most used diagnostic tools in the Nigerian health care system, but most diagnostic examinations carried out do not have records of patient doses. Lack of adequate information on patient doses has been a major hindrance in quantifying the radiological risk associated with radiographic examinations. This study aimed at estimating dose–area product (DAP) of patient examined in X-Ray units in selected hospitals in Southern Nigeria. The standard projections selected are chest posterior-anterior (PA), abdomen anterior-posterior (AP), pelvis AP, pelvis lateral (LAT), skull AP/PA, skull LAT, lumbar spine AP, lumbar spine, LAT. Measurement of entrance surface dose (ESD) was carried out using thermoluminescent dosimeter (TLD). Measured ESDs were converted into DAP using the beam area of patients. The results show that the mean DAP ranged from 0.17 to 18.35 Gycm². The results obtained in this study when compared with those of NRPB-HPE were found to be higher. These are an indication of non optimization of operational conditions.Keywords: dose–area product, radiographic examinations, patient doses, optimization
Procedia PDF Downloads 176626 Evaluating the Methods of Retrofitting and Renovating the Masonry Schools of Ahvaz City
Authors: Navid Khayat, Babak Mombeni
Abstract:
This study investigates the retrofitting of schools in Ahvaz City. Three schools, namely, Enghelab, Sherafat, and Golchehreh, in Ahvaz City, are initially examined through Schmidt hammer and ultrasonic tests. Given the tests and controls on the structures of these schools, the methods are presented for their reconstruction. The plan is presented for each school by estimating the cost and generally the feasibility and estimated the duration of project reconstruction. After reconstruction, the mentioned tests are re-performed for rebuilt parts and the results indicate a significant improvement in performance of structure because of reconstruction. According to the results, despite the fact that the use of fiber reinforced polymers (FRP) for structure retrofitting is costly, due to the low executive costs and also other benefits of FRP, it is generally considered as one of the most effective ways of retrofitting. Building the concrete coating on walls is another effective method in retrofitting the buildings. According to this method, a grid of horizontal and vertical bars is installed on the wall and then the concrete is poured on it. The use of concrete coating on the concrete and brick structures leads to the useful results and the experience indicates that the poured concrete filled the joints well and provides the appropriate bonding and adhesion.Keywords: renovation, retrofitting, masonry structures, concrete coating
Procedia PDF Downloads 453625 An Improved Convolution Deep Learning Model for Predicting Trip Mode Scheduling
Authors: Amin Nezarat, Naeime Seifadini
Abstract:
Trip mode selection is a behavioral characteristic of passengers with immense importance for travel demand analysis, transportation planning, and traffic management. Identification of trip mode distribution will allow transportation authorities to adopt appropriate strategies to reduce travel time, traffic and air pollution. The majority of existing trip mode inference models operate based on human selected features and traditional machine learning algorithms. However, human selected features are sensitive to changes in traffic and environmental conditions and susceptible to personal biases, which can make them inefficient. One way to overcome these problems is to use neural networks capable of extracting high-level features from raw input. In this study, the convolutional neural network (CNN) architecture is used to predict the trip mode distribution based on raw GPS trajectory data. The key innovation of this paper is the design of the layout of the input layer of CNN as well as normalization operation, in a way that is not only compatible with the CNN architecture but can also represent the fundamental features of motion including speed, acceleration, jerk, and Bearing rate. The highest prediction accuracy achieved with the proposed configuration for the convolutional neural network with batch normalization is 85.26%.Keywords: predicting, deep learning, neural network, urban trip
Procedia PDF Downloads 138624 Estimating the Probability of Winning the Best Actor/Actress Award Conditional on the Best Picture Nomination with Bayesian Hierarchical Models
Authors: Svetlana K. Eden
Abstract:
Movies and TV shows have long become part of modern culture. We all have our preferred genre, story, actors, and actresses. However, can we objectively discern good acting from the bad? As laymen, we are probably not objective, but what about the Oscar academy members? Are their votes based on objective measures? Oscar academy members are probably also biased due to many factors, including their professional affiliations or advertisement exposure. Heavily advertised films bring more publicity to their cast and are likely to have bigger budgets. Because a bigger budget may also help earn a Best Picture (BP) nomination, we hypothesize that best actor/actress (BA) nominees from BP-nominated movies would have higher chances of winning the award than those BA nominees from non-BP-nominated films. To test this hypothesis, three Bayesian hierarchical models are proposed, and their performance is evaluated. The results from all three models largely support our hypothesis. Depending on the proportion of BP nominations among BA nominees, the odds ratios (estimated over expected) of winning the BA award conditional on BP nomination vary from 2.8 [0.8-7.0] to 4.3 [2.0, 15.8] for actors and from 1.5 [0.0, 12.2] to 5.4 [2.7, 14.2] for actresses.Keywords: Oscar, best picture, best actor/actress, bias
Procedia PDF Downloads 223623 Africa as Endemically a War Continent: Explaining the Changing Pattern of Armed Conflicts in Africa
Authors: Kenneth Azaigba
Abstract:
The history of post-colonial African States has been dubbed a history of endemic warfare in existing literature. Indeed, Africa political environment is characterized by a multiplicity of threats to peace and security. Africa's leading drivers of conflict include abundant (especially mineral) resources, personal rule and attendant political authoritarianism, manipulation of identity politics across ethnicity, marginalization of communities, as well as electoral mal-practices resulting in contested legitimacy and resultant violence. However, the character of armed conflicts in Africa is changing. This paper attempts to reconstruct the trajectory of armed conflicts in Africa and explain the changing pattern of armed conflict. The paper contends that large scale political violence in Africa is on the decline rendering the endemic thesis an inappropriate paradigm in explaining political conflicts in Africa. The paper also posits that though small scale conflicts are springing up and exhibiting trans-border dimensions, these patterns of armed conflicts are not peculiar to Africa but emerging waves of global conflicts. The paper explains that the shift in the scale of warfare in Africa is a function of a multiplicity of post-cold war global contradictions. Inclusive governance, social justice and economic security are articulated as workable panaceas for mitigating warfare in Africa.Keywords: Africa, conflicts, pattern, war
Procedia PDF Downloads 387622 Evaluating the Influence of Financial Technology (FinTech) on Sustainable Finance: A Comprehensive Global Analysis
Authors: Muhammad Kashif
Abstract:
The primary aim of this paper is to investigate the influence of financial technology (FinTech) on sustainable finance. The sample for this study spans from 2010 to 2021, encompassing data from 89 countries worldwide. The study employed two-stage least squares (2SLS) regression approach with the instrumental variables and validated the findings using a two-step system generalized method of moments (GMM). The findings indicate that fintech has a significant favorable impact on sustainable finance. While other factors such as institutional quality, socio-economic condition, and renewable energy have a significant and beneficial influence on the trajectory of sustainable finance, except globalization's impact is positive but insignificant. Furthermore, fintech is crucial in driving the transition toward a sustainable future characterized by a lower carbon economy. The study found that fintech has extensive application across various sectors of sustainable finance and has substantial potential to create long-term positive effects on sustainable finance. Fintech can integrate extensively with other technologies to facilitate diversified growth in sustainable finance. Additionally, this study highlights fintech-related trends and research opportunities in sustainable finance, showing how these can promote each other worldwide with important policy implications for countries looking to advance sustainable finance through technology.Keywords: sustainable development goals (SDGs), financial technology (FinTech), genuine savings index (GSI), financial stability index, sustainable finance
Procedia PDF Downloads 134621 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques
Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah
Abstract:
Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or under-estimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improves accuracies. This requires standard measurement methods to be structured in ontologically and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.Keywords: BIM, construction projects, cost estimation, NRM, ontology
Procedia PDF Downloads 551620 Maximum Initial Input Allowed to Iterative Learning Control Set-up Using Singular Values
Authors: Naser Alajmi, Ali Alobaidly, Mubarak Alhajri, Salem Salamah, Muhammad Alsubaie
Abstract:
Iterative Learning Control (ILC) known to be a controlling tool to overcome periodic disturbances for repetitive systems. This technique is required to let the error signal tends to zero as the number of operation increases. The learning process that lies within this context is strongly dependent on the initial input which if selected properly tends to let the learning process be more effective compared to the case where a system starts from blind. ILC uses previous recorded execution data to update the following execution/trial input such that a reference trajectory is followed to a high accuracy. Error convergence in ILC is generally highly dependent on the input applied to a plant for trial $1$, thus a good choice of initial starting input signal would make learning faster and as a consequence the error tends to zero faster as well. In the work presented within, an upper limit based on the Singular Values Principle (SV) is derived for the initial input signal applied at trial $1$ such that the system follow the reference in less number of trials without responding aggressively or exceeding the working envelope where a system is required to move within in a robot arm, for example. Simulation results presented illustrate the theory introduced within this paper.Keywords: initial input, iterative learning control, maximum input, singular values
Procedia PDF Downloads 241619 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models
Authors: Yoonsuh Jung
Abstract:
As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an "optimal" value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.Keywords: cross validation, parameter averaging, parameter selection, regularization parameter search
Procedia PDF Downloads 415618 The Influence of Collaboration on Individual Writing Quality: The Case of Iranian vs. Malaysian Freshers
Authors: Seyed Yasin Yazdi-Amirkhiz, Azirah Hashim
Abstract:
This study purported to comparatively investigate the influence of collaborative writing on the quality of individual writing of four female Iranian and four female Malaysian students. The first semester students at a private university in Malaysia, who were homogeneous in terms of age, gender, study discipline, and language proficiency, were divided into two Iranian and two Malaysian dyads. The dyads performed collaborative writing tasks for 15 sessions; after three consecutive collaborative writing sessions, each participant was asked to individually attempt a writing task. Both collaborative and individual writing tasks comprised isomorphic graphic prompts (IELTS Academic Module task 1). Writing quality of the five individually-produced texts during the study was scored in terms of task achievement (TA), cohesion/coherence (C/C), grammatical range/accuracy (GR/A), and lexical resources (LR). The findings indicated a hierarchy of development in TA and C/C among all the students, while LR showed minor improvement only among three of Malaysian students, and GR/A barely exhibited any progress among all the participants. Intermittent progressions and regressions were also discerned in the trajectory of their writing development. The findings are discussed in the light of the socio-cultural and emergentist perspectives, the typology of tasks used as well as the role of the participants’ level of language proficiency.Keywords: collaborative writing, writing quality, individual writing, collaboration
Procedia PDF Downloads 458617 MapReduce Logistic Regression Algorithms with RHadoop
Authors: Byung Ho Jung, Dong Hoon Lim
Abstract:
Logistic regression is a statistical method for analyzing a dataset in which there are one or more independent variables that determine an outcome. Logistic regression is used extensively in numerous disciplines, including the medical and social science fields. In this paper, we address the problem of estimating parameters in the logistic regression based on MapReduce framework with RHadoop that integrates R and Hadoop environment applicable to large scale data. There exist three learning algorithms for logistic regression, namely Gradient descent method, Cost minimization method and Newton-Rhapson's method. The Newton-Rhapson's method does not require a learning rate, while gradient descent and cost minimization methods need to manually pick a learning rate. The experimental results demonstrated that our learning algorithms using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also compared the performance of our Newton-Rhapson's method with gradient descent and cost minimization methods. The results showed that our newton's method appeared to be the most robust to all data tested.Keywords: big data, logistic regression, MapReduce, RHadoop
Procedia PDF Downloads 284616 Nonparametric Path Analysis with Truncated Spline Approach in Modeling Rural Poverty in Indonesia
Authors: Usriatur Rohma, Adji Achmad Rinaldo Fernandes
Abstract:
Nonparametric path analysis is a statistical method that does not rely on the assumption that the curve is known. The purpose of this study is to determine the best nonparametric truncated spline path function between linear and quadratic polynomial degrees with 1, 2, and 3-knot points and to determine the significance of estimating the best nonparametric truncated spline path function in the model of the effect of population migration and agricultural economic growth on rural poverty through the variable unemployment rate using the t-test statistic at the jackknife resampling stage. The data used in this study are secondary data obtained from statistical publications. The results showed that the best model of nonparametric truncated spline path analysis is quadratic polynomial degree with 3-knot points. In addition, the significance of the best-truncated spline nonparametric path function estimation using jackknife resampling shows that all exogenous variables have a significant influence on the endogenous variables.Keywords: nonparametric path analysis, truncated spline, linear, quadratic, rural poverty, jackknife resampling
Procedia PDF Downloads 46615 Enhancing Transfer Path Analysis with In-Situ Component Transfer Path Analysis for Interface Forces Identification
Authors: Raef Cherif, Houssine Bakkali, Wafaa El Khatiri, Yacine Yaddaden
Abstract:
The analysis of how vibrations are transmitted between components is required in many engineering applications. Transfer path analysis (TPA) has been a valuable engineering tool for solving Noise, Vibration, and Harshness (NVH problems using sub-structuring applications. The most challenging part of a TPA analysis is estimating the equivalent forces at the contact points between the active and the passive side. Component TPA in situ Method calculates these forces by inverting the frequency response functions (FRFs) measured at the passive subsystem, relating the motion at indicator points to forces at the interface. However, matrix inversion could pose problems due to the ill-conditioning of the matrices leading to inaccurate results. This paper establishes a TPA model for an academic system consisting of two plates linked by four springs. A numerical study has been performed to improve the interface forces identification. Several parameters are studied and discussed, such as the singular value rejection and the number and position of indicator points chosen and used in the inversion matrix.Keywords: transfer path analysis, matrix inverse method, indicator points, SVD decomposition
Procedia PDF Downloads 84614 Comparison of Statistical Methods for Estimating Missing Precipitation Data in the River Subbasin Lenguazaque, Colombia
Authors: Miguel Cañon, Darwin Mena, Ivan Cabeza
Abstract:
In this work was compared and evaluated the applicability of statistical methods for the estimation of missing precipitations data in the basin of the river Lenguazaque located in the departments of Cundinamarca and Boyacá, Colombia. The methods used were the method of simple linear regression, distance rate, local averages, mean rates, correlation with nearly stations and multiple regression method. The analysis used to determine the effectiveness of the methods is performed by using three statistical tools, the correlation coefficient (r2), standard error of estimation and the test of agreement of Bland and Altmant. The analysis was performed using real rainfall values removed randomly in each of the seasons and then estimated using the methodologies mentioned to complete the missing data values. So it was determined that the methods with the highest performance and accuracy in the estimation of data according to conditions that were counted are the method of multiple regressions with three nearby stations and a random application scheme supported in the precipitation behavior of related data sets.Keywords: statistical comparison, precipitation data, river subbasin, Bland and Altmant
Procedia PDF Downloads 467613 Wind Wave Modeling Using MIKE 21 SW Spectral Model
Authors: Pouya Molana, Zeinab Alimohammadi
Abstract:
Determining wind wave characteristics is essential for implementing projects related to Coastal and Marine engineering such as designing coastal and marine structures, estimating sediment transport rates and coastal erosion rates in order to predict significant wave height (H_s), this study applies the third generation spectral wave model, Mike 21 SW, along with CEM model. For SW model calibration and verification, two data sets of meteorology and wave spectroscopy are used. The model was exposed to time-varying wind power and the results showed that difference ratio mean, standard deviation of difference ratio and correlation coefficient in SW model for H_s parameter are 1.102, 0.279 and 0.983, respectively. Whereas, the difference ratio mean, standard deviation and correlation coefficient in The Choice Experiment Method (CEM) for the same parameter are 0.869, 1.317 and 0.8359, respectively. Comparing these expected results it is revealed that the Choice Experiment Method CEM has more errors in comparison to MIKE 21 SW third generation spectral wave model and higher correlation coefficient does not necessarily mean higher accuracy.Keywords: MIKE 21 SW, CEM method, significant wave height, difference ratio
Procedia PDF Downloads 402612 Comparative Assessment of a Distributed Model and a Lumped Model for Estimating of Sediments Yielding in Small Urban Areas
Authors: J.Zambrano Nájera, M.Gómez Valentín
Abstract:
Increases in urbanization during XX century, have brought as one major problem the increased of sediment production. Hydraulic erosion is one of the major causes of increasing of sediments in small urban catchments. Such increments in sediment yielding in header urban catchments can caused obstruction of drainage systems, making impossible to capture urban runoff, increasing runoff volumes and thus exacerbating problems of urban flooding. For these reasons, it is more and more important to study of sediment production in urban watershed for properly analyze and solve problems associated to sediments. The study of sediments production has improved with the use of mathematical modeling. For that reason, it is proposed a new physically based model applicable to small header urban watersheds that includes the advantages of distributed physically base models, but with more realistic data requirements. Additionally, in this paper the model proposed is compared with a lumped model, reviewing the results, the advantages and disadvantages between the both of them.Keywords: erosion, hydrologic modeling, urban runoff, sediment modeling, sediment yielding, urban planning
Procedia PDF Downloads 347611 Craniopharyngiomas: Surgical Techniques: The Combined Interhemispheric Sub-Commissural Translaminaterminalis Approach to Tumors in and Around the Third Ventricle: Neurological and Functional Outcome
Authors: Pietro Mortini, Marco Losa
Abstract:
Objective: Resection of large lesions growing into the third ventricle remains a demanding surgery, sometimes at risk of severe post-operative complications. Transcallosal and transcortical routes were considered as approaches of choice to access the third ventricle, however neurological consequences like memory loss have been reported. We report clinical results of the previously described combined interhemispheric sub-commissural translaminaterminalis approach (CISTA) for the resection of large lesions located in the third ventricle. Methods: Authors conducted a retrospective analysis on 10 patients, who were operated through the CISTA, for the resection of lesions growing into the third ventricle. Results: Total resection was achieved in all cases. Cognitive worsening occurred only in one case. No perioperative deaths were recorded and, at last follow-up, all patients were alive. One year after surgery 80% of patients had an excellent outcome with a KPS 100 and Glasgow Outcome score (GOS) Conclusion: The CISTA represents a safe and effective alternative to transcallosal and transcortical routes to resect lesions growing into the third ventricle. It allows for a multiangle trajectory to access the third ventricle with a wide working area free from critical neurovascular structures, without any section of the corpus callosum, the anterior commissure and the fornix.Keywords: craniopharingioma, surgery, sub-commissural translaminaterminalis approach (CISTA),
Procedia PDF Downloads 293610 Influence of Physical Properties on Estimation of Mechanical Strength of Limestone
Authors: Khaled Benyounes
Abstract:
Determination of the rock mechanical properties such as unconfined compressive strength UCS, Young’s modulus E, and tensile strength by the Brazilian test Rtb is considered to be the most important component in drilling and mining engineering project. Research related to establishing correlation between strength and physical parameters of rocks has always been of interest to mining and reservoir engineering. For this, many rock blocks of limestone were collected from the quarry located in Meftah(Algeria), the cores were crafted in the laboratory using a core drill. This work examines the relationships between mechanical properties and some physical properties of limestone. Many empirical equations are established between UCS and physical properties of limestone (such as dry bulk density, velocity of P-waves, dynamic Young’s modulus, alteration index, and total porosity). Others correlations UCS-tensile strength, dynamic Young’s modulus-static Young’s modulus have been find. Based on the Mohr-Coulomb failure criterion, we were able to establish mathematical relationships that will allow estimating the cohesion and internal friction angle from UCS and indirect tensile strength. Results from this study can be useful for mining industry for resolve range of geomechanical problems such as slope stability.Keywords: limestone, mechanical strength, Young’s modulus, porosity
Procedia PDF Downloads 454609 Prediction Fluid Properties of Iranian Oil Field with Using of Radial Based Neural Network
Authors: Abdolreza Memari
Abstract:
In this article in order to estimate the viscosity of crude oil,a numerical method has been used. We use this method to measure the crude oil's viscosity for 3 states: Saturated oil's viscosity, viscosity above the bubble point and viscosity under the saturation pressure. Then the crude oil's viscosity is estimated by using KHAN model and roller ball method. After that using these data that include efficient conditions in measuring viscosity, the estimated viscosity by the presented method, a radial based neural method, is taught. This network is a kind of two layered artificial neural network that its stimulation function of hidden layer is Gaussian function and teaching algorithms are used to teach them. After teaching radial based neural network, results of experimental method and artificial intelligence are compared all together. Teaching this network, we are able to estimate crude oil's viscosity without using KHAN model and experimental conditions and under any other condition with acceptable accuracy. Results show that radial neural network has high capability of estimating crude oil saving in time and cost is another advantage of this investigation.Keywords: viscosity, Iranian crude oil, radial based, neural network, roller ball method, KHAN model
Procedia PDF Downloads 501608 A Fast Algorithm for Electromagnetic Compatibility Estimation for Radio Communication Network Equipment in a Complex Electromagnetic Environment
Authors: C. Temaneh-Nyah
Abstract:
Electromagnetic compatibility (EMC) is the ability of a Radio Communication Equipment (RCE) to operate with a desired quality of service in a given Electromagnetic Environment (EME) and not to create harmful interference with other RCE. This paper presents an algorithm which improves the simulation speed of estimating EMC of RCE in a complex EME, based on a stage by stage frequency-energy criterion of filtering. This algorithm considers different interference types including: Blocking and intermodulation. It consist of the following steps: simplified energy criterion where filtration is based on comparing the free space interference level to the industrial noise, frequency criterion which checks whether the interfering emissions characteristic overlap with the receiver’s channels characteristic and lastly the detailed energy criterion where the real channel interference level is compared to the noise level. In each of these stages, some interference cases are filtered out by the relevant criteria. This reduces the total number of dual and different combinations of RCE involved in the tedious detailed energy analysis and thus provides an improved simulation speed.Keywords: electromagnetic compatibility, electromagnetic environment, simulation of communication network
Procedia PDF Downloads 218607 A Comparative Study of Generalized Autoregressive Conditional Heteroskedasticity (GARCH) and Extreme Value Theory (EVT) Model in Modeling Value-at-Risk (VaR)
Authors: Longqing Li
Abstract:
The paper addresses the inefficiency of the classical model in measuring the Value-at-Risk (VaR) using a normal distribution or a Student’s t distribution. Specifically, the paper focuses on the one day ahead Value-at-Risk (VaR) of major stock market’s daily returns in US, UK, China and Hong Kong in the most recent ten years under 95% confidence level. To improve the predictable power and search for the best performing model, the paper proposes using two leading alternatives, Extreme Value Theory (EVT) and a family of GARCH models, and compares the relative performance. The main contribution could be summarized in two aspects. First, the paper extends the GARCH family model by incorporating EGARCH and TGARCH to shed light on the difference between each in estimating one day ahead Value-at-Risk (VaR). Second, to account for the non-normality in the distribution of financial markets, the paper applies Generalized Error Distribution (GED), instead of the normal distribution, to govern the innovation term. A dynamic back-testing procedure is employed to assess the performance of each model, a family of GARCH and the conditional EVT. The conclusion is that Exponential GARCH yields the best estimate in out-of-sample one day ahead Value-at-Risk (VaR) forecasting. Moreover, the discrepancy of performance between the GARCH and the conditional EVT is indistinguishable.Keywords: Value-at-Risk, Extreme Value Theory, conditional EVT, backtesting
Procedia PDF Downloads 321606 Presentation of a Mix Algorithm for Estimating the Battery State of Charge Using Kalman Filter and Neural Networks
Authors: Amin Sedighfar, M. R. Moniri
Abstract:
Determination of state of charge (SOC) in today’s world becomes an increasingly important issue in all the applications that include a battery. In fact, estimation of the SOC is a fundamental need for the battery, which is the most important energy storage in Hybrid Electric Vehicles (HEVs), smart grid systems, drones, UPS and so on. Regarding those applications, the SOC estimation algorithm is expected to be precise and easy to implement. This paper presents an online method for the estimation of the SOC of Valve-Regulated Lead Acid (VRLA) batteries. The proposed method uses the well-known Kalman Filter (KF), and Neural Networks (NNs) and all of the simulations have been done with MATLAB software. The NN is trained offline using the data collected from the battery discharging process. A generic cell model is used, and the underlying dynamic behavior of the model has used two capacitors (bulk and surface) and three resistors (terminal, surface, and end), where the SOC determined from the voltage represents the bulk capacitor. The aim of this work is to compare the performance of conventional integration-based SOC estimation methods with a mixed algorithm. Moreover, by containing the effect of temperature, the final result becomes more accurate.Keywords: Kalman filter, neural networks, state-of-charge, VRLA battery
Procedia PDF Downloads 192605 The Experiences of Agency in the Utilization of Twitter for English Language Learning in a Saudi EFL Context
Authors: Fahd Hamad Alqasham
Abstract:
This longitudinal study investigates Saudi students’ use trajectory and experiences of Twitter as an innovative tool for in-class learning of the English language in a Saudi tertiary English as a foreign language (EFL) context for a 12-week semester. The study adopted van Lier’s agency theory (2008, 2010) as the analytical framework to obtain an in-depth analysis of how the learners’ could utilize Twitter to create innovative ways for them to engage in English learning inside the language classroom. The study implemented a mixed methods approach, including six data collection instruments consisting of a research log, observations, focus group participation, initial and post-project interviews, and a post-project questionnaire. The study was conducted at Qassim University, specifically at Preparatory Year Program (PYP) on the main campus. The sample included 25 male students studying in the first level of PYP. The findings results revealed that although Twitter’s affordances initially paled a crucial role in motivating the learners to initiate their agency inside the classroom to learn English, the contextual constraints, mainly anxiety, the university infrastructure, and the teacher’s role negatively influenced the sustainability of Twitter’s use past week nine of its implementation.Keywords: CALL, agency, innovation, EFL, language learning
Procedia PDF Downloads 72604 Evaluation of Nutrition Supplement on Body Composition during Catch-Up Growth, in a Pre-Clinical Model of Growth Restriction
Authors: Bindya Jacob
Abstract:
The aim of the present study was to assess the quality of catchup growth induced by Oral Nutrition Supplement (ONS), in animal model of growth restriction due to under nutrition. Quality of catch-up growth was assessed by proportion of lean body mass (LBM) and fat mass (FM). Young SD rats were food restricted at 70% of normal caloric intake for 4 weeks; and re-fed at 120% of normal caloric intake for 4 weeks. Refeeding diet had 50% calories from animal diet and 50% from ONS formulated for optimal growth. After refeeding, the quantity and quality of catch-up growth were measured including weight, length, LBM and FM. During nutrient restriction, body weight and length of animals was reduced compared to healthy controls. Both LBM and FM were significantly lower than healthy controls (p < 0.001). Refeeding with ONS resulted in increase of weight and length, with significant catch-up growth compared to baseline (p < 0.001). Detailed examination of body composition showed that the catch-up in body weight was due to proportionate increase of LBM and FM, resulting in a final body composition similar to healthy controls. This data supports the use of well-designed ONS for recovery from growth restriction due to under nutrition, and return to normal growth trajectory characterized by normal ratio of lean and fat mass.Keywords: catch up growth, body composition, nutrient restriction, healthy growth
Procedia PDF Downloads 438603 Software Reliability Prediction Model Analysis
Authors: Lela Mirtskhulava, Mariam Khunjgurua, Nino Lomineishvili, Koba Bakuria
Abstract:
Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.Keywords: exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability
Procedia PDF Downloads 464