Search results for: digital surface model (DSM)
6397 Development of a System for Fitting Clothes and Accessories Using Augmented Reality
Authors: Dinmukhamed T., Vassiliy S.
Abstract:
This article suggests the idea of fitting clothes and accessories based on augmented reality. A logical data model has been developed, taking into account the decision-making module (colors, style, type, material, popularity, etc.) based on personal data (age, gender, weight, height, leg size, hoist length, geolocation, photogrammetry, number of purchases of certain types of clothing, etc.) and statistical data of the purchase history (number of items, price, size, color, style, etc.). Also, in order to provide information to the user, it is planned to develop an augmented reality system using a QR code. This system of selection and fitting of clothing and accessories based on augmented reality will be used in stores to reduce the time for the buyer to make a decision on the choice of clothes.Keywords: augmented reality, online store, decision-making module, like QR code, clothing store, queue
Procedia PDF Downloads 1576396 A Preliminary Study for Building an Arabic Corpus of Pair Questions-Texts from the Web: Aqa-Webcorp
Authors: Wided Bakari, Patrce Bellot, Mahmoud Neji
Abstract:
With the development of electronic media and the heterogeneity of Arabic data on the Web, the idea of building a clean corpus for certain applications of natural language processing, including machine translation, information retrieval, question answer, become more and more pressing. In this manuscript, we seek to create and develop our own corpus of pair’s questions-texts. This constitution then will provide a better base for our experimentation step. Thus, we try to model this constitution by a method for Arabic insofar as it recovers texts from the web that could prove to be answers to our factual questions. To do this, we had to develop a java script that can extract from a given query a list of html pages. Then clean these pages to the extent of having a database of texts and a corpus of pair’s question-texts. In addition, we give preliminary results of our proposal method. Some investigations for the construction of Arabic corpus are also presented in this document.Keywords: Arabic, web, corpus, search engine, URL, question, corpus building, script, Google, html, txt
Procedia PDF Downloads 3236395 Drying Shrinkage of Concrete: Scale Effect and Influence of Reinforcement
Authors: Qier Wu, Issam Takla, Thomas Rougelot, Nicolas Burlion
Abstract:
In the framework of French underground disposal of intermediate level radioactive wastes, concrete is widely used as a construction material for containers and tunnels. Drying shrinkage is one of the most disadvantageous phenomena of concrete structures. Cracks generated by differential shrinkage could impair the mechanical behavior, increase the permeability of concrete and act as a preferential path for aggressive species, hence leading to an overall decrease in durability and serviceability. It is of great interest to understand the drying shrinkage phenomenon in order to predict and even to control the strains of concrete. The question is whether the results obtained from laboratory samples are in accordance with the measurements on a real structure. Another question concerns the influence of reinforcement on drying shrinkage of concrete. As part of a global project with Andra (French National Radioactive Waste Management Agency), the present study aims to experimentally investigate the scale effect as well as the influence of reinforcement on the development of drying shrinkage of two high performance concretes (based on CEM I and CEM V cements, according to European standards). Various sizes of samples are chosen, from ordinary laboratory specimens up to real-scale specimens: prismatic specimens with different volume-to-surface (V/S) ratios, thin slices (thickness of 2 mm), cylinders with different sizes (37 and 160 mm in diameter), hollow cylinders, cylindrical columns (height of 1000 mm) and square columns (320×320×1000 mm). The square columns have been manufactured with different reinforcement rates and can be considered as mini-structures, to approximate the behavior of a real voussoir from the waste disposal facility. All the samples are kept, in a first stage, at 20°C and 50% of relative humidity (initial conditions in the tunnel) in a specific climatic chamber developed by the Laboratory of Mechanics of Lille. The mass evolution and the drying shrinkage are monitored regularly. The obtained results show that the specimen size has a great impact on water loss and drying shrinkage of concrete. The specimens with a smaller V/S ratio and a smaller size have a bigger drying shrinkage. The correlation between mass variation and drying shrinkage follows the same tendency for all specimens in spite of the size difference. However, the influence of reinforcement rate on drying shrinkage is not clear based on the present results. The second stage of conservation (50°C and 30% of relative humidity) could give additional results on these influences.Keywords: concrete, drying shrinkage, mass evolution, reinforcement, scale effect
Procedia PDF Downloads 1836394 Privacy Preserving Data Publishing Based on Sensitivity in Context of Big Data Using Hive
Authors: P. Srinivasa Rao, K. Venkatesh Sharma, G. Sadhya Devi, V. Nagesh
Abstract:
Privacy Preserving Data Publication is the main concern in present days because the data being published through the internet has been increasing day by day. This huge amount of data was named as Big Data by its size. This project deals the privacy preservation in the context of Big Data using a data warehousing solution called hive. We implemented Nearest Similarity Based Clustering (NSB) with Bottom-up generalization to achieve (v,l)-anonymity. (v,l)-Anonymity deals with the sensitivity vulnerabilities and ensures the individual privacy. We also calculate the sensitivity levels by simple comparison method using the index values, by classifying the different levels of sensitivity. The experiments were carried out on the hive environment to verify the efficiency of algorithms with Big Data. This framework also supports the execution of existing algorithms without any changes. The model in the paper outperforms than existing models.Keywords: sensitivity, sensitive level, clustering, Privacy Preserving Data Publication (PPDP), bottom-up generalization, Big Data
Procedia PDF Downloads 2956393 Forecasting Exchange Rate between Thai Baht and the US Dollar Using Time Series Analysis
Authors: Kunya Bowornchockchai
Abstract:
The objective of this research is to forecast the monthly exchange rate between Thai baht and the US dollar and to compare two forecasting methods. The methods are Box-Jenkins’ method and Holt’s method. Results show that the Box-Jenkins’ method is the most suitable method for the monthly Exchange Rate between Thai Baht and the US Dollar. The suitable forecasting model is ARIMA (1,1,0) without constant and the forecasting equation is Yt = Yt-1 + 0.3691 (Yt-1 - Yt-2) When Yt is the time series data at time t, respectively.Keywords: Box–Jenkins method, Holt’s method, mean absolute percentage error (MAPE), exchange rate
Procedia PDF Downloads 2546392 Fast and Robust Long-term Tracking with Effective Searching Model
Authors: Thang V. Kieu, Long P. Nguyen
Abstract:
Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.Keywords: correlation filter, long-term tracking, random fern, real-time tracking
Procedia PDF Downloads 1386391 Optimizing Human Diet Problem Using Linear Programming Approach: A Case Study
Authors: P. Priyanka, S. Shruthi, N. Guruprasad
Abstract:
Health is a common theme in most cultures. In fact all communities have their concepts of health, as part of their culture. Health continues to be a neglected entity. Planning of Human diet should be done very careful by selecting the food items or groups of food items also the composition involved. Low price and good taste of foods are regarded as two major factors for optimal human nutrition. Linear programming techniques have been extensively used for human diet formulation for quiet good number of years. Through the process, we mainly apply “The Simplex Method” which is a very useful statistical tool based on the theorem of Elementary Row Operation from Linear Algebra and also incorporate some other necessary rules set by the Simplex Method to help solve the problem. The study done by us is an attempt to develop a programming model for optimal planning and best use of nutrient ingredients.Keywords: diet formulation, linear programming, nutrient ingredients, optimization, simplex method
Procedia PDF Downloads 5586390 Robot Movement Using the Trust Region Policy Optimization
Authors: Romisaa Ali
Abstract:
The Policy Gradient approach is one of the deep reinforcement learning families that combines deep neural networks (DNN) with reinforcement learning RL to discover the optimum of the control problem through experience gained from the interaction between the robot and its surroundings. In contrast to earlier policy gradient algorithms, which were unable to handle these two types of error because of over-or under-estimation introduced by the deep neural network model, this article will discuss the state-of-the-art SOTA policy gradient technique, trust region policy optimization (TRPO), by applying this method in various environments compared to another policy gradient method, the Proximal Policy Optimization (PPO), to explain their robust optimization, using this SOTA to gather experience data during various training phases after observing the impact of hyper-parameters on neural network performance.Keywords: deep neural networks, deep reinforcement learning, proximal policy optimization, state-of-the-art, trust region policy optimization
Procedia PDF Downloads 1696389 An Adjoint-Based Method to Compute Derivatives with Respect to Bed Boundary Positions in Resistivity Measurements
Authors: Mostafa Shahriari, Theophile Chaumont-Frelet, David Pardo
Abstract:
Resistivity measurements are used to characterize the Earth’s subsurface. They are categorized into two different groups: (a) those acquired on the Earth’s surface, for instance, controlled source electromagnetic (CSEM) and Magnetotellurics (MT), and (b) those recorded with borehole logging instruments such as Logging-While-Drilling (LWD) devices. LWD instruments are mostly used for geo-steering purposes, i.e., to adjust dip and azimuthal angles of a well trajectory to drill along a particular geological target. Modern LWD tools measure all nine components of the magnetic field corresponding to three orthogonal transmitter and receiver orientations. In order to map the Earth’s subsurface and perform geo-steering, we invert measurements using a gradient-based method that utilizes the derivatives of the recorded measurements with respect to the inversion variables. For resistivity measurements, these inversion variables are usually the constant resistivity value of each layer and the bed boundary positions. It is well-known how to compute derivatives with respect to the constant resistivity value of each layer using semi-analytic or numerical methods. However, similar formulas for computing the derivatives with respect to bed boundary positions are unavailable. The main contribution of this work is to provide an adjoint-based formulation for computing derivatives with respect to the bed boundary positions. The key idea to obtain the aforementioned adjoint state formulations for the derivatives is to separate the tangential and normal components of the field and treat them differently. This formulation allows us to compute the derivatives faster and more accurately than with traditional finite differences approximations. In the presentation, we shall first derive a formula for computing the derivatives with respect to the bed boundary positions for the potential equation. Then, we shall extend our formulation to 3D Maxwell’s equations. Finally, by considering a 1D domain and reducing the dimensionality of the problem, which is a common practice in the inversion of resistivity measurements, we shall derive a formulation to compute the derivatives of the measurements with respect to the bed boundary positions using a 1.5D variational formulation. Then, we shall illustrate the accuracy and convergence properties of our formulations by comparing numerical results with the analytical derivatives for the potential equation. For the 1.5D Maxwell’s system, we shall compare our numerical results based on the proposed adjoint-based formulation vs those obtained with a traditional finite difference approach. Numerical results shall show that our proposed adjoint-based technique produces enhanced accuracy solutions while its cost is negligible, as opposed to the finite difference approach that requires the solution of one additional problem per derivative.Keywords: inverse problem, bed boundary positions, electromagnetism, potential equation
Procedia PDF Downloads 1786388 The End Is Just the Beginning: The Importance of Project Post-Implementation Reviews
Authors: Catalin-Teodor Dogaru, Ana-Maria Dogaru
Abstract:
Success means different things to different people. For us, project managers, it becomes even harder to find a definition. Many factors have to be included in the evaluation. Moreover, literature is not very helpful, lacking consensus and neutrality. Post-implementation reviews (PIR) can be an efficient tool in evaluating how things worked on a certain project. Despite the visible progress, PIR is not a very detailed subject yet and there is not a common understanding in this matter. This may be the reason that some organizations include it in the projects’ lifecycle and some do not. Through this paper, we point out the reasons why all project managers should pay proper attention to this important step and to the elements, which can be assessed, beside the already famous triple constraints: cost, budget, and time. It is essential to take notice that PIR is not a checklist. It brings the edge in eliminating subjectivity and judging projects based on actual proof. Based on our experience, our success indicator model, presented in this paper, contributes to the success of the project! In the same time, it increases trust among customers who will perceive success more objectively.Keywords: project, post implementation, review, success, indicators
Procedia PDF Downloads 3716387 An Evaluation of a First Year Introductory Statistics Course at a University in Jamaica
Authors: Ayesha M. Facey
Abstract:
The evaluation sought to determine the factors associated with the high failure rate among students taking a first-year introductory statistics course. By utilizing Tyler’s Objective Based Model, the main objectives were: to assess the effectiveness of the lecturer’s teaching strategies; to determine the proportion of students who attends lectures and tutorials frequently and to determine the impact of infrequent attendance on performance; to determine how the assigned activities assisted in students understanding of the course content; to ascertain the possible issues being faced by students in understanding the course material and obtain possible solutions to the challenges and to determine whether the learning outcomes have been achieved based on an assessment of the second in-course examination. A quantitative survey research strategy was employed and the study population was students enrolled in semester one of the academic year 2015/2016. A convenience sampling approach was employed resulting in a sample of 98 students. Primary data was collected using self-administered questionnaires over a one-week period. Secondary data was obtained from the results of the second in-course examination. Data were entered and analyzed in SPSS version 22 and both univariate and bivariate analyses were conducted on the information obtained from the questionnaires. Univariate analyses provided description of the sample through means, standard deviations and percentages while bivariate analyses were done using Spearman’s Rho correlation coefficient and Chi-square analyses. For secondary data, an item analysis was performed to obtain the reliability of the examination questions, difficulty index and discriminant index. The examination results also provided information on the weak areas of the students and highlighted the learning outcomes that were not achieved. Findings revealed that students were more likely to participate in lectures than tutorials and that attendance was high for both lectures and tutorials. There was a significant relationship between participation in lectures and performance on examination. However, a high proportion of students has been absent from three or more tutorials as well as lectures. A higher proportion of students indicated that they completed the assignments obtained from the lectures sometimes while they rarely completed tutorial worksheets. Students who were more likely to complete their assignments were significantly more likely to perform well on their examination. Additionally, students faced a number of challenges in understanding the course content and the topics of probability, binomial distribution and normal distribution were the most challenging. The item analysis also highlighted these topics as problem areas. Problems doing mathematics and application and analyses were their major challenges faced by students and most students indicated that some of the challenges could be alleviated if additional examples were worked in lectures and they were given more time to solve questions. Analysis of the examination results showed that a number of learning outcomes were not achieved for a number of topics. Based on the findings recommendations were made that suggested adjustments to grade allocations, delivery of lectures and methods of assessment.Keywords: evaluation, item analysis, Tyler’s objective based model, university statistics
Procedia PDF Downloads 1906386 Finite Element Analysis of RC Frames with Retrofitted Infill Walls
Authors: M. Ömer Timurağaoğlu, Adem Doğangün, Ramazan Livaoğlu
Abstract:
The evaluation of performance of infilled reinforced concrete (RC) frames has been a significant challenge for engineers. The strengthening of infill walls has been an important concern to enhance the behavior of RC infilled frames. The aim of this study is to investigate the behaviour of retrofitted infill walls of RC frames using finite element analysis. For this purpose, a one storey, one bay infilled and strengthened infilled RC frame which have the same geometry and material properties with the frames tested in laboratory are modelled using different analytical approaches. A fibrous material is used to strengthen infill walls and frame. As a consequence, the results of the finite element analysis were evaluated of whether these analytical approaches estimate the behavior or not. To model the infilled and strengthened infilled RC frames, a finite element program ABAQUS is used. Finally, data obtained from the nonlinear finite element analysis is compared with the experimental results.Keywords: finite element analysis, infilled RC frames, infill wall, strengthening
Procedia PDF Downloads 5296385 Text Similarity in Vector Space Models: A Comparative Study
Authors: Omid Shahmirzadi, Adam Lugowski, Kenneth Younge
Abstract:
Automatic measurement of semantic text similarity is an important task in natural language processing. In this paper, we evaluate the performance of different vector space models to perform this task. We address the real-world problem of modeling patent-to-patent similarity and compare TFIDF (and related extensions), topic models (e.g., latent semantic indexing), and neural models (e.g., paragraph vectors). Contrary to expectations, the added computational cost of text embedding methods is justified only when: 1) the target text is condensed; and 2) the similarity comparison is trivial. Otherwise, TFIDF performs surprisingly well in other cases: in particular for longer and more technical texts or for making finer-grained distinctions between nearest neighbors. Unexpectedly, extensions to the TFIDF method, such as adding noun phrases or calculating term weights incrementally, were not helpful in our context.Keywords: big data, patent, text embedding, text similarity, vector space model
Procedia PDF Downloads 1756384 Solution of Insurance Pricing Model Giving Optimum Premium Level for Both Insured and Insurer by Game Theory
Authors: Betul Zehra Karagul
Abstract:
A game consists of strategies that each actor has in his/her own choice strategies, and a game regulates the certain rules in the strategies that the actors choose, express how they evaluate their knowledge and the utility of output results. Game theory examines the human behaviors (preferences) of strategic situations in which each actor of a game regards the action that others will make in spite of his own moves. There is a balance between each player playing a game with the final number of players and the player with a certain probability of choosing the players, and this is called Nash equilibrium. The insurance is a two-person game where the insurer and insured are the actors. Both sides have the right to act in favor of utility functions. The insured has to pay a premium to buy the insurance cover. The insured will want to pay a low premium while the insurer is willing to get a high premium. In this study, the state of equilibrium for insurance pricing was examined in terms of the insurer and insured with game theory.Keywords: game theory, insurance pricing, Nash equilibrium, utility function
Procedia PDF Downloads 3626383 Correlation between Consumer Knowledge of the Circular Economy and Consumer Behavior towards Its Application: A Canadian Exploratory Study
Authors: Christopher E. A. Ramsey, Halia Valladares Montemayor
Abstract:
This study examined whether the dissemination of information about the circular economy (CE) has any bearing on the likelihood of the implementation of its concepts on an individual basis. Specifically, the goal of this research study was to investigate the impact of consumer knowledge about the circular economy on their behavior in applying such concepts. Given that our current linear supply chains are unsustainable, it is of great importance that we understand what mechanisms are most effective in encouraging consumers to embrace CE. The theoretical framework employed was the theory of planned behavior (TPB). TPB, with its analysis of how attitude, subjective norms, and perceived behavioral control affect intention, provided an adequate model for testing the effects of increased information about the CE on the implementation of its recommendations. The empirical research consisted of a survey distributed among university students, faculty, and staff at a Canadian University in British Columbia.Keywords: circular economy, consumer behavior, sustainability, theory of planned behavior
Procedia PDF Downloads 1246382 Land Art in Public Spaces Design: Remediation, Prevention of Environmental Risks and Recycling as a Consequence of the Avant-Garde Activity of Landscape Architecture
Authors: Karolina Porada
Abstract:
Over the last 40 years, there has been a trend in landscape architecture which supporters do not perceive the role of pro-ecological or postmodern solutions in the design of public green spaces as an essential goal, shifting their attention to the 'sculptural' shaping of areas with the use of slopes, hills, embankments, and other forms of terrain. This group of designers can be considered avant-garde, which in its activities refers to land art. Initial research shows that such applications are particularly frequent in places of former post-industrial sites and landfills, utilizing materials such as debris and post-mining waste in their construction. Due to the high degradation of the environment surrounding modern man, the brownfields are a challenge and a field of interest for the representatives of landscape architecture avant-garde, who through their projects try to recover lost lands by means of transformations supported by engineering and ecological knowledge to create places where nature can develop again. The analysis of a dozen or so facilities made it possible to come up with an important conclusion: apart from the cultural aspects (including artistic activities), the green areas formally referring to the land are important in the process of remediation of post-industrial sites and waste recycling (e. g. from construction sites). In these processes, there is also a potential for applying the concept of Natural Based Solutions, i.e. solutions allowing for the natural development of the site in such a way as to use it to cope with environmental problems, such as e.g. air pollution, soil phytoremediation and climate change. The paper presents examples of modern parks, whose compositions are based on shaping the surface of the terrain in a way referring to the land art, at the same time providing an example of brownfields reuse and application of waste recycling. For the purposes of object analysis, research methods such as historical-interpretation studies, case studies, qualitative research or the method of logical argumentation were used. The obtained results provide information about the role that landscape architecture can have in the process of remediation of degraded areas, at the same time guaranteeing the benefits, such as the shaping of landscapes attractive in terms of visual appearance, low costs of implementation, and improvement of the natural environment quality.Keywords: brownfields, contemporary parks, landscape architecture, remediation
Procedia PDF Downloads 1506381 Design and Implementation of an Affordable Electronic Medical Records in a Rural Healthcare Setting: A Qualitative Intrinsic Phenomenon Case Study
Authors: Nitika Sharma, Yogesh Jain
Abstract:
Introduction: An efficient Information System helps in improving the service delivery as well provides the foundation for policy and regulation of other building blocks of Health System. Health care organizations require an integrated working of its various sub-systems. An efficient EMR software boosts the teamwork amongst the various sub-systems thereby resulting in improved service delivery. Although there has been a huge impetus to EMR under the Digital India initiative, it has still not been mandated in India. It is generally implemented in huge funded public or private healthcare organizations only. Objective: The study was conducted to understand the factors that lead to the successful adoption of an affordable EMR in the low level healthcare organization. It intended to understand the design of the EMR and address the solutions to the challenges faced in adoption of the EMR. Methodology: The study was conducted in a non-profit registered Healthcare organization that has been providing healthcare facilities to more than 2500 villages including certain areas that are difficult to access. The data was collected with help of field notes, in-depth interviews and participant observation. A total of 16 participants using the EMR from different departments were enrolled via purposive sampling technique. The participants included in the study were working in the organization before the implementation of the EMR system. The study was conducted in one month period from 25 June-20 July 2018. The Ethical approval was taken from the institute along with prior approval of the participants. Data analysis: A word document of more than 4000 words was obtained after transcribing and translating the answers of respondents. It was further analyzed by focused coding, a line by line review of the transcripts, underlining words, phrases or sentences that might suggest themes to do thematic narrative analysis. Results: Based on the answers the results were thematically grouped under four headings: 1. governance of organization, 2. architecture and design of the software, 3. features of the software, 4. challenges faced in adoption and the solutions to address them. It was inferred that the successful implementation was attributed to the easy and comprehensive design of the system which has facilitated not only easy data storage and retrieval but contributes in constructing a decision support system for the staff. Portability has lead to increased acceptance by physicians. The proper division of labor, increased efficiency of staff, incorporation of auto-correction features and facilitation of task shifting has lead to increased acceptance amongst the users of various departments. Geographical inhibitions, low computer literacy and high patient load were the major challenges faced during its implementation. Despite of dual efforts made both by the architects and administrators to combat these challenges, there are still certain ongoing challenges faced by organization. Conclusion: Whenever any new technology is adopted there are certain innovators, early adopters, late adopters and laggards. The same pattern was followed in adoption of this software. He challenges were overcome with joint efforts of organization administrators and users as well. Thereby this case study provides a framework of implementing similar systems in public sector of countries that are struggling for digitizing the healthcare in presence of crunch of human and financial resources.Keywords: EMR, healthcare technology, e-health, EHR
Procedia PDF Downloads 1056380 Effects of Cattaneo-Christov Heat Flux on 3D Magnetohydrodynamic Viscoelastic Fluid Flow with Variable Thermal Conductivity
Authors: Muhammad Ramzan
Abstract:
A mathematical model has been envisaged to discuss three-dimensional Viscoelastic fluid flow with an effect of Cattaneo-Christov heat flux in attendance of magnetohydrodynamic (MHD). Variable thermal conductivity with the impact of homogeneous-heterogeneous reactions and convective boundary condition is also taken into account. Homotopy analysis method is engaged to obtain series solutions. Graphical illustrations depicting behaviour of sundry parameters on skin friction coefficient and all involved distributions are also given. It is observed that velocity components are decreasing functions of Viscoelastic fluid parameter. Furthermore, strength of homogeneous and heterogeneous reactions have opposite effects on concentration distribution. A comparison with a published paper has also been established and an excellent agreement is obtained; hence reliable results are being presented.Keywords: Cattaneo Christov heat flux, homogenous-heterogeneous reactions, magnetic field, variable thermal conductivity
Procedia PDF Downloads 1976379 Risk Management of Water Derivatives: A New Commodity in The Market
Authors: Daniel Mokatsanyane, Johnny Jansen Van Rensburg
Abstract:
This paper is a concise introduction of the risk management on the water derivatives market. Water, a new commodity in the market, is one of the most important commodity on earth. As important to life and planet as crops, metals, and energy, none of them matters without water. This paper presents a brief overview of water as a tradable commodity via a new first of its kind futures contract on the Nasdaq Veles California Water Index (NQH2O) derivative instrument, TheGeneralised Autoregressive Conditional Heteroscedasticity (GARCH) statistical model will be the used to measure the water price volatility of the instrument and its performance since it’s been traded. describe the main products and illustrate their usage in risk management and also discuss key challenges with modeling and valuation of water as a traded commodity and finally discuss how water derivatives may be taken as an alternative asset investment class.Keywords: water derivatives, commodity market, nasdaq veles california water Index (NQH2O, water price, risk management
Procedia PDF Downloads 1366378 Precision Assessment of the Orthometric Heights Determination in the Northern Part of Libya
Authors: Jamal A. Gledan, Akrm H. Algnin
Abstract:
The Global Positioning System (GPS) satellite-based technology has been utilized extensively in the last few years in a wide range of Geomatics and Geographic Information Systems (GIS) applications. One of the main challenges dealing with GPS-based heights consists of converting them into Mean Sea Level (MSL) heights which is used in surveys and mapping. In this research work, differences in heights of 50 points, in northern part of Libya were carried out using both ordinary levelling (in which Geoid is the reference datum) and GPS techniques (in which Ellipsoid is the reference datum). In addition, this study has utilized the EGM2008 model to obtain the undulation values between the ellipsoidal and orthometric heights. From these values with ellipsoidal heights which can be obtained from GPS observations to compute the orthomteric heights. This research presented a suitable alternative, from an economical point of view, to substitute the expensive traditional levelling technique particularly for topographic mapping.Keywords: geoid undulation, GPS, ordinary and geodetic levelling, orthometric height
Procedia PDF Downloads 4456377 Optimal Tracking Control of a Hydroelectric Power Plant Incorporating Neural Forecasting for Uncertain Input Disturbances
Authors: Marlene Perez Villalpando, Kelly Joel Gurubel Tun
Abstract:
In this paper, we propose an optimal control strategy for a hydroelectric power plant subject to input disturbances like meteorological phenomena. The engineering characteristics of the system are described by a nonlinear model. The random availability of renewable sources is predicted by a high-order neural network trained with an extended Kalman filter, whereas the power generation is regulated by the optimal control law. The main advantage of the system is the stabilization of the amount of power generated in the plant. A control supervisor maintains stability and availability in hydropower reservoirs water levels for power generation. The proposed approach demonstrated a good performance to stabilize the reservoir level and the power generation along their desired trajectories in the presence of disturbances.Keywords: hydropower, high order neural network, Kalman filter, optimal control
Procedia PDF Downloads 2986376 The Effect of Spatial Variability on Axial Pile Design of Closed Ended Piles in Sand
Authors: Cormac Reale, Luke J. Prendergast, Kenneth Gavin
Abstract:
While significant improvements have been made in axial pile design methods over recent years, the influence of soils natural variability has not been adequately accounted for within them. Soil variability is a crucial parameter to consider as it can account for large variations in pile capacity across the same site. This paper seeks to address this knowledge deficit, by demonstrating how soil spatial variability can be accommodated into existing cone penetration test (CPT) based pile design methods, in the form of layered non-homogeneous random fields. These random fields model the scope of a given property’s variance and define how it varies spatially. A Monte Carlo analysis of the pile will be performed taking into account parameter uncertainty and spatial variability, described using the measured scales of fluctuation. The results will be discussed in light of Eurocode 7 and the effect of spatial averaging on design capacities will be analysed.Keywords: pile axial design, reliability, spatial variability, CPT
Procedia PDF Downloads 2466375 Effect of Non-Newtonian Behavior of Oil Phase on Oil-Water Stratified Flow in a Horizontal Channel
Authors: Satish Kumar Dewangan, Santosh Kumar Senapati
Abstract:
The present work focuses on the investigation of the effect of non-Newtonian behavior on the oil-water stratified flow in a horizontal channel using ANSYS Fluent. Coupled level set and volume of fluid (CLSVOF) has been used to capture the evolving interface assuming unsteady, coaxial flow with constant fluid properties. The diametric variation of oil volume fraction, mixture velocity, total pressure and pressure gradient has been studied. Non-Newtonian behavior of oil has been represented by the power law model in order to investigate the effect of flow behavior index. Stratified flow pattern tends to assume dispersed flow pattern with the change in the behavior of oil to non-Newtonian. The pressure gradient is found to be very much sensitive to the flow behavior index. The findings could be useful in designing the transportation pipe line in petroleum industries.Keywords: oil-water stratified flow, horizontal channel, CLSVOF, non–Newtonian behaviour.
Procedia PDF Downloads 4916374 Static and Dynamic Load on Hip Contact of Hip Prosthesis and Thai Femoral Bones
Authors: K. Chalernpon, P. Aroonjarattham, K. Aroonjarattham
Abstract:
Total hip replacement had been one of the most successful operations in hip arthritis surgery. The purpose of this research had been to develop a dynamic hip contact of Thai femoral bone to analyze the stress distribution on the implant and the strain distribution on the bone model under daily activities and compared with the static load simulation. The results showed the different of maximum von Mises stress 0.14 percent under walking and 0.03 percent under climbing stair condition and the different of equivalent total strain 0.52 percent under walking and 0.05 percent under climbing stair condition. The muscular forces should be evaluated with dynamic condition to reduce the maximum von Mises stress and equivalent total strain.Keywords: dynamic loading, static load, hip prosthesis, Thai femur, femoral bone, finite element analysis
Procedia PDF Downloads 3496373 Peristaltic Transport of a Jeffrey Fluid with Double-Diffusive Convection in Nanofluids in the Presence of Inclined Magnetic Field
Authors: Safia Akram
Abstract:
In this article, the effects of peristaltic transport with double-diffusive convection in nanofluids through an asymmetric channel with different waveforms is presented. Mathematical modelling for two-dimensional and two directional flows of a Jeffrey fluid model along with double-diffusive convection in nanofluids are given. Exact solutions are obtained for nanoparticle fraction field, concentration field, temperature field, stream functions, pressure gradient and pressure rise in terms of axial and transverse coordinates under the restrictions of long wavelength and low Reynolds number. With the help of computational and graphical results the effects of Brownian motion, thermophoresis, Dufour, Soret, and Grashof numbers (thermal, concentration, nanoparticles) on peristaltic flow patterns with double-diffusive convection are discussed.Keywords: nanofluid particles, peristaltic flow, Jeffrey fluid, magnetic field, asymmetric channel, different waveforms
Procedia PDF Downloads 3816372 Capex Planning with and without Additional Spectrum
Authors: Koirala Abarodh, Maghaiya Ujjwal, Guragain Phani Raj
Abstract:
This analysis focuses on defining the spectrum evaluation model for telecom operators in terms of total cost of ownership (TCO). A quantitative approach for specific case analysis research methodology was used for identifying the results. Specific input parameters like target User experience, year on year traffic growth, capacity site limit per year, target additional spectrum type, bandwidth, spectrum efficiency, UE penetration have been used for the spectrum evaluation process and desired outputs in terms of the number of sites, capex in USD and required spectrum bandwidth have been calculated. Furthermore, this study gives a comparison of capex investment for target growth with and without addition spectrum. As a result, the combination of additional spectrum bands of 700 and 2600 MHz has a better evaluation in terms of TCO and performance. It is our recommendation to use these bands for expansion rather than expansion in the current 1800 and 2100 bands.Keywords: spectrum, capex planning, case study methodology, TCO
Procedia PDF Downloads 646371 STATCOM’s Contribution to the Improvement of Voltage Plan and Power Flow in an Electrical Transmission Network
Authors: M. Adjabi, A. Amiar, P. O. Logerais
Abstract:
Flexible Alternative Current Systems Transmission (FACTS) are used since nearly four decades and present very good dynamic performances. The purpose of this work is to study the behavior of a system where Static Compensator (STATCOM) is located at the midpoint of a transmission line which is the idea of the project functioning in disturbed modes with various levels of load. The studied model and starting from the analysis of various alternatives will lead to the checking of the aptitude of the STATCOM to maintain the voltage plan and to improve the power flow in electro-energetic system which is the east region of Algerian 400 kV transmission network. The steady state performance of STATCOM’s controller is analyzed through computer simulations with Matlab/Simulink program. The simulation results have demonstrated that STATCOM can be effectively applied in power transmission systems to solve the problems of poor dynamic performance and voltage regulation.Keywords: STATCOM, reactive power, power flow, voltage plan, Algerian network
Procedia PDF Downloads 5696370 STATCOM's Contribution to the Improvement of Voltage Plan and Power Flow in an Electrical Transmission Network
Authors: M. Adjabi, A. Amiar, P. O. Logerais
Abstract:
Flexible Alternative Current Systems Transmission (FACTS) are used since nearly four decades and present very good dynamic performances. The purpose of this work is to study the behavior of a system where Static Compensator (STATCOM) is located at the midpoint of a transmission line which is the idea of the project functioning in disturbed modes with various levels of load. The studied model and starting from the analysis of various alternatives will lead to the checking of the aptitude of the STATCOM to maintain the voltage plan and to improve the power flow in electro-energetic system which is the east region of Algerian 400 kV transmission network. The steady state performance of STATCOM’s controller is analyzed through computer simulations with Matlab/Simulink program. The simulation results have demonstrated that STATCOM can be effectively applied in power transmission systems to solve the problems of poor dynamic performance and voltage regulation.Keywords: STATCOM, reactive power, power flow, voltage plan, Algerian network
Procedia PDF Downloads 6006369 Naïve Bayes: A Classical Approach for the Epileptic Seizures Recognition
Authors: Bhaveek Maini, Sanjay Dhanka, Surita Maini
Abstract:
Electroencephalography (EEG) is used to classify several epileptic seizures worldwide. It is a very crucial task for the neurologist to identify the epileptic seizure with manual EEG analysis, as it takes lots of effort and time. Human error is always at high risk in EEG, as acquiring signals needs manual intervention. Disease diagnosis using machine learning (ML) has continuously been explored since its inception. Moreover, where a large number of datasets have to be analyzed, ML is acting as a boon for doctors. In this research paper, authors proposed two different ML models, i.e., logistic regression (LR) and Naïve Bayes (NB), to predict epileptic seizures based on general parameters. These two techniques are applied to the epileptic seizures recognition dataset, available on the UCI ML repository. The algorithms are implemented on an 80:20 train test ratio (80% for training and 20% for testing), and the performance of the model was validated by 10-fold cross-validation. The proposed study has claimed accuracy of 81.87% and 95.49% for LR and NB, respectively.Keywords: epileptic seizure recognition, logistic regression, Naïve Bayes, machine learning
Procedia PDF Downloads 616368 Potency of Minapolitan Area Development to Enhance Gross Domestic Product and Prosperty in Indonesia
Authors: Shobrina Silmi Qori Tarlita, Fariz Kukuh Harwinda
Abstract:
Indonesia has 81.000 kilometers coastal line and 70% water surface which is known as the country who has a huge potential in fisheries sector and also which is able to support more than 50 % of Gross Domestic Product. But according to Department of Marine and Fisheries data, fisheries sector supported only 20% of Total GDP in 1998. Not only that, the highest decline in fisheries sector income occured in 2009. Those conditions occur, because of some factors contributed to the lack of integrated working platform for the fisheries and marine management in some areas which have a high productivity to increase the economical profit every year for the country, especially Indonesia, besides the labor requirement for every company, whether a big company or smaller one, depends on the natural condition that makes a lot of people become unemployed if the weather condition or any other conditions dealing with the natural condition is bad for creating fisheries and marine management, especially in aquaculture and fish – captured operation. Not only those, a lot of fishermen, especially in Indonesia, mostly make their job profession as an additional job or side job to fulfill their own needs, although they are averagely poor. Another major problem are the lack of the sustainable developmental program to stabilize the productivity of fisheries and marine natural source, like protecting the environment for fish nursery ground and migration channel, that makes the low productivity of fisheries and marine natural resource, even though the growth of the society in Indonesia has increased for years and needs more food resource to comply the high demand nutrition for living. The development of Minapolitan Area is one of the alternative solution to build a better place for aqua-culturist as well as the fishermen which focusing on systemic and business effort for fisheries and marine management. Minapolitan is kind of integration area which gathers and integrates the ones who is focusing their effort and business in fisheries sector, so that Minapolitan is capable of triggering the fishery activity on the area which using Minapolitan management intensively. From those things, finally, Minapolitan is expected to reinforce the sustainable development through increasing the productivity of fish – capturing operation as well as aquaculture, and it is also expected that Minapolitan will be able to increase GDP, the earning for a lot of people and also will be able to bring prosperity around the world. From those backgrounds, this paper will explain more about the Minapolitan Area and the design of reinforcing the Minapolitan Area by zonation in the Fishery and Marine exploitation area with high productivity as well as low productivity. Hopefully, this solution will be able to answer the economical and social issue for declining food resource, especially fishery and marine resource.Keywords: Minapolitan, fisheries, economy, Indonesia
Procedia PDF Downloads 463