Search results for: first order error analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37851

Search results for: first order error analysis

37221 Arabic Character Recognition Using Regression Curves with the Expectation Maximization Algorithm

Authors: Abdullah A. AlShaher

Abstract:

In this paper, we demonstrate how regression curves can be used to recognize 2D non-rigid handwritten shapes. Each shape is represented by a set of non-overlapping uniformly distributed landmarks. The underlying models utilize 2nd order of polynomials to model shapes within a training set. To estimate the regression models, we need to extract the required coefficients which describe the variations for a set of shape class. Hence, a least square method is used to estimate such modes. We then proceed by training these coefficients using the apparatus Expectation Maximization algorithm. Recognition is carried out by finding the least error landmarks displacement with respect to the model curves. Handwritten isolated Arabic characters are used to evaluate our approach.

Keywords: character recognition, regression curves, handwritten Arabic letters, expectation maximization algorithm

Procedia PDF Downloads 145
37220 Second Harmonic Generation of Higher-Order Gaussian Laser Beam in Density Rippled Plasma

Authors: Jyoti Wadhwa, Arvinder Singh

Abstract:

This work presents the theoretical investigation of an enhanced second-harmonic generation of higher-order Gaussian laser beam in plasma having a density ramp. The mechanism responsible for the self-focusing of a laser beam in plasma is considered to be the relativistic mass variation of plasma electrons under the effect of a highly intense laser beam. Using the moment theory approach and considering the Wentzel-Kramers-Brillouin approximation for the non-linear Schrodinger wave equation, the differential equation is derived, which governs the spot size of the higher-order Gaussian laser beam in plasma. The nonlinearity induced by the laser beam creates the density gradient in the background plasma electrons, which is responsible for the excitation of the electron plasma wave. The large amplitude electron plasma wave interacts with the fundamental beam, which further produces the coherent radiations with double the frequency of the incident beam. The analysis shows the important role of the different modes of higher-order Gaussian laser beam and density ramp on the efficiency of generated harmonics.

Keywords: density rippled plasma, higher order Gaussian laser beam, moment theory approach, second harmonic generation.

Procedia PDF Downloads 179
37219 Discrete Sliding Modes Regulator with Exponential Holder for Non-Linear Systems

Authors: G. Obregon-Pulido , G. C. Solis-Perales, J. A. Meda-Campaña

Abstract:

In this paper, we present a sliding mode controller in discrete time. The design of the controller is based on the theory of regulation for nonlinear systems. In the problem of disturbance rejection and/or output tracking, it is known that in discrete time, a controller that uses the zero-order holder only guarantees tracking at the sampling instances but not between instances. It is shown that using the so-called exponential holder, it is possible to guarantee asymptotic zero output tracking error, also between the sampling instant. For stabilizing the problem of close loop system we introduce the sliding mode approach relaxing the requirements of the existence of a linear stabilizing control law.

Keywords: regulation theory, sliding modes, discrete controller, ripple-free tracking

Procedia PDF Downloads 54
37218 An Investigation of Crop Diversity’s Impact on Income Risk of Selected Crops

Authors: Saeed Yazdani, Sima Mohamadi Amidabadi, Amir Mohamadi Nejad, Farahnaz Nekoofar

Abstract:

As a result of uncertainty and doubts about the quantity of agricultural products, greater significance has been attached to risk management in the agricultural sector. Normally, farmers seek to minimize risks, and crop diversity has always been a means to reduce risk. The study at hand seeks to explore the long-term impact of crop diversity on income risk reduction. The timeframe of the study is 1998 to 2018. Initially, the Herfindahl index was used to estimate crop diversity in different periods, and next, the Hodrick-Prescott filter was applied to estimate income risk both in nominal and real terms. Finally, using the Vector Error Correction Model (VECM), the long-term impact of crop diversity on two modes of risk for the farmer's income has been estimated. Given the long-term pattern’s results, it is evident that in the long-run, crop diversity can reduce income fluctuations in two nominal and real terms. Moreover, results showed that in case the fluctuation shock affects the agricultural income in the short run, to balance out the shock in nominal and real terms, 4 and 3 cycles are needed respectively. In other words, in each cycle, 25% and 33% of the shock impact can be removed, respectively. Thus, as the results of the error correction coefficient showed, policies need to be put in place to prevent income shocks. In case of a shock, they need to be balanced out in a four-year period, taking inflation into account, and in a three-year period irrespective of the inflation and reparative policies such as insurance services should be developed.

Keywords: risk, long-term model, Herfindahl index, time series model, vector error correction model

Procedia PDF Downloads 24
37217 Capability Prediction of Machining Processes Based on Uncertainty Analysis

Authors: Hamed Afrasiab, Saeed Khodaygan

Abstract:

Prediction of machining process capability in the design stage plays a key role to reach the precision design and manufacturing of mechanical products. Inaccuracies in machining process lead to errors in position and orientation of machined features on the part, and strongly affect the process capability in the final quality of the product. In this paper, an efficient systematic approach is given to investigate the machining errors to predict the manufacturing errors of the parts and capability prediction of corresponding machining processes. A mathematical formulation of fixture locators modeling is presented to establish the relationship between the part errors and the related sources. Based on this method, the final machining errors of the part can be accurately estimated by relating them to the combined dimensional and geometric tolerances of the workpiece – fixture system. This method is developed for uncertainty analysis based on the Worst Case and statistical approaches. The application of the presented method is illustrated through presenting an example and the computational results are compared with the Monte Carlo simulation results.

Keywords: process capability, machining error, dimensional and geometrical tolerances, uncertainty analysis

Procedia PDF Downloads 307
37216 Value Proposition and Value Creation in Network Environments: An Experimental Study of Academic Productivity via the Application of Bibliometrics

Authors: R. Oleko, A. Saraceni

Abstract:

The aim of this research is to provide a rigorous evaluation of the existing academic productivity in relation to value proposition and creation in networked environments. Bibliometrics is a vigorous approach used to structure existing literature in an objective and reliable manner. To that aim, a thorough bibliometric analysis was performed in order to assess the large volume of the information encountered in a structured and reliable manner. A clear distinction between networks and service networks was considered indispensable in order to capture the effects of each network’s type properties on value creation processes. Via the use of bibliometric parameters, this review was able to capture the state-of-the-art in both value proposition and value creation consecutively. The results provide a rigorous assessment of the annual scientific production, the most influential journals, and the leading corresponding author countries. By means of citation analysis, the most frequently cited manuscripts and countries for each network type were identified. Moreover, by means of co-citation analysis, existing collaborative patterns were detected through the creation of reference co-citation networks and country collaboration networks. Co-word analysis was also performed in order to provide an overview of the conceptual structure in both networks and service networks. The acquired results provide a rigorous and systematic assessment of the existing scientific output in networked settings. As such, they positively contribute to a better understanding of the distinct impact of service networks on value proposition and value creation when compared to regular networks. The implications derived can serve as a guide for informed decision-making by practitioners during network formation and provide a structured evaluation that can stand as a basis for future research in the field.

Keywords: bibliometrics, co-citation analysis, networks, service networks, value creation, value proposition

Procedia PDF Downloads 203
37215 Exploring Error-Minimization Protocols for Upper-Limb Function During Activities of Daily Life in Chronic Stroke Patients

Authors: M. A. Riurean, S. Heijnen, C. A. Knott, J. Makinde, D. Gotti, J. VD. Kamp

Abstract:

Objectives: The current study is done in preparation for a randomized controlled study investigating the effects of an implicit motor learning protocol implemented using an extension-supporting glove. It will explore different protocols to find out which is preferred when studying motor learn-ing in the chronic stroke population that struggles with hand spasticity. Design: This exploratory study will follow 24 individuals who have a chronic stroke (> 6 months) during their usual care journey. We will record the results of two 9-Hole Peg Tests (9HPT) done during their therapy ses-sions with a physiotherapist or in their home before and after 4 weeks of them wearing an exten-sion-supporting glove used to employ the to-be-studied protocols. The participants will wear the glove 3 times/week for one hour while performing their activities of daily living and record the times they wore it in a diary. Their experience will be monitored through telecommunication once every week. Subjects: Individuals that have had a stroke at least 6 months prior to participation, hand spasticity measured on the modified Ashworth Scale of maximum 3, and finger flexion motor control measured on the Motricity Index of at least 19/33. Exclusion criteria: extreme hemi-neglect. Methods: The participants will be randomly divided into 3 groups: one group using the glove in a pre-set way of decreasing support (implicit motor learning), one group using the glove in a self-controlled way of decreasing support (autonomous motor learning), and the third using the glove with constant support (as control). Before and after the 4-week period, there will be an intake session and a post-assessment session. Analysis: We will compare the results of the two 9HPTs to check whether the protocols were effective. Furthermore, we will compare the results between the three groups to find the preferred one. A qualitative analysis will be run of the experience of participants throughout the 4-week period. Expected results: We expect that the group using the implicit learning protocol will show superior results.

Keywords: implicit learning, hand spasticity, stroke, error minimization, motor task

Procedia PDF Downloads 59
37214 Neural Network Based Path Loss Prediction for Global System for Mobile Communication in an Urban Environment

Authors: Danladi Ali

Abstract:

In this paper, we measured GSM signal strength in the Dnepropetrovsk city in order to predict path loss in study area using nonlinear autoregressive neural network prediction and we also, used neural network clustering to determine average GSM signal strength receive at the study area. The nonlinear auto-regressive neural network predicted that the GSM signal is attenuated with the mean square error (MSE) of 2.6748dB, this attenuation value is used to modify the COST 231 Hata and the Okumura-Hata models. The neural network clustering revealed that -75dB to -95dB is received more frequently. This means that the signal strength received at the study is mostly weak signal

Keywords: one-dimensional multilevel wavelets, path loss, GSM signal strength, propagation, urban environment and model

Procedia PDF Downloads 382
37213 Quintic Spline Solution of Fourth-Order Parabolic Equations Arising in Beam Theory

Authors: Reza Mohammadi, Mahdieh Sahebi

Abstract:

We develop a method based on polynomial quintic spline for numerical solution of fourth-order non-homogeneous parabolic partial differential equation with variable coefficient. By using polynomial quintic spline in off-step points in space and finite difference in time directions, we obtained two three level implicit methods. Stability analysis of the presented method has been carried out. We solve four test problems numerically to validate the derived method. Numerical comparison with other methods shows the superiority of presented scheme.

Keywords: fourth-order parabolic equation, variable coefficient, polynomial quintic spline, off-step points

Procedia PDF Downloads 352
37212 NOx Prediction by Quasi-Dimensional Combustion Model of Hydrogen Enriched Compressed Natural Gas Engine

Authors: Anas Rao, Hao Duan, Fanhua Ma

Abstract:

The dependency on the fossil fuels can be minimized by using the hydrogen enriched compressed natural gas (HCNG) in the transportation vehicles. However, the NOx emissions of HCNG engines are significantly higher, and this turned to be its major drawback. Therefore, the study of NOx emission of HCNG engines is a very important area of research. In this context, the experiments have been performed at the different hydrogen percentage, ignition timing, air-fuel ratio, manifold-absolute pressure, load and engine speed. Afterwards, the simulation has been accomplished by the quasi-dimensional combustion model of HCNG engine. In order to investigate the NOx emission, the NO mechanism has been coupled to the quasi-dimensional combustion model of HCNG engine. The three NOx mechanism: the thermal NOx, prompt NOx and N2O mechanism have been used to predict NOx emission. For the validation purpose, NO curve has been transformed into NO packets based on the temperature difference of 100 K for the lean-burn and 60 K for stoichiometric condition. While, the width of the packet has been taken as the ratio of crank duration of the packet to the total burnt duration. The combustion chamber of the engine has been divided into three zones, with the zone equal to the product of summation of NO packets and space. In order to check the accuracy of the model, the percentage error of NOx emission has been evaluated, and it lies in the range of ±6% and ±10% for the lean-burn and stoichiometric conditions respectively. Finally, the percentage contribution of each NO formation has been evaluated.

Keywords: quasi-dimensional combustion , thermal NO, prompt NO, NO packet

Procedia PDF Downloads 251
37211 Vehicles Analysis, Assessment and Redesign Related to Ergonomics and Human Factors

Authors: Susana Aragoneses Garrido

Abstract:

Every day, the roads are scenery of numerous accidents involving vehicles, producing thousands of deaths and serious injuries all over the world. Investigations have revealed that Human Factors (HF) are one of the main causes of road accidents in modern societies. Distracted driving (including external or internal aspects of the vehicle), which is considered as a human factor, is a serious and emergent risk to road safety. Consequently, a further analysis regarding this issue is essential due to its transcendence on today’s society. The objectives of this investigation are the detection and assessment of the HF in order to provide solutions (including a better vehicle design), which might mitigate road accidents. The methodology of the project is divided in different phases. First, a statistical analysis of public databases is provided between Spain and The UK. Second, data is classified in order to analyse the major causes involved in road accidents. Third, a simulation between different paths and vehicles is presented. The causes related to the HF are assessed by Failure Mode and Effects Analysis (FMEA). Fourth, different car models are evaluated using the Rapid Upper Body Assessment (RULA). Additionally, the JACK SIEMENS PLM tool is used with the intention of evaluating the Human Factor causes and providing the redesign of the vehicles. Finally, improvements in the car design are proposed with the intention of reducing the implication of HF in traffic accidents. The results from the statistical analysis, the simulations and the evaluations confirm that accidents are an important issue in today’s society, especially the accidents caused by HF resembling distractions. The results explore the reduction of external and internal HF through the global analysis risk of vehicle accidents. Moreover, the evaluation of the different car models using RULA method and the JACK SIEMENS PLM prove the importance of having a good regulation of the driver’s seat in order to avoid harmful postures and therefore distractions. For this reason, a car redesign is proposed for the driver to acquire the optimum position and consequently reducing the human factors in road accidents.

Keywords: analysis vehicles, asssesment, ergonomics, car redesign

Procedia PDF Downloads 335
37210 Vibration Analysis of FGM Sandwich Panel with Cut-Outs Using Refined Higher-Order Shear Deformation Theory (HSDT) Based on Isogeometric Analysis

Authors: Lokanath Barik, Abinash Kumar Swain

Abstract:

This paper presents vibration analysis of FGM sandwich structure with a complex profile governed by refined higher-order shear deformation theory (RHSDT) using isogeometric analysis (IGA). Functionally graded sandwich plates provide a wide range of applications in aerospace, defence, and aircraft industries due to their ability to distribute material functions to influence the thermo-mechanical properties as desired. In practical applications, these structures generally have intrinsic profiles, and their response to loads is significantly affected due to cut-outs. IGA is primarily a NURBS-based technique that is effective in solving higher-order differential equations due to its inherent C1 continuity imposition in solution space for a single patch. Complex structures generally require multiple patches to accurately represent the geometry, and hence, there is a loss of continuity at adjoining patch junctions. Therefore, patch coupling is desired to maintain continuity requirements throughout the domain. In this work, a novel strong coupling approach is provided that generates a well-defined NURBS-based model while achieving continuity. The methodology is validated by free vibration analysis of sandwich plates with present literature. The results are in good agreement with the analytical solution for different plate configurations and power law indexes. Numerical examples of rectangular and annular plates are discussed with variable boundary conditions. Additionally, parametric studies are provided by varying the aspect ratio, porosity ratio and their influence on the natural frequency of the plate.

Keywords: vibration analysis, FGM sandwich structure, multipatch geometry, patch coupling, IGA

Procedia PDF Downloads 82
37209 Site Analysis’ Importance as a Valid Factor in Building Design

Authors: Mekwa Eme, Anya chukwuma

Abstract:

The act of evaluating a particular site physically and socially in order to create a good design solution that will address the physical and interior environment of the location is known as architectural site analysis. This essay will describe site analysis as a useful design component. According to the introduction and supporting research, site evaluation and analysis are crucial to good design in terms of topography, orientation, site size, accessibility, rainfall, wind direction, and times of sunrise and sunset. Methodology: Both quantitative and qualitative analyses are used in this paper. The primary and secondary types of data collection are as follows. This information was gathered via the case study approach, already published literature, journals, the internet, a local poll, oral interviews, inquiries, and in-person interviews. The purpose of this is to clarify the benefits of site analysis for the design process and its implications for the working or building stage. Results: Each site's criteria are unique in terms of things like soil, plants, trees, accessibility, topography, and security. This will make it easier for the architect and environmentalist to decide on the idea, shape, and supporting structures of the design. It is crucial because before any design work is done, the nature of the target location will be determined through site visits and research. The location, contours, site features, and accessibility are just a few of the topics included in this site study. In order for students and working architects to understand the nature of the site they will be working on, site analysis is a key component of architectural education. The building's orientation, the site's circulation, and the sustainability of the site may all be determined with thorough research of the site's features.

Keywords: analysis, climate, statistics, design

Procedia PDF Downloads 249
37208 Energy Consumption Modeling for Strawberry Greenhouse Crop by Adaptive Nero Fuzzy Inference System Technique: A Case Study in Iran

Authors: Azar Khodabakhshi, Elham Bolandnazar

Abstract:

Agriculture as the most important food manufacturing sector is not only the energy consumer, but also is known as energy supplier. Using energy is considered as a helpful parameter for analyzing and evaluating the agricultural sustainability. In this study, the pattern of energy consumption of strawberry greenhouses of Jiroft in Kerman province of Iran was surveyed. The total input energy required in the strawberries production was calculated as 113314.71 MJ /ha. Electricity with 38.34% contribution of the total energy was considered as the most energy consumer in strawberry production. In this study, Neuro Fuzzy networks was used for function modeling in the production of strawberries. Results showed that the best model for predicting the strawberries function had a correlation coefficient, root mean square error (RMSE) and mean absolute percentage error (MAPE) equal to 0.9849, 0.0154 kg/ha and 0.11% respectively. Regards to these results, it can be said that Neuro Fuzzy method can be well predicted and modeled the strawberry crop function.

Keywords: crop yield, energy, neuro-fuzzy method, strawberry

Procedia PDF Downloads 381
37207 Reliability and Validity for Measurement of Body Composition: A Field Method

Authors: Ahmad Hashim, Zarizi Ab Rahman

Abstract:

Measurement of body composition via a field method has the most popular instruments which are used to estimate the percentage of body fat. Among the instruments used are the Body Mass Index, Bio Impedance Analysis and Skinfold Test. All three of these instruments do not involve high costs, do not require high technical skills, are mobile, save time, and are suitable for use in large populations. Because all three instruments can estimate the percentage of body fat, but it is important to identify the most appropriate instruments and have high reliability. Hence, this study was conducted to determine the reliability and convergent validity of the instruments. A total of 40 students, males and females aged between 13 and 14 years participated in this study. The study found that the test retest and Pearson correlation coefficient of reliability for the three instruments is very high, r = .99. While the inter class reliability also are at high level with r = .99 for Body Mass Index and Bio Impedance Analysis, r = .96 for Skin fold test. Intra class reliability coefficient for these three instruments is too high for Body Mass Index r = .99, Bio Impedance Analysis r = .97, and Skin fold Test r = .90. However, Standard Error of Measurement value for all three instruments indicates the Body Mass Index is the most appropriate instrument with a mean value of .000672 compared with other instruments. The findings show that the Body Mass Index is an instrument which is the most accurate and reliable in estimating body fat percentage for the population studied.

Keywords: reliability, validity, body mass index, bio impedance analysis and skinfold test

Procedia PDF Downloads 336
37206 Analysis of the Inverse Kinematics for 5 DOF Robot Arm Using D-H Parameters

Authors: Apurva Patil, Maithilee Kulkarni, Ashay Aswale

Abstract:

This paper proposes an algorithm to develop the kinematic model of a 5 DOF robot arm. The formulation of the problem is based on finding the D-H parameters of the arm. Brute Force iterative method is employed to solve the system of non linear equations. The focus of the paper is to obtain the accurate solutions by reducing the root mean square error. The result obtained will be implemented to grip the objects. The trajectories followed by the end effector for the required workspace coordinates are plotted. The methodology used here can be used in solving the problem for any other kinematic chain of up to six DOF.

Keywords: 5 DOF robot arm, D-H parameters, inverse kinematics, iterative method, trajectories

Procedia PDF Downloads 202
37205 Frequency Recognition Models for Steady State Visual Evoked Potential Based Brain Computer Interfaces (BCIs)

Authors: Zeki Oralhan, Mahmut Tokmakçı

Abstract:

SSVEP based brain computer interface (BCI) systems have been preferred, because of high information transfer rate (ITR) and practical use. ITR is the parameter of BCI overall performance. For high ITR value, one of specification BCI system is that has high accuracy. In this study, we investigated to recognize SSVEP with shorter time and lower error rate. In the experiment, there were 8 flickers on light crystal display (LCD). Participants gazed to flicker which had 12 Hz frequency and 50% duty cycle ratio on the LCD during 10 seconds. During the experiment, EEG signals were acquired via EEG device. The EEG data was filtered in preprocessing session. After that Canonical Correlation Analysis (CCA), Multiset CCA (MsetCCA), phase constrained CCA (PCCA), and Multiway CCA (MwayCCA) methods were applied on data. The highest average accuracy value was reached when MsetCCA was applied.

Keywords: brain computer interface, canonical correlation analysis, human computer interaction, SSVEP

Procedia PDF Downloads 266
37204 Forecasting Nokoué Lake Water Levels Using Long Short-Term Memory Network

Authors: Namwinwelbere Dabire, Eugene C. Ezin, Adandedji M. Firmin

Abstract:

The prediction of hydrological flows (rainfall-depth or rainfall-discharge) is becoming increasingly important in the management of hydrological risks such as floods. In this study, the Long Short-Term Memory (LSTM) network, a state-of-the-art algorithm dedicated to time series, is applied to predict the daily water level of Nokoue Lake in Benin. This paper aims to provide an effective and reliable method enable of reproducing the future daily water level of Nokoue Lake, which is influenced by a combination of two phenomena: rainfall and river flow (runoff from the Ouémé River, the Sô River, the Porto-Novo lagoon, and the Atlantic Ocean). Performance analysis based on the forecasting horizon indicates that LSTM can predict the water level of Nokoué Lake up to a forecast horizon of t+10 days. Performance metrics such as Root Mean Square Error (RMSE), coefficient of correlation (R²), Nash-Sutcliffe Efficiency (NSE), and Mean Absolute Error (MAE) agree on a forecast horizon of up to t+3 days. The values of these metrics remain stable for forecast horizons of t+1 days, t+2 days, and t+3 days. The values of R² and NSE are greater than 0.97 during the training and testing phases in the Nokoué Lake basin. Based on the evaluation indices used to assess the model's performance for the appropriate forecast horizon of water level in the Nokoué Lake basin, the forecast horizon of t+3 days is chosen for predicting future daily water levels.

Keywords: forecasting, long short-term memory cell, recurrent artificial neural network, Nokoué lake

Procedia PDF Downloads 64
37203 Numerical Simulation of Kangimi Reservoir Sedimentation, Kaduna State, Nigeria

Authors: Abdurrasheed Sa'id, Abubakar Isma'il, Waheed Alayande

Abstract:

This study focused on carrying out numerical simulations of Kangimi reservoir sedimentation by reviewing different numerical sediment transport models, and GSTARS3 was selected. The model was developed using the 1977 data. It was calibrated by simulating the 2012 profile and sediment deposition and compared with 2012 hydrographic survey results of NWRI. The model was validated by simulating the 2016 deposition and compared the results with NWRI estimates. Also, the performance of the proposed model was tested using statistical parameters such as MSE (Mean Square Error), MAPE (Mean Average Percentage Error) and R2 (Coefficient of determination) with values of 1.32m, 0.17% and 0.914 respectively which shows strong agreement. After the calibration, validation and performance testing the model was used to simulate the 2032 and 2062 profiles and deposition. The results showed that by 2032 the reservoir will be silted by 25.34MCM or 43.3% of the design capacity and 60.7% of the capacity by the year 2062. A number of sedimentation mitigation measures were recommended.

Keywords: NWRI- national water resources institute, sedimentation, GSTARS3, model

Procedia PDF Downloads 219
37202 The Per Capita Income, Energy production and Environmental Degradation: A Comprehensive Assessment of the existence of the Environmental Kuznets Curve Hypothesis in Bangladesh

Authors: Ashique Mahmud, MD. Ataul Gani Osmani, Shoria Sharmin

Abstract:

In the first quarter of the twenty-first century, the most substantial global concern is environmental contamination, and it has gained the prioritization of both the national and international community. Keeping in mind this crucial fact, this study conducted different statistical and econometrical methods to identify whether the gross national income of the country has a significant impact on electricity production from nonrenewable sources and different air pollutants like carbon dioxide, nitrous oxide, and methane emissions. Besides, the primary objective of this research was to analyze whether the environmental Kuznets curve hypothesis holds for the examined variables. After analyzing different statistical properties of the variables, this study came to the conclusion that the environmental Kuznets curve hypothesis holds for gross national income and carbon dioxide emission in Bangladesh in the short run as well as the long run. This study comes to this conclusion based on the findings of ordinary least square estimations, ARDL bound tests, short-run causality analysis, the Error Correction Model, and other pre-diagnostic and post-diagnostic tests that have been employed in the structural model. Moreover, this study wants to demonstrate that the outline of gross national income and carbon dioxide emissions is in its initial stage of development and will increase up to the optimal peak. The compositional effect will then force the emission to decrease, and the environmental quality will be restored in the long run.

Keywords: environmental Kuznets curve hypothesis, carbon dioxide emission in Bangladesh, gross national income in Bangladesh, autoregressive distributed lag model, granger causality, error correction model

Procedia PDF Downloads 150
37201 High-Resolution ECG Automated Analysis and Diagnosis

Authors: Ayad Dalloo, Sulaf Dalloo

Abstract:

Electrocardiogram (ECG) recording is prone to complications, on analysis by physicians, due to noise and artifacts, thus creating ambiguity leading to possible error of diagnosis. Such drawbacks may be overcome with the advent of high resolution Methods, such as Discrete Wavelet Analysis and Digital Signal Processing (DSP) techniques. This ECG signal analysis is implemented in three stages: ECG preprocessing, features extraction and classification with the aim of realizing high resolution ECG diagnosis and improved detection of abnormal conditions in the heart. The preprocessing stage involves removing spurious artifacts (noise), due to such factors as muscle contraction, motion, respiration, etc. ECG features are extracted by applying DSP and suggested sloping method techniques. These measured features represent peak amplitude values and intervals of P, Q, R, S, R’, and T waves on ECG, and other features such as ST elevation, QRS width, heart rate, electrical axis, QR and QT intervals. The classification is preformed using these extracted features and the criteria for cardiovascular diseases. The ECG diagnostic system is successfully applied to 12-lead ECG recordings for 12 cases. The system is provided with information to enable it diagnoses 15 different diseases. Physician’s and computer’s diagnoses are compared with 90% agreement, with respect to physician diagnosis, and the time taken for diagnosis is 2 seconds. All of these operations are programmed in Matlab environment.

Keywords: ECG diagnostic system, QRS detection, ECG baseline removal, cardiovascular diseases

Procedia PDF Downloads 297
37200 Photoelastic Analysis and Finite Elements Analysis of a Stress Field Developed in a Double Edge Notched Specimen

Authors: A. Bilek, M. Beldi, T. Cherfi, S. Djebali, S. Larbi

Abstract:

Finite elements analysis and photoelasticity are used to determine the stress field developed in a double edge notched specimen loaded in tension. The specimen is cut in a birefringent plate. Experimental isochromatic fringes are obtained with circularly polarized light on the analyzer of a regular polariscope. The fringes represent the loci of points of equal maximum shear stress. In order to obtain the stress values corresponding to the fringe orders recorded in the notched specimen, particularly in the neighborhood of the notches, a calibrating disc made of the same material is loaded in compression along its diameter in order to determine the photoelastic fringe value. This fringe value is also used in the finite elements solution in order to obtain the simulated photoelastic fringes, the isochromatics as well as the isoclinics. A color scale is used by the software to represent the simulated fringes on the whole model. The stress concentration factor can be readily obtained at the notches. Good agreements are obtained between the experimental and the simulated fringe patterns and between the graphs of the shear stress particularly in the neighborhood of the notches. The purpose in this paper is to show that one can obtain rapidly and accurately, by the finite element analysis, the isochromatic and the isoclinic fringe patterns in a stressed model as the experimental procedure can be time consuming. Stress fields can therefore be analyzed in three dimensional models as long as the meshing and the limit conditions are properly set in the program.

Keywords: isochromatic fringe, isoclinic fringe, photoelasticity, stress concentration factor

Procedia PDF Downloads 229
37199 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 415
37198 Near Infrared Spectrometry to Determine the Quality of Milk, Experimental Design Setup and Chemometrics: Review

Authors: Meghana Shankara, Priyadarshini Natarajan

Abstract:

Infrared (IR) spectroscopy has revolutionized the way we look at materials around us. Unraveling the pattern in the molecular spectra of materials to analyze the composition and properties of it has been one of the most interesting challenges in modern science. Applications of the IR spectrometry are numerous in the field’s pharmaceuticals, health, food and nutrition, oils, agriculture, construction, polymers, beverage, fabrics and much more limited only by the curiosity of the people. Near Infrared (NIR) spectrometry is applied robustly in analyzing the solids and liquid substances because of its non-destructive analysis method. In this paper, we have reviewed the application of NIR spectrometry in milk quality analysis and have presented the modes of measurement applied in NIRS measurement setup, Design of Experiment (DoE), classification/quantification algorithms used in the case of milk composition prediction like Fat%, Protein%, Lactose%, Solids Not Fat (SNF%) along with different approaches for adulterant identification. We have also discussed the important NIR ranges for the chosen milk parameters. The performance metrics used in the comparison of the various Chemometric approaches include Root Mean Square Error (RMSE), R^2, slope, offset, sensitivity, specificity and accuracy

Keywords: chemometrics, design of experiment, milk quality analysis, NIRS measurement modes

Procedia PDF Downloads 271
37197 Argumentative and Enunciative Analysis of Spanish Political Discourse

Authors: Cristina Diez

Abstract:

One of the most important challenges of discourse analysis is to find the linguistic mechanisms of subjectivity. The present article aims to raise the need for an argumentative and enunciative analysis to reach the subjective tissue of language. The intention is to prove that the instructions inscribed in the own language are those that indicate how a statement is to be interpreted and that the argumentative value is implied at the semantic level. For that, the theory of argumentation from Ducrot and Anscombre will be implemented. First, a reflection on the study about subjectivity and enunciation in language will be exposed, followed by concrete proposals on the linguistic mechanisms that speakers use either consciously or unconsciously, to finally focus on those argumentative tools that political discourse uses in order to influence the audience.

Keywords: argumentation, enunciation, discourse analysis, subjectivity

Procedia PDF Downloads 201
37196 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 393
37195 A Continuous Boundary Value Method of Order 8 for Solving the General Second Order Multipoint Boundary Value Problems

Authors: T. A. Biala

Abstract:

This paper deals with the numerical integration of the general second order multipoint boundary value problems. This has been achieved by the development of a continuous linear multistep method (LMM). The continuous LMM is used to construct a main discrete method to be used with some initial and final methods (also obtained from the continuous LMM) so that they form a discrete analogue of the continuous second order boundary value problems. These methods are used as boundary value methods and adapted to cope with the integration of the general second order multipoint boundary value problems. The convergence, the use and the region of absolute stability of the methods are discussed. Several numerical examples are implemented to elucidate our solution process.

Keywords: linear multistep methods, boundary value methods, second order multipoint boundary value problems, convergence

Procedia PDF Downloads 377
37194 Performance Evaluation of a Minimum Mean Square Error-Based Physical Sidelink Share Channel Receiver under Fading Channel

Authors: Yang Fu, Jaime Rodrigo Navarro, Jose F. Monserrat, Faiza Bouchmal, Oscar Carrasco Quilis

Abstract:

Cellular Vehicle to Everything (C-V2X) is considered a promising solution for future autonomous driving. From Release 16 to Release 17, the Third Generation Partnership Project (3GPP) has introduced the definitions and services for 5G New Radio (NR) V2X. Experience from previous generations has shown that establishing a simulator for C-V2X communications is an essential preliminary step to achieve reliable and stable communication links. This paper proposes a complete framework of a link-level simulator based on the 3GPP specifications for the Physical Sidelink Share Channel (PSSCH) of the 5G NR Physical Layer (PHY). In this framework, several algorithms in the receiver part, i.e., sliding window in channel estimation and Minimum Mean Square Error (MMSE)-based equalization, are developed. Finally, the performance of the developed PSSCH receiver is validated through extensive simulations under different assumptions.

Keywords: C-V2X, channel estimation, link-level simulator, sidelink, 3GPP

Procedia PDF Downloads 130
37193 The SBO/LOCA Analysis of TRACE/SNAP for Kuosheng Nuclear Power Plant

Authors: J. R. Wang, H. T. Lin, Y. Chiang, H. C. Chen, C. Shih

Abstract:

Kuosheng Nuclear Power Plant (NPP) is located on the northern coast of Taiwan. Its nuclear steam supply system is a type of BWR/6 designed and built by General Electric on a twin unit concept. First, the methodology of Kuosheng NPP SPU (Stretch Power Uprate) safety analysis TRACE/SNAP model was developed in this research. Then, in order to estimate the safety of Kuosheng NPP under the more severe condition, the SBO (Station Blackout) + LOCA (Loss-of-Coolant Accident) transient analysis of Kuosheng NPP SPU TRACE/SNAP model was performed. Besides, the animation model of Kuosheng NPP was presented using the animation function of SNAP with TRACE/SNAP analysis results.

Keywords: TRACE, safety analysis, BWR/6, severe accident

Procedia PDF Downloads 714
37192 Design of the Compliant Mechanism of a Biomechanical Assistive Device for the Knee

Authors: Kevin Giraldo, Juan A. Gallego, Uriel Zapata, Fanny L. Casado

Abstract:

Compliant mechanisms are designed to deform in a controlled manner in response to external forces, utilizing the flexibility of their components to store potential elastic energy during deformation, gradually releasing it upon returning to its original form. This article explores the design of a knee orthosis intended to assist users during stand-up motion. The orthosis makes use of a compliant mechanism to balance the user’s weight, thereby minimizing the strain on leg muscles during standup motion. The primary function of the compliant mechanism is to store and exchange potential energy, so when coupled with the gravitational potential of the user, the total potential energy variation is minimized. The design process for the semi-rigid knee orthosis involved material selection and the development of a numerical model for the compliant mechanism seen as a spring. Geometric properties are obtained through the numerical modeling of the spring once the desired stiffness and safety factor values have been attained. Subsequently, a 3D finite element analysis was conducted. The study demonstrates a strong correlation between the maximum stress in the mathematical model (250.22 MPa) and the simulation (239.8 MPa), with a 4.16% error. Both analyses safety factors: 1.02 for the mathematical approach and 1.1 for the simulation, with a consistent 7.84% margin of error. The spring’s stiffness, calculated at 90.82 Nm/rad analytically and 85.71 Nm/rad in the simulation, exhibits a 5.62% difference. These results suggest significant potential for the proposed device in assisting patients with knee orthopedic restrictions, contributing to ongoing efforts in advancing the understanding and treatment of knee osteoarthritis.

Keywords: biomechanics, complaint mechanisms, gonarthrosis, orthoses

Procedia PDF Downloads 36