Search results for: penalized spline regression method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21456

Search results for: penalized spline regression method

18336 Preparation of Amorphous silica from Algerian Diatomite and Its Properties

Authors: S. Medeghri, S. Hamzaoui, M. Zerdali, S. Masatomo

Abstract:

In this work there is a facile method to produce pure amorphous silica from Algerian diatomite with an economic and ecological method. The sodium silicate is commonly used as precursor in silica gel diatomite preparation. In this study, the preparation of sodium silicate is preceded by acid washing of raw diatomite; the acid is then slowly added to precipitate silica at different pH values to obtain silica gel. The silica gel is characterized by EDX, ICP-MS and XRD. The EDX revels that the purity of silica from diatom is 98% after purification compared to raw diatom.

Keywords: diatomite, acid cleaning, dissolution, amorphous silica, purity

Procedia PDF Downloads 576
18335 Interpretation of the Russia-Ukraine 2022 War via N-Gram Analysis

Authors: Elcin Timur Cakmak, Ayse Oguzlar

Abstract:

This study presents the results of the tweets sent by Twitter users on social media about the Russia-Ukraine war by bigram and trigram methods. On February 24, 2022, Russian President Vladimir Putin declared a military operation against Ukraine, and all eyes were turned to this war. Many people living in Russia and Ukraine reacted to this war and protested and also expressed their deep concern about this war as they felt the safety of their families and their futures were at stake. Most people, especially those living in Russia and Ukraine, express their views on the war in different ways. The most popular way to do this is through social media. Many people prefer to convey their feelings using Twitter, one of the most frequently used social media tools. Since the beginning of the war, it is seen that there have been thousands of tweets about the war from many countries of the world on Twitter. These tweets accumulated in data sources are extracted using various codes for analysis through Twitter API and analysed by Python programming language. The aim of the study is to find the word sequences in these tweets by the n-gram method, which is known for its widespread use in computational linguistics and natural language processing. The tweet language used in the study is English. The data set consists of the data obtained from Twitter between February 24, 2022, and April 24, 2022. The tweets obtained from Twitter using the #ukraine, #russia, #war, #putin, #zelensky hashtags together were captured as raw data, and the remaining tweets were included in the analysis stage after they were cleaned through the preprocessing stage. In the data analysis part, the sentiments are found to present what people send as a message about the war on Twitter. Regarding this, negative messages make up the majority of all the tweets as a ratio of %63,6. Furthermore, the most frequently used bigram and trigram word groups are found. Regarding the results, the most frequently used word groups are “he, is”, “I, do”, “I, am” for bigrams. Also, the most frequently used word groups are “I, do, not”, “I, am, not”, “I, can, not” for trigrams. In the machine learning phase, the accuracy of classifications is measured by Classification and Regression Trees (CART) and Naïve Bayes (NB) algorithms. The algorithms are used separately for bigrams and trigrams. We gained the highest accuracy and F-measure values by the NB algorithm and the highest precision and recall values by the CART algorithm for bigrams. On the other hand, the highest values for accuracy, precision, and F-measure values are achieved by the CART algorithm, and the highest value for the recall is gained by NB for trigrams.

Keywords: classification algorithms, machine learning, sentiment analysis, Twitter

Procedia PDF Downloads 73
18334 The Impact of Job Meaningfulness on the Relationships between Job Autonomy, Supportive Organizational Climate, and Job Satisfaction

Authors: Sashank Nyapati, Laura Lorente-Prieto, Maria Peiro

Abstract:

The general objective of this study is to analyse the mediating role of meaningfulness in the relationships between job autonomy and job satisfaction and supportive organizational climate and job satisfaction. Theories such as the Job Characteristics Model, Conservation of Resources theory, as well as the Job Demands-Resources theory were used as theoretical framework. Data was obtained from the 5th European Working Conditions Survey (EWCS), and sample was composed of 1005 and 1000 workers from Spain and Portugal respectively. The analysis was conducted using the SOBEL Macro for SPSS (A multiple regression mediation model) developed by Preacher and Hayes in 2003. Results indicated that Meaningfulness partially mediates both the Job Autonomy-Job Satisfaction as well as the Supportive Organizational Climate-Job Satisfaction relationships. However, the percentages are large enough to draw substantial conclusions, especially that Job Meaningfulness plays an essential – if indirect – role in the amount of Satisfaction that one experiences at work. Some theoretical and practical implications are discussed.

Keywords: meaningfulness, job autonomy, supportive organizational climate, job satisfaction

Procedia PDF Downloads 536
18333 The Challenges for Engineers to Change the Construction Method in Brazil

Authors: Yuri B. Cesarino, Vinícius R. Domingues, Darym J. F. Campos

Abstract:

Developing countries have some restrains towards the adoption of new technologies and construction methods. Some nations, such as Brazil, still use conventional construction methodologies, knowing its lesser cost-effectiveness. This research has been conducted to demonstrate how industrialized construction methods should be implemented in Brazil, especially in times of need. Using the common sense among different authors with different perspectives, it is clear that the second method is more suitable for construction development because of its great advantages. However, it is unlikely for this process to be adopted in the country as a result of several social-economic restraints. Nonetheless, Brazilian engineers have a major challenge ahead of them, and it will take more than creativity to solve such an issue.

Keywords: Brazilian engineers, construction methods, industrialized construction, infrastructure

Procedia PDF Downloads 284
18332 Sensitivity Analysis of Prestressed Post-Tensioned I-Girder and Deck System

Authors: Tahsin A. H. Nishat, Raquib Ahsan

Abstract:

Sensitivity analysis of design parameters of the optimization procedure can become a significant factor while designing any structural system. The objectives of the study are to analyze the sensitivity of deck slab thickness parameter obtained from both the conventional and optimum design methodology of pre-stressed post-tensioned I-girder and deck system and to compare the relative significance of slab thickness. For analysis on conventional method, the values of 14 design parameters obtained by the conventional iterative method of design of a real-life I-girder bridge project have been considered. On the other side for analysis on optimization method, cost optimization of this system has been done using global optimization methodology 'Evolutionary Operation (EVOP)'. The problem, by which optimum values of 14 design parameters have been obtained, contains 14 explicit constraints and 46 implicit constraints. For both types of design parameters, sensitivity analysis has been conducted on deck slab thickness parameter which can become too sensitive for the obtained optimum solution. Deviations of slab thickness on both the upper and lower side of its optimum value have been considered reflecting its realistic possible ranges of variations during construction. In this procedure, the remaining parameters have been kept unchanged. For small deviations from the optimum value, compliance with the explicit and implicit constraints has been examined. Variations in the cost have also been estimated. It is obtained that without violating any constraint deck slab thickness obtained by the conventional method can be increased up to 25 mm whereas slab thickness obtained by cost optimization can be increased only up to 0.3 mm. The obtained result suggests that slab thickness becomes less sensitive in case of conventional method of design. Therefore, for realistic design purpose sensitivity should be conducted for any of the design procedure of girder and deck system.

Keywords: sensitivity analysis, optimum design, evolutionary operations, PC I-girder, deck system

Procedia PDF Downloads 137
18331 Texture Identification Using Vision System: A Method to Predict Functionality of a Component

Authors: Varsha Singh, Shraddha Prajapati, M. B. Kiran

Abstract:

Texture identification is useful in predicting the functionality of a component. Many of the existing texture identification methods are of contact in nature, which limits its measuring speed. These contact measurement techniques use a diamond stylus and the diamond stylus being sharp going to damage the surface under inspection and hence these techniques can be used in statistical sampling. Though these contact methods are very accurate, they do not give complete information for full characterization of surface. In this context, the presented method assumes special significance. The method uses a relatively low cost vision system for image acquisition. Software is developed based on wavelet transform, for analyzing texture images. Specimens are made using different manufacturing process (shaping, grinding, milling etc.) During experimentation, the specimens are illuminated using proper lighting and texture images a capture using CCD camera connected to the vision system. The software installed in the vision system processes these images and subsequently identify the texture of manufacturing processes.

Keywords: diamond stylus, manufacturing process, texture identification, vision system

Procedia PDF Downloads 289
18330 Cash Flow Position and Corporate Performance: A Study of Selected Manufacturing Companies in Nigeria

Authors: Uzoma Emmanuel Igboji

Abstract:

The study investigates the effects of cash flow position on corporate performance in the manufacturing sector of Nigeria, using multiple regression techniques. The study involved a survey of five (5) manufacturing companies quoted on the Nigerian Stock Exchange. The data were obtained from the annual reports of the selected companies under study. The result shows that operating and financing cash flow have a significant positive relationship with corporate performance, while investing cash flow position have a significant negative relationship. The researcher recommended that the regulatory authorities should encourage external auditors of these quoted companies to use cash flow ratios in evaluating the performance of a company before expressing an independent opinion on the financial statement. The will give detailed financial information to existing and potential investors to make informed economic decisions.

Keywords: cash flow, financing, performance, operating

Procedia PDF Downloads 315
18329 Digital Library Evaluation by SWARA-WASPAS Method

Authors: Mehmet Yörükoğlu, Serhat Aydın

Abstract:

Since the discovery of the manuscript, mechanical methods for storing, transferring and using the information have evolved into digital methods over the time. In this process, libraries that are the center of the information have also become digitized and become accessible from anywhere and at any time in the world by taking on a structure that has no physical boundaries. In this context, some criteria for information obtained from digital libraries have become more important for users. This paper evaluates the user criteria from different perspectives that make a digital library more useful. The Step-Wise Weight Assessment Ratio Analysis-Weighted Aggregated Sum Product Assessment (SWARA-WASPAS) method is used with flexibility and easy calculation steps for the evaluation of digital library criteria. Three different digital libraries are evaluated by information technology experts according to five conflicting main criteria, ‘interface design’, ‘effects on users’, ‘services’, ‘user engagement’ and ‘context’. Finally, alternatives are ranked in descending order.

Keywords: digital library, multi criteria decision making, SWARA-WASPAS method

Procedia PDF Downloads 151
18328 The Use of Bituminaria bituminosa (L.) Stirton and Microbial Biotechnologies for Restoration of Degraded Pastoral Lands: The Case of the Middle Atlas of Morocco

Authors: O. Zennouhi, M. El Mderssa, J. Ibijbijen, E. Bouiamrine, L. Nassiri

Abstract:

Rangelands and silvopastoral systems of the middle Atlas are under a heavy pressure, which led to pasture degradation, invasion by non-palatable and toxic species and edaphic aridification due to the regression of the global vegetation cover. In this situation, the introduction of multipurpose leguminous shrubs, such as Bituminaria bituminosa (L.) Stirton, commonly known as bituminous clover, could be a promising socio-ecological alternative for the rehabilitation of these degraded areas. The application of biofertilizers like plant growth promoting rhizobacteria especially phosphate solubilizing bacteria (PSB) can ensure a successful installation of this plant in the selected degraded areas. The main objective of the present work is to produce well-inoculated seedlings using the best efficient PSB strains in the greenhouse to increase their ability to resist to environmental constraints once transplanted to the field in the central Middle Atlas.

Keywords: biofertilizers, bituminaria bituminosa, phosphate solubilizing bacteria, rehabilitation

Procedia PDF Downloads 151
18327 Real-Time Recognition of the Terrain Configuration to Improve Driving Stability for Unmanned Robots

Authors: Bongsoo Jeon, Jayoung Kim, Jihong Lee

Abstract:

Methods for measuring or estimating of ground shape by a laser range finder and a vision sensor (exteroceptive sensors) have critical weakness in terms that these methods need prior database built to distinguish acquired data as unique surface condition for driving. Also, ground information by exteroceptive sensors does not reflect the deflection of ground surface caused by the movement of UGVs. Therefore, this paper proposes a method of recognizing exact and precise ground shape using Inertial Measurement Unit (IMU) as a proprioceptive sensor. In this paper, firstly this method recognizes attitude of a robot in real-time using IMU and compensates attitude data of a robot with angle errors through analysis of vehicle dynamics. This method is verified by outdoor driving experiments of a real mobile robot.

Keywords: inertial measurement unit, laser range finder, real-time recognition of the ground shape, proprioceptive sensor

Procedia PDF Downloads 287
18326 External Validation of Established Pre-Operative Scoring Systems in Predicting Response to Microvascular Decompression for Trigeminal Neuralgia

Authors: Kantha Siddhanth Gujjari, Shaani Singhal, Robert Andrew Danks, Adrian Praeger

Abstract:

Background: Trigeminal neuralgia (TN) is a heterogenous pain syndrome characterised by short paroxysms of lancinating facial pain in the distribution of the trigeminal nerve, often triggered by usually innocuous stimuli. TN has a low prevalence of less than 0.1%, of which 80% to 90% is caused by compression of the trigeminal nerve from an adjacent artery or vein. The root entry zone of the trigeminal nerve is most sensitive to neurovascular conflict (NVC), causing dysmyelination. Whilst microvascular decompression (MVD) is an effective treatment for TN with NVC, all patients do not achieve long-term pain relief. Pre-operative scoring systems by Panczykowski and Hardaway have been proposed but have not been externally validated. These pre-operative scoring systems are composite scores calculated according to a subtype of TN, presence and degree of neurovascular conflict, and response to medical treatments. There is discordance in the assessment of NVC identified on pre-operative magnetic resonance imaging (MRI) between neurosurgeons and radiologists. To our best knowledge, the prognostic impact for MVD of this difference of interpretation has not previously been investigated in the form of a composite scoring system such as those suggested by Panczykowski and Hardaway. Aims: This study aims to identify prognostic factors and externally validate the proposed scoring systems by Panczykowski and Hardaway for TN. A secondary aim is to investigate the prognostic difference between a neurosurgeon's interpretation of NVC on MRI compared with a radiologist’s. Methods: This retrospective cohort study included 95 patients who underwent de novo MVD in a single neurosurgical unit in Melbourne. Data was recorded from patients’ hospital records and neurosurgeon’s correspondence from perioperative clinic reviews. Patient demographics, type of TN, distribution of TN, response to carbamazepine, neurosurgeon, and radiologist interpretation of NVC on MRI, were clearly described prospectively and preoperatively in the correspondence. Scoring systems published by Panczykowski et al. and Hardaway et al. were used to determine composite scores, which were compared with the recurrence of TN recorded during follow-up over 1-year. Categorical data analysed using Pearson chi-square testing. Independent numerical and nominal data analysed with logistical regression. Results: Logistical regression showed that a Panczykowski composite score of greater than 3 points was associated with a higher likelihood of pain-free outcome 1-year post-MVD with an OR 1.81 (95%CI 1.41-2.61, p=0.032). The composite score using neurosurgeon’s impression of NVC had an OR 2.96 (95%CI 2.28-3.31, p=0.048). A Hardaway composite score of greater than 2 points was associated with a higher likelihood of pain-free outcome 1 year post-MVD with an OR 3.41 (95%CI 2.58-4.37, p=0.028). The composite score using neurosurgeon’s impression of NVC had an OR 3.96 (95%CI 3.01-4.65, p=0.042). Conclusion: Composite scores developed by Panczykowski and Hardaway were validated for the prediction of response to MVD in TN. A composite score based on the neurosurgeon’s interpretation of NVC on MRI, when compared with the radiologist’s had a greater correlation with pain-free outcomes 1 year post-MVD.

Keywords: de novo microvascular decompression, neurovascular conflict, prognosis, trigeminal neuralgia

Procedia PDF Downloads 74
18325 Reducing Uncertainty of Monte Carlo Estimated Fatigue Damage in Offshore Wind Turbines Using FORM

Authors: Jan-Tore H. Horn, Jørgen Juncher Jensen

Abstract:

Uncertainties related to fatigue damage estimation of non-linear systems are highly dependent on the tail behaviour and extreme values of the stress range distribution. By using a combination of the First Order Reliability Method (FORM) and Monte Carlo simulations (MCS), the accuracy of the fatigue estimations may be improved for the same computational efforts. The method is applied to a bottom-fixed, monopile-supported large offshore wind turbine, which is a non-linear and dynamically sensitive system. Different curve fitting techniques to the fatigue damage distribution have been used depending on the sea-state dependent response characteristics, and the effect of a bi-linear S-N curve is discussed. Finally, analyses are performed on several environmental conditions to investigate the long-term applicability of this multistep method. Wave loads are calculated using state-of-the-art theory, while wind loads are applied with a simplified model based on rotor thrust coefficients.

Keywords: fatigue damage, FORM, monopile, Monte Carlo, simulation, wind turbine

Procedia PDF Downloads 260
18324 Stability Indicating RP – HPLC Method Development, Validation and Kinetic Study for Amiloride Hydrochloride and Furosemide in Pharmaceutical Dosage Form

Authors: Jignasha Derasari, Patel Krishna M, Modi Jignasa G.

Abstract:

Chemical stability of pharmaceutical molecules is a matter of great concern as it affects the safety and efficacy of the drug product.Stability testing data provides the basis to understand how the quality of a drug substance and drug product changes with time under the influence of various environmental factors. Besides this, it also helps in selecting proper formulation and package as well as providing proper storage conditions and shelf life, which is essential for regulatory documentation. The ICH guideline states that stress testing is intended to identify the likely degradation products which further help in determination of the intrinsic stability of the molecule and establishing degradation pathways, and to validate the stability indicating procedures. A simple, accurate and precise stability indicating RP- HPLC method was developed and validated for simultaneous estimation of Amiloride Hydrochloride and Furosemide in tablet dosage form. Separation was achieved on an Phenomenexluna ODS C18 (250 mm × 4.6 mm i.d., 5 µm particle size) by using a mobile phase consisting of Ortho phosphoric acid: Acetonitrile (50:50 %v/v) at a flow rate of 1.0 ml/min (pH 3.5 adjusted with 0.1 % TEA in Water) isocratic pump mode, Injection volume 20 µl and wavelength of detection was kept at 283 nm. Retention time for Amiloride Hydrochloride and Furosemide was 1.810 min and 4.269 min respectively. Linearity of the proposed method was obtained in the range of 40-60 µg/ml and 320-480 µg/ml and Correlation coefficient was 0.999 and 0.998 for Amiloride hydrochloride and Furosemide, respectively. Forced degradation study was carried out on combined dosage form with various stress conditions like hydrolysis (acid and base hydrolysis), oxidative and thermal conditions as per ICH guideline Q2 (R1). The RP- HPLC method has shown an adequate separation for Amiloride hydrochloride and Furosemide from its degradation products. Proposed method was validated as per ICH guidelines for specificity, linearity, accuracy; precision and robustness for estimation of Amiloride hydrochloride and Furosemide in commercially available tablet dosage form and results were found to be satisfactory and significant. The developed and validated stability indicating RP-HPLC method can be used successfully for marketed formulations. Forced degradation studies help in generating degradants in much shorter span of time, mostly a few weeks can be used to develop the stability indicating method which can be applied later for the analysis of samples generated from accelerated and long term stability studies. Further, kinetic study was also performed for different forced degradation parameters of the same combination, which help in determining order of reaction.

Keywords: amiloride hydrochloride, furosemide, kinetic study, stability indicating RP-HPLC method validation

Procedia PDF Downloads 464
18323 Study of Flow-Induced Noise Control Effects on Flat Plate through Biomimetic Mucus Injection

Authors: Chen Niu, Xuesong Zhang, Dejiang Shang, Yongwei Liu

Abstract:

Fishes can secrete high molecular weight fluid on their body skin to enable their rapid movement in the water. In this work, we employ a hybrid method that combines Computational Fluid Dynamics (CFD) and Finite Element Method (FEM) to investigate the effects of different mucus viscosities and injection velocities on fluctuation pressure in the boundary layer and flow-induced structural vibration noise of a flat plate model. To accurately capture the transient flow distribution on the plate surface, we use Large Eddy Simulation (LES) while the mucus inlet is positioned at a sufficient distance from the model to ensure effective coverage. Mucus injection is modeled using the Volume of Fluid (VOF) method for multiphase flow calculations. The results demonstrate that mucus control of pulsating pressure effectively reduces flow-induced structural vibration noise, providing an approach for controlling flow-induced noise in underwater vehicles.

Keywords: mucus, flow control, noise control, flow-induced noise

Procedia PDF Downloads 146
18322 Relevant LMA Features for Human Motion Recognition

Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier

Abstract:

Motion recognition from videos is actually a very complex task due to the high variability of motions. This paper describes the challenges of human motion recognition, especially motion representation step with relevant features. Our descriptor vector is inspired from Laban Movement Analysis method. We propose discriminative features using the Random Forest algorithm in order to remove redundant features and make learning algorithms operate faster and more effectively. We validate our method on MSRC-12 and UTKinect datasets.

Keywords: discriminative LMA features, features reduction, human motion recognition, random forest

Procedia PDF Downloads 195
18321 Nonlinear Dynamic Analysis of Base-Isolated Structures Using a Partitioned Solution Approach and an Exponential Model

Authors: Nicolò Vaiana, Filip C. Filippou, Giorgio Serino

Abstract:

The solution of the nonlinear dynamic equilibrium equations of base-isolated structures adopting a conventional monolithic solution approach, i.e. an implicit single-step time integration method employed with an iteration procedure, and the use of existing nonlinear analytical models, such as differential equation models, to simulate the dynamic behavior of seismic isolators can require a significant computational effort. In order to reduce numerical computations, a partitioned solution method and a one dimensional nonlinear analytical model are presented in this paper. A partitioned solution approach can be easily applied to base-isolated structures in which the base isolation system is much more flexible than the superstructure. Thus, in this work, the explicit conditionally stable central difference method is used to evaluate the base isolation system nonlinear response and the implicit unconditionally stable Newmark’s constant average acceleration method is adopted to predict the superstructure linear response with the benefit in avoiding iterations in each time step of a nonlinear dynamic analysis. The proposed mathematical model is able to simulate the dynamic behavior of seismic isolators without requiring the solution of a nonlinear differential equation, as in the case of widely used differential equation model. The proposed mixed explicit-implicit time integration method and nonlinear exponential model are adopted to analyze a three dimensional seismically isolated structure with a lead rubber bearing system subjected to earthquake excitation. The numerical results show the good accuracy and the significant computational efficiency of the proposed solution approach and analytical model compared to the conventional solution method and mathematical model adopted in this work. Furthermore, the low stiffness value of the base isolation system with lead rubber bearings allows to have a critical time step considerably larger than the imposed ground acceleration time step, thus avoiding stability problems in the proposed mixed method.

Keywords: base-isolated structures, earthquake engineering, mixed time integration, nonlinear exponential model

Procedia PDF Downloads 280
18320 Accurate Positioning Method of Indoor Plastering Robot Based on Line Laser

Authors: Guanqiao Wang, Hongyang Yu

Abstract:

There is a lot of repetitive work in the traditional construction industry. These repetitive tasks can significantly improve production efficiency by replacing manual tasks with robots. There- fore, robots appear more and more frequently in the construction industry. Navigation and positioning are very important tasks for construction robots, and the requirements for accuracy of positioning are very high. Traditional indoor robots mainly use radiofrequency or vision methods for positioning. Compared with ordinary robots, the indoor plastering robot needs to be positioned closer to the wall for wall plastering, so the requirements for construction positioning accuracy are higher, and the traditional navigation positioning method has a large error, which will cause the robot to move. Without the exact position, the wall cannot be plastered, or the error of plastering the wall is large. A new positioning method is proposed, which is assisted by line lasers and uses image processing-based positioning to perform more accurate positioning on the traditional positioning work. In actual work, filter, edge detection, Hough transform and other operations are performed on the images captured by the camera. Each time the position of the laser line is found, it is compared with the standard value, and the position of the robot is moved or rotated to complete the positioning work. The experimental results show that the actual positioning error is reduced to less than 0.5 mm by this accurate positioning method.

Keywords: indoor plastering robot, navigation, precise positioning, line laser, image processing

Procedia PDF Downloads 148
18319 Determination of the Bank's Customer Risk Profile: Data Mining Applications

Authors: Taner Ersoz, Filiz Ersoz, Seyma Ozbilge

Abstract:

In this study, the clients who applied to a bank branch for loan were analyzed through data mining. The study was composed of the information such as amounts of loans received by personal and SME clients working with the bank branch, installment numbers, number of delays in loan installments, payments available in other banks and number of banks to which they are in debt between 2010 and 2013. The client risk profile was examined through Classification and Regression Tree (CART) analysis, one of the decision tree classification methods. At the end of the study, 5 different types of customers have been determined on the decision tree. The classification of these types of customers has been created with the rating of those posing a risk for the bank branch and the customers have been classified according to the risk ratings.

Keywords: client classification, loan suitability, risk rating, CART analysis

Procedia PDF Downloads 338
18318 The Origins of Inflation in Tunisia

Authors: Narimen Rdhaounia Mohamed Kouni

Abstract:

Our aim in this paper is to identify the origins of inflation in Tunisia on the period from 1988 to 2018. In order to estimate the model, an ARDL methodology is used. We studied also the effect of informal economy on inflation. Indeed, we estimated the size of the informal economy in Tunisia based on Gutmann method. The results showed that there are three main origins of inflation. In fact, the first origin is the fiscal policy adopted by Tunisia, particularly after revolution. The second origin is the increase of monetary variables. Finally, informal economy played an important role in inflation.

Keywords: inflation, consumer price index, informal, gutmann method, ARDL model

Procedia PDF Downloads 82
18317 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: ’Reddit’

Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell

Abstract:

Native language identification is one of the growing subfields in natural language processing (NLP). The task of native language identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features, when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL), and then the trained models are evaluated on different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and logistic regression. Results show that content-based features are more accurate and robust than content independent ones when tested within the corpus and across corpus.

Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML

Procedia PDF Downloads 137
18316 Impact of Microfinance in Promoting Rural Economic Growth in Nigeria

Authors: Udeh Anastasia Ifeoma

Abstract:

The need to develop the rural areas in developing countries where there have been decades of neglect are on the increase. It is against this background that this paper examined the impact of micro finance contribution to Nigeria’s gross domestic product. Time series data for 12-years period 1999-2010 were collated from Central Bank of Nigeria published annual reports. The least squares (LS) regression was used to analyze the data. The result revealed that microfinance activities have negative and non-significant contribution to gross domestic product in Nigeria. The paper recommends that rural poverty is often a product of poor infrastructural facilities; therefore government should make a conscious effort towards industrializing the rural areas thereby motivating the micro finance institutions to locate their offices and extend credit facilities to rural areas thereby improving rural economic growth.

Keywords: microfinance, rural economic growth, Nigeria, developing countries

Procedia PDF Downloads 451
18315 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data

Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar

Abstract:

It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.

Keywords: accuracy, exponential smoothing, forecasting, initial value

Procedia PDF Downloads 177
18314 Design and Development of Hybrid Rocket Motor

Authors: Aniket Aaba Kadam, Manish Mangesh Panchal, Roushan Ashit Sharma

Abstract:

This project focuses on the design and development of a lab-scale hybrid rocket motor to accurately determine the regression rate of a fuel/oxidizer combination consisting of solid paraffin and gaseous oxygen (GOX). Hybrid motors offer the advantage of on-demand thrust control over both solid and liquid systems in certain applications. The thermodynamic properties of the propellant combination were calculated using NASA CEA at different chamber pressures and corresponding O/F values to determine initial operating conditions with suitable peak temperatures and optimal O/F values. The project also includes the design of the injector orifice and the determination of the final design configurations of the motor casing, pressure control setup, and valve configuration. This research will be valuable in advancing the understanding of paraffin-based propulsion and improving the performance of hybrid rocket motors.

Keywords: hybrid rocket, NASA CEA, injector, thrust control

Procedia PDF Downloads 103
18313 Optimal Portfolio Selection under Treynor Ratio Using Genetic Algorithms

Authors: Imad Zeyad Ramadan

Abstract:

In this paper a genetic algorithm was developed to construct the optimal portfolio based on the Treynor method. The GA maximizes the Treynor ratio under budget constraint to select the best allocation of the budget for the companies in the portfolio. The results show that the GA was able to construct a conservative portfolio which includes companies from the three sectors. This indicates that the GA reduced the risk on the investor as it choose some companies with positive risks (goes with the market) and some with negative risks (goes against the market).

Keywords: oOptimization, genetic algorithm, portfolio selection, Treynor method

Procedia PDF Downloads 449
18312 The Use of Random Set Method in Reliability Analysis of Deep Excavations

Authors: Arefeh Arabaninezhad, Ali Fakher

Abstract:

Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.

Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty

Procedia PDF Downloads 268
18311 Prediction of Dubai Financial Market Stocks Movement Using K-Nearest Neighbor and Support Vector Regression

Authors: Abdulla D. Alblooshi

Abstract:

The stock market is a representation of human behavior and psychology, such as fear, greed, and discipline. Those are manifested in the form of price movements during the trading sessions. Therefore, predicting the stock movement and prices is a challenging effort. However, those trading sessions produce a large amount of data that can be utilized to train an AI agent for the purpose of predicting the stock movement. Predicting the stock market price action will be advantageous. In this paper, the stock movement data of three DFM listed stocks are studied using historical price movements and technical indicators value and used to train an agent using KNN and SVM methods to predict the future price movement. MATLAB Toolbox and a simple script is written to process and classify the information and output the prediction. It will also compare the different learning methods and parameters s using metrics like RMSE, MAE, and R².

Keywords: KNN, ANN, style, SVM, stocks, technical indicators, RSI, MACD, moving averages, RMSE, MAE

Procedia PDF Downloads 171
18310 An Infinite Mixture Model for Modelling Stutter Ratio in Forensic Data Analysis

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Forensic DNA analysis has received much attention over the last three decades, due to its incredible usefulness in human identification. The statistical interpretation of DNA evidence is recognised as one of the most mature fields in forensic science. Peak heights in an Electropherogram (EPG) are approximately proportional to the amount of template DNA in the original sample being tested. A stutter is a minor peak in an EPG, which is not masking as an allele of a potential contributor, and considered as an artefact that is presumed to be arisen due to miscopying or slippage during the PCR. Stutter peaks are mostly analysed in terms of stutter ratio that is calculated relative to the corresponding parent allele height. Analysis of mixture profiles has always been problematic in evidence interpretation, especially with the presence of PCR artefacts like stutters. Unlike binary and semi-continuous models; continuous models assign a probability (as a continuous weight) for each possible genotype combination, and significantly enhances the use of continuous peak height information resulting in more efficient reliable interpretations. Therefore, the presence of a sound methodology to distinguish between stutters and real alleles is essential for the accuracy of the interpretation. Sensibly, any such method has to be able to focus on modelling stutter peaks. Bayesian nonparametric methods provide increased flexibility in applied statistical modelling. Mixture models are frequently employed as fundamental data analysis tools in clustering and classification of data and assume unidentified heterogeneous sources for data. In model-based clustering, each unknown source is reflected by a cluster, and the clusters are modelled using parametric models. Specifying the number of components in finite mixture models, however, is practically difficult even though the calculations are relatively simple. Infinite mixture models, in contrast, do not require the user to specify the number of components. Instead, a Dirichlet process, which is an infinite-dimensional generalization of the Dirichlet distribution, is used to deal with the problem of a number of components. Chinese restaurant process (CRP), Stick-breaking process and Pólya urn scheme are frequently used as Dirichlet priors in Bayesian mixture models. In this study, we illustrate an infinite mixture of simple linear regression models for modelling stutter ratio and introduce some modifications to overcome weaknesses associated with CRP.

Keywords: Chinese restaurant process, Dirichlet prior, infinite mixture model, PCR stutter

Procedia PDF Downloads 330
18309 Optimising Transcranial Alternating Current Stimulation

Authors: Robert Lenzie

Abstract:

Transcranial electrical stimulation (tES) is significant in the research literature. However, the effects of tES on brain activity are still poorly understood at the surface level, the Brodmann Area level, and the impact on neural networks. Using a method like electroencephalography (EEG) in conjunction with tES might make it possible to comprehend the brain response and mechanisms behind published observed alterations in more depth. Using a method to directly see the effect of tES on EEG may offer high temporal resolution data on the brain activity changes/modulations brought on by tES that correlate to various processing stages within the brain. This paper provides unpublished information on a cutting-edge methodology that may reveal details about the dynamics of how the human brain works beyond what is now achievable with existing methods.

Keywords: tACS, frequency, EEG, optimal

Procedia PDF Downloads 83
18308 Predictors of Response to Interferone Therapy in Chronic Hepatitis C Virus Infection

Authors: Ali Kassem, Ehab Fawzy, Mahmoud Sef el-eslam, Fatma Salah- Eldeen, El zahraa Mohamed

Abstract:

Introduction: The combination of interferon (INF) and ribavirin is the preferred treatment for chronic hepatitis C viral (HCV) infection. However, nonresponse to this therapy remains common and is associated with several factors such as HCV genotype and HCV viral load in addition to host factors such as sex, HLA type and cytokine polymorphisms. Aim of the work: The aim of this study was to determine predictors of response to (INF) therapy in chronic HCV infected patients treated with INF alpha and ribavirin combination therapy. Patients and Methods: The present study included 110 patients (62 males, 48 females) with chronic HCV infection. Their ages ranged from 20-59 years. Inclusion criteria were organized according to the protocol of the Egyptian National Committee for control of viral hepatitis. Patients included in this study were recruited to receive INF ribavirin combination therapy; 54 patients received pegylated NF α-2a (180 μg) and weight based ribavirin therapy (1000 mg if < 75 kg, 1200 mg if > 75 kg) for 48 weeks and 53 patients received pegylated INF α-2b (1.5 ug/kg/week) and weight based ribavirin therapy (800 mg if < 65 kg, 1000 mg if 65-75 kg and 1200 mg if > 75kg). One hundred and seven liver biopsies were included in the study and submitted to histopathological examination. Hematoxylin and eosin (H&E) stained sections were done to assess both the grade and the stage of chronic viral hepatitis, in addition to the degree of steatosis. Modified hepatic activity index (HAI) grading, modified Ishak staging and Metavir grading and staging systems were used. Laboratory follow up including: HCV PCR at the 12th week to assess the early virologic response (EVR) and at the 24th week were done. At the end of the course: HCV PCR was done at the end of the course and tested 6 months later to document end virologic response (ETR) and sustained virologic response (SVR) respectively. Results One hundred seven patients; 62 males (57.9 %) and 45 females (42.1%) completed the course and included in this study. The age of patients ranged from 20-59 years with a mean of 40.39±10.03 years. Six months after the end of treatment patients were categorized into two groups: Group (1): patients who achieved sustained virological response (SVR). Group (2): patients who didn't achieve sustained virological response (non SVR) including non-responders, breakthrough and relapsers. In our study, 58 (54.2%) patients showed SVR, 18 (16.8%) patients were non-responders, 15 (14%) patients showed break-through and 16 (15 %) patients were relapsers. Univariate binary regression analysis of the possible risk factors of non SVR showed that the significant factors were higher age, higher fasting insulin level, higher Metavir stage and higher grade of hepatic steatosis. Multivariate binary regression analysis showed that the only independent risk factor for non SVR was high fasting insulin level. Conclusion: Younger age, lower Metavir stage, lower steatosis grade and lower fasting insulin level are good predictors of SVR and could be used in predicting the treatment response of pegylated interferon/ribavirin therapy.

Keywords: chronic HCV infection, interferon ribavirin combination therapy, predictors to antiviral therapy, treatment response

Procedia PDF Downloads 396
18307 Food Composition Tables Used as an Instrument to Estimate the Nutrient Ingest in Ecuador

Authors: Ortiz M. Rocío, Rocha G. Karina, Domenech A. Gloria

Abstract:

There are several tools to assess the nutritional status of the population. A main instrument commonly used to build those tools is the food composition tables (FCT). Despite the importance of FCT, there are many error sources and variability factors that can be presented on building those tables and can lead to an under or over estimation of ingest of nutrients of a population. This work identified different food composition tables used as an instrument to estimate the nutrient ingest in Ecuador.The collection of data for choosing FCT was made through key informants –self completed questionnaires-, supplemented with institutional web research. A questionnaire with general variables (origin, year of edition, etc) and methodological variables (method of elaboration, information of the table, etc) was passed to the identified FCT. Those variables were defined based on an extensive literature review. A descriptive analysis of content was performed. Ten printed tables and three databases were reported which were all indistinctly treated as food composition tables. We managed to get information from 69% of the references. Several informants referred to printed documents that were not accessible. In addition, searching the internet was not successful. Of the 9 final tables, n=8 are from Latin America, and, n= 5 of these were constructed by indirect method (collection of already published data) having as a main source of information a database from the United States department of agriculture USDA. One FCT was constructed by using direct method (bromatological analysis) and has its origin in Ecuador. The 100% of the tables made a clear distinction of the food and its method of cooking, 88% of FCT expressed values of nutrients per 100g of edible portion, 77% gave precise additional information about the use of the table, and 55% presented all the macro and micro nutrients on a detailed way. The more complete FCT were: INCAP (Central America), Composition of foods (Mexico). The more referred table was: Ecuadorian food composition table of 1965 (70%). The indirect method was used for most tables within this study. However, this method has the disadvantage that it generates less reliable food composition tables because foods show variations in composition. Therefore, a database cannot accurately predict the composition of any isolated sample of a food product.In conclusion, analyzing the pros and cons, and, despite being a FCT elaborated by using an indirect method, it is considered appropriate to work with the FCT of INCAP Central America, given the proximity to our country and a food items list that is very similar to ours. Also, it is imperative to have as a reference the table of composition for Ecuadorian food, which, although is not updated, was constructed using the direct method with Ecuadorian foods. Hence, both tables will be used to elaborate a questionnaire with the purpose of assessing the food consumption of the Ecuadorian population. In case of having disparate values, we will proceed by taking just the INCAP values because this is an updated table.

Keywords: Ecuadorian food composition tables, FCT elaborated by direct method, ingest of nutrients of Ecuadorians, Latin America food composition tables

Procedia PDF Downloads 432