Search results for: Statistical Approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16643

Search results for: Statistical Approach

15923 Effectiveness of Homoeopathic Medicine Conium Maculatum 200 C for Management of Pyuria

Authors: Amir Ashraf

Abstract:

Homoeopathy is an alternative system of medicine discovered by German physician Samuel Hahnemann in 1796. It has been used by several people for various health conditions globally for more than last 200 years. In India, homoeopathy is considered as a major system of alternative medicine. Homoeopathy is found effective in various medical conditions including Pyuria. Pyuria is the condition in which pus cells are found in urine. Homoeopathy is very useful for reducing pus cells, and homeopathically potentized Conium Mac (Hemlock) is an important remedy commonly used for reducing pyuria. Aim: To reduce the amount pus cells found in urine using Conium Mac 200C. Methods: Design. Small N Design. Samples: Purposive Sampling with 5 cases diagnosed as pyuria. Tools: Personal Data Schedule and ICD-10 Criteria for Pyuria. Techniques: Potentized homoeopathic medicine, Conium Mac 200th potency is used. Statistical Analysis: The statistical analyses were done using non-parametric tests. Results: There is significant pre/post difference has been identified. Conclusion: Homoeopathic potency, Conium Mac 200 C is effective in reducing the increased level of pus cells found in urine samples.

Keywords: homoeopathy, alternative medicine, Pyuria, Conim Mac, small N design, non-parametric tests, homeopathic physician, Ashirvad Hospital, Kannur

Procedia PDF Downloads 316
15922 Automatic Post Stroke Detection from Computed Tomography Images

Authors: C. Gopi Jinimole, A. Harsha

Abstract:

For detecting strokes, Computed Tomography (CT) scan is preferred for imaging the abnormalities or infarction in the brain. Because of the problems in the window settings used to evaluate brain CT images, they are very poor in the early stage infarction detection. This paper presents an automatic estimation method for the window settings of the CT images for proper contrast of the hyper infarction present in the brain. In the proposed work the window width is estimated automatically for each slice and the window centre is changed to a new value of 31HU, which is the average of the HU values of the grey matter and white matter in the brain. The automatic window width estimation is based on the average of median of statistical central moments. Thus with the new suggested window centre and estimated window width, the hyper infarction or post-stroke regions in CT brain images are properly detected. The proposed approach assists the radiologists in CT evaluation for early quantitative signs of delayed stroke, which leads to severe hemorrhage in the future can be prevented by providing timely medication to the patients.

Keywords: computed tomography (CT), hyper infarction or post stroke region, Hounsefield Unit (HU), window centre (WC), window width (WW)

Procedia PDF Downloads 189
15921 Characteristics of Wood Plastics Nano-Composites Made of Agricultural Residues and Urban Recycled Polymer Materials

Authors: Amir Nourbakhsh Habibabadi, Alireza Ashori

Abstract:

Context: The growing concern over the management of plastic waste and the high demand for wood-based products have led to the development of wood-plastic composites. Agricultural residues, which are abundantly available, can be used as a source of lignocellulosic fibers in the production of these composites. The use of recycled polymers and nanomaterials is also a promising approach to enhance the mechanical and physical properties of the composites. Research Aim: The aim of this study was to investigate the feasibility of using recycled high-density polyethylene (rHDPE), polypropylene (rPP), and agricultural residues fibers for manufacturing wood-plastic nano-composites. The effects of these materials on the mechanical properties of the composites, specifically tensile and flexural strength, were studied. Methodology: The study utilized an experimental approach where extruders and hot presses were used to fabricate the composites. Five types of cellulosic residues fibers (bagasse, corn stalk, rice straw, sunflower, and canola stem), three levels of nanomaterials (carbon nanotubes, nano silica, and nanoclay), and coupling agent were used to chemically bind the wood/polymer fibers, chemicals, and reinforcement. The mechanical properties of the composites were then analyzed. Findings: The study found that composites made with rHDPE provided moderately superior tensile and flexural properties compared to rPP samples. The addition of agricultural residues in several types of wood-plastic nano-composites significantly improved their bending and tensile properties, with bagasse having the most significant advantage over other lignocellulosic materials. The use of recycled polymers, agricultural residues, and nano-silica resulted in composites with the best strength properties. Theoretical Importance: The study's findings suggest that using agricultural fiber residues as reinforcement in wood/plastic nanocomposites is a viable approach to improve the mechanical properties of the composites. Additionally, the study highlights the potential of using recycled polymers in the development of value-added products without compromising the product's properties. Data Collection and Analysis Procedures: The study collected data on the mechanical properties of the composites using tensile and flexural tests. Statistical analyses were performed to determine the significant effects of the various materials used. Question addressed: Can agricultural residues and recycled polymers be used to manufacture wood-plastic nano-composites with enhanced mechanical properties? Conclusion: The study demonstrates the feasibility of using agricultural residues and recycled polymers in the production of wood-plastic nano-composites. The addition of these materials significantly improved the mechanical properties of the composites, with bagasse being the most effective agricultural residue. The study's findings suggest that composites made from recycled materials can offer value-added products without sacrificing performance.

Keywords: polymer, composites, wood, nano

Procedia PDF Downloads 58
15920 Statistical Model to Examine the Impact of the Inflation Rate and Real Interest Rate on the Bahrain Economy

Authors: Ghada Abo-Zaid

Abstract:

Introduction: Oil is one of the most income source in Bahrain. Low oil price influence on the economy growth and the investment rate in Bahrain. For example, the economic growth was 3.7% in 2012, and it reduced to 2.9% in 2015. Investment rate was 9.8% in 2012, and it is reduced to be 5.9% and -12.1% in 2014 and 2015, respectively. The inflation rate is increased to the peak point in 2013 with 3.3 %. Objectives: The objectives here are to build statistical models to examine the effect of the interest rate inflation rate on the growth economy in Bahrain from 2000 to 2018. Methods: This study based on 18 years, and the multiple regression model is used for the analysis. All of the missing data are omitted from the analysis. Results: Regression model is used to examine the association between the Growth national product (GNP), the inflation rate, and real interest rate. We found that (i) Increase the real interest rate decrease the GNP. (ii) Increase the inflation rate does not effect on the growth economy in Bahrain since the average of the inflation rate was almost 2%, and this is considered as a low percentage. Conclusion: There is a positive impact of the real interest rate on the GNP in Bahrain. While the inflation rate does not show any negative influence on the GNP as the inflation rate was not large enough to effect negatively on the economy growth rate in Bahrain.

Keywords: growth national product, egypt, regression model, interest rate

Procedia PDF Downloads 141
15919 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility

Authors: Fu Jinyu, Lin Jinguan

Abstract:

This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.

Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate

Procedia PDF Downloads 138
15918 Posterior Thigh Compartment Syndrome Associated with Hamstring Avulsion and Antiplatelet Therapy

Authors: Andrea Gatti, Federica Coppotelli, Ma Primavera, Laura Palmieri, Umberto Tarantino

Abstract:

Aim of study: Scientific literature is scarce of studies and reviews valuing the pros and cons of the paratricipital approach for the treatment of humeral shaft fractures; the lateral paratricipital approach is a valid alternative to the classical posterior approach to the humeral shaft as it preserves both the triceps muscle and the elbow extensor mechanisms; based on our experience, this retrospective analysis aims at analyzing outcome, risks and benefits of the lateral paratricipital approach for humeral shaft fractures. Methods: Our study includes 14 patients treated between 2018 and 2019 for unilateral humeral shaft fractures: 13 with a B1 or B2 and a patient with a C fracture type (according to the AO/ATO Classification); 6 of our patients identified as male while 8 as female; age average was 57.8 years old (range 21-73 years old). A lateral paratricipital approach was performed on all 14 patients, sparing the triceps muscle by avoiding the olecranon osteotomy and by assessing the integrity and the preservation of the radial nerve; the humeral shaft fracture osteosynthesis was performed by means of plates and screws. After surgery all patients have started elbow functional rehabilitation with acceptable pain management. Post-operative follow-up has been carried out by assessing radiographs, MEPS (Mayo Elbow Performance Score) and DASH (Disability of Arm Shoulder and Hand) functional assessment and ROM of the affected joint. Results: All 14 patients had an optimal post-operative follow-up with an adequate osteosynthesis and functional rehabilitations by entirely preserving the operated elbow joint; the mean elbow ROM was 0-118.6 degree (range of 0-130) while the average MEPS score was 86 (range75-100) and 79.9 for the DASH (range 21.7-86.1). Just 2 patients suffered of temporary radial nerve apraxia, healed in the subsequent follow-ups. CONCLUSION: The lateral paratricipital approach preserve both the integrity of the triceps muscle and the elbow biomechanism but we do strongly recommend additional studies to be carried out to highlight differences between it and the classical posterior approach in treating humeral shaft fractures.

Keywords: paratricepital approach, humerus shaft fracture, posterior approach humeral shaft, paratricipital postero-lateral approach

Procedia PDF Downloads 112
15917 An Analytical Approach to Calculate Thermo-Mechanical Stresses in Integral Abutment Bridge Piles

Authors: Jafar Razmi

Abstract:

Integral abutment bridges are bridges that do not have joints. If these bridges are subject to large seasonal and daily temperature variations, the expansion and contraction of the bridge slab is transferred to the piles. Since the piles are deep into the soil, displacement induced by slab can cause bending and stresses in piles. These stresses cause fatigue and failure of piles. A complex mechanical interaction exists between the slab, pile, soil and abutment. This complex interaction needs to be understood in order to calculate the stresses in piles. This paper uses a mechanical approach in developing analytical equations for the complex structure to determine the stresses in piles. The solution to these analytical solutions is developed and compared with finite element analysis results and experimental data. Our comparison shows that using analytical approach can accurately predict the displacement in piles. This approach offers a simplified technique that can be utilized without the need for computationally extensive finite element model.

Keywords: integral abutment bridges, piles, thermo-mechanical stress, stress and strains

Procedia PDF Downloads 227
15916 Defects Estimation of Embedded Systems Components by a Bond Graph Approach

Authors: I. Gahlouz, A. Chellil

Abstract:

The paper concerns the estimation of system components faults by using an unknown inputs observer. To reach this goal, we used the Bond Graph approach to physical modelling. We showed that this graphical tool is allowing the representation of system components faults as unknown inputs within the state representation of the considered physical system. The study of the causal and structural features of the system (controllability, observability, finite structure, and infinite structure) based on the Bond Graph approach was hence fulfilled in order to design an unknown inputs observer which is used for the system component fault estimation.

Keywords: estimation, bond graph, controllability, observability

Procedia PDF Downloads 395
15915 Health Monitoring and Failure Detection of Electronic and Structural Components in Small Unmanned Aerial Vehicles

Authors: Gopi Kandaswamy, P. Balamuralidhar

Abstract:

Fully autonomous small Unmanned Aerial Vehicles (UAVs) are increasingly being used in many commercial applications. Although a lot of research has been done to develop safe, reliable and durable UAVs, accidents due to electronic and structural failures are not uncommon and pose a huge safety risk to the UAV operators and the public. Hence there is a strong need for an automated health monitoring system for UAVs with a view to minimizing mission failures thereby increasing safety. This paper describes our approach to monitoring the electronic and structural components in a small UAV without the need for additional sensors to do the monitoring. Our system monitors data from four sources; sensors, navigation algorithms, control inputs from the operator and flight controller outputs. It then does statistical analysis on the data and applies a rule based engine to detect failures. This information can then be fed back into the UAV and a decision to continue or abort the mission can be taken automatically by the UAV and independent of the operator. Our system has been verified using data obtained from real flights over the past year from UAVs of various sizes that have been designed and deployed by us for various applications.

Keywords: fault detection, health monitoring, unmanned aerial vehicles, vibration analysis

Procedia PDF Downloads 237
15914 A Generic Approach to Reuse Unified Modeling Language Components Following an Agile Process

Authors: Rim Bouhaouel, Naoufel Kraïem, Zuhoor Al Khanjari

Abstract:

Unified Modeling Language (UML) is considered as one of the widespread modeling language standardized by the Object Management Group (OMG). Therefore, the model driving engineering (MDE) community attempts to provide reuse of UML diagrams, and do not construct it from scratch. The UML model appears according to a specific software development process. The existing method generation models focused on the different techniques of transformation without considering the development process. Our work aims to construct an UML component from fragments of UML diagram basing on an agile method. We define UML fragment as a portion of a UML diagram, which express a business target. To guide the generation of fragments of UML models using an agile process, we need a flexible approach, which adapts to the agile changes and covers all its activities. We use the software product line (SPL) to derive a fragment of process agile method. This paper explains our approach, named RECUP, to generate UML fragments following an agile process, and overviews the different aspects. In this paper, we present the approach and we define the different phases and artifacts.

Keywords: UML, component, fragment, agile, SPL

Procedia PDF Downloads 375
15913 Identifying and Quantifying Factors Affecting Traffic Crash Severity under Heterogeneous Traffic Flow

Authors: Praveen Vayalamkuzhi, Veeraragavan Amirthalingam

Abstract:

Studies on safety on highways are becoming the need of the hour as over 400 lives are lost every day in India due to road crashes. In order to evaluate the factors that lead to different levels of crash severity, it is necessary to investigate the level of safety of highways and their relation to crashes. In the present study, an attempt is made to identify the factors that contribute to road crashes and to quantify their effect on the severity of road crashes. The study was carried out on a four-lane divided rural highway in India. The variables considered in the analysis includes components of horizontal alignment of highway, viz., straight or curve section; time of day, driveway density, presence of median; median opening; gradient; operating speed; and annual average daily traffic. These variables were considered after a preliminary analysis. The major complexities in the study are the heterogeneous traffic and the speed variation between different classes of vehicles along the highway. To quantify the impact of each of these factors, statistical analyses were carried out using Logit model and also negative binomial regression. The output from the statistical models proved that the variables viz., horizontal components of the highway alignment; driveway density; time of day; operating speed as well as annual average daily traffic show significant relation with the severity of crashes viz., fatal as well as injury crashes. Further, the annual average daily traffic has significant effect on the severity compared to other variables. The contribution of highway horizontal components on crash severity is also significant. Logit models can predict crashes better than the negative binomial regression models. The results of the study will help the transport planners to look into these aspects at the planning stage itself in the case of highways operated under heterogeneous traffic flow condition.

Keywords: geometric design, heterogeneous traffic, road crash, statistical analysis, level of safety

Procedia PDF Downloads 277
15912 Examination of the Main Behavioral Patterns of Male and Female Students in Islamic Azad University

Authors: Sobhan Sobhani

Abstract:

This study examined the behavioral patterns of student and their determinants according to the "symbolic interaction" sociological perspective in the form of 7 hypotheses. Behavioral patterns of students were classified in 8 categories: religious, scientific, political, artistic, sporting, national, parents and teachers. They were evaluated by student opinions by a five-point Likert rating scale. The statistical population included all male and female students of Islamic Azad University, Behabahan branch, among which 600 patients (268 females and 332 males) were selected randomly. The following statistical methods were used: frequency and percentage, mean, t-test, Pearson correlation coefficient and multi-way analysis of variance. The results obtained from statistical analysis showed that: 1-There is a significant difference between male and female students in terms of disposition to religious figures, artists, teachers and parents. 2-There is a significant difference between students of urban and rural areas in terms of assuming behavioral patterns of religious, political, scientific, artistic, national figures and teachers. 3-The most important criterion for selecting behavioral patterns of students is intellectual understanding with the pattern. 4-The most important factor influencing the behavioral patterns of male and female students is parents followed by friends. 5-Boys are affected by teachers, the Internet and satellite programs more than girls. Girls assume behavioral patterns from books more than boys. 6-There is a significant difference between students in human sciences, technical, medical and engineering disciplines in terms of selecting religious and political figures as behavioral patterns. 7-There is a significant difference between students belonging to different subcultures in terms of assuming behavioral patterns of religious, scientific and cultural figures. 8-Between the first and fourth year students in terms of selecting behavioral patterns, there is a significant difference only in selecting religious figures. 9-There is a significant negative correlation between the education level of parents and the selection of religious and political figures and teachers. 10-There is a significant negative correlation between family income and the selection of political and religious figures.

Keywords: behavioral patterns, behavioral patterns, male and female students, Islamic Azad University

Procedia PDF Downloads 346
15911 Ecopsychological Approach to Enhance Space Consciousness Toward Environment

Authors: Tiwi Kamidin

Abstract:

After years of effort trying to integrate environmental education, studies keep revealing that Malaysian still not reached the certain level of desired commitment toward the environment. Some researchers mentioned that our planet healthy is depending on our mentally health especially our psychological and spiritual is split from the natural. Therefore, this study discussed on ecopcyhological approach in order to enhance space consciousness toward the environment. Space consciousness represents not only freedom from ego but also from dependency on the things of this world, from materialism and materiality. It is the spiritual dimension which alone can give transcendent and true meaning to this world. If pupils can balance this internal awareness will put an individual to respect the environment as part of yourself and your family against only as contributors to the continuance of human’s life. Qualitative findings showed that the informants considered their consciousness toward environment has been changed.

Keywords: ecopsychological approach, space consciousness, environmental education, environment

Procedia PDF Downloads 289
15910 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia

Authors: Rohan Bhasin

Abstract:

Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.

Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM

Procedia PDF Downloads 147
15909 Simulation Based Analysis of Gear Dynamic Behavior in Presence of Multiple Cracks

Authors: Ahmed Saeed, Sadok Sassi, Mohammad Roshun

Abstract:

Gears are important components with a vital role in many rotating machines. One of the common gear failure causes is tooth fatigue crack; however, its early detection is still a challenging task. The objective of this study is to develop a numerical model that simulates the effect of teeth cracks on the resulting gears vibrations and permits consequently to perform an early fault detection. In contrast to other published papers, this work incorporates the possibility of multiple simultaneous cracks with different depths. As cracks alter significantly the stiffness of the tooth, finite element software is used to determine the stiffness variation with respect to the angular position, for different combinations of crack orientation and depth. A simplified six degrees of freedom nonlinear lumped parameter model of a one-stage spur gear system is proposed to study the vibration with and without cracks. The model developed for calculating the stiffness with the crack permitted to update the physical parameters of the second-degree-of-freedom equations of motions describing the vibration of the gearbox. The vibration simulation results of the gearbox were by obtained using Simulink/Matlab. The effect of one crack with different levels was studied thoroughly. The change in the mesh stiffness and the vibration response were found to be consistent with previously published works. In addition, various statistical time domain parameters were considered. They showed different degrees of sensitivity toward the crack depth. Multiple cracks were also introduced at different locations and the vibration response along with the statistical parameters were obtained again for a general case of degradation (increase in crack depth, crack number and crack locations). It was found that although some parameters increase in value as the deterioration level increases, they show almost no change or even decrease when the number of cracks increases. Therefore, the use of any statistical parameters could be misleading if not considered in an appropriate way.

Keywords: Spur gear, cracked tooth, numerical simulation, time-domain parameters

Procedia PDF Downloads 252
15908 Impact of Yogic Exercise on Cardiovascular Function on Selected College Students of High Altitude

Authors: Benu Gupta

Abstract:

The purpose of the study was to assess the impact of yogic exercise on cardiovascular exercises on selected college students of high altitude. The research was conducted on college students of high altitude in Shimla for their cardiovascular function [Blood Pressure (BP), VO2 Max (TLC) and Pulse Rate (PR)] in respect to yogic exercise. Total 139 students were randomly selected from Himachal University colleges in Shimla. The study was conducted in three phases. The subjects were identified in the first phase of research program then further in next phase they were physiologically tested, and yogic exercise battery was operated in different time frame. The entire subjects were treated with three months yogic exercise. The entire lot of students were again evaluated physiologically [(Cardiovascular measurement: Blood Pressure (BP), VO2 Max (TLC) and Pulse Rate (PR)] with standard equipments. The statistical analyses of the variance (PR, BP (SBP & DBP) and TLC) were done. The result reveals that there was a significant difference in TLC; whereas there was no significant difference in PR. For BP statistical analysis suggests no significant difference were formed. Result showed that the BP of the participants were more inclined towards normal standard BP i.e. 120-80 mmHg.

Keywords: cardiovascular function, college students, high altitude, yogic exercise

Procedia PDF Downloads 213
15907 Combining Corpus Linguistics and Critical Discourse Analysis to Study Power Relations in Hindi Newspapers

Authors: Vandana Mishra, Niladri Sekhar Dash, Jayshree Charkraborty

Abstract:

This present paper focuses on the application of corpus linguistics techniques for critical discourse analysis (CDA) of Hindi newspapers. While Corpus linguistics is the study of language as expressed in corpora (samples) of 'real world' text, CDA is an interdisciplinary approach to the study of discourse that views language as a form of social practice. CDA has mainly been studied from a qualitative perspective. However, we can say that recent studies have begun combining corpus linguistics with CDA in analyzing large volumes of text for the study of existing power relations in society. The corpus under our study is also of a sizable amount (1 million words of Hindi newspaper texts) and its analysis requires an alternative analytical procedure. So, we have combined both the quantitative approach i.e. the use of corpus techniques with CDA’s traditional qualitative analysis. In this context, we have focused on the Keyword Analysis Sorting Concordance Lines of the selected Keywords and calculating collocates of the keywords. We have made use of the Wordsmith Tool for all these analysis. The analysis starts with identifying the keywords in the political news corpus when compared with the main news corpus. The keywords are extracted from the corpus based on their keyness calculated through statistical tests like chi-squared test and log-likelihood test on the frequent words of the corpus. Some of the top occurring keywords are मोदी (Modi), भाजपा (BJP), कांग्रेस (Congress), सरकार (Government) and पार्टी (Political party). This is followed by the concordance analysis of these keywords which generates thousands of lines but we have to select few lines and examine them based on our objective. We have also calculated the collocates of the keywords based on their Mutual Information (MI) score. Both concordance and collocation help to identify lexical patterns in the political texts. Finally, all these quantitative results derived from the corpus techniques will be subjectively interpreted in accordance to the CDA’s theory to examine the ways in which political news discourse produces social and political inequality, power abuse or domination.

Keywords: critical discourse analysis, corpus linguistics, Hindi newspapers, power relations

Procedia PDF Downloads 201
15906 Nonparametric Path Analysis with a Truncated Spline Approach in Modeling Waste Management Behavior Patterns

Authors: Adji Achmad Rinaldo Fernandes, Usriatur Rohma

Abstract:

Nonparametric path analysis is a statistical method that does not rely on the assumption that the curve is known. The purpose of this study is to determine the best truncated spline nonparametric path function between linear and quadratic polynomial degrees with 1, 2, and 3 knot points and to determine the significance of estimating the best truncated spline nonparametric path function in the model of the effect of perceived benefits and perceived convenience on behavior to convert waste into economic value through the intention variable of changing people's mindset about waste using the t test statistic at the jackknife resampling stage. The data used in this study are primary data obtained from research grants. The results showed that the best model of nonparametric truncated spline path analysis is quadratic polynomial degree with 3 knot points. In addition, the significance of the best truncated spline nonparametric path function estimation using jackknife resampling shows that all exogenous variables have a significant influence on the endogenous variables.

Keywords: nonparametric path analysis, truncated spline, linear, kuadratic, behavior to turn waste into economic value, jackknife resampling

Procedia PDF Downloads 0
15905 A Theorem Related to Sample Moments and Two Types of Moment-Based Density Estimates

Authors: Serge B. Provost

Abstract:

Numerous statistical inference and modeling methodologies are based on sample moments rather than the actual observations. A result justifying the validity of this approach is introduced. More specifically, it will be established that given the first n moments of a sample of size n, one can recover the original n sample points. This implies that a sample of size n and its first associated n moments contain precisely the same amount of information. However, it is efficient to make use of a limited number of initial moments as most of the relevant distributional information is included in them. Two types of density estimation techniques that rely on such moments will be discussed. The first one expresses a density estimate as the product of a suitable base density and a polynomial adjustment whose coefficients are determined by equating the moments of the density estimate to the sample moments. The second one assumes that the derivative of the logarithm of a density function can be represented as a rational function. This gives rise to a system of linear equations involving sample moments, the density estimate is then obtained by solving a differential equation. Unlike kernel density estimation, these methodologies are ideally suited to model ‘big data’ as they only require a limited number of moments, irrespective of the sample size. What is more, they produce simple closed form expressions that are amenable to algebraic manipulations. They also turn out to be more accurate as will be shown in several illustrative examples.

Keywords: density estimation, log-density, polynomial adjustments, sample moments

Procedia PDF Downloads 145
15904 Statistical Tools for SFRA Diagnosis in Power Transformers

Authors: Rahul Srivastava, Priti Pundir, Y. R. Sood, Rajnish Shrivastava

Abstract:

For the interpretation of the signatures of sweep frequency response analysis(SFRA) of transformer different types of statistical techniques serves as an effective tool for doing either phase to phase comparison or sister unit comparison. In this paper with the discussion on SFRA several statistics techniques like cross correlation coefficient (CCF), root square error (RSQ), comparative standard deviation (CSD), Absolute difference, mean square error(MSE),Min-Max ratio(MM) are presented through several case studies. These methods require sample data size and spot frequencies of SFRA signatures that are being compared. The techniques used are based on power signal processing tools that can simplify result and limits can be created for the severity of the fault occurring in the transformer due to several short circuit forces or due to ageing. The advantages of using statistics techniques for analyzing of SFRA result are being indicated through several case studies and hence the results are obtained which determines the state of the transformer.

Keywords: absolute difference (DABS), cross correlation coefficient (CCF), mean square error (MSE), min-max ratio (MM-ratio), root square error (RSQ), standard deviation (CSD), sweep frequency response analysis (SFRA)

Procedia PDF Downloads 681
15903 A Review on Existing Challenges of Data Mining and Future Research Perspectives

Authors: Hema Bhardwaj, D. Srinivasa Rao

Abstract:

Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.

Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges

Procedia PDF Downloads 92
15902 Communicative Language Teaching Technique: A Neglected Approach in Reading Comprehension Instruction

Authors: Olumide Yusuf Jimoh

Abstract:

Reading comprehension is an interactive and purposeful process of getting meaning from and bringing meaning to a text. Over the years, teachers of the English Language (in Nigeria) have been glued to the monotonous method of making students read comprehension passages silently and then answer the questions that follow such passages without making the reading session interactive. Hence, students often find such exercises monotonous and boring. Consequently, students'’ interest in language learning continues to dwindle, and this often affects their overall academic performance. Relying on Communicative Accommodation Theory therefore, the study employed the qualitative research design method to x-ray Communicative Language Teaching Approach (CLTA) in reading comprehension. Moreover, techniques such as the Genuinely Collaborative Reading Approach (GCRA), Jigsaw reading, Pre-reading, and Post-reading tasks were examined. The researcher submitted that effective reading comprehension could not be done passively. Students must respond to what they read; they must interact not only with the materials being read but also with one another and with the teacher; this can be achieved by developing communicative and interactive reading programs.

Keywords: collaborative reading approach, communicative teaching, interactive reading program, pre-reading task, reading comprehension

Procedia PDF Downloads 80
15901 Person-Environment Fit (PE Fit): Evidence from Brazil

Authors: Jucelia Appio, Danielle Deimling De Carli, Bruno Henrique Rocha Fernandes, Nelson Natalino Frizon

Abstract:

The purpose of this paper is to investigate if there are positive and significant correlations between the dimensions of Person-Environment Fit (Person-Job, Person-Organization, Person-Group and Person-Supervisor) at the “Best Companies to Work for” in Brazil in 2017. For that, a quantitative approach was used with a descriptive method being defined as a research sample the "150 Best Companies to Work for", according to data base collected in 2017 and provided by Fundação Instituto of Administração (FIA) of the University of São Paulo (USP). About the data analysis procedures, asymmetry and kurtosis, factorial analysis, Kaiser-Meyer-Olkin (KMO) tests, Bartlett sphericity and Cronbach's alpha were used for the 69 research variables, and as a statistical technique for the purpose of analyzing the hypothesis, Pearson's correlation analysis was performed. As a main result, we highlight that there was a positive and significant correlation between the dimensions of Person-Environment Fit, corroborating the H1 hypothesis that there is a positive and significant correlation between Person-Job Fit, Person-Organization Fit, Person-Group Fit and Person-Supervisor Fit.

Keywords: Human Resource Management (HRM), Person-Environment Fit (PE), strategic people management, best companies to work for

Procedia PDF Downloads 124
15900 The Implementation of Organizational Ecoinnovativeness as an Expression of a Strategic Approach of an Organization

Authors: Marzena Hajduk-Stelmachowicz

Abstract:

This paper presents the reasons why the implementation of the organizational eco-innovation (based on requirements of the International Standard ISO 14001) can be an expression of a strategic organization approach. An elaboration about different issues associated with the Environmental Management Systems are given.

Keywords: envionmental management system, ISO 14001, organizational ecoinnovativeness, ecoinnovation

Procedia PDF Downloads 297
15899 Socio-Economic and Psychological Factors of Moscow Population Deviant Behavior: Sociological and Statistical Research

Authors: V. Bezverbny

Abstract:

The actuality of the project deals with stable growing of deviant behavior’ statistics among Moscow citizens. During the recent years the socioeconomic health, wealth and life expectation of Moscow residents is regularly growing up, but the limits of crime and drug addiction have grown up seriously. Another serious Moscow problem has been economical stratification of population. The cost of identical residential areas differs at 2.5 times. The project is aimed at complex research and the development of methodology for main factors and reasons evaluation of deviant behavior growing in Moscow. The main project objective is finding out the links between the urban environment quality and dynamics of citizens’ deviant behavior in regional and municipal aspect using the statistical research methods and GIS modeling. The conducted research allowed: 1) to evaluate the dynamics of deviant behavior in Moscow different administrative districts; 2) to describe the reasons of crime increasing, drugs addiction, alcoholism, suicides tendencies among the city population; 3) to develop the city districts classification based on the level of the crime rate; 4) to create the statistical database containing the main indicators of Moscow population deviant behavior in 2010-2015 including information regarding crime level, alcoholism, drug addiction, suicides; 5) to present statistical indicators that characterize the dynamics of Moscow population deviant behavior in condition of expanding the city territory; 6) to analyze the main sociological theories and factors of deviant behavior for concretization the deviation types; 7) to consider the main theoretical statements of the city sociology devoted to the reasons for deviant behavior in megalopolis conditions. To explore the level of deviant behavior’ factors differentiation, the questionnaire was worked out, and sociological survey involved more than 1000 people from different districts of the city was conducted. Sociological survey allowed to study the socio-economical and psychological factors of deviant behavior. It also included the Moscow residents’ open-ended answers regarding the most actual problems in their districts and reasons of wish to leave their place. The results of sociological survey lead to the conclusion that the main factors of deviant behavior in Moscow are high level of social inequality, large number of illegal migrants and bums, nearness of large transport hubs and stations on the territory, ineffective work of police, alcohol availability and drug accessibility, low level of psychological comfort for Moscow citizens, large number of building projects.

Keywords: deviant behavior, megapolis, Moscow, urban environment, social stratification

Procedia PDF Downloads 178
15898 Multi-Elemental Analysis Using Inductively Coupled Plasma Mass Spectrometry for the Geographical Origin Discrimination of Greek Giant Beans “Gigantes Elefantes”

Authors: Eleni C. Mazarakioti, Anastasios Zotos, Anna-Akrivi Thomatou, Efthimios Kokkotos, Achilleas Kontogeorgos, Athanasios Ladavos, Angelos Patakas

Abstract:

“Gigantes Elefantes” is a particularly dynamic crop of giant beans cultivated in western Macedonia (Greece). This variety of large beans growing in this area and specifically in the regions of Prespes and Kastoria is a protected designation of origin (PDO) species with high nutritional quality. Mislabeling of geographical origin and blending with unidentified samples are common fraudulent practices in Greek food market with financial and possible health consequences. In the last decades, multi-elemental composition analysis has been used in identifying the geographical origin of foods and agricultural products. In an attempt to discriminate the authenticity of Greek beans, multi-elemental analysis (Ag, Al, As, B, Ba, Be, Ca, Cd, Co, Cr, Cs, Cu, Fe, Ga, Ge, K, Li, Mg, Mn, Mo, Na, Nb, Ni, P, Pb, Rb, Re, Se, Sr, Ta, Ti, Tl, U, V, W, Zn, Zr) was performed by inductively coupled plasma mass spectrometry (ICP-MS) on 320 samples of beans, originated from Greece (Prespes and Kastoria), China and Poland. All samples were collected during the autumn of 2021. The obtained data were analysed by principal component analysis (PCA), an unsupervised statistical method, which allows for to reduce of the dimensionality of the enormous datasets. Statistical analysis revealed a clear separation of beans that had been cultivated in Greece compared with those from China and Poland. An adequate discrimination of geographical origin between bean samples originating from the two Greece regions, Prespes and Kastoria, was also evident. Our results suggest that multi-elemental analysis combined with the appropriate multivariate statistical method could be a useful tool for bean’s geographical authentication. Acknowledgment: This research has been financed by the Public Investment Programme/General Secretariat for Research and Innovation, under the call “YPOERGO 3, code 2018SE01300000: project title: ‘Elaboration and implementation of methodology for authenticity and geographical origin assessment of agricultural products.

Keywords: geographical origin, authenticity, multi-elemental analysis, beans, ICP-MS, PCA

Procedia PDF Downloads 57
15897 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood

Authors: Randa Alharbi, Vladislav Vyshemirsky

Abstract:

Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.

Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)

Procedia PDF Downloads 190
15896 Application of Stochastic Models to Annual Extreme Streamflow Data

Authors: Karim Hamidi Machekposhti, Hossein Sedghi

Abstract:

This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.

Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river

Procedia PDF Downloads 133
15895 A Pragmatic Study of Falnama Texts Based on Critical Discourse Analysis Approach

Authors: Raziyeh Mashhadi Moghadam

Abstract:

Persian writings in the form of stories, scientific articles, historiographies, biographies, and philosophical, religious, and poetic arguments have established their presence in the past and present. Any piece of text is composed in a unique style depending on its content and subject. In this paper, a manuscript called Falnama of the Prophet is reviewed. Only a few scattered pages of this version are extant, and the author, using the name of twenty-four prophets, seeks to explore the presence and future of the reader. This version is analyzed based on Norman Fairclough’s Critical Discourse Analysis (CDA) approach to unravel the underlying processes in this type of manuscript. The spelling of some words and sentences is different from that of the new written Persian version.

Keywords: application of Falnama texts, critical discourse analysis, Fairclough’s approach

Procedia PDF Downloads 92
15894 The Assessment of Forest Wood Biomass Potential in Terms of Sustainable Development

Authors: Julija Konstantinavičienė, Vlada Vitunskienė

Abstract:

The role of sustainable biomass, including wood biomass, is becoming more important because of European Green Deal. The New EU Forest strategy is a flagship element of the European Green Deal and a key action on the EU biodiversity strategy for 2030. The first measure of this strategy is promoting sustainable forest management, including encouraging the sustainable use of wood-based resources. The first aim of this research was to develop and present a new approach to the concept of forest wood biomass potential in terms of sustainable development, distinguishing theoretical, technical and sustainable potential and detailing its constraints. The second aim was to prepare the methodology outline of sustainable forest wood biomass potential assessment and empirically check this methodology, considering economic, social and ecological constraints. The basic methodologies of the research: the review of research (with a combination of semi-systematic and integrative review methodologies), rapid assessment method and statistical data analysis. The developed methodology of assessment of forest wood potential in terms of sustainable development can be used in Lithuania and in other countries and will let us compare this potential a different time and spatial levels. The application of the methodology will be able to serve the development of new national strategies for the wood sector.

Keywords: assessment, constraints, forest wood biomass, methodology, potential, sustainability

Procedia PDF Downloads 106