Search results for: Statistical Analysis
28907 Nonparametric Path Analysis with a Truncated Spline Approach in Modeling Waste Management Behavior Patterns
Authors: Adji Achmad Rinaldo Fernandes, Usriatur Rohma
Abstract:
Nonparametric path analysis is a statistical method that does not rely on the assumption that the curve is known. The purpose of this study is to determine the best truncated spline nonparametric path function between linear and quadratic polynomial degrees with 1, 2, and 3 knot points and to determine the significance of estimating the best truncated spline nonparametric path function in the model of the effect of perceived benefits and perceived convenience on behavior to convert waste into economic value through the intention variable of changing people's mindset about waste using the t test statistic at the jackknife resampling stage. The data used in this study are primary data obtained from research grants. The results showed that the best model of nonparametric truncated spline path analysis is quadratic polynomial degree with 3 knot points. In addition, the significance of the best truncated spline nonparametric path function estimation using jackknife resampling shows that all exogenous variables have a significant influence on the endogenous variables.Keywords: nonparametric path analysis, truncated spline, linear, kuadratic, behavior to turn waste into economic value, jackknife resampling
Procedia PDF Downloads 4628906 Analysis of Operating Speed on Four-Lane Divided Highways under Mixed Traffic Conditions
Authors: Chaitanya Varma, Arpan Mehar
Abstract:
The present study demonstrates the procedure to analyse speed data collected on various four-lane divided sections in India. Field data for the study was collected at different straight and curved sections on rural highways with the help of radar speed gun and video camera. The data collected at the sections were analysed and parameters pertain to speed distributions were estimated. The different statistical distribution was analysed on vehicle type speed data and for mixed traffic speed data. It was found that vehicle type speed data was either follows the normal distribution or Log-normal distribution, whereas the mixed traffic speed data follows more than one type of statistical distribution. The most common fit observed on mixed traffic speed data were Beta distribution and Weibull distribution. The separate operating speed model based on traffic and roadway geometric parameters were proposed in the present study. The operating speed model with traffic parameters and curve geometry parameters were established. Two different operating speed models were proposed with variables 1/R and Ln(R) and were found to be realistic with a different range of curve radius. The models developed in the present study are simple and realistic and can be used for forecasting operating speed on four-lane highways.Keywords: highway, mixed traffic flow, modeling, operating speed
Procedia PDF Downloads 45928905 Multivariate Analysis of the Relationship between Professional Burnout, Emotional Intelligence and Health Level in Teachers University of Guayaquil
Authors: Viloria Marin Hermes, Paredes Santiago Maritza, Viloria Paredes Jonathan
Abstract:
The aim of this study is to assess the prevalence of Burnout syndrome in a sample of 600 professors at the University of Guayaquil (Ecuador) using the Maslach Burnout Inventory (M.B.I.). In addition, assessment was made of the effects on health from professional burnout using the General Health Questionnaire (G.H.Q.-28), and the influence of Emotional Intelligence on prevention of its symptoms using the Spanish version of the Trait Meta-Mood Scale (T.M.M.S.-24). After confirmation of the underlying factor structure, the three measurement tools showed high levels of internal consistency, and specific cut-off points were proposed for the group of Latin American academics in the M.B.I. Statistical analysis showed the syndrome is present extensively, particularly on medium levels, with notably low scores given for Professional Self-Esteem. The application of Canonical Correspondence Analysis revealed that low levels of self-esteem are related to depression, with a lack of personal resources related to anxiety and insomnia, whereas the ability to perceive and control emotions and feelings improves perceptions of professional effectiveness and performance.Keywords: burnout, academics, emotional intelligence, general health, canonical correspondence analysis
Procedia PDF Downloads 36928904 Role of DatScan in the Diagnosis of Parkinson's Disease
Authors: Shraddha Gopal, Jayam Lazarus
Abstract:
Aims: To study the referral practice and impact of DAT-scan in the diagnosis or exclusion of Parkinson’s disease. Settings and Designs: A retrospective study Materials and methods: A retrospective study of the results of 60 patients who were referred for a DAT scan over a period of 2 years from the Department of Neurology at Northern Lincolnshire and Goole NHS trust. The reason for DAT scan referral was noted under 5 categories against Parkinson’s disease; drug-induced Parkinson’s, essential tremors, diagnostic dilemma, not responding to Parkinson’s treatment, and others. We assessed the number of patients who were diagnosed with Parkinson’s disease against the number of patients in whom Parkinson’s disease was excluded or an alternative diagnosis was made. Statistical methods: Microsoft Excel was used for data collection and statistical analysis, Results: 30 of the 60 scans were performed to confirm the diagnosis of early Parkinson’s disease, 13 were done to differentiate essential tremors from Parkinsonism, 6 were performed to exclude drug-induced Parkinsonism, 5 were done to look for alternative diagnosis as the patients were not responding to anti-Parkinson medication and 6 indications were outside the recommended guidelines. 55% of cases were confirmed with a diagnosis of Parkinson’s disease. 43.33% had Parkinson’s disease excluded. 33 of the 60 scans showed bilateral abnormalities and confirmed the clinical diagnosis of Parkinson’s disease. Conclusion: DAT scan provides valuable information in confirming Parkinson’s disease in 55% of patients along with excluding the diagnosis in 43.33% of patients aiding an alternative diagnosis.Keywords: DATSCAN, Parkinson's disease, diagnosis, essential tremors
Procedia PDF Downloads 23028903 Geometric Imperfections in Lattice Structures: A Simulation Strategy to Predict Strength Variability
Authors: Xavier Lorang, Ahmadali Tahmasebimoradi, Chetra Mang, Sylvain Girard
Abstract:
The additive manufacturing processes (e.g. selective laser melting) allow us to produce lattice structures which have less weight, higher impact absorption capacity, and better thermal exchange property compared to the classical structures. Unfortunately, geometric imperfections (defects) in the lattice structures are by-products results of the manufacturing process. These imperfections decrease the lifetime and the strength of the lattice structures and alternate their mechanical responses. The objective of the paper is to present a simulation strategy which allows us to take into account the effect of the geometric imperfections on the mechanical response of the lattice structure. In the first part, an identification method of geometric imperfection parameters of the lattice structure based on point clouds is presented. These point clouds are based on tomography measurements. The point clouds are fed into the platform LATANA (LATtice ANAlysis) developed by IRT-SystemX to characterize the geometric imperfections. This is done by projecting the point clouds of each microbeam along the beam axis onto a 2D surface. Then, by fitting an ellipse to the 2D projections of the points, the geometric imperfections are characterized by introducing three parameters of an ellipse; semi-major/minor axes and angle of rotation. With regard to the calculated parameters of the microbeam geometric imperfections, a statistical analysis is carried out to determine a probability density law based on a statistical hypothesis. The microbeam samples are randomly drawn from the density law and are used to generate lattice structures. In the second part, a finite element model for the lattice structure with the simplified geometric imperfections (ellipse parameters) is presented. This numerical model is used to simulate the generated lattice structures. The propagation of the uncertainties of geometric imperfections is shown through the distribution of the computed mechanical responses of the lattice structures.Keywords: additive manufacturing, finite element model, geometric imperfections, lattice structures, propagation of uncertainty
Procedia PDF Downloads 18328902 Importance of Different Spatial Parameters in Water Quality Analysis within Intensive Agricultural Area
Authors: Marina Bubalo, Davor Romić, Stjepan Husnjak, Helena Bakić
Abstract:
Even though European Council Directive 91/676/EEC known as Nitrates Directive was adopted in 1991, the issue of water quality preservation in areas of intensive agricultural production still persist all over Europe. High nitrate nitrogen concentrations in surface and groundwater originating from diffuse sources are one of the most important environmental problems in modern intensive agriculture. The fate of nitrogen in soil, surface and groundwater in agricultural area is mostly affected by anthropogenic activity (i.e. agricultural practice) and hydrological and climatological conditions. The aim of this study was to identify impact of land use, soil type, soil vulnerability to pollutant percolation, and natural aquifer vulnerability to nitrate occurrence in surface and groundwater within an intensive agricultural area. The study was set in Varaždin County (northern Croatia), which is under significant influence of the large rivers Drava and Mura and due to that entire area is dominated by alluvial soil with shallow active profile mainly on gravel base. Negative agricultural impact on water quality in this area is evident therefore the half of selected county is a part of delineated nitrate vulnerable zones (NVZ). Data on water quality were collected from 7 surface and 8 groundwater monitoring stations in the County. Also, recent study of the area implied detailed inventory of agricultural production and fertilizers use with the aim to produce new agricultural land use database as one of dominant parameters. The analysis of this database done using ArcGIS 10.1 showed that 52,7% of total County area is agricultural land and 59,2% of agricultural land is used for intensive agricultural production. On the other hand, 56% of soil within the county is classified as soil vulnerable to pollutant percolation. The situation is similar with natural aquifer vulnerability; northern part of the county ranges from high to very high aquifer vulnerability. Statistical analysis of water quality data is done using SPSS 13.0. Cluster analysis group both surface and groundwater stations in two groups according to nitrate nitrogen concentrations. Mean nitrate nitrogen concentration in surface water – group 1 ranges from 4,2 to 5,5 mg/l and in surface water – group 2 from 24 to 42 mg/l. The results are similar, but evidently higher, in groundwater samples; mean nitrate nitrogen concentration in group 1 ranges from 3,9 to 17 mg/l and in group 2 from 36 to 96 mg/l. ANOVA analysis confirmed statistical significance between stations that are classified in the same group. The previously listed parameters (land use, soil type, etc.) were used in factorial correspondence analysis (FCA) to detect importance of each stated parameter in local water quality. Since stated parameters mostly cannot be altered, there is obvious necessity for more precise and more adapted land management in such conditions.Keywords: agricultural area, nitrate, factorial correspondence analysis, water quality
Procedia PDF Downloads 25828901 Comparison of Solar Radiation Models
Authors: O. Behar, A. Khellaf, K. Mohammedi, S. Ait Kaci
Abstract:
Up to now, most validation studies have been based on the MBE and RMSE, and therefore, focused only on long and short terms performance to test and classify solar radiation models. This traditional analysis does not take into account the quality of modeling and linearity. In our analysis we have tested 22 solar radiation models that are capable to provide instantaneous direct and global radiation at any given location Worldwide. We introduce a new indicator, which we named Global Accuracy Indicator (GAI) to examine the linear relationship between the measured and predicted values and the quality of modeling in addition to long and short terms performance. Note that the quality of model has been represented by the T-Statistical test, the model linearity has been given by the correlation coefficient and the long and short term performance have been respectively known by the MBE and RMSE. An important founding of this research is that the use GAI allows avoiding default validation when using traditional methodology that might results in erroneous prediction of solar power conversion systems performances.Keywords: solar radiation model, parametric model, performance analysis, Global Accuracy Indicator (GAI)
Procedia PDF Downloads 34828900 Econometric Analysis of Organic Vegetable Production in Turkey
Authors: Ersin Karakaya, Halit Tutar
Abstract:
Reliable foods must be consumed in terms of healthy nutrition. The production and dissemination of diatom products in Turkey is rapidly evolving on the basis of preserving ecological balance, ensuring sustainability in agriculture and offering quality, reliable products to consumers. In this study, year in Turkey as (2002- 2015) to determine values of such as cultivated land of organic vegetable production, production levels, production quantity, number of products, number of farmers. It is intended to make the econometric analysis of the factors affecting the production of organic vegetable production (Number of products, Number of farmers and cultivated land). The main material of the study has created secondary data in relation to the 2002-2015 period as organic vegetable production in Turkey and regression analysis of the factors affecting the value of production of organic vegetable is determined by the Least Squares Method with EViews statistical software package.Keywords: number of farmers, cultivated land, Eviews, Turkey
Procedia PDF Downloads 30528899 Carbon Sequestration Modeling in the Implementation of REDD+ Programmes in Nigeria
Authors: Oluwafemi Samuel Oyamakin
Abstract:
The forest in Nigeria is currently estimated to extend to around 9.6 million hectares, but used to expand over central and southern Nigeria decades ago. The forest estate is shrinking due to long-term human exploitation for agricultural development, fuel wood demand, uncontrolled forest harvesting and urbanization, amongst other factors, compounded by population growth in rural areas. Nigeria has lost more than 50% of its forest cover since 1990 and currently less than 10% of the country is forested. The current deforestation rate is estimated at 3.7%, which is one of the highest in the world. Reducing Emissions from Deforestation and forest Degradation plus conservation, sustainable management of forests and enhancement of forest carbon stocks constituted what is referred to as REDD+. This study evaluated some of the existing way of computing carbon stocks using eight indigenous tree species like Mansonia, Shorea, Bombax, Terminalia superba, Khaya grandifolia, Khaya senegalenses, Pines and Gmelina arborea. While these components are the essential elements of REDD+ programme, they can be brought under a broader framework of systems analysis designed to arrive at optimal solutions for future predictions through statistical distribution pattern of carbon sequestrated by various species of tree. Available data on height and diameter of trees in Ibadan were studied and their respective potentials of carbon sequestration level were assessed and subjected to tests so as to determine the best statistical distribution that would describe the carbon sequestration pattern of trees. The result of this study suggests a reasonable statistical distribution for carbons sequestered in simulation studies and hence, allow planners and government in determining resources forecast for sustainable development especially where experiments with real-life systems are infeasible. Sustainable management of forest can then be achieved by projecting future condition of forests under different management regimes thereby supporting conservation and REDD+ programmes in Nigeria.Keywords: REDD+, carbon, climate change, height and diameter
Procedia PDF Downloads 16528898 Analysis of Advancements in Process Modeling and Reengineering at Fars Regional Electric Company, Iran
Authors: Mohammad Arabi
Abstract:
Business Process Reengineering (BPR) is a systematic approach to fundamentally redesign organizational processes to achieve significant improvements in organizational performance. At Fars Regional Electric Company, implementing BPR is deemed essential to increase productivity, reduce costs, and improve service quality. This article examines how BPR can help enhance the performance of Fars Regional Electric Company. The objective of this research is to evaluate and analyze the advancements in process modeling and reengineering at Fars Regional Electric Company and to provide solutions for improving the productivity and efficiency of organizational processes. This study aims to demonstrate how BPR can be used to improve organizational processes and enhance the overall performance of the company. This research employs both qualitative and quantitative research methods and includes interviews with senior managers and experts at Fars Regional Electric Company. The analytical tools include process modeling software such as Bizagi and ARIS, and statistical analysis software such as SPSS and Minitab. Data analysis was conducted using advanced statistical methods. The results indicate that the use of BPR techniques can lead to a significant reduction in process execution time and overall improvement in quality. Implementing BPR at Fars Regional Electric Company has led to increased productivity, reduced costs, and improved overall performance of the company. This study shows that with proper implementation of BPR and the use of modeling tools, the company can achieve significant improvements in its processes. Recommendations: (1) Continuous Training for Staff: Invest in continuous training of staff to enhance their skills and knowledge in BPR. (2) Use of Advanced Technologies: Utilize modeling and analysis software to improve processes. (3) Implementation of Effective Management Systems: Employ knowledge and information management systems to enhance organizational performance. (4) Continuous Monitoring and Review of Processes: Regularly review and revise processes to ensure ongoing improvements. This article highlights the importance of improving organizational processes at Fars Regional Electric Company and recommends that managers and decision-makers at the company seriously consider reengineering processes and utilizing modeling technologies to achieve developmental goals and continuous improvement.Keywords: business process reengineering, electric company, Fars province, process modeling advancements
Procedia PDF Downloads 4728897 Investigation of Effective Parameters on Water Quality of Iranian Rivers Using Hydrochemical and Statistical Methods
Authors: Maryam Sayadi, Rana Sedighpour, Hossein Rezaie
Abstract:
In this study, in order to evaluate water quality of Gamasiab and Gharehsoo rivers located in Kermanshah province, the information of a 5-year statistical period during the years 2014-2018 was used. To evaluate the hydrochemistry of water, first the type and hydrogeochemical facies of river water were determined using Stiff and Piper diagrams. Then, based on Gibbs diagram and combination diagrams, the factors controlling the chemical parameters of the two rivers were identified. Saturation indices were used to predict the possibility of dissolution and deposition of some minerals. Then, in order to classify water in different sections, fourteen water quality indicators for different uses along with WHO standard were used. Finally, factor analysis was used to determine the processes affecting the hydrochemistry of the two rivers. The results of this study showed that in both rivers, the predominant type and facies are bicarbonate of calcite. Also, the main factor in changing the chemical quality of water in both Gamasiab and Gharehsoo rivers is the water-rock reaction. According to the results of factor analysis in both rivers, two factors have the greatest impact on water quality in the region. Among the parameters of Gamasiab river in the first factor, HCO3-, Na+ and Cl-, respectively, had the highest factor loads, and in the second factor, SO42- and Mg2+ were selected as the main parameters. The parameters Ca2+, Cl- and Na have the highest factor loads in the first factor and in the second factor Mg2+ and SO42- have the highest factor loads in Gharehsoo river. The dissolution of carbonate formations due to their abundance and expansion in the two basins has a more significant effect on changing water chemistry. It has saturated the water of rivers with aragonite, calcite and dolomite. Due to the low contribution of the second factor in changing the chemical parameters, the water of both rivers is saturated with respect to evaporative minerals such as gypsum, halite and anhydrite in all stations. Based on Schoeller diagrams, Wilcox and other quality indicators in these two sections, the amount of main physicochemical parameters are in the desired range for drinking and agriculture. The results of Langelier, Ryznar, Larson-Skold and Puckorius indices showed that water is corrosive in industry.Keywords: factor analysis, hydrochemical, saturation index, surface water quality
Procedia PDF Downloads 12528896 Transformation of Health Communication Literacy in Information Technology during Pandemic in 2019-2022
Authors: K. Y. S. Putri, Heri Fathurahman, Yuki Surisita, Widi Sagita, Kiki Dwi Arviani
Abstract:
Society needs the assistance of academics in understanding and being skilled in health communication literacy. Information technology runs very fast while health communication literacy skills in getting health communication information during the pandemic are not as fast as the development of information technology. The research question is whether there is an influence of health communication on information technology in health information during the pandemic in Indonesia. The purpose of the study is to find out the influence of health communication on information technology in health information during the pandemic in Indonesia. The concepts of health communication literacy and information technology are used this study. Previous research is in support of this study. Quantitative research methods by disseminating questionnaires in this study. The validity and reliability test of this study is positive, so it can proceed to the next statistical analysis. Descriptive results of variable health communication literacy are of positive value in all dimensions. All dimensions of information technology are of positive value. Statistical tests of the influence of health communication literacy on information technology are of great value. Discussion of both variables in the influence of health communication literacy and high-value information technology because health communication literacy has a high effect in information technology. Respondents to this study have high information technology skills. So that health communication literacy in obtaining health information during the 2019-2022 pandemic is needed. Research advice is that academics are still very much needed by the community in the development of society during the pandemic.Keywords: health information, health information needs, literacy health communication, information technology
Procedia PDF Downloads 13828895 Multi-Source Data Fusion for Urban Comprehensive Management
Authors: Bolin Hua
Abstract:
In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data
Procedia PDF Downloads 39228894 Ranking the Elements of Relationship Market Orientation Banks (Case Study: Saderat Bank of Iran)
Authors: Sahar Jami, Iman Valizadeh
Abstract:
Today banks not only should seek for new customers but also should consider previous maintenance and retention and establish a stable relationship with them. In this term, relationship-manner marketing seeks to make, maintain, and promote the relationship between customers and other stakeholders in benefits to fulfill all involved parties. This fact is possible just by interactive transaction and promises fulfillment. According to the importance of relationship-manner marketing in banks, making context to make relationship-manner marketing has high importance. Therefore, the present study aims at exploring intention condition to relationship-manner marketing in Iran Province Iran Limited bank, and also prioritizing its variables using hierarchical analysis (AHP). There is questionnaire designed in this research to paired comparison of relationship-manner marketing elements. After distributing this questionnaire among statistical society members who are 20 of Iran Limited bank experts, data analysis has been done by Expert Choice software.Keywords: relationship marketing, relationship market orientation, Saderat Bank of Iran, hierarchical analysis
Procedia PDF Downloads 41728893 Application of a Model-Free Artificial Neural Networks Approach for Structural Health Monitoring of the Old Lidingö Bridge
Authors: Ana Neves, John Leander, Ignacio Gonzalez, Raid Karoumi
Abstract:
Systematic monitoring and inspection are needed to assess the present state of a structure and predict its future condition. If an irregularity is noticed, repair actions may take place and the adequate intervention will most probably reduce the future costs with maintenance, minimize downtime and increase safety by avoiding the failure of the structure as a whole or of one of its structural parts. For this to be possible decisions must be made at the right time, which implies using systems that can detect abnormalities in their early stage. In this sense, Structural Health Monitoring (SHM) is seen as an effective tool for improving the safety and reliability of infrastructures. This paper explores the decision-making problem in SHM regarding the maintenance of civil engineering structures. The aim is to assess the present condition of a bridge based exclusively on measurements using the suggested method in this paper, such that action is taken coherently with the information made available by the monitoring system. Artificial Neural Networks are trained and their ability to predict structural behavior is evaluated in the light of a case study where acceleration measurements are acquired from a bridge located in Stockholm, Sweden. This relatively old bridge is presently still in operation despite experiencing obvious problems already reported in previous inspections. The prediction errors provide a measure of the accuracy of the algorithm and are subjected to further investigation, which comprises concepts like clustering analysis and statistical hypothesis testing. These enable to interpret the obtained prediction errors, draw conclusions about the state of the structure and thus support decision making regarding its maintenance.Keywords: artificial neural networks, clustering analysis, model-free damage detection, statistical hypothesis testing, structural health monitoring
Procedia PDF Downloads 20728892 Enhancing Technical Trading Strategy on the Bitcoin Market using News Headlines and Language Models
Authors: Mohammad Hosein Panahi, Naser Yazdani
Abstract:
we present a technical trading strategy that leverages the FinBERT language model and financial news analysis with a focus on news related to a subset of Nasdaq 100 stocks. Our approach surpasses the baseline Range Break-out strategy in the Bitcoin market, yielding a remarkable 24.8% increase in the win ratio for all Friday trades and an impressive 48.9% surge in short trades specifically on Fridays. Moreover, we conduct rigorous hypothesis testing to establish the statistical significance of these improvements. Our findings underscore considerable potential of our NLP-driven approach in enhancing trading strategies and achieving greater profitability within financial markets.Keywords: quantitative finance, technical analysis, bitcoin market, NLP, language models, FinBERT, technical trading
Procedia PDF Downloads 7328891 Effect of Angles Collision, Absorption, Dash and Their Relationship with the Finale Results Case the Algerian Elite Team Triple Jump
Authors: Guebli Abdelkader, Zerf Mohammed, Mekkades Moulay Idriss, BenGoua Ali, Atouti Nouredinne, Habchi Nawel
Abstract:
The paper aims to show the influence of angles in the results of triple jump. Whereas our background confirms that a series of motions are characterized by complex angles in the properties phase (hop, step, and jump) as a combination of the pushed phase on ultimate phases in the result. For the purpose, our results are obtained from the National Athletics Championship 2013, which was filmed and analysis by the software kinovea. Based on the statistical analysis we confirm: there is a positive relationship between angle of the leg, hip angle, angle of the trunk in the collision during (hop, step, and jump), and there is a negative correlation to the angle of the knee relationship in a collision during.Keywords: kinematics variables, the triple jump, the finale results, digital achievement
Procedia PDF Downloads 32528890 Simulation Based Analysis of Gear Dynamic Behavior in Presence of Multiple Cracks
Authors: Ahmed Saeed, Sadok Sassi, Mohammad Roshun
Abstract:
Gears are important components with a vital role in many rotating machines. One of the common gear failure causes is tooth fatigue crack; however, its early detection is still a challenging task. The objective of this study is to develop a numerical model that simulates the effect of teeth cracks on the resulting gears vibrations and permits consequently to perform an early fault detection. In contrast to other published papers, this work incorporates the possibility of multiple simultaneous cracks with different depths. As cracks alter significantly the stiffness of the tooth, finite element software is used to determine the stiffness variation with respect to the angular position, for different combinations of crack orientation and depth. A simplified six degrees of freedom nonlinear lumped parameter model of a one-stage spur gear system is proposed to study the vibration with and without cracks. The model developed for calculating the stiffness with the crack permitted to update the physical parameters of the second-degree-of-freedom equations of motions describing the vibration of the gearbox. The vibration simulation results of the gearbox were by obtained using Simulink/Matlab. The effect of one crack with different levels was studied thoroughly. The change in the mesh stiffness and the vibration response were found to be consistent with previously published works. In addition, various statistical time domain parameters were considered. They showed different degrees of sensitivity toward the crack depth. Multiple cracks were also introduced at different locations and the vibration response along with the statistical parameters were obtained again for a general case of degradation (increase in crack depth, crack number and crack locations). It was found that although some parameters increase in value as the deterioration level increases, they show almost no change or even decrease when the number of cracks increases. Therefore, the use of any statistical parameters could be misleading if not considered in an appropriate way.Keywords: Spur gear, cracked tooth, numerical simulation, time-domain parameters
Procedia PDF Downloads 26528889 Frame to Frameless: Stereotactic Operation Progress in Robot Time
Authors: Zengmin Tian, Bin Lv, Rui Hui, Yupeng Liu, Chuan Wang, Qing Liu, Hongyu Li, Yan Qi, Li Song
Abstract:
Objective Robot was used for replacement of the frame in recent years. The paper is to investigate the safety and effectiveness of frameless stereotactic surgery in the treatment of children with cerebral palsy. Methods Clinical data of 425 children with spastic cerebral palsy were retrospectively analyzed. The patients were treated with robot-assistant frameless stereotactic surgery of nuclear mass destruction. The motor function was evaluated by gross motor function measure-88 (GMFM-88) before the operation, 1 week and 3 months after the operation respectively. The statistical analysis was performed. Results The postoperative CT showed that the destruction area covered the predetermined target in all the patients. Minimal bleeding of puncture channel occurred in 2 patient, and mild fever in 3 cases. Otherwise, there was no severe surgical complication occurred. The GMFM-88 scores were 49.1±22.5 before the operation, 52.8±24.2 and 64.2±21.4 at the time of 1 week and 3 months after the operation, respectively. There was statistical difference between before and after the operation (P<0.01). After 3 months, the total effective rate was 98.1%, and the average improvement rate of motor function was 24.3% . Conclusion Replaced the traditional frame, the robot-assistant frameless stereotactic surgery is safe and reliable for children with spastic cerebral palsy, which has positive significance in improving patients’ motor function.Keywords: cerebral palsy, robotics, stereotactic techniques, frameless operation
Procedia PDF Downloads 8528888 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators
Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros
Abstract:
Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis
Procedia PDF Downloads 13828887 A Nonlocal Means Algorithm for Poisson Denoising Based on Information Geometry
Authors: Dongxu Chen, Yipeng Li
Abstract:
This paper presents an information geometry NonlocalMeans(NLM) algorithm for Poisson denoising. NLM estimates a noise-free pixel as a weighted average of image pixels, where each pixel is weighted according to the similarity between image patches in Euclidean space. In this work, every pixel is a Poisson distribution locally estimated by Maximum Likelihood (ML), all distributions consist of a statistical manifold. A NLM denoising algorithm is conducted on the statistical manifold where Fisher information matrix can be used for computing distribution geodesics referenced as the similarity between patches. This approach was demonstrated to be competitive with related state-of-the-art methods.Keywords: image denoising, Poisson noise, information geometry, nonlocal-means
Procedia PDF Downloads 28428886 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection
Authors: Mahshid Arabi
Abstract:
With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.Keywords: data protection, digital technologies, information security, modern management
Procedia PDF Downloads 2828885 Measurement and Prediction of Speed of Sound in Petroleum Fluids
Authors: S. Ghafoori, A. Al-Harbi, B. Al-Ajmi, A. Al-Shaalan, A. Al-Ajmi, M. Ali Juma
Abstract:
Seismic methods play an important role in the exploration for hydrocarbon reservoirs. However, the success of the method depends strongly on the reliability of the measured or predicted information regarding the velocity of sound in the media. Speed of sound has been used to study the thermodynamic properties of fluids. In this study, experimental data are reported and analyzed on the speed of sound in toluene and octane binary mixture. Three-factor three-level Box-Benhkam design is used to determine the significance of each factor, the synergetic effects of the factors, and the most significant factors on speed of sound. The developed mathematical model and statistical analysis provided a critical analysis of the simultaneous interactive effects of the independent variables indicating that the developed quadratic models were highly accurate and predictive.Keywords: experimental design, octane, speed of sound, toluene
Procedia PDF Downloads 27128884 Statistical Data Analysis of Migration Impact on the Spread of HIV Epidemic Model Using Markov Monte Carlo Method
Authors: Ofosuhene O. Apenteng, Noor Azina Ismail
Abstract:
Over the last several years, concern has developed over how to minimize the spread of HIV/AIDS epidemic in many countries. AIDS epidemic has tremendously stimulated the development of mathematical models of infectious diseases. The transmission dynamics of HIV infection that eventually developed AIDS has taken a pivotal role of much on building mathematical models. From the initial HIV and AIDS models introduced in the 80s, various improvements have been taken into account as how to model HIV/AIDS frameworks. In this paper, we present the impact of migration on the spread of HIV/AIDS. Epidemic model is considered by a system of nonlinear differential equations to supplement the statistical method approach. The model is calibrated using HIV incidence data from Malaysia between 1986 and 2011. Bayesian inference based on Markov Chain Monte Carlo is used to validate the model by fitting it to the data and to estimate the unknown parameters for the model. The results suggest that the migrants stay for a long time contributes to the spread of HIV. The model also indicates that susceptible individual becomes infected and moved to HIV compartment at a rate that is more significant than the removal rate from HIV compartment to AIDS compartment. The disease-free steady state is unstable since the basic reproduction number is 1.627309. This is a big concern and not a good indicator from the public heath point of view since the aim is to stabilize the epidemic at the disease equilibrium.Keywords: epidemic model, HIV, MCMC, parameter estimation
Procedia PDF Downloads 59828883 Determination of Anti-Fungal Activity of Cedrus deodara Oil against Oligoporus placentus, Trametes versicolor and Xylaria acuminata on Populus deltoids
Authors: Sauradipta Ganguly, Akhato Sumi, Sanjeet Kumar Hom, Ajan T. Lotha
Abstract:
Populus deltoides is a hardwood used predominantly for the manufacturing of plywood, matchsticks, and paper in India and hence has a higher economical significance. Wood-decaying fungi cause serious damage to Populus deltoides products, as the wood itself is perishable and vulnerable to decaying agents, decreasing their aesthetical value which in return results in significant monetary loss for the wood industries concerned. The aim of the study was to determine the antifungal activity of Cedrus deodara oil against three primary wood-decaying fungi namely white-rot fungi (Trametes versicolor), brown-rot fungi (Oligoporus placentus) and soft-rot fungi (Xylaria acuminata) on Populus deltoides samples under optimum laboratory conditions. The susceptibility of Populus deltoides samples on the fungal attack and the ability of deodar oil to control colonization of the wood rotting fungi on the samples were assessed. Three concentrations of deodar oil were considered for the study as treating solutions, i.e., 4%, 5%, and 6%. The Populus deltoides samples were treated with treating solutions, and the ability of the same to prevent a fungal attack on the samples were assessed using accelerated test in the laboratory at Biochemical Oxygen Demand incubator at temperature (25 ± 2°C) and relative humidity 70 ± 4%. Efficacy test and statistical analysis of deodar oil against Trametes versicolor, Oligoporus placentus, and Xylariaacuminataon P. deltoides samples exhibited light, minor and negligible mycelia growth at 4 %, 5% and 6% concentrations of deodar oil, respectively. Whereas, moderate to heavy attack was observed on the surface of the control samples. Statistical analysis further established that the treatments were statistically significant and had significantly inhibited fungal growth of all the three fungus spp by almost 3 to 5 times.Keywords: populus deltoides, Trametes versicolor, Oligoporus placentus, Xylaria acuminata, Deodar oil, treatment
Procedia PDF Downloads 12428882 Optimization of the Fabrication Process for Particleboards Made from Oil Palm Fronds Blended with Empty Fruit Bunch Using Response Surface Methodology
Authors: Ghazi Faisal Najmuldeen, Wahida Amat-Fadzil, Zulkafli Hassan, Jinan B. Al-Dabbagh
Abstract:
The objective of this study was to evaluate the optimum fabrication process variables to produce particleboards from oil palm fronds (OPF) particles and empty fruit bunch fiber (EFB). Response surface methodology was employed to analyse the effect of hot press temperature (150–190°C); press time (3–7 minutes) and EFB blending ratio (0–40%) on particleboards modulus of rupture, modulus of elasticity, internal bonding, water absorption and thickness swelling. A Box-Behnken experimental design was carried out to develop statistical models used for the optimisation of the fabrication process variables. All factors were found to be statistically significant on particleboards properties. The statistical analysis indicated that all models showed significant fit with experimental results. The optimum particleboards properties were obtained at optimal fabrication process condition; press temperature; 186°C, press time; 5.7 min and EFB / OPF ratio; 30.4%. Incorporating of oil palm frond and empty fruit bunch to produce particleboards has improved the particleboards properties. The OPF–EFB particleboards fabricated at optimized conditions have satisfied the ANSI A208.1–1999 specification for general purpose particleboards.Keywords: empty fruit bunch fiber, oil palm fronds, particleboards, response surface methodology
Procedia PDF Downloads 22528881 A Fully Automated New-Fangled VESTAL to Label Vertebrae and Intervertebral Discs
Authors: R. Srinivas, K. V. Ramana
Abstract:
This paper presents a novel method called VESTAL to label vertebrae and inter vertebral discs. Each vertebra has certain statistical features properties. To label vertebrae and discs, a new equation to model the path of spinal cord is derived using statistical properties of the spinal canal. VESTAL uses this equation for labeling vertebrae and discs. For each vertebrae and inter vertebral discs both posterior, interior width, height are measured. The calculated values are compared with real values which are measured using venires calipers and the comparison produced 95% efficiency and accurate results. The VESTAL is applied on 50 patients 350 MR images and obtained 100% accuracy in labeling.Keywords: spine, vertebrae, inter vertebral disc, labeling, statistics, texture, disc
Procedia PDF Downloads 36128880 128-Multidetector CT for Assessment of Optimal Depth of Electrode Array Insertion in Cochlear Implant Operations
Authors: Amina Sultan, Mohamed Ghonim, Eman Oweida, Aya Abdelaziz
Abstract:
Objective: To assess the diagnostic reliability of multi-detector CT in pre and post-operative evaluation of cochlear implant candidates. Material and Methods: The study includes 40 patients (18 males and 22 females); mean age 5.6 years. They were classified into two groups: Group A (20 patients): cochlear implant device was Nucleus-22 and Group B (20 patients): the device was MED-EL. Cochlear length (CL) and cochlear height (CH) were measured pre-operatively by 128-multidetector CT. Electrode length (EL) and insertion depth angle (α) were measured post-operatively by MDCT. Results: For Group A mean CL was 9.1 mm ± 0.4 SD; mean CH was 4.1 ± 0.3 SD; mean EL was 18 ± 2.7 SD; mean α angle was 299.05 ± 37 SD. Significant statistical correlation (P < 0.05) was found between preoperative CL and post-operative EL (r²=0.6); as well as EL and α angle (r²=0.7). Group B's mean CL was 9.1 mm ± 0.3 SD; mean CH was 4.1 ± 0.4 SD; mean EL was 27 ± 2.1 SD; mean α angle was 287.6 ± 41.7 SD. Significant statistical correlation was found between CL and EL (r²= 0.6) and α angle (r²=0.5). Also, a strong correlation was found between EL and α angle (r²=0.8). Significant statistical difference was detected between the two devices as regards to the electrode length. Conclusion: Multidetector CT is a reliable tool for preoperative planning and post-operative evaluation of the outcomes of cochlear implant operations. Cochlear length is a valuable prognostic parameter for prediction of the depth of electrode array insertion which can influence criteria of device selection.Keywords: angle of insertion (α angle), cochlear implant (CI), cochlear length (CL), Multidetector Computed Tomography (MDCT)
Procedia PDF Downloads 19128879 Applying Arima Data Mining Techniques to ERP to Generate Sales Demand Forecasting: A Case Study
Authors: Ghaleb Y. Abbasi, Israa Abu Rumman
Abstract:
This paper modeled sales history archived from 2012 to 2015 bulked in monthly bins for five products for a medical supply company in Jordan. The sales forecasts and extracted consistent patterns in the sales demand history from the Enterprise Resource Planning (ERP) system were used to predict future forecasting and generate sales demand forecasting using time series analysis statistical technique called Auto Regressive Integrated Moving Average (ARIMA). This was used to model and estimate realistic sales demand patterns and predict future forecasting to decide the best models for five products. Analysis revealed that the current replenishment system indicated inventory overstocking.Keywords: ARIMA models, sales demand forecasting, time series, R code
Procedia PDF Downloads 38428878 An Infinite Mixture Model for Modelling Stutter Ratio in Forensic Data Analysis
Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer
Abstract:
Forensic DNA analysis has received much attention over the last three decades, due to its incredible usefulness in human identification. The statistical interpretation of DNA evidence is recognised as one of the most mature fields in forensic science. Peak heights in an Electropherogram (EPG) are approximately proportional to the amount of template DNA in the original sample being tested. A stutter is a minor peak in an EPG, which is not masking as an allele of a potential contributor, and considered as an artefact that is presumed to be arisen due to miscopying or slippage during the PCR. Stutter peaks are mostly analysed in terms of stutter ratio that is calculated relative to the corresponding parent allele height. Analysis of mixture profiles has always been problematic in evidence interpretation, especially with the presence of PCR artefacts like stutters. Unlike binary and semi-continuous models; continuous models assign a probability (as a continuous weight) for each possible genotype combination, and significantly enhances the use of continuous peak height information resulting in more efficient reliable interpretations. Therefore, the presence of a sound methodology to distinguish between stutters and real alleles is essential for the accuracy of the interpretation. Sensibly, any such method has to be able to focus on modelling stutter peaks. Bayesian nonparametric methods provide increased flexibility in applied statistical modelling. Mixture models are frequently employed as fundamental data analysis tools in clustering and classification of data and assume unidentified heterogeneous sources for data. In model-based clustering, each unknown source is reflected by a cluster, and the clusters are modelled using parametric models. Specifying the number of components in finite mixture models, however, is practically difficult even though the calculations are relatively simple. Infinite mixture models, in contrast, do not require the user to specify the number of components. Instead, a Dirichlet process, which is an infinite-dimensional generalization of the Dirichlet distribution, is used to deal with the problem of a number of components. Chinese restaurant process (CRP), Stick-breaking process and Pólya urn scheme are frequently used as Dirichlet priors in Bayesian mixture models. In this study, we illustrate an infinite mixture of simple linear regression models for modelling stutter ratio and introduce some modifications to overcome weaknesses associated with CRP.Keywords: Chinese restaurant process, Dirichlet prior, infinite mixture model, PCR stutter
Procedia PDF Downloads 328