Search results for: statistical data analysis
41893 A Clustering-Based Approach for Weblog Data Cleaning
Authors: Amine Ganibardi, Cherif Arab Ali
Abstract:
This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data
Procedia PDF Downloads 16841892 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions
Authors: K. Hardy, A. Maurushat
Abstract:
Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.Keywords: big data, open data, productivity, data governance
Procedia PDF Downloads 37041891 Sexual Cognitive Behavioral Therapy: Psychological Performance and Openness to Experience
Authors: Alireza Monzavi Chaleshtari, Mahnaz Aliakbari Dehkordi, Amin Asadi Hieh, Majid Kazemnezhad
Abstract:
This research was conducted with the aim of determining the effectiveness of sexual cognitive behavioral therapy on psychological performance and openness to experience in women. The type of research was experimental in the form of pre-test-post-test. The statistical population of this research was made up of all working and married women with membership in the researcher's Instagram social network who had problems in marital-sexual relationships (N=900). From the statistical community, which includes working and married women who are members of the researcher's Instagram social network who have problems in marital-sexual relationships, there are 30 people including two groups (15 people in the experimental group and 15 people in the control group) as available sampling and selected randomly. They were placed in two experimental and control groups. The anxiety, stress, and depression scale (DASS) and the Costa and McCree personality questionnaire were used to collect data, and the cognitive behavioral therapy protocol of Dr. Mehrnaz Ali Akbari was used for the treatment sessions. To analyze the data, the covariance test was used in the SPSS22 software environment. The results showed that sexual cognitive behavioral therapy has a positive and significant effect on psychological performance and openness to experience in women. Conclusion: It can be concluded that interventions such as cognitive-behavioral sex can be used to treat marital problems.Keywords: sexual cognitive behavioral therapy, psychological function, openness to experience, women
Procedia PDF Downloads 7741890 Knowledge, Attitudes and Readiness of Students towards Higher Order Thinking Skills
Authors: Mohd Aderi Che Noh, Tuan Rahayu Tuan Lasan
Abstract:
Higher order thinking skills (HOTS) is an important skill in the Malaysian education system to produce a knowledgeable generation, able to think critically and creatively in order to face the challenges in the future. Educational challenges of the 21st century require that all students to have the HOTS. Therefore, this study aims to identify the level of knowledge, attitude and readiness of students towards HOTS. The respondents were 127 form four students from schools in the Federal Territory of Putrajaya. This study is quantitative survey using a questionnaire to collect data. Data were analyzed using Statistical Package for the Social Sciences (SPSS) 23.0. The results showed that knowledge, attitudes and readiness of students towards HOTS lam were at a high level. Inferential analysis showed that there was a significant relationship between knowledge with attitude and readiness towards HOTS. This study provides information to the schools and teachers to improve the teaching and learning to increase students HOTS and fulfilling the hope of Ministry of Education to produce human capital who can be globally competitive.Keywords: high order thinking skills, teaching, education, Malaysia
Procedia PDF Downloads 21041889 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot
Authors: Arezou Javadi
Abstract:
The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.Keywords: machine learning, financial income, statistical potential, govpilot
Procedia PDF Downloads 8741888 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot
Authors: Arezou Javadi
Abstract:
The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.Keywords: machine learning, financial income, statistical potential, govpilot
Procedia PDF Downloads 6941887 Road Safety in the Great Britain: An Exploratory Data Analysis
Authors: Jatin Kumar Choudhary, Naren Rayala, Abbas Eslami Kiasari, Fahimeh Jafari
Abstract:
The Great Britain has one of the safest road networks in the world. However, the consequences of any death or serious injury are devastating for loved ones, as well as for those who help the severely injured. This paper aims to analyse the Great Britain's road safety situation and show the response measures for areas where the total damage caused by accidents can be significantly and quickly reduced. In this paper, we do an exploratory data analysis using STATS19 data. For the past 30 years, the UK has had a good record in reducing fatalities. The UK ranked third based on the number of road deaths per million inhabitants. There were around 165,000 accidents reported in the Great Britain in 2009 and it has been decreasing every year until 2019 which is under 120,000. The government continues to scale back road deaths empowering responsible road users by identifying and prosecuting the parameters that make the roads less safe.Keywords: road safety, data analysis, openstreetmap, feature expanding.
Procedia PDF Downloads 13941886 Poster : Incident Signals Estimation Based on a Modified MCA Learning Algorithm
Authors: Rashid Ahmed , John N. Avaritsiotis
Abstract:
Many signal subspace-based approaches have already been proposed for determining the fixed Direction of Arrival (DOA) of plane waves impinging on an array of sensors. Two procedures for DOA estimation based neural networks are presented. First, Principal Component Analysis (PCA) is employed to extract the maximum eigenvalue and eigenvector from signal subspace to estimate DOA. Second, minor component analysis (MCA) is a statistical method of extracting the eigenvector associated with the smallest eigenvalue of the covariance matrix. In this paper, we will modify a Minor Component Analysis (MCA(R)) learning algorithm to enhance the convergence, where a convergence is essential for MCA algorithm towards practical applications. The learning rate parameter is also presented, which ensures fast convergence of the algorithm, because it has direct effect on the convergence of the weight vector and the error level is affected by this value. MCA is performed to determine the estimated DOA. Preliminary results will be furnished to illustrate the convergences results achieved.Keywords: Direction of Arrival, neural networks, Principle Component Analysis, Minor Component Analysis
Procedia PDF Downloads 45041885 The Predictive Role of Attachment and Adjustment in the Decision-Making Process in Infertility
Authors: A. Luli, A. Santona
Abstract:
It is rare for individuals that are involved in a relationship to think about the possibility of having procreation problems in the near present or in the future. However, infertility is a condition that affects millions of people all around the world. Often, infertile individuals have to deal with experiences of psychological, relational and social problems. In these cases, they have to review their choices and take into consideration, if it is necessary, new ones. Different studies have examined the different decisions that infertile individuals have to go through dealing with infertility and its treatment, but none of them is focused on the decision-making style used by infertile individuals to solve their problem and on the factors that influences it. The aim of this paper is to define the style of decision-making used by infertile persons to give a solution to the ‘problem’ and the potential predictive role of the attachment and of the dyadic adjustment. The total sample is composed by 251 participants, divided in two groups: the experimental group composed by 114 participants, 62 males and 52 females, age between 25 and 59 years, and the control group composed by 137 participants, 65 males and 72 females, age between 22 and 49 years. The battery of instruments used is composed by: the General Decision Making Style (GDMS), the Experiences in Close Relationships Questionnaire Revised (ECR-R), Dyadic Adjustment Scale (DAS), and the Symptom Checklist-90-R (SCL-90-R). The results from the analysis of the samples showed a prevalence of the rational decision-making style for both males and females. No significant statistical difference was found between the experimental and control group. Also the analyses showed a significant statistical relationship between the decision making styles and the adult attachment styles for both males and females. In this case, only for males, there was a significant statistical difference between the experimental and the control group. Another significant statistical relationship was founded between the decision making styles and the adjustment scales for both males and females. Also in this case, the difference between the two groups was founded to be significant only of males. These results contribute to enrich the literature on the subject of decision-making styles in infertile individuals, showing also the predictive role of the attachment styles and the adjustment, confirming in this was the few results in the literature.Keywords: adjustment, attachment, decision-making style, infertility
Procedia PDF Downloads 33241884 Assessing Smallholder Farmers’ Perception of Climate Change and Coping Strategies Adopted in the Olifants Catchment of South Africa
Authors: Mary Funke Olabanji, Thando Ndarana, Nerhene Davis, Sylvester Okechukwu Ilo
Abstract:
Scientific evidence indicates that climate change is already being experienced by farmers, and its impacts are felt on agricultural and food systems. Understanding the perceptions of farmers on climate change and how they respond to this change is essential to the development and implementation of appropriate policies for agriculture and food security. This paper aims to contribute to the understanding of farmers’ perceptions of climate change, adopted coping strategies, long-term implications of their adaptation choices, and barriers to their decisions to adapt. Data were randomly collected from 73 respondents in five districts located in the Olifants catchment of South Africa. A combination of descriptive statistics and Chi-Square statistical tests using the Statistical Package for Social Science (SPSS) was used to analyse the data obtained from the survey. Results show that smallholder farmers have an in-depth perception of climate change. The most significant changes perceived by farmers were increased temperature and low rainfall. The results equally revealed that smallholder farmers in the Olifants catchment had adopted several adaptation strategies in response to the perceived climate change. The significant adaptation strategies from the results include changing cropping patterns and planting date, use of improved seed variety, and chemical fertilizers. The study, therefore, concludes that crop diversification and agroforestry were more effective and sustainable in mitigating the impact of climate change.Keywords: adaptation, climate change, perception, smallholder farmers
Procedia PDF Downloads 18241883 Data Driven Infrastructure Planning for Offshore Wind farms
Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree
Abstract:
The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data
Procedia PDF Downloads 8641882 Mapping Crime against Women in India: Spatio-Temporal Analysis, 2001-2012
Authors: Ritvik Chauhan, Vijay Kumar Baraik
Abstract:
Women are most vulnerable to crime despite occupying central position in shaping a society as the first teacher of children. In India too, having equal rights and constitutional safeguards, the incidences of crime against them are large and grave. In this context of crime against women, especially rape has been increasing over time. This paper explores the spatial and temporal aspects of crime against women in India with special reference to rape. It also examines the crime against women with its spatial, socio-economic and demographic associates using related data obtained from the National Crime Records Bureau India, Indian Census and other government sources of the Government of India. The simple statistical, choropleth mapping and other cartographic representation methods have been used to see the crime rates, spatio-temporal patterns of crime, and association of crime with its correlates. The major findings are visible spatial variations across the country and are also in the rising trends in terms of incidence and rates over the reference period. The study also indicates that the geographical associations are somewhat observed. However, selected indicators of socio-economic factors seem to have no significant bearing on crime against women at this level.Keywords: crime against women, crime mapping, trend analysis, society
Procedia PDF Downloads 32841881 A Highly Accurate Computer-Aided Diagnosis: CAD System for the Diagnosis of Breast Cancer by Using Thermographic Analysis
Authors: Mahdi Bazarganigilani
Abstract:
Computer-aided diagnosis (CAD) systems can play crucial roles in diagnosing crucial diseases such as breast cancer at the earliest. In this paper, a CAD system for the diagnosis of breast cancer was introduced and evaluated. This CAD system was developed by using spatio-temporal analysis of data on a set of consecutive thermographic images by employing wavelet transformation. By using this analysis, a very accurate machine learning model using random forest was obtained. The final results showed a promising accuracy of 91% in terms of the F1 measure indicator among 200 patients' sample data. The CAD system was further extended to obtain a detailed analysis of the effect of smaller sub-areas of each breast on the occurrence of cancer.Keywords: computer-aided diagnosis systems, thermographic analysis, spatio-temporal analysis, image processing, machine learning
Procedia PDF Downloads 20941880 The Impact of Effective Employee Retention Strategies to the Success of the Hotel Industry of Rwanda
Authors: Ange Meghane Hakizimana, Landry Ndikuriyo
Abstract:
Retention of employees in the hospitality industry is a recurrent agenda in the organization involving all the combined efforts to maintain the best available laborer. The general objective of this research is to assess the impact of effective employee retention strategies on the success of the hotel industry at Galileo Hotel, Huye District in Rwanda, for the period of 2019-2021. Herzberg Two Factor Theory and Equity Theory were used. The study adopted a descriptive research design. Descriptive research design allowed us to study the elements in their natural form without making any alterations to them. Secondary data and primary data and the data collected were sorted and entered into the statistical packages for social sciences for analysis (SPSS) version 26. Frequencies, descriptive statistics and percentages were used to analyze and establish extent to which employee retention strategies impact the success of the hotel industry of Rwanda and this was analyzed using regression and correlation analysis. The results revealed that employee training and development had an influence of 24.8% on the success of the hotel industry in Rwanda. According to the results of our study, the employee reward system contributes 20.7% to the success of the hotel industry in Rwanda, the value of t is 3.475 and this is greater than the standard t value score of 1.96, p-value is 0.002. Therefore the employee reward system has a great positive impact on the success of the hotel industry in Rwanda. The results also show that 15.7% of the success of the hospitality industry in Rwanda is due to the work environment of employees. With a t-value of 4.384 and a p-value of 0.000, the above statistics show a positive impact of the employees' working environment on success of the hospitality industry in Rwanda. A priority to the retention of their employees should be given by the hotel industry and its managers because it has already been proven that it is an effective approach to offering good customer service. In addition, employee retention reduces expenses associated with employee recruitment and turnover.Keywords: success, hotel industry, training and development, employee reward system, employee work environment
Procedia PDF Downloads 9541879 Investigating Elements That Influence Higher Education Institutions’ Digital Maturity
Authors: Zarah M. Bello, Nathan Baddoo, Mariana Lilley, Paul Wernick
Abstract:
In this paper, we present findings from a multi-part study to evaluate candidate elements reflecting the level of digital capability maturity (DCM) in higher education and the relationship between these elements. We will use these findings to propose a model of DCM for educational institutions. We suggest that the success of learning in higher education is dependent in part on the level of maturity of digital capabilities of institutions as well as the abilities of learners and those who support the learning process. It is therefore important to have a good understanding of the elements that underpin this maturity as well as their impact and interactions in order to better exploit the benefits that technology presents to the modern learning environment and support its continued improvement. Having identified ten candidate elements of digital capability that we believe support the level of a University’s maturity in this area as well as a number of relevant stakeholder roles, we conducted two studies utilizing both quantitative and qualitative research methods. In the first of these studies, 85 electronic questionnaires were completed by various stakeholders in a UK university, with a 100% response rate. We also undertook five in-depth interviews with management stakeholders in the same university. We then utilized statistical analysis to process the survey data and conducted a textual analysis of the interview transcripts. Our findings support our initial identification of candidate elements and support our contention that these elements interact in a multidimensional manner. This multidimensional dynamic suggests that any proposal for improvement in digital capability must reflect the interdependency and cross-sectional relationship of the elements that contribute to DCM. Our results also indicate that the notion of DCM is strongly data-centric and that any proposed maturity model must reflect the role of data in driving maturity and improvement. We present these findings as a key step towards the design of an operationalisable DCM maturity model for universities.Keywords: digital capability, elements, maturity, maturity framework, university
Procedia PDF Downloads 14141878 Propagation of DEM Varying Accuracy into Terrain-Based Analysis
Authors: Wassim Katerji, Mercedes Farjas, Carmen Morillo
Abstract:
Terrain-Based Analysis results in derived products from an input DEM and these products are needed to perform various analyses. To efficiently use these products in decision-making, their accuracies must be estimated systematically. This paper proposes a procedure to assess the accuracy of these derived products, by calculating the accuracy of the slope dataset and its significance, taking as an input the accuracy of the DEM. Based on the output of previously published research on modeling the relative accuracy of a DEM, specifically ASTER and SRTM DEMs with Lebanon coverage as the area of study, analysis have showed that ASTER has a low significance in the majority of the area where only 2% of the modeled terrain has 50% or more significance. On the other hand, SRTM showed a better significance, where 37% of the modeled terrain has 50% or more significance. Statistical analysis deduced that the accuracy of the slope dataset, calculated on a cell-by-cell basis, is highly correlated to the accuracy of the input DEM. However, this correlation becomes lower between the slope accuracy and the slope significance, whereas it becomes much higher between the modeled slope and the slope significance.Keywords: terrain-based analysis, slope, accuracy assessment, Digital Elevation Model (DEM)
Procedia PDF Downloads 44441877 Statistical Quality Control on Assignable Causes of Variation on Cement Production in Ashaka Cement PLC Gombe State
Authors: Hamisu Idi
Abstract:
The present study focuses on studying the impact of influencer recommendation in the quality of cement production. Exploratory research was done on monthly basis, where data were obtained from secondary source i.e. the record kept by an automated recompilation machine. The machine keeps all the records of the mills downtime which the process manager checks for validation and refer the fault (if any) to the department responsible for maintenance or measurement taking so as to prevent future occurrence. The findings indicated that the product of the Ashaka Cement Plc. were considered as qualitative, since all the production processes were found to be in control (preset specifications) with the exception of the natural cause of variation which is normal in the production process as it will not affect the outcome of the product. It is reduced to the bearest minimum since it cannot be totally eliminated. It is also hopeful that the findings of this study would be of great assistance to the management of Ashaka cement factory and the process manager in particular at various levels in the monitoring and implementation of statistical process control. This study is therefore of great contribution to the knowledge in this regard and it is hopeful that it would open more research in that direction.Keywords: cement, quality, variation, assignable cause, common cause
Procedia PDF Downloads 25941876 Effect of Gender on Carcass Parameters in Japanese Quail
Authors: M. Bolacali
Abstract:
This study was conducted to determine the effects of and sex on the carcass characteristics in Japanese quails. A total of 320 (160 for each sex groups) one-day-old quail chicks were randomly allocated to the sex groups, each containing 160 chicks according to a completely randomized design. Each gender was then divided into five replicate groups of 32 chicks. According to sex groups, the chicks of all replicate groups were housed in cages. The normality of distribution for all data was tested with the Shapiro-Wilk test at 95% confidence interval. A P value of ≤ 0.05 was interpreted as different. The statistical analysis for normal distribution data of the dietary groups was carried out with the general linear model procedure of SPSS software. The results are expressed as mean ± standard deviation of five replications. Duncan’s multiple range test was used for multiple comparisons in important groups. Data points bearing different letters are significantly different P ≤ 0.05. For the distribution of data that was different from normal, Kruskal Wallis H-Test was applied as a nonparametric test, and the results were expressed as median, minimum and maximum values. Pairwise comparisons of groups were made when Kruskal Wallis H-Test was significant. The study period lasted 42 days. Hot carcass, cold carcass, heart, and leg percentages in male quails was higher than female quails (P < 0.05), but liver, and breast percentages in female quails was higher than male quails (P > 0.05). The highest slaughter and carcass weight values were determined in the female quails in the cage. As a conclusion, it may be recommended to quail meat producers, who would like to obtain higher carcass weight to make more economic profit, to raise female quails in cage.Keywords: carcass yield, chick, gender, management
Procedia PDF Downloads 18741875 FRATSAN: A New Software for Fractal Analysis of Signals
Authors: Hamidreza Namazi
Abstract:
Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 46641874 The Role of Creative Entrepreneurship in the Development of Croatian Economy
Authors: Marko Kolakovic
Abstract:
Creative industries are an important sector of growth and development of knowledge economies. They have a positive impact on employment, economic growth, export and the quality of life in the areas where they are developed. Creative sectors include architecture, design, advertising, publishing, music, film, television and radio, video games, visual and performing arts and heritage. Following the positive trends of development of creative industries on the global and European level, this paper analyzes creative industries in general and specific characteristics of creative entrepreneurship. Special focus in this paper is put on the influence of the information communication technology on the development of new creative business models and protection of the intellectual property rights. One part of the paper is oriented on the analysis of the status of creative industries and creative entrepreneurship in Croatia. The main objective of the paper is by using the statistical analysis of creative industries in Croatia and information gained during the interviews with entrepreneurs, to make conclusions about potentials and development of creative industries in Croatia. Creative industries in Croatia are at the beginning of their development and growth strategy still does not exist at the national level. Statistical analysis pointed out that in 2015 creative enterprises made 9% of all enterprises in Croatia, employed 5,5% of employed people and their share in GDP was 4,01%. Croatian creative entrepreneurs are building competitive advantage using their creative resources and creating specific business models. The main obstacles they meet are lack of business experience and impossibility of focusing on the creative activities only. In their business, they use digital technologies and are focused on export. The conclusion is that creative industries in Croatia have development potential, but it is necessary to take adequate measures to use this potential in a right way.Keywords: creative entrepreneurship, knowledge economy, business models, intellectual property
Procedia PDF Downloads 20641873 Stability-Indicating High-Performance Thin-Layer Chromatography Method for Estimation of Naftopidil
Authors: P. S. Jain, K. D. Bobade, S. J. Surana
Abstract:
A simple, selective, precise and Stability-indicating High-performance thin-layer chromatographic method for analysis of Naftopidil both in a bulk and in pharmaceutical formulation has been developed and validated. The method employed, HPTLC aluminium plates precoated with silica gel as the stationary phase. The solvent system consisted of hexane: ethyl acetate: glacial acetic acid (4:4:2 v/v). The system was found to give compact spot for Naftopidil (Rf value of 0.43±0.02). Densitometric analysis of Naftopidil was carried out in the absorbance mode at 253 nm. The linear regression analysis data for the calibration plots showed good linear relationship with r2=0.999±0.0001 with respect to peak area in the concentration range 200-1200 ng per spot. The method was validated for precision, recovery and robustness. The limits of detection and quantification were 20.35 and 61.68 ng per spot, respectively. Naftopidil was subjected to acid and alkali hydrolysis, oxidation and thermal degradation. The drug undergoes degradation under acidic, basic, oxidation and thermal conditions. This indicates that the drug is susceptible to acid, base, oxidation and thermal conditions. The degraded product was well resolved from the pure drug with significantly different Rf value. Statistical analysis proves that the method is repeatable, selective and accurate for the estimation of investigated drug. The proposed developed HPTLC method can be applied for identification and quantitative determination of Naftopidil in bulk drug and pharmaceutical formulation.Keywords: naftopidil, HPTLC, validation, stability, degradation
Procedia PDF Downloads 39641872 A Sequential Approach for Random-Effects Meta-Analysis
Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya
Abstract:
The objective in meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence based for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research finding changed with year publication and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable to fixed effect model (FEM). For random-effects model (REM), the analysis incorporates the heterogeneity variance, tau-squared and its estimation create complications. In this paper proposed the use of Gombay and Serbian (2005) truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring of REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of application.Keywords: meta-analysis, random-effects model, sequential test, temporal changes in effect sizes
Procedia PDF Downloads 46741871 Implementation of Correlation-Based Data Analysis as a Preliminary Stage for the Prediction of Geometric Dimensions Using Machine Learning in the Forming of Car Seat Rails
Authors: Housein Deli, Loui Al-Shrouf, Hammoud Al Joumaa, Mohieddine Jelali
Abstract:
When forming metallic materials, fluctuations in material properties, process conditions, and wear lead to deviations in the component geometry. Several hundred features sometimes need to be measured, especially in the case of functional and safety-relevant components. These can only be measured offline due to the large number of features and the accuracy requirements. The risk of producing components outside the tolerances is minimized but not eliminated by the statistical evaluation of process capability and control measurements. The inspection intervals are based on the acceptable risk and are at the expense of productivity but remain reactive and, in some cases, considerably delayed. Due to the considerable progress made in the field of condition monitoring and measurement technology, permanently installed sensor systems in combination with machine learning and artificial intelligence, in particular, offer the potential to independently derive forecasts for component geometry and thus eliminate the risk of defective products - actively and preventively. The reliability of forecasts depends on the quality, completeness, and timeliness of the data. Measuring all geometric characteristics is neither sensible nor technically possible. This paper, therefore, uses the example of car seat rail production to discuss the necessary first step of feature selection and reduction by correlation analysis, as otherwise, it would not be possible to forecast components in real-time and inline. Four different car seat rails with an average of 130 features were selected and measured using a coordinate measuring machine (CMM). The run of such measuring programs alone takes up to 20 minutes. In practice, this results in the risk of faulty production of at least 2000 components that have to be sorted or scrapped if the measurement results are negative. Over a period of 2 months, all measurement data (> 200 measurements/ variant) was collected and evaluated using correlation analysis. As part of this study, the number of characteristics to be measured for all 6 car seat rail variants was reduced by over 80%. Specifically, direct correlations for almost 100 characteristics were proven for an average of 125 characteristics for 4 different products. A further 10 features correlate via indirect relationships so that the number of features required for a prediction could be reduced to less than 20. A correlation factor >0.8 was assumed for all correlations.Keywords: long-term SHM, condition monitoring, machine learning, correlation analysis, component prediction, wear prediction, regressions analysis
Procedia PDF Downloads 4641870 Finding Data Envelopment Analysis Targets Using Multi-Objective Programming in DEA-R with Stochastic Data
Authors: R. Shamsi, F. Sharifi
Abstract:
In this paper, we obtain the projection of inefficient units in data envelopment analysis (DEA) in the case of stochastic inputs and outputs using the multi-objective programming (MOP) structure. In some problems, the inputs might be stochastic while the outputs are deterministic, and vice versa. In such cases, we propose a multi-objective DEA-R model because in some cases (e.g., when unnecessary and irrational weights by the BCC model reduce the efficiency score), an efficient decision-making unit (DMU) is introduced as inefficient by the BCC model, whereas the DMU is considered efficient by the DEA-R model. In some other cases, only the ratio of stochastic data may be available (e.g., the ratio of stochastic inputs to stochastic outputs). Thus, we provide a multi-objective DEA model without explicit outputs and prove that the input-oriented MOP DEA-R model in the invariable return to scale case can be replaced by the MOP-DEA model without explicit outputs in the variable return to scale and vice versa. Using the interactive methods for solving the proposed model yields a projection corresponding to the viewpoint of the DM and the analyst, which is nearer to reality and more practical. Finally, an application is provided.Keywords: DEA-R, multi-objective programming, stochastic data, data envelopment analysis
Procedia PDF Downloads 10441869 Statistical Channel Modeling for Multiple-Input-Multiple-Output Communication System
Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany
Abstract:
The performance of wireless communication systems is affected mainly by the environment of its associated channel, which is characterized by dynamic and unpredictable behavior. In this paper, different statistical earth-satellite channel models are studied with emphasize on two main models, first is the Rice-Log normal model, due to its representation for the environment including shadowing and multi-path components that affect the propagated signal along its path, and a three-state model that take into account different fading conditions (clear area, moderate shadow and heavy shadowing). The provided models are based on AWGN, Rician, Rayleigh, and log-normal distributions were their Probability Density Functions (PDFs) are presented. The transmission system Bit Error Rate (BER), Peak-Average-Power Ratio (PAPR), and the channel capacity vs. fading models are measured and analyzed. These simulations are implemented using MATLAB tool, and the results had shown the performance of transmission system over different channel models.Keywords: fading channels, MIMO communication, RNS scheme, statistical modeling
Procedia PDF Downloads 14641868 Case Study: Throughput Analysis over PLC Infrastructure as Last Mile Residential Solution in Colombia
Authors: Edward P. Guillen, A. Karina Martinez Barliza
Abstract:
Powerline Communications (PLC) as last mile solution to provide communication services, has the advantage of transmitting over channels already used for electrical distribution. However these channels have been not designed with this purpose, for that reason telecommunication companies in Colombia want to know how good would be using PLC in costs and network performance in comparison to cable modem or DSL. This paper analyzes PLC throughput for residential complex scenarios using a PLC network scenarios and some statistical results are shown.Keywords: home network, power line communication, throughput analysis, power factor, cost, last mile solution
Procedia PDF Downloads 26541867 Application of Scanning Electron Microscopy and X-Ray Evaluation of the Main Digestion Methods for Determination of Macroelements in Plant Tissue
Authors: Krasimir I. Ivanov, Penka S. Zapryanova, Stefan V. Krustev, Violina R. Angelova
Abstract:
Three commonly used digestion methods (dry ashing, acid digestion, and microwave digestion) in different variants were compared for digestion of tobacco leaves. Three main macroelements (K, Ca and Mg) were analysed using AAS Spectrometer Spectra АА 220, Varian, Australia. The accuracy and precision of the measurements were evaluated by using Polish reference material CTR-VTL-2 (Virginia tobacco leaves). To elucidate the problems with elemental recovery X-Ray and SEM–EDS analysis of all residues after digestion were performed. The X-ray investigation showed a formation of KClO4 when HClO4 was used as a part of the acids mixture. The use of HF at Ca and Mg determination led to the formation of CaF2 and MgF2. The results were confirmed by energy dispersive X-ray microanalysis. SPSS program for Windows was used for statistical data processing.Keywords: digestion methods, plant tissue, determination of macroelements, K, Ca, Mg
Procedia PDF Downloads 31641866 Antibiotic Prescribing Pattern and Associated Risk Factors Promoting Antibiotic Resistance, a Cross Sectional Study in a Regional Hospital in Ghana
Authors: Nicholas Agyepong, Paul Gyan
Abstract:
Inappropriate prescribing of antibiotic is a common healthcare concern globally resulted in an increased risk of adverse reactions and the emergence of antimicrobial resistance. The wrong antibiotic prescribing habits may lead to ineffective and unsafe treatment, worsening of disease condition, and thus increase in health care costs. The study was to examine the antibiotic prescribing pattern and associated risk factors at Regional Hospital in the Bono region of Ghana. A retrospective cross-sectional study was conducted to describe the current prescribing practices at the Hospital from January 2014 to December, 2021. A systematic random sampling method was used to select the participants for the study. STATA version 16 software was used for data management and analysis. Descriptive statistics and logistic regression analysis were used to analyze the data. Statistical significance set at p<0.05. Antibiotic consumption was equivalent to 11 per 1000 inhabitants consuming 1 DDD per day. Most common prescribed antibiotic was amoxicillin/clavulanic acid (14.39%) followed by erythromycin (11.44%), and ciprofloxacin (11.36%). Antibiotics prescription have been steadily increased over the past eight years (2014: n=59,280 to 2021: n=190,320). Prescribers above the age of 35 were more likely to prescribe antibiotics than those between the ages of 20 and 25 (COR=21.00; 95% CI: 1.78 – 48.10; p=0.016). Prescribers with at least 6 years of experience were also significantly more likely to prescribe antibiotics than those with at most 5 years of experience (COR=14.17; 95% CI: 2.39 – 84.07; p=0.004). Thus, the establishment of an antibiotic stewardship program in the hospitals is imperative, and further studies need to be conducted in other facilities to establish the national antibiotic prescription guideline.Keywords: antibiotic, antimicrobial resistance, prescription, prescribers
Procedia PDF Downloads 4541865 Investigation of Maritime Accidents with Exploratory Data Analysis in the Strait of Çanakkale (Dardanelles)
Authors: Gizem Kodak
Abstract:
The Strait of Çanakkale, together with the Strait of Istanbul and the Sea of Marmara, form the Turkish Straits System. In other words, the Strait of Çanakkale is the southern gate of the system that connects the Black Sea countries with the other countries of the world. Due to the heavy maritime traffic, it is important to scientifically examine the accident characteristics in the region. In particular, the results indicated by the descriptive statistics are of critical importance in order to strengthen the safety of navigation. At this point, exploratory data analysis offers strategic outputs in terms of defining the problem and knowing the strengths and weaknesses against possible accident risk. The study aims to determine the accident characteristics in the Strait of Çanakkale with temporal and spatial analysis of historical data, using Exploratory Data Analysis (EDA) as the research method. The study's results will reveal the general characteristics of maritime accidents in the region and form the infrastructure for future studies. Therefore, the text provides a clear description of the research goals and methodology, and the study's contributions are well-defined.Keywords: maritime accidents, EDA, Strait of Çanakkale, navigational safety
Procedia PDF Downloads 9541864 Using ANN in Emergency Reconstruction Projects Post Disaster
Authors: Rasha Waheeb, Bjorn Andersen, Rafa Shakir
Abstract:
Purpose The purpose of this study is to avoid delays that occur in emergency reconstruction projects especially in post disaster circumstances whether if they were natural or manmade due to their particular national and humanitarian importance. We presented a theoretical and practical concepts for projects management in the field of construction industry that deal with a range of global and local trails. This study aimed to identify the factors of effective delay in construction projects in Iraq that affect the time and the specific quality cost, and find the best solutions to address delays and solve the problem by setting parameters to restore balance in this study. 30 projects were selected in different areas of construction were selected as a sample for this study. Design/methodology/approach This study discusses the reconstruction strategies and delay in time and cost caused by different delay factors in some selected projects in Iraq (Baghdad as a case study).A case study approach was adopted, with thirty construction projects selected from the Baghdad region, of different types and sizes. Project participants from the case projects provided data about the projects through a data collection instrument distributed through a survey. Mixed approach and methods were applied in this study. Mathematical data analysis was used to construct models to predict delay in time and cost of projects before they started. The artificial neural networks analysis was selected as a mathematical approach. These models were mainly to help decision makers in construction project to find solutions to these delays before they cause any inefficiency in the project being implemented and to strike the obstacles thoroughly to develop this industry in Iraq. This approach was practiced using the data collected through survey and questionnaire data collection as information form. Findings The most important delay factors identified leading to schedule overruns were contractor failure, redesigning of designs/plans and change orders, security issues, selection of low-price bids, weather factors, and owner failures. Some of these are quite in line with findings from similar studies in other countries/regions, but some are unique to the Iraqi project sample, such as security issues and low-price bid selection. Originality/value we selected ANN’s analysis first because ANN’s was rarely used in project management , and never been used in Iraq to finding solutions for problems in construction industry. Also, this methodology can be used in complicated problems when there is no interpretation or solution for a problem. In some cases statistical analysis was conducted and in some cases the problem is not following a linear equation or there was a weak correlation, thus we suggested using the ANN’s because it is used for nonlinear problems to find the relationship between input and output data and that was really supportive.Keywords: construction projects, delay factors, emergency reconstruction, innovation ANN, post disasters, project management
Procedia PDF Downloads 165