Search results for: statistical methods
4613 Analysis of Fertilizer Effect in the Tilapia Growth of Mozambique (Oreochromis mossambicus)
Authors: Sérgio Afonso Mulema, Andrés Carrión García, Vicente Ernesto
Abstract:
This paper analyses the effect of fertilizer (organic and inorganic) in the growth of tilapia. An experiment was implemented in the Aquapesca Company of Mozambique; there were considered four different treatments. Each type of fertilizer was applied in two of these treatments; a feed was supplied to the third treatment, and the fourth was taken as control. The weight and length of the tilapia were used as the growth parameters, and to measure the water quality, the physical-chemical parameters were registered. The results show that the weight and length were different for tilapias cultivated in different treatments. These differences were evidenced mainly by organic and feed treatments, where there was the largest and smallest value of these parameters, respectively. In order to prove that these differences were caused only by applied treatment without interference for the aquatic environment, a Fisher discriminant analysis was applied, which confirmed that the treatments were exposed to the same environment condition.Keywords: Fertilizer, tilapia, growth, statistical methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8994612 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.
Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8354611 Affine Combination of Splitting Type Integrators, Implemented with Parallel Computing Methods
Authors: Adrian Alvarez, Diego Rial
Abstract:
In this work we present a family of new convergent type methods splitting high order no negative steps feature that allows your application to irreversible problems. Performing affine combinations consist of results obtained with Trotter Lie integrators of different steps. Some examples where applied symplectic compared with methods, in particular a pair of differential equations semilinear. The number of basic integrations required is comparable with integrators symplectic, but this technique allows the ability to do the math in parallel thus reducing the times of which exemplify exhibiting some implementations with simple schemes for its modularity and scalability process.Keywords: Lie Trotter integrators, Irreversible Problems, Splitting Methods without negative steps, MPI, HPC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13344610 Autonomous Robots- Visual Perception in Underground Terrains Using Statistical Region Merging
Authors: Omowunmi E. Isafiade, Isaac O. Osunmakinde, Antoine B. Bagula
Abstract:
Robots- visual perception is a field that is gaining increasing attention from researchers. This is partly due to emerging trends in the commercial availability of 3D scanning systems or devices that produce a high information accuracy level for a variety of applications. In the history of mining, the mortality rate of mine workers has been alarming and robots exhibit a great deal of potentials to tackle safety issues in mines. However, an effective vision system is crucial to safe autonomous navigation in underground terrains. This work investigates robots- perception in underground terrains (mines and tunnels) using statistical region merging (SRM) model. SRM reconstructs the main structural components of an imagery by a simple but effective statistical analysis. An investigation is conducted on different regions of the mine, such as the shaft, stope and gallery, using publicly available mine frames, with a stream of locally captured mine images. An investigation is also conducted on a stream of underground tunnel image frames, using the XBOX Kinect 3D sensors. The Kinect sensors produce streams of red, green and blue (RGB) and depth images of 640 x 480 resolution at 30 frames per second. Integrating the depth information to drivability gives a strong cue to the analysis, which detects 3D results augmenting drivable and non-drivable regions in 2D. The results of the 2D and 3D experiment with different terrains, mines and tunnels, together with the qualitative and quantitative evaluation, reveal that a good drivable region can be detected in dynamic underground terrains.Keywords: Drivable Region Detection, Kinect Sensor, Robots' Perception, SRM, Underground Terrains.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18394609 Approximate Solutions to Large Stein Matrix Equations
Authors: Khalide Jbilou
Abstract:
In the present paper, we propose numerical methods for solving the Stein equation AXC - X - D = 0 where the matrix A is large and sparse. Such problems appear in discrete-time control problems, filtering and image restoration. We consider the case where the matrix D is of full rank and the case where D is factored as a product of two matrices. The proposed methods are Krylov subspace methods based on the block Arnoldi algorithm. We give theoretical results and we report some numerical experiments.
Keywords: IEEEtran, journal, LATEX, paper, template.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19054608 A Cost Optimization Model for the Construction of Bored Piles
Authors: Kenneth M. Oba
Abstract:
Adequate management, control, and optimization of cost is an essential element for a successful construction project. A multiple linear regression optimization model was formulated to address the problem of costs associated with pile construction operations. A total of 32 PVC-reinforced concrete piles with diameter of 300 mm, 5.4 m long, were studied during the construction. The soil upon which the piles were installed was mostly silty sand, and completely submerged in water at Bonny, Nigeria. The piles are friction piles installed by boring method, using a piling auger. The volumes of soil removed, the weight of reinforcement cage installed, and volumes of fresh concrete poured into the PVC void were determined. The cost of constructing each pile based on the calculated quantities was determined. A model was derived and subjected to statistical tests using Statistical Package for the Social Sciences (SPSS) software. The model turned out to be adequate, fit, and have a high predictive accuracy with an R2 value of 0.833.
Keywords: Cost optimization modelling, multiple linear models, pile construction, regression models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784607 A New Performance Characterization of Transient Analysis Method
Authors: José Peralta, Gabriela Peretti, Eduardo Romero, Carlos Marqués
Abstract:
This paper proposes a new performance characterization for the test strategy intended for second order filters denominated Transient Analysis Method (TRAM). We evaluate the ability of the addressed test strategy for detecting deviation faults under simultaneous statistical fluctuation of the non-faulty parameters. For this purpose, we use Monte Carlo simulations and a fault model that considers as faulty only one component of the filter under test while the others components adopt random values (within their tolerance band) obtained from their statistical distributions. The new data reported here show (for the filters under study) the presence of hard-to-test components and relatively low fault coverage values for small deviation faults. These results suggest that the fault coverage value obtained using only nominal values for the non-faulty components (the traditional evaluation of TRAM) seem to be a poor predictor of the test performance.
Keywords: testing, fault analysis, analog filter test, parametric faults detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14644606 Fault Detection of Drinking Water Treatment Process Using PCA and Hotelling's T2 Chart
Authors: Joval P George, Dr. Zheng Chen, Philip Shaw
Abstract:
This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.
Keywords: Principal component analysis, hotelling's t2 chart, multivariate statistical process control, drinking water treatment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27864605 E-Learning Methodology Development using Modeling
Authors: Sarma Cakula, Maija Sedleniece
Abstract:
Simulation and modeling computer programs are concerned with construction of models for analyzing different perspectives and possibilities in changing conditions environment. The paper presents theoretical justification and evaluation of qualitative e-learning development model in perspective of advancing modern technologies. There have been analyzed principles of qualitative e-learning in higher education, productivity of studying process using modern technologies, different kind of methods and future perspectives of e-learning in formal education. Theoretically grounded and practically tested model of developing e-learning methods using different technologies for different type of classroom, which can be used in professor-s decision making process to choose the most effective e-learning methods has been worked out.Keywords: E-learning, modeling, E-learning methods development, personal knowledge management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19914604 Reading Literacy and Methods of Improving Reading
Authors: Iva Košek Bartošová, Andrea Jokešová, Eva Kozlová, Helena Matějová
Abstract:
The paper presents results of a research team from Faculty of Education, University of Hradec Králové in the Czech Republic. It introduces with the most reading methods used in the 1st classes of a primary school and presents results of a pilot research focused on mastering reading techniques and the quality of reading comprehension of pupils in the first half of a school year during training in teaching reading by an analytic-synthetic method and by a genetic method. These methods of practicing reading skills are the most used ones in the Czech Republic. During the school year 2015/16 there has been a measurement made of two groups of pupils of the 1st year and monitoring of quantitative and qualitative parameters of reading pupils’ outputs by several methods. Both of these methods are based on different theoretical basis and each of them has a specific educational and methodical procedure. This contribution represents results during a piloting project and draws pilot conclusions which will be verified in the subsequent broader research at the end of the school year of the first class of primary school.
Keywords: Analytic-synthetic method of reading, genetic method of reading, reading comprehension, reading literacy, reading methods, reading speed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10424603 Space Telemetry Anomaly Detection Based on Statistical PCA Algorithm
Authors: B. Nassar, W. Hussein, M. Mokhtar
Abstract:
The critical concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission, but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the problem above coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions, and the results show that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.Keywords: Space telemetry monitoring, multivariate analysis, PCA algorithm, space operations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20624602 Adaptation of State/Transition-Based Methods for Embedded System Testing
Authors: Abdelaziz Guerrouat, Harald Richter
Abstract:
In this paper test generation methods and appropriate fault models for testing and analysis of embedded systems described as (extended) finite state machines ((E)FSMs) are presented. Compared to simple FSMs, EFSMs specify not only the control flow but also the data flow. Thus, we define a two-level fault model to cover both aspects. The goal of this paper is to reuse well-known FSM-based test generation methods for automation of embedded system testing. These methods have been widely used in testing and validation of protocols and communicating systems. In particular, (E)FSMs-based specification and testing is more advantageous because (E)FSMs support the formal semantic of already standardised formal description techniques (FDTs) despite of their popularity in the design of hardware and software systems.
Keywords: Formal methods, testing and validation, finite state machines, formal description techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20944601 Transmitting a Distance Training Model to the Community in the Upper Northeastern Region
Authors: Teerawach Khamkorn, Laongtip Mathurasa, Savittree Rochanasmita Arnold, Witthaya Mekhum
Abstract:
The objective of this research seeks to transmit a distance training model to the community in the upper northeastern region. The group sampling consists of 60 community leaders in the municipality of sub-district Kumphawapi, Kumphawapi Disrict, Udonthani Province. The research tools rely on the following instruments, they are : 1) the achievement test of community leaders- training and 2) the satisfaction questionnaires of community leaders. The statistics used in data analysis takes the statistical mean, percentage, standard deviation, and statistical T-test. The resulted findings reveal : 1) the efficiency of the distance training developed by the researcher for the community leaders joining in the training received the average score between in-training and post-training period higher than the setup criterion, 2) the two groups of participants in the training achieved higher knowledge than their pre-training state, 3) the comparison of the achievements between the two group presented no different results, 4) the community leaders obtained the high-to-highest satisfaction.
Keywords: Distance Training, Management, Technology, Transmitting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13024600 Classification of Computer Generated Images from Photographic Images Using Convolutional Neural Networks
Authors: Chaitanya Chawla, Divya Panwar, Gurneesh Singh Anand, M. P. S Bhatia
Abstract:
This paper presents a deep-learning mechanism for classifying computer generated images and photographic images. The proposed method accounts for a convolutional layer capable of automatically learning correlation between neighbouring pixels. In the current form, Convolutional Neural Network (CNN) will learn features based on an image's content instead of the structural features of the image. The layer is particularly designed to subdue an image's content and robustly learn the sensor pattern noise features (usually inherited from image processing in a camera) as well as the statistical properties of images. The paper was assessed on latest natural and computer generated images, and it was concluded that it performs better than the current state of the art methods.Keywords: Image forensics, computer graphics, classification, deep learning, convolutional neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11754599 Novel Methods for Desulfurization of Fuel Oils
Authors: H. Hosseini
Abstract:
Because of the requirement for low sulfur content of fuel oils, it is necessary to develop alternative methods for desulfurization of heavy fuel oil. Due to the disadvantages of HDS technologies such as costs, safety and green environment, new methods have been developed. Among these methods is ultrasoundassisted oxidative desulfurization. Using ultrasound-assisted oxidative desulfurization, compounds such as benzothiophene and dibenzothiophene can be oxidized. As an alternative method is sulfur elimination of heavy fuel oil by using of activated carbon in a packed column in batch condition. The removal of sulfur compounds in this case to reach about 99%. The most important property of activated carbon is ability of it for adsorption, which is due to high surface area and pore volume of it.Keywords: Desulfurization, Fuel oil, Activated carbon, Ultrasound-assisted oxidative desulfurization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44584598 The Relationships between Market Orientation and Competitiveness of Companies in Banking Sector
Authors: P. Jangl, M. Mikuláštík
Abstract:
The objective of the paper is to measure and compare market orientation of Swiss and Czech banks, as well as examine statistically the degree of influence it has on competitiveness of the institutions. The analysis of market orientation is based on the collecting, analysis and correct interpretation of the data. Descriptive analysis of market orientation describe current situation. Research of relation of competitiveness and market orientation in the sector of big international banks is suggested with the expectation of existence of a strong relationship. Partially, the work served as reconfirmation of suitability of classic methodologies to measurement of banks’ market orientation.
Two types of data were gathered. Firstly, by measuring subjectively perceived market orientation of a company and secondly, by quantifying its competitiveness. All data were collected from a sample of small, mid-sized and large banks. We used numerical secondary character data from the international statistical financial Bureau Van Dijk’s BANKSCOPE database.
Statistical analysis led to the following results. Assuming classical market orientation measures to be scientifically justified, Czech banks are statistically less market-oriented than Swiss banks. Secondly, among small Swiss banks, which are not broadly internationally active, small relationship exist between market orientation measures and market share based competitiveness measures. Thirdly, among all Swiss banks, a strong relationship exists between market orientation measures and market share based competitiveness measures. Above results imply existence of a strong relation of this measure in sector of big international banks. A strong statistical relationship has been proven to exist between market orientation measures and equity/total assets ratio in Switzerland.
Keywords: Market Orientation, Competitiveness, Marketing Strategy, Measurement of Market Orientation, Relation between Market Orientation and Competitiveness, Banking Sector.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27804597 Identifying Corruption in Legislation using Risk Analysis Methods
Authors: Chvalkovska, J., Jansky, P., Mejstrik, M.
Abstract:
The objective of this article is to discuss the potential of economic analysis as a tool for identification and evaluation of corruption in legislative acts. We propose that corruption be perceived as a risk variable within the legislative process. Therefore we find it appropriate to employ risk analysis methods, used in various fields of economics, for the evaluation of corruption in legislation. Furthermore we propose the incorporation of these methods into the so called corruption impact assessment (CIA), the general framework for detection of corruption in legislative acts. The applications of the risk analysis methods are demonstrated on examples of implementation of proposed CIA in the Czech Republic. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24684596 Analysis of Air Quality in the Outdoor Environment of the City of Messina by an Application of the Pollution Index Method
Authors: G. Cannistraro, L. Ponterio
Abstract:
In this paper is reported an analysis about the outdoor air pollution of the urban centre of the city of Messina. The variations of the most critical pollutants concentrations (PM10, O3, CO, C6H6) and their trends respect of climatic parameters and vehicular traffic have been studied. Linear regressions have been effectuated for representing the relations among the pollutants; the differences between pollutants concentrations on weekend/weekday were also analyzed. In order to evaluate air pollution and its effects on human health, a method for calculating a pollution index was implemented and applied in the urban centre of the city. This index is based on the weighted mean of the most detrimental air pollutants concentrations respect of their limit values for protection of human health. The analyzed data of the polluting substances were collected by the Assessorship of the Environment of the Regional Province of Messina in the year 2004. A statistical analysis of the air quality index trends is also reported.
Keywords: Environmental pollution, Pollutants levels, Linearregression, Air Quality Index, Statistical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17794595 Project and Module Based Teaching and Learning
Authors: Jingyu Hou
Abstract:
This paper proposes a new teaching and learning approach-project and module based teaching and learning (PMBTL). The PMBTL approach incorporates the merits of project/problem based and module based learning methods, and overcomes the limitations of these methods. The correlation between teaching, learning, practice and assessment is emphasized in this approach, and new methods have been proposed accordingly. The distinct features of these new methods differentiate the PMBTL approach from conventional teaching approaches. Evaluation of this approach on practical teaching and learning activities demonstrates the effectiveness and stability of the approach in improving the performance and quality of teaching and learning. The approach proposed in this paper is also intuitive to the design of other teaching units.
Keywords: Computer science education, project and module based, software engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34544594 Holistic Approach to Teaching Mathematics in Secondary School as a Means of Improving Students’ Comprehension of Study Material
Authors: Natalia Podkhodova, Olga Sheremeteva, Mariia Soldaeva
Abstract:
Creating favourable conditions for students’ comprehension of mathematical content is one of the primary problems in teaching mathematics in secondary school. The fact of comprehension includes the ability to build a working situational model and thus becomes an important means of solving mathematical problems. This paper describes a holistic approach to teaching mathematics designed to address the primary challenges of such teaching; specifically, the challenge of students’ comprehension. Essentially, this approach consists of (1) establishing links between the attributes of the notion: the sense, the meaning, and the term; (2) taking into account the components of student’s subjective experience—value-based emotions, contextual, procedural and communicative—during the educational process; (3) linking together different ways to present mathematical information; (4) identifying and leveraging the relationships between real, perceptual and conceptual (scientific) mathematical spaces by applying real-life situational modelling. The article describes approaches to the practical use of these foundational concepts. Identifying how proposed methods and techniques influence understanding of material used in teaching mathematics was the primary goal. The study included an experiment in which 256 secondary school students took part: 142 in the study group and 114 in the control group. All students in these groups had similar levels of achievement in math and studied math under the same curriculum. In the course of the experiment, comprehension of two topics — “Derivative” and “Trigonometric functions”—was evaluated. Control group participants were taught using traditional methods. Students in the study group were taught using the holistic method: under teacher’s guidance, they carried out assignments designed to establish linkages between notion’s characteristics, to convert information from one mode of presentation to another, as well as assignments that required the ability to operate with all modes of presentation. Identification, accounting for and transformation of subjective experience were associated with methods of stimulating the emotional value component of the studied mathematical content (discussions of lesson titles, assignments aimed to create study dominants, performing theme-related physical exercise ...) The use of techniques that forms inter-subject notions based on linkages between, perceptual real and mathematical conceptual spaces proved to be of special interest to the students. Results of the experiment were analysed by presenting students in each of the groups with a final test in each of the studied topics. The test included assignments that required building real situational models. Statistical analysis was used to aggregate test results. Pierson criterion x2 was used to reveal statistics significance of results (pass-fail the modelling test). Significant difference of results was revealed (p < 0.001), which allowed to conclude that students in the study group showed better comprehension of mathematical information than those in the control group. The total number of completed assignments of each student was analysed as well, with average results calculated for each group. Statistical significance of result differences against the quantitative criterion (number of completed assignments) was determined using Student’s t-test, which showed that students in the study group completed significantly more assignments than those in the control group (p = 0.0001). Authors thus come to the conclusion that suggested increase in the level of comprehension of study material took place as a result of applying implemented methods and techniques.
Keywords: Comprehension of mathematical content, holistic approach to teaching mathematics in secondary school, subjective experience, technology of the formation of inter-subject notions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6084593 Unsupervised Feature Selection Using Feature Density Functions
Authors: Mina Alibeigi, Sattar Hashemi, Ali Hamzeh
Abstract:
Since dealing with high dimensional data is computationally complex and sometimes even intractable, recently several feature reductions methods have been developed to reduce the dimensionality of the data in order to simplify the calculation analysis in various applications such as text categorization, signal processing, image retrieval, gene expressions and etc. Among feature reduction techniques, feature selection is one the most popular methods due to the preservation of the original features. In this paper, we propose a new unsupervised feature selection method which will remove redundant features from the original feature space by the use of probability density functions of various features. To show the effectiveness of the proposed method, popular feature selection methods have been implemented and compared. Experimental results on the several datasets derived from UCI repository database, illustrate the effectiveness of our proposed methods in comparison with the other compared methods in terms of both classification accuracy and the number of selected features.Keywords: Feature, Feature Selection, Filter, Probability Density Function
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20774592 A Comparison of Recent Methods for Solving a Model 1D Convection Diffusion Equation
Authors: Ashvin Gopaul, Jayrani Cheeneebash, Kamleshsing Baurhoo
Abstract:
In this paper we study some numerical methods to solve a model one-dimensional convection–diffusion equation. The semi-discretisation of the space variable results into a system of ordinary differential equations and the solution of the latter involves the evaluation of a matrix exponent. Since the calculation of this term is computationally expensive, we study some methods based on Krylov subspace and on Restrictive Taylor series approximation respectively. We also consider the Chebyshev Pseudospectral collocation method to do the spatial discretisation and we present the numerical solution obtained by these methods.
Keywords: Chebyshev Pseudospectral collocation method, convection-diffusion equation, restrictive Taylor approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16804591 Food Quality Labels and their Perception by Consumers in the Czech Republic
Authors: Sarka Velcovska
Abstract:
The paper deals with quality labels used in the food products market, especially with labels of quality, labels of origin, and labels of organic farming. The aim of the paper is to identify perception of these labels by consumers in the Czech Republic. The first part refers to the definition and specification of food quality labels that are relevant in the Czech Republic. The second part includes the discussion of marketing research results. Data were collected with personal questioning method. Empirical findings on 150 respondents are related to consumer awareness and perception of national and European food quality labels used in the Czech Republic, attitudes to purchases of labelled products, and interest in information regarding the labels. Statistical methods, in the concrete Pearson´s chi-square test of independence, coefficient of contingency, and coefficient of association are used to determinate if significant differences do exist among selected demographic categories of Czech consumers.
Keywords: Food quality labels, quality labels awareness, quality labels perception, marketing research.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23274590 Computational Methods in Official Statistics with an Example on Calculating and Predicting Diabetes Mellitus [DM] Prevalence in Different Age Groups within Australia in Future Years, in Light of the Aging Population
Authors: D. Hilton
Abstract:
An analysis of the Australian Diabetes Screening Study estimated undiagnosed diabetes mellitus [DM] prevalence in a high risk general practice based cohort. DM prevalence varied from 9.4% to 18.1% depending upon the diagnostic criteria utilised with age being a highly significant risk factor. Utilising the gold standard oral glucose tolerance test, the prevalence of DM was 22-23% in those aged >= 70 years and <15% in those aged 40-59 years. Opportunistic screening in Australian general practice potentially can identify many persons with undiagnosed type 2 DM. An Australian Bureau of Statistics document published three years ago, reported the highest rate of DM in men aged 65-74 years [19%] whereas the rate for women was highest in those over 75 years [13%]. If you consider that the Australian Bureau of Statistics report in 2007 found that 13% of the population was over 65 years of age and that this will increase to 23-25% by 2056 with a further projected increase to 25-28% by 2101, obviously this information has to be factored into the equation when age related diabetes prevalence predictions are calculated. This 10-15% proportional increase of elderly persons within the population demographics has dramatic implications for the estimated number of elderly persons with DM in these age groupings. Computational methodology showing the age related demographic changes reported in these official statistical documents will be done showing estimates for 2056 and 2101 for different age groups. This has relevance for future diabetes prevalence rates and shows that along with many countries worldwide Australia is facing an increasing pandemic. In contrast Japan is expected to have a decrease in the next twenty years in the number of persons with diabetes.
Keywords: Epidemiological methods, aging, prevalence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19544589 Watermark-based Counter for Restricting Digital Audio Consumption
Authors: Mikko Löytynoja, Nedeljko Cvejic, Tapio Seppänen
Abstract:
In this paper we introduce three watermarking methods that can be used to count the number of times that a user has played some content. The proposed methods are tested with audio content in our experimental system using the most common signal processing attacks. The test results show that the watermarking methods used enable the watermark to be extracted under the most common attacks with a low bit error rate.
Keywords: Digital rights management, restricted usage, content protection, spread spectrum, audio watermarking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14664588 A Decision Boundary based Discretization Technique using Resampling
Authors: Taimur Qureshi, Djamel A Zighed
Abstract:
Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.Keywords: Bootstrap, discretization, resampling, soft decision trees.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14344587 Assessing Basic Computer Applications’ Skills of College-Level Students in Saudi Arabia
Authors: Mohammed A. Gharawi, Majed M. Khoja
Abstract:
This paper is a report on the findings of a study conducted at the Institute of Public Administration (IPA) in Saudi Arabia. The paper applied both qualitative and quantitative approaches to assess the levels of basic computer applications’ skills among students enrolled in the preparatory programs of the institution. Qualitative data have been collected from semi-structured interviews with the instructors who have previously been assigned to teach Introduction to information technology courses. Quantitative data were collected by executing a self-report questionnaire and a written statistical test. Three hundred eighty enrolled students responded to the questionnaire and one hundred forty two accomplished the statistical test. The results indicate the lack of necessary skills to deal with computer applications among most of the students who are enrolled in the IPA’s preparatory programs.
Keywords: Assessment, Computer Applications, Computer Literacy, Institute of Public Administration, Saudi Arabia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26824586 Institutional Determinants of Economic Growth in Georgia and in Other Post-Communist Economies
Authors: Nazira Kakulia, Tsotne Zhghenti
Abstract:
The institutional development is one of the actual topics in economics science. New trends and directions of institutional development mostly depend on its structure and framework. Transformation of institutions is an important problem for every economy, especially for developing countries. The first research goal is to determine the importance and interactions between different institutions in Georgia. Using World Governance Indicators and Economic Freedom indexes it can be calculated the size for each institutional group. The second aim of this research is to evaluate Georgian institutional backwardness in comparison to other post-communist economies. We use statistical and econometric methods to evaluate the difference between the levels of institutional development in Georgia and in leading post-communist economies. Within the scope of this research, major findings are coefficients which are an assessment of their deviation (i.e. lag) of institutional indicators between Georgia and leading post-communist country which should be compared. The last part of the article includes analysis around the selected coefficients.
Keywords: Post-communist transition, institutions, economic growth, institutional development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6194585 A Framework for Improving Trade Contractors’ Productivity Tracking Methods
Authors: Sophia Hayes, Kenny L. Liang, Sahil Sharma, Austin Shema, Mahmoud Bader, Mohamed Elbarkouky
Abstract:
Despite being one of the most significant economic contributors of the country, Canada’s construction industry is lagging behind other sectors when it comes to labor productivity improvements. The construction industry is very collaborative as a general contractor, will hire trade contractors to perform most of a project’s work; meaning low productivity from one contractor can have a domino effect on the shared success of a project. To address this issue and encourage trade contractors to improve their productivity tracking methods, an investigative study was done on the productivity views and tracking methods of various trade contractors. Additionally, an in-depth review was done on four standard tracking methods used in the construction industry: cost codes, benchmarking, the job productivity measurement (JPM) standard, and WorkFace Planning (WFP). The four tracking methods were used as a baseline in comparing the trade contractors’ responses, determining gaps within their current tracking methods, and for making improvement recommendations. 15 interviews were conducted with different trades to analyze how contractors value productivity. The results of these analyses indicated that there seem to be gaps within the construction industry when it comes to an understanding of the purpose and value in productivity tracking. The trade contractors also shared their current productivity tracking systems; which were then compared to the four standard tracking methods used in the construction industry. Gaps were identified in their various tracking methods and using a framework; recommendations were made based on the type of trade on how to improve how they track productivity.
Keywords: Trade contractors’ productivity, productivity tracking, cost codes, benchmarking, job productivity measurement, JPM, workface planning WFP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8894584 Fuzzy based Security Threshold Determining for the Statistical En-Route Filtering in Sensor Networks
Authors: Hae Young Lee, Tae Ho Cho
Abstract:
In many sensor network applications, sensor nodes are deployed in open environments, and hence are vulnerable to physical attacks, potentially compromising the node's cryptographic keys. False sensing report can be injected through compromised nodes, which can lead to not only false alarms but also the depletion of limited energy resource in battery powered networks. Ye et al. proposed a statistical en-route filtering scheme (SEF) to detect such false reports during the forwarding process. In this scheme, the choice of a security threshold value is important since it trades off detection power and overhead. In this paper, we propose a fuzzy logic for determining a security threshold value in the SEF based sensor networks. The fuzzy logic determines a security threshold by considering the number of partitions in a global key pool, the number of compromised partitions, and the energy level of nodes. The fuzzy based threshold value can conserve energy, while it provides sufficient detection power.
Keywords: Fuzzy logic, security, sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581