Search results for: statistical computing
4494 Georgia Case: Tourism Expenses of International Visitors on the Basis of Growing Attractiveness
Authors: Nino Abesadze, Marine Mindorashvili, Nino Paresashvili
Abstract:
At present actual tourism indicators cannot be calculated in Georgia, making it impossible to perform their quantitative analysis. Therefore, the study conducted by us is highly important from a theoretical as well as practical standpoint. The main purpose of the article is to make complex statistical analysis of tourist expenses of foreign visitors and to calculate statistical attractiveness indices of the tourism potential of Georgia. During the research, the method involving random and proportional selection has been applied. Computer software SPSS was used to compute statistical data for corresponding analysis. Corresponding methodology of tourism statistics was implemented according to international standards. Important information was collected and grouped from major Georgian airports, and a representative population of foreign visitors and a rule of selection of respondents were determined. The results show a trend of growth in tourist numbers and the share of tourists from post-soviet countries are constantly increasing. The level of satisfaction with tourist facilities and quality of service has improved, but still we have a problem of disparity between the service quality and the prices. The design of tourist expenses of foreign visitors is diverse; competitiveness of tourist products of Georgian tourist companies is higher. Attractiveness of popular cities of Georgia has increased by 43%.Keywords: tourist, expenses, indexes, statistics, analysis
Procedia PDF Downloads 3334493 A Fully Automated New-Fangled VESTAL to Label Vertebrae and Intervertebral Discs
Authors: R. Srinivas, K. V. Ramana
Abstract:
This paper presents a novel method called VESTAL to label vertebrae and inter vertebral discs. Each vertebra has certain statistical features properties. To label vertebrae and discs, a new equation to model the path of spinal cord is derived using statistical properties of the spinal canal. VESTAL uses this equation for labeling vertebrae and discs. For each vertebrae and inter vertebral discs both posterior, interior width, height are measured. The calculated values are compared with real values which are measured using venires calipers and the comparison produced 95% efficiency and accurate results. The VESTAL is applied on 50 patients 350 MR images and obtained 100% accuracy in labeling.Keywords: spine, vertebrae, inter vertebral disc, labeling, statistics, texture, disc
Procedia PDF Downloads 3634492 Risks in Forestry Operations, Analysis of Fatal Accidents
Authors: Rino Gubiani, Gianfranco Pergher
Abstract:
The work focused on the statistical analysis of accidents in the forestry sector (2000-2020) in Friuli-Venezia Giulia region, located in the North-East of Italy. The aim of the work was to analyse the evolution of the casualties throughout time and to evaluate possible improvements in the sector. It was shown that even nowadays the rate of accidents in forestry work is higher compared with all the other sectors, including agriculture; moreover, it was highlighted that some accidents remained present throughout the whole analysed range, such as slipping on the soil, being hit by trees and falling down from the plants. The results showed that an increase in forestry exploitation could even increase the total number of accidents, if advanced technological machines, such as cable cranes, would not implemented, given the fact that there is also a significant number of old people (above 50 years old) working in the sector.Keywords: safety, forestry work, accidents, risk analysis, casualties, statistical analysis
Procedia PDF Downloads 1314491 Non-Destructive Visual-Statistical Approach to Detect Leaks in Water Mains
Authors: Alaa Al Hawari, Mohammad Khader, Tarek Zayed, Osama Moselhi
Abstract:
In this paper, an effective non-destructive, non-invasive approach for leak detection was proposed. The process relies on analyzing thermal images collected by an IR viewer device that captures thermo-grams. In this study a statistical analysis of the collected thermal images of the ground surface along the expected leak location followed by a visual inspection of the thermo-grams was performed in order to locate the leak. In order to verify the applicability of the proposed approach the predicted leak location from the developed approach was compared with the real leak location. The results showed that the expected leak location was successfully identified with an accuracy of more than 95%.Keywords: thermography, leakage, water pipelines, thermograms
Procedia PDF Downloads 3554490 Drying Kinects of Soybean Seeds
Authors: Amanda Rithieli Pereira Dos Santos, Rute Quelvia De Faria, Álvaro De Oliveira Cardoso, Anderson Rodrigo Da Silva, Érica Leão Fernandes Araújo
Abstract:
The study of the kinetics of drying has great importance for the mathematical modeling, allowing to know about the processes of transference of heat and mass between the products and to adjust dryers managing new technologies for these processes. The present work had the objective of studying the kinetics of drying of soybean seeds and adjusting different statistical models to the experimental data varying cultivar and temperature. Soybean seeds were pre-dried in a natural environment in order to reduce and homogenize the water content to the level of 14% (b.s.). Then, drying was carried out in a forced air circulation oven at controlled temperatures of 38, 43, 48, 53 and 58 ± 1 ° C, using two soybean cultivars, BRS 8780 and Sambaíba, until reaching a hygroscopic equilibrium. The experimental design was completely randomized in factorial 5 x 2 (temperature x cultivar) with 3 replicates. To the experimental data were adjusted eleven statistical models used to explain the drying process of agricultural products. Regression analysis was performed using the least squares Gauss-Newton algorithm to estimate the parameters. The degree of adjustment was evaluated from the analysis of the coefficient of determination (R²), the adjusted coefficient of determination (R² Aj.) And the standard error (S.E). The models that best represent the drying kinetics of soybean seeds are those of Midilli and Logarítmico.Keywords: curve of drying seeds, Glycine max L., moisture ratio, statistical models
Procedia PDF Downloads 6284489 Automated Facial Symmetry Assessment for Orthognathic Surgery: Utilizing 3D Contour Mapping and Hyperdimensional Computing-Based Machine Learning
Authors: Wen-Chung Chiang, Lun-Jou Lo, Hsiu-Hsia Lin
Abstract:
This study aimed to improve the evaluation of facial symmetry, which is crucial for planning and assessing outcomes in orthognathic surgery (OGS). Facial symmetry plays a key role in both aesthetic and functional aspects of OGS, making its accurate evaluation essential for optimal surgical results. To address the limitations of traditional methods, a different approach was developed, combining three-dimensional (3D) facial contour mapping with hyperdimensional (HD) computing to enhance precision and efficiency in symmetry assessments. The study was conducted at Chang Gung Memorial Hospital, where data were collected from 2018 to 2023 using 3D cone beam computed tomography (CBCT), a highly detailed imaging technique. A large and comprehensive dataset was compiled, consisting of 150 normal individuals and 2,800 patients, totaling 5,750 preoperative and postoperative facial images. These data were critical for training a machine learning model designed to analyze and quantify facial symmetry. The machine learning model was trained to process 3D contour data from the CBCT images, with HD computing employed to power the facial symmetry quantification system. This combination of technologies allowed for an objective and detailed analysis of facial features, surpassing the accuracy and reliability of traditional symmetry assessments, which often rely on subjective visual evaluations by clinicians. In addition to developing the system, the researchers conducted a retrospective review of 3D CBCT data from 300 patients who had undergone OGS. The patients’ facial images were analyzed both before and after surgery to assess the clinical utility of the proposed system. The results showed that the facial symmetry algorithm achieved an overall accuracy of 82.5%, indicating its robustness in real-world clinical applications. Postoperative analysis revealed a significant improvement in facial symmetry, with an average score increase of 51%. The mean symmetry score rose from 2.53 preoperatively to 3.89 postoperatively, demonstrating the system's effectiveness in quantifying improvements after OGS. These results underscore the system's potential for providing valuable feedback to surgeons and aiding in the refinement of surgical techniques. The study also led to the development of a web-based system that automates facial symmetry assessment. This system integrates HD computing and 3D contour mapping into a user-friendly platform that allows for rapid and accurate evaluations. Clinicians can easily access this system to perform detailed symmetry assessments, making it a practical tool for clinical settings. Additionally, the system facilitates better communication between clinicians and patients by providing objective, easy-to-understand symmetry scores, which can help patients visualize the expected outcomes of their surgery. In conclusion, this study introduced a valuable and highly effective approach to facial symmetry evaluation in OGS, combining 3D contour mapping, HD computing, and machine learning. The resulting system achieved high accuracy and offers a streamlined, automated solution for clinical use. The development of the web-based platform further enhances its practicality, making it a valuable tool for improving surgical outcomes and patient satisfaction in orthognathic surgery.Keywords: facial symmetry, orthognathic surgery, facial contour mapping, hyperdimensional computing
Procedia PDF Downloads 274488 Characteristic Sentence Stems in Academic English Texts: Definition, Identification, and Extraction
Authors: Jingjie Li, Wenjie Hu
Abstract:
Phraseological units in academic English texts have been a central focus in recent corpus linguistic research. A wide variety of phraseological units have been explored, including collocations, chunks, lexical bundles, patterns, semantic sequences, etc. This paper describes a special category of clause-level phraseological units, namely, Characteristic Sentence Stems (CSSs), with a view to describing their defining criteria and extraction method. CSSs are contiguous lexico-grammatical sequences which contain a subject-predicate structure and which are frame expressions characteristic of academic writing. The extraction of CSSs consists of six steps: Part-of-speech tagging, n-gram segmentation, structure identification, significance of occurrence calculation, text range calculation, and overlapping sequence reduction. Significance of occurrence calculation is the crux of this study. It includes the computing of both the internal association and the boundary independence of a CSS and tests the occurring significance of the CSS from both inside and outside perspectives. A new normalization algorithm is also introduced into the calculation of LocalMaxs for reducing overlapping sequences. It is argued that many sentence stems are so recurrent in academic texts that the most typical of them have become the habitual ways of making meaning in academic writing. Therefore, studies of CSSs could have potential implications and reference value for academic discourse analysis, English for Academic Purposes (EAP) teaching and writing.Keywords: characteristic sentence stem, extraction method, phraseological unit, the statistical measure
Procedia PDF Downloads 1664487 128-Multidetector CT for Assessment of Optimal Depth of Electrode Array Insertion in Cochlear Implant Operations
Authors: Amina Sultan, Mohamed Ghonim, Eman Oweida, Aya Abdelaziz
Abstract:
Objective: To assess the diagnostic reliability of multi-detector CT in pre and post-operative evaluation of cochlear implant candidates. Material and Methods: The study includes 40 patients (18 males and 22 females); mean age 5.6 years. They were classified into two groups: Group A (20 patients): cochlear implant device was Nucleus-22 and Group B (20 patients): the device was MED-EL. Cochlear length (CL) and cochlear height (CH) were measured pre-operatively by 128-multidetector CT. Electrode length (EL) and insertion depth angle (α) were measured post-operatively by MDCT. Results: For Group A mean CL was 9.1 mm ± 0.4 SD; mean CH was 4.1 ± 0.3 SD; mean EL was 18 ± 2.7 SD; mean α angle was 299.05 ± 37 SD. Significant statistical correlation (P < 0.05) was found between preoperative CL and post-operative EL (r²=0.6); as well as EL and α angle (r²=0.7). Group B's mean CL was 9.1 mm ± 0.3 SD; mean CH was 4.1 ± 0.4 SD; mean EL was 27 ± 2.1 SD; mean α angle was 287.6 ± 41.7 SD. Significant statistical correlation was found between CL and EL (r²= 0.6) and α angle (r²=0.5). Also, a strong correlation was found between EL and α angle (r²=0.8). Significant statistical difference was detected between the two devices as regards to the electrode length. Conclusion: Multidetector CT is a reliable tool for preoperative planning and post-operative evaluation of the outcomes of cochlear implant operations. Cochlear length is a valuable prognostic parameter for prediction of the depth of electrode array insertion which can influence criteria of device selection.Keywords: angle of insertion (α angle), cochlear implant (CI), cochlear length (CL), Multidetector Computed Tomography (MDCT)
Procedia PDF Downloads 1944486 Validation of the Linear Trend Estimation Technique for Prediction of Average Water and Sewerage Charge Rate Prices in the Czech Republic
Authors: Aneta Oblouková, Eva Vítková
Abstract:
The article deals with the issue of water and sewerage charge rate prices in the Czech Republic. The research is specifically focused on the analysis of the development of the average prices of water and sewerage charge rate in the Czech Republic in the years 1994-2021 and on the validation of the chosen methodology relevant for the prediction of the development of the average prices of water and sewerage charge rate in the Czech Republic. The research is based on data collection. The data for this research was obtained from the Czech Statistical Office. The aim of the paper is to validate the relevance of the mathematical linear trend estimate technique for the calculation of the predicted average prices of water and sewerage charge rates. The real values of the average prices of water and sewerage charge rates in the Czech Republic in the years 1994-2018 were obtained from the Czech Statistical Office and were converted into a mathematical equation. The same type of real data was obtained from the Czech Statistical Office for the years 2019-2021. Prediction of the average prices of water and sewerage charge rates in the Czech Republic in the years 2019-2021 were also calculated using a chosen method -a linear trend estimation technique. The values obtained from the Czech Statistical Office and the values calculated using the chosen methodology were subsequently compared. The research result is a validation of the chosen mathematical technique to be a suitable technique for this research.Keywords: Czech Republic, linear trend estimation, price prediction, water and sewerage charge rate
Procedia PDF Downloads 1204485 Understanding the Damage Evolution and the Risk of Failure of Pyrrhotite Containing Concrete Foundations
Authors: Marisa Chrysochoou, James Mahoney, Kay Wille
Abstract:
Pyrrhotite is an iron-sulfide mineral which releases sulfuric acid when exposed to water and oxygen. The presence of this mineral in concrete foundations across Connecticut and Massachusetts in the US is causing in some cases premature failure. This has resulted in a devastating crisis for all parties affected by this type of failure which can take up to 15-25 years before internal damage becomes visible on the surface. This study shares laboratory results aimed to investigate the fundamental mechanisms of pyrrhotite reaction and to further the understanding of its deterioration kinetics within concrete. This includes the following analyses: total sulfur, wavelength dispersive X-ray fluorescence, expansion, reaction rate combined with ion-chromatography, as well as damage evolution using electro-chemical acceleration. This information is coupled to a statistical analysis of over 150 analyzed concrete foundations. Those samples were obtained and process using a developed and validated sampling method that is minimally invasive to the foundation in use, provides representative samples of the concrete matrix across the entire foundation, and is time and cost-efficient. The processed samples were then analyzed using a developed modular testing method based on total sulfur and wavelength dispersive X-ray fluorescence analysis to quantify the amount of pyrrhotite. As part of the statistical analysis the results were grouped into the following three categories: no damage observed and no pyrrhotite detected, no damage observed and pyrrhotite detected and damaged observed and pyrrhotite detected. As expected, a strong correlation between amount of pyrrhotite, age of the concrete and damage is observed. Information from the laboratory investigation and from the statistical analysis of field samples will aid in forming a scientific basis to support the decision process towards sustainable financial and administrative solutions by state and local stakeholders.Keywords: concrete, pyrrhotite, risk of failure, statistical analysis
Procedia PDF Downloads 684484 Screen Method of Distributed Cooperative Navigation Factors for Unmanned Aerial Vehicle Swarm
Authors: Can Zhang, Qun Li, Yonglin Lei, Zhi Zhu, Dong Guo
Abstract:
Aiming at the problem of factor screen in distributed collaborative navigation of dense UAV swarm, an efficient distributed collaborative navigation factor screen method is proposed. The method considered the balance between computing load and positioning accuracy. The proposed algorithm utilized the factor graph model to implement a distributed collaborative navigation algorithm. The GNSS information of the UAV itself and the ranging information between the UAVs are used as the positioning factors. In this distributed scheme, a local factor graph is established for each UAV. The positioning factors of nodes with good geometric position distribution and small variance are selected to participate in the navigation calculation. To demonstrate and verify the proposed methods, the simulation and experiments in different scenarios are performed in this research. Simulation results show that the proposed scheme achieves a good balance between the computing load and positioning accuracy in the distributed cooperative navigation calculation of UAV swarm. This proposed algorithm has important theoretical and practical value for both industry and academic areas.Keywords: screen method, cooperative positioning system, UAV swarm, factor graph, cooperative navigation
Procedia PDF Downloads 794483 Statistical Estimation of Ionospheric Energy Dissipation Using ØStgaard's Empirical Relation
Authors: M. A. Ahmadu, S. S. Rabia
Abstract:
During the past few decades, energy dissipation in the ionosphere resulting from the geomagnetic activity has caused an increasing number of major disruptions of important power and communication services, malfunctions and loss of expensive facilities. Here, the electron precipitation energy, w(ep) and joule heating energy, w(jh) was used in the computation of this dissipation using Østgaard’s empirical relation from hourly geomagnetic indices of 2012, under the assumption that the magnetosphere does not store any energy, so that at the beginning of the activity t1=0 and end at t2=t, the statistical results obtained show that ionospheric dissipation varies month to month, day to day and hour to hour and estimated with a value ~3.6 w(ep), which is in agreement with experimental result.Keywords: Ostgaard's, ionospheric dissipation, joule heating, electron precipitation, geomagnetic indices, empirical relation
Procedia PDF Downloads 2944482 Experimental Assessment of Alkaline Leaching of Lepidolite
Authors: António Fiúza, Aurora Futuro, Joana Monteiro, Joaquim Góis
Abstract:
Lepidolite is an important lithium mineral that, to the author’s best knowledge, has not been used to produce lithium hydroxide, which is necessary for energy conversion to electric vehicles. Alkaline leaching of lithium concentrates allows the establishment of a production diagram avoiding most of the environmental drawbacks that are associated with the usage of acid reagents. The tested processes involve a pretreatment by digestion at high temperatures with additives, followed by leaching at hot atmospheric pressure. The solutions obtained must be compatible with solutions from the leaching of spodumene concentrates, allowing the development of a common treatment diagram, an important accomplishment for the feasible exploitation of Portuguese resources. Statistical programming and interpretation techniques minimize the laboratory effort required by conventional approaches and allow phenomenological comprehension.Keywords: alkaline leaching, lithium, research design, statistical interpretation
Procedia PDF Downloads 974481 Fault Tolerant and Testable Designs of Reversible Sequential Building Blocks
Authors: Vishal Pareek, Shubham Gupta, Sushil Chandra Jain
Abstract:
With increasing high-speed computation demand the power consumption, heat dissipation and chip size issues are posing challenges for logic design with conventional technologies. Recovery of bit loss and bit errors is other issues that require reversibility and fault tolerance in the computation. The reversible computing is emerging as an alternative to conventional technologies to overcome the above problems and helpful in a diverse area such as low-power design, nanotechnology, quantum computing. Bit loss issue can be solved through unique input-output mapping which require reversibility and bit error issue require the capability of fault tolerance in design. In order to incorporate reversibility a number of combinational reversible logic based circuits have been developed. However, very few sequential reversible circuits have been reported in the literature. To make the circuit fault tolerant, a number of fault model and test approaches have been proposed for reversible logic. In this paper, we have attempted to incorporate fault tolerance in sequential reversible building blocks such as D flip-flop, T flip-flop, JK flip-flop, R-S flip-flop, Master-Slave D flip-flop, and double edge triggered D flip-flop by making them parity preserving. The importance of this proposed work lies in the fact that it provides the design of reversible sequential circuits completely testable for any stuck-at fault and single bit fault. In our opinion our design of reversible building blocks is superior to existing designs in term of quantum cost, hardware complexity, constant input, garbage output, number of gates and design of online testable D flip-flop have been proposed for the first time. We hope our work can be extended for building complex reversible sequential circuits.Keywords: parity preserving gate, quantum computing, fault tolerance, flip-flop, sequential reversible logic
Procedia PDF Downloads 5454480 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework
Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi
Abstract:
There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.Keywords: video lectures, big video data, video retrieval, hadoop
Procedia PDF Downloads 5344479 A Real-World Roadmap and Exploration of Quantum Computers Capacity to Trivialise Internet Security
Authors: James Andrew Fitzjohn
Abstract:
This paper intends to discuss and explore the practical aspects of cracking encrypted messages with quantum computers. The theory of this process has been shown and well described both in academic papers and headline-grabbing news articles, but with all theory and hyperbole, we must be careful to assess the practicalities of these claims. Therefore, we will use real-world devices and proof of concept code to prove or disprove the notion that quantum computers will render the encryption technologies used by many websites unfit for purpose. It is time to discuss and implement the practical aspects of the process as many advances in quantum computing hardware/software have recently been made. This paper will set expectations regarding the useful lifespan of RSA and cipher lengths and propose alternative encryption technologies. We will set out comprehensive roadmaps describing when and how encryption schemes can be used, including when they can no longer be trusted. The cost will also be factored into our investigation; for example, it would make little financial sense to spend millions of dollars on a quantum computer to factor a private key in seconds when a commodity GPU could perform the same task in hours. It is hoped that the real-world results depicted in this paper will help influence the owners of websites who can take appropriate actions to improve the security of their provisions.Keywords: quantum computing, encryption, RSA, roadmap, real world
Procedia PDF Downloads 1314478 Comparison of Safety Factor Evaluation Methods for Buckling of High Strength Steel Welded Box Section Columns
Authors: Balazs Somodi, Balazs Kovesdi
Abstract:
In the research praxis of civil engineering the statistical evaluation of experimental and numerical investigations is an essential task in order to compare the experimental and numerical resistances of a specific structural problem with the proposed resistances of the standards. However, in the standards and in the international literature there are several different safety factor evaluation methods that can be used to check the necessary safety level (e.g.: 5% quantile level, 2.3% quantile level, 1‰ quantile level, γM partial safety factor, γM* partial safety factor, β reliability index). Moreover, in the international literature different calculation methods could be found even for the same safety factor as well. In the present study the flexural buckling resistance of high strength steel (HSS) welded closed sections are analyzed. The authors investigated the flexural buckling resistances of the analyzed columns by laboratory experiments. In the present study the safety levels of the obtained experimental resistances are calculated based on several safety approaches and compared with the EN 1990. The results of the different safety approaches are compared and evaluated. Based on the evaluation tendencies are identified and the differences between the statistical evaluation methods are explained.Keywords: flexural buckling, high strength steel, partial safety factor, statistical evaluation
Procedia PDF Downloads 1604477 Explore Urban Spatial Density with Boltzmann Statistical Distribution
Authors: Jianjia Wang, Tong Yu, Haoran Zhu, Kun Liu, Jinwei Hao
Abstract:
The underlying pattern in the modern city is agglomeration. To some degree, the distribution of urban spatial density can be used to describe the status of this assemblage. There are three intrinsic characteristics to measure urban spatial density, namely, Floor Area Ratio (FAR), Building Coverage Ratio (BCR), and Average Storeys (AS). But the underlying mechanism that contributes to these quantities is still vague in the statistical urban study. In this paper, we explore the corresponding extrinsic factors related to spatial density. These factors can further provide the potential influence on the intrinsic quantities. Here, we take Shanghai Inner Ring Area and Manhattan in New York as examples to analyse the potential impacts on urban spatial density with six selected extrinsic elements. Ebery single factor presents the correlation to the spatial distribution, but the overall global impact of all is still implicit. To handle this issue, we attempt to develop the Boltzmann statistical model to explicitly explain the mechanism behind that. We derive a corresponding novel quantity, called capacity, to measure the global effects of all other extrinsic factors to the three intrinsic characteristics. The distribution of capacity presents a similar pattern to real measurements. This reveals the nonlinear influence on the multi-factor relations to the urban spatial density in agglomeration.Keywords: urban spatial density, Boltzmann statistics, multi-factor correlation, spatial distribution
Procedia PDF Downloads 1504476 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate
Authors: Angela Maria Fasnacht
Abstract:
Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive
Procedia PDF Downloads 1224475 GPU Accelerated Fractal Image Compression for Medical Imaging in Parallel Computing Platform
Authors: Md. Enamul Haque, Abdullah Al Kaisan, Mahmudur R. Saniat, Aminur Rahman
Abstract:
In this paper, we have implemented both sequential and parallel version of fractal image compression algorithms using CUDA (Compute Unified Device Architecture) programming model for parallelizing the program in Graphics Processing Unit for medical images, as they are highly similar within the image itself. There is several improvements in the implementation of the algorithm as well. Fractal image compression is based on the self similarity of an image, meaning an image having similarity in majority of the regions. We take this opportunity to implement the compression algorithm and monitor the effect of it using both parallel and sequential implementation. Fractal compression has the property of high compression rate and the dimensionless scheme. Compression scheme for fractal image is of two kinds, one is encoding and another is decoding. Encoding is very much computational expensive. On the other hand decoding is less computational. The application of fractal compression to medical images would allow obtaining much higher compression ratios. While the fractal magnification an inseparable feature of the fractal compression would be very useful in presenting the reconstructed image in a highly readable form. However, like all irreversible methods, the fractal compression is connected with the problem of information loss, which is especially troublesome in the medical imaging. A very time consuming encoding process, which can last even several hours, is another bothersome drawback of the fractal compression.Keywords: accelerated GPU, CUDA, parallel computing, fractal image compression
Procedia PDF Downloads 3364474 A Statistical Approach to Classification of Agricultural Regions
Authors: Hasan Vural
Abstract:
Turkey is a favorable country to produce a great variety of agricultural products because of her different geographic and climatic conditions which have been used to divide the country into four main and seven sub regions. This classification into seven regions traditionally has been used in order to data collection and publication especially related with agricultural production. Afterwards, nine agricultural regions were considered. Recently, the governmental body which is responsible of data collection and dissemination (Turkish Institute of Statistics-TIS) has used 12 classes which include 11 sub regions and Istanbul province. This study aims to evaluate these classification efforts based on the acreage of ten main crops in a ten years time period (1996-2005). The panel data grouped in 11 subregions has been evaluated by cluster and multivariate statistical methods. It was concluded that from the agricultural production point of view, it will be rather meaningful to consider three main and eight sub-agricultural regions throughout the country.Keywords: agricultural region, factorial analysis, cluster analysis,
Procedia PDF Downloads 4164473 Evaluation of the Factors Affecting Violence Against Women (Case Study: Couples Referring to Family Counseling Centers in Tehran)
Authors: Hassan Manouchehri
Abstract:
The present study aimed to identify and evaluate the factors affecting violence against women. The statistical population included all couples referring to family counseling centers in Tehran due to domestic violence during the past year. A number of 305 people were selected as a statistical sample using simple random sampling and Cochran's formula in unlimited conditions. A researcher-made questionnaire including 110 items was used for data collection. The face validity and content validity of the questionnaire were confirmed by 30 experts and its reliability was obtained above 0.7 for all studied variables in a preliminary test with 30 subjects and it was acceptable. In order to analyze the data, descriptive statistical methods were used with SPSS software version 22 and inferential statistics were used for modeling structural equations in Smart PLS software version 2. Evaluating the theoretical framework and domestic and foreign studies indicated that, in general, four main factors, including cultural and social factors, economic factors, legal factors, as well as medical factors, underlie violence against women. In addition, structural equation modeling findings indicated that cultural and social factors, economic factors, legal factors, and medical factors affect violence against women.Keywords: violence against women, cultural and social factors, economic factors, legal factors, medical factors
Procedia PDF Downloads 1414472 Extended Kalman Filter and Markov Chain Monte Carlo Method for Uncertainty Estimation: Application to X-Ray Fluorescence Machine Calibration and Metal Testing
Authors: S. Bouhouche, R. Drai, J. Bast
Abstract:
This paper is concerned with a method for uncertainty evaluation of steel sample content using X-Ray Fluorescence method. The considered method of analysis is a comparative technique based on the X-Ray Fluorescence; the calibration step assumes the adequate chemical composition of metallic analyzed sample. It is proposed in this work a new combined approach using the Kalman Filter and Markov Chain Monte Carlo (MCMC) for uncertainty estimation of steel content analysis. The Kalman filter algorithm is extended to the model identification of the chemical analysis process using the main factors affecting the analysis results; in this case, the estimated states are reduced to the model parameters. The MCMC is a stochastic method that computes the statistical properties of the considered states such as the probability distribution function (PDF) according to the initial state and the target distribution using Monte Carlo simulation algorithm. Conventional approach is based on the linear correlation, the uncertainty budget is established for steel Mn(wt%), Cr(wt%), Ni(wt%) and Mo(wt%) content respectively. A comparative study between the conventional procedure and the proposed method is given. This kind of approaches is applied for constructing an accurate computing procedure of uncertainty measurement.Keywords: Kalman filter, Markov chain Monte Carlo, x-ray fluorescence calibration and testing, steel content measurement, uncertainty measurement
Procedia PDF Downloads 2834471 Roasting Process of Sesame Seeds Modelling Using Gene Expression Programming: A Comparative Analysis with Response Surface Methodology
Authors: Alime Cengiz, Talip Kahyaoglu
Abstract:
Roasting process has the major importance to obtain desired aromatic taste of nuts. In this study, two kinds of roasting process were applied to hulled sesame seeds - vacuum oven and hot air roasting. Efficiency of Gene Expression Programming (GEP), a new soft computing technique of evolutionary algorithm that describes the cause and effect relationships in the data modelling system, and response surface methodology (RSM) were examined in the modelling of roasting processes over a range of temperature (120-180°C) for various times (30-60 min). Color attributes (L*, a*, b*, Browning Index (BI)), textural properties (hardness and fracturability) and moisture content were evaluated and modelled by RSM and GEP. The GEP-based formulations and RSM approach were compared with experimental results and evaluated according to correlation coefficients. The results showed that both GEP and RSM were found to be able to adequately learn the relation between roasting conditions and physical and textural parameters of roasted seeds. However, GEP had better prediction performance than the RSM with the high correlation coefficients (R2 >0.92) for the all quality parameters. This result indicates that the soft computing techniques have better capability for describing the physical changes occuring in sesame seeds during roasting process.Keywords: genetic expression programming, response surface methodology, roasting, sesame seed
Procedia PDF Downloads 4184470 Predictive Maintenance of Industrial Shredders: Efficient Operation through Real-Time Monitoring Using Statistical Machine Learning
Authors: Federico Pittino, Thomas Arnold
Abstract:
The shredding of waste materials is a key step in the recycling process towards the circular economy. Industrial shredders for waste processing operate in very harsh operating conditions, leading to the need for frequent maintenance of critical components. Maintenance optimization is particularly important also to increase the machine’s efficiency, thereby reducing the operational costs. In this work, a monitoring system has been developed and deployed on an industrial shredder located at a waste recycling plant in Austria. The machine has been monitored for one year, and methods for predictive maintenance have been developed for two key components: the cutting knives and the drive belt. The large amount of collected data is leveraged by statistical machine learning techniques, thereby not requiring very detailed knowledge of the machine or its live operating conditions. The results show that, despite the wide range of operating conditions, a reliable estimate of the optimal time for maintenance can be derived. Moreover, the trade-off between the cost of maintenance and the increase in power consumption due to the wear state of the monitored components of the machine is investigated. This work proves the benefits of real-time monitoring system for the efficient operation of industrial shredders.Keywords: predictive maintenance, circular economy, industrial shredder, cost optimization, statistical machine learning
Procedia PDF Downloads 1254469 Development of Geo-computational Model for Analysis of Lassa Fever Dynamics and Lassa Fever Outbreak Prediction
Authors: Adekunle Taiwo Adenike, I. K. Ogundoyin
Abstract:
Lassa fever is a neglected tropical virus that has become a significant public health issue in Nigeria, with the country having the greatest burden in Africa. This paper presents a Geo-Computational Model for Analysis and Prediction of Lassa Fever Dynamics and Outbreaks in Nigeria. The model investigates the dynamics of the virus with respect to environmental factors and human populations. It confirms the role of the rodent host in virus transmission and identifies how climate and human population are affected. The proposed methodology is carried out on a Linux operating system using the OSGeoLive virtual machine for geographical computing, which serves as a base for spatial ecology computing. The model design uses Unified Modeling Language (UML), and the performance evaluation uses machine learning algorithms such as random forest, fuzzy logic, and neural networks. The study aims to contribute to the control of Lassa fever, which is achievable through the combined efforts of public health professionals and geocomputational and machine learning tools. The research findings will potentially be more readily accepted and utilized by decision-makers for the attainment of Lassa fever elimination.Keywords: geo-computational model, lassa fever dynamics, lassa fever, outbreak prediction, nigeria
Procedia PDF Downloads 944468 Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation
Authors: Shonak Bansal, Prince Jain, Arun Kumar Singh, Neena Gupta
Abstract:
Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.Keywords: channel allocation, conventional computing, four–wave mixing, nature–inspired algorithm, optimal Golomb ruler, lévy flight distribution, optimization, improved multi–objective firefly algorithms, Pareto optimal
Procedia PDF Downloads 3224467 Dicotyledon Weed Quantification Algorithm for Selective Herbicide Application in Maize Crops: Statistical Evaluation of the Potential Herbicide Savings
Authors: Morten Stigaard Laursen, Rasmus Nyholm Jørgensen, Henrik Skov Midtiby, Anders Krogh Mortensen, Sanmohan Baby
Abstract:
This work contributes a statistical model and simulation framework yielding the best estimate possible for the potential herbicide reduction when using the MoDiCoVi algorithm all the while requiring a efficacy comparable to conventional spraying. In June 2013 a maize field located in Denmark were seeded. The field was divided into parcels which was assigned to one of two main groups: 1) Control, consisting of subgroups of no spray and full dose spraty; 2) MoDiCoVi algorithm subdivided into five different leaf cover thresholds for spray activation. In addition approximately 25% of the parcels were seeded with additional weeds perpendicular to the maize rows. In total 299 parcels were randomly assigned with the 28 different treatment combinations. In the statistical analysis, bootstrapping was used for balancing the number of replicates. The achieved potential herbicide savings was found to be 70% to 95% depending on the initial weed coverage. However additional field trials covering more seasons and locations are needed to verify the generalisation of these results. There is a potential for further herbicide savings as the time interval between the first and second spraying session was not long enough for the weeds to turn yellow, instead they only stagnated in growth.Keywords: herbicide reduction, macrosprayer, weed crop discrimination, site-specific, sprayer boom
Procedia PDF Downloads 2984466 A Molding Surface Auto-inspection System
Authors: Ssu-Han Chen, Der-Baau Perng
Abstract:
Molding process in IC manufacturing secures chips against the harms done by hot, moisture or other external forces. While a chip was being molded, defects like cracks, dilapidation, or voids may be embedding on the molding surface. The molding surfaces the study poises to treat and the ones on the market, though, differ in the surface where texture similar to defects is everywhere. Manual inspection usually passes over low-contrast cracks or voids; hence an automatic optical inspection system for molding surface is necessary. The proposed system is consisted of a CCD, a coaxial light, a back light as well as a motion control unit. Based on the property of statistical textures of the molding surface, a series of digital image processing and classification procedure is carried out. After training of the parameter associated with above algorithm, result of the experiment suggests that the accuracy rate is up to 93.75%, contributing to the inspection quality of IC molding surface.Keywords: molding surface, machine vision, statistical texture, discrete Fourier transformation
Procedia PDF Downloads 4314465 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution
Authors: Masomeh Jamshid Nejad
Abstract:
Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.Keywords: statistics, excel-based instruction, data visualization, pedagogy
Procedia PDF Downloads 53