Search results for: accurate data
25591 Side Effects of Dental Whitening: Published Data from the Literature
Authors: Ilma Robo, Saimir Heta, Emela Dalloshi, Nevila Alliu, Vera Ostreni
Abstract:
The dental whitening process, beyond the fact that it is a mini-invasive dental treatment, has effects on the dental structure, or on the pulp of the tooth, where it is applied. The electronic search was performed using keywords to find articles published within the last 10 years about side effects, assessed as such, of minimally invasive dental bleaching treatment. Methodology: In selected articles, the other aim of the study was to evaluate the side effects of bleaching based on the percentage and type of solution used, where the latter was evaluated on the basic solution used for bleaching. Results: The side effects of bleaching are evaluated in selected articles depending on the method of bleaching application, which means it is carried out with recommended solutions, or with mixtures of alternative solutions or substances based on Internet information. Short conclusion: The dental bleaching process has side effects which have not yet been definitively evaluated, experimentally in large samples of individuals or animals (mice or cattle) to arrive at accurate numerical conclusions. The trend of publications about this topic is increasing in recent years, as long as the trend for aesthetic facial treatments, including dental ones, is increasing.Keywords: teeth whitening, side effects, permanent teeth, formed dental apex
Procedia PDF Downloads 6125590 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform
Authors: Sadam Alwadi
Abstract:
Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.Keywords: outlier values, imputation, stock market data, detecting, estimation
Procedia PDF Downloads 8025589 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage
Authors: P. Jayashree, S. Rajkumar
Abstract:
With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding
Procedia PDF Downloads 29225588 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework
Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe
Abstract:
This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.Keywords: IoT, fog, cloud, data analysis, data privacy
Procedia PDF Downloads 9625587 Support Vector Regression Combined with Different Optimization Algorithms to Predict Global Solar Radiation on Horizontal Surfaces in Algeria
Authors: Laidi Maamar, Achwak Madani, Abdellah El Ahdj Abdellah
Abstract:
The aim of this work is to use Support Vector regression (SVR) combined with dragonfly, firefly, Bee Colony and particle swarm Optimization algorithm to predict global solar radiation on horizontal surfaces in some cities in Algeria. Combining these optimization algorithms with SVR aims principally to enhance accuracy by fine-tuning the parameters, speeding up the convergence of the SVR model, and exploring a larger search space efficiently; these parameters are the regularization parameter (C), kernel parameters, and epsilon parameter. By doing so, the aim is to improve the generalization and predictive accuracy of the SVR model. Overall, the aim is to leverage the strengths of both SVR and optimization algorithms to create a more powerful and effective regression model for various cities and under different climate conditions. Results demonstrate close agreement between predicted and measured data in terms of different metrics. In summary, SVM has proven to be a valuable tool in modeling global solar radiation, offering accurate predictions and demonstrating versatility when combined with other algorithms or used in hybrid forecasting models.Keywords: support vector regression (SVR), optimization algorithms, global solar radiation prediction, hybrid forecasting models
Procedia PDF Downloads 3425586 Comparison of Selected Pier-Scour Equations for Wide Piers Using Field Data
Authors: Nordila Ahmad, Thamer Mohammad, Bruce W. Melville, Zuliziana Suif
Abstract:
Current methods for predicting local scour at wide bridge piers, were developed on the basis of laboratory studies and very limited scour prediction were tested with field data. Laboratory wide pier scour equation from previous findings with field data were presented. A wide range of field data were used and it consists of both live-bed and clear-water scour. A method for assessing the quality of the data was developed and applied to the data set. Three other wide pier-scour equations from the literature were used to compare the performance of each predictive method. The best-performing scour equation were analyzed using statistical analysis. Comparisons of computed and observed scour depths indicate that the equation from the previous publication produced the smallest discrepancy ratio and RMSE value when compared with the large amount of laboratory and field data.Keywords: field data, local scour, scour equation, wide piers
Procedia PDF Downloads 41125585 A Mathematical Model Approach Regarding the Children’s Height Development with Fractional Calculus
Authors: Nisa Özge Önal, Kamil Karaçuha, Göksu Hazar Erdinç, Banu Bahar Karaçuha, Ertuğrul Karaçuha
Abstract:
The study aims to use a mathematical approach with the fractional calculus which is developed to have the ability to continuously analyze the factors related to the children’s height development. Until now, tracking the development of the child is getting more important and meaningful. Knowing and determining the factors related to the physical development of the child any desired time would provide better, reliable and accurate results for childcare. In this frame, 7 groups for height percentile curve (3th, 10th, 25th, 50th, 75th, 90th, and 97th) of Turkey are used. By using discrete height data of 0-18 years old children and the least squares method, a continuous curve is developed valid for any time interval. By doing so, in any desired instant, it is possible to find the percentage and location of the child in Percentage Chart. Here, with the help of the fractional calculus theory, a mathematical model is developed. The outcomes of the proposed approach are quite promising compared to the linear and the polynomial method. The approach also yields to predict the expected values of children in the sense of height.Keywords: children growth percentile, children physical development, fractional calculus, linear and polynomial model
Procedia PDF Downloads 14725584 Recognizing and Prioritizing Effective Factors on Productivity of Human Resources Through Using Technique for Order of Preference by Similarity to Ideal Solution Method
Authors: Amirmehdi Dokhanchi, Babak Ziyae
Abstract:
Studying and prioritizing effective factors on productivity of human resources through TOPSIS method is the main aim of the present research study. For this reason, while reviewing concepts existing in productivity, effective factors were studied. Managers, supervisors, staff and personnel of Tabriz Tractor Manufacturing Company are considered subject of this study. Of total individuals, 160 of them were selected through the application of random sampling method as 'subject'. Two questionnaires were used for collecting data in this study. The factors, which had the highest effect on productivity, were recognized through the application of software packages. TOPSIS method was used for prioritizing recognized factors. For this reason, the second questionnaire was put available to statistics sample for studying effect of each of factors towards predetermined indicators. Therefore, decision-making matrix was obtained. The result of prioritizing factors shows that existence of accurate organizational strategy, high level of occupational skill, application of partnership and contribution system, on-the-job-training services, high quality of occupational life, dissemination of appropriate organizational culture, encouraging to creativity and innovation, and environmental factors are prioritized respectively.Keywords: productivity of human resources, productivity indicators, TOPSIS, prioritizing factors
Procedia PDF Downloads 33125583 Interaction with Earth’s Surface in Remote Sensing
Authors: Spoorthi Sripad
Abstract:
Remote sensing is a powerful tool for acquiring information about the Earth's surface without direct contact, relying on the interaction of electromagnetic radiation with various materials and features. This paper explores the fundamental principle of "Interaction with Earth's Surface" in remote sensing, shedding light on the intricate processes that occur when electromagnetic waves encounter different surfaces. The absorption, reflection, and transmission of radiation generate distinct spectral signatures, allowing for the identification and classification of surface materials. The paper delves into the significance of the visible, infrared, and thermal infrared regions of the electromagnetic spectrum, highlighting how their unique interactions contribute to a wealth of applications, from land cover classification to environmental monitoring. The discussion encompasses the types of sensors and platforms used to capture these interactions, including multispectral and hyperspectral imaging systems. By examining real-world applications, such as land cover classification and environmental monitoring, the paper underscores the critical role of understanding the interaction with the Earth's surface for accurate and meaningful interpretation of remote sensing data.Keywords: remote sensing, earth's surface interaction, electromagnetic radiation, spectral signatures, land cover classification, archeology and cultural heritage preservation
Procedia PDF Downloads 5625582 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol
Authors: Inkyu Kim, SangMan Moon
Abstract:
This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application
Procedia PDF Downloads 39125581 Methods for Distinction of Cattle Using Supervised Learning
Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl
Abstract:
Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning
Procedia PDF Downloads 54925580 Router 1X3 - RTL Design and Verification
Authors: Nidhi Gopal
Abstract:
Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.Keywords: data packets, networking, router, routing
Procedia PDF Downloads 81125579 A Convolution Neural Network Approach to Predict Pes-Planus Using Plantar Pressure Mapping Images
Authors: Adel Khorramrouz, Monireh Ahmadi Bani, Ehsan Norouzi, Morvarid Lalenoor
Abstract:
Background: Plantar pressure distribution measurement has been used for a long time to assess foot disorders. Plantar pressure is an important component affecting the foot and ankle function and Changes in plantar pressure distribution could indicate various foot and ankle disorders. Morphologic and mechanical properties of the foot may be important factors affecting the plantar pressure distribution. Accurate and early measurement may help to reduce the prevalence of pes planus. With recent developments in technology, new techniques such as machine learning have been used to assist clinicians in predicting patients with foot disorders. Significance of the study: This study proposes a neural network learning-based flat foot classification methodology using static foot pressure distribution. Methodologies: Data were collected from 895 patients who were referred to a foot clinic due to foot disorders. Patients with pes planus were labeled by an experienced physician based on clinical examination. Then all subjects (with and without pes planus) were evaluated for static plantar pressures distribution. Patients who were diagnosed with the flat foot in both feet were included in the study. In the next step, the leg length was normalized and the network was trained for plantar pressure mapping images. Findings: From a total of 895 image data, 581 were labeled as pes planus. A computational neural network (CNN) ran to evaluate the performance of the proposed model. The prediction accuracy of the basic CNN-based model was performed and the prediction model was derived through the proposed methodology. In the basic CNN model, the training accuracy was 79.14%, and the test accuracy was 72.09%. Conclusion: This model can be easily and simply used by patients with pes planus and doctors to predict the classification of pes planus and prescreen for possible musculoskeletal disorders related to this condition. However, more models need to be considered and compared for higher accuracy.Keywords: foot disorder, machine learning, neural network, pes planus
Procedia PDF Downloads 35825578 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion Detection System (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw data set for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. These optimal feature subset used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.Keywords: Particle Swarm Optimization (PSO), Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP)
Procedia PDF Downloads 36425577 Applying Neural Networks for Solving Record Linkage Problem via Fuzzy Description Logics
Authors: Mikheil Kalmakhelidze
Abstract:
Record linkage (RL) problem has become more and more important in recent years due to the growing interest towards big data analysis. The problem can be formulated in a very simple way: Given two entries a and b of a database, decide whether they represent the same object or not. There are two classical deterministic and probabilistic ways of solving the RL problem. Using simple Bayes classifier in many cases produces useful results but sometimes they show to be poor. In recent years several successful approaches have been made towards solving specific RL problems by neural network algorithms including single layer perception, multilayer back propagation network etc. In our work, we model the RL problem for specific dataset of student applications in fuzzy description logic (FDL) where linkage of specific pair (a,b) depends on the truth value of corresponding formula A(a,b) in a canonical FDL model. As a main result, we build neural network for deciding truth value of FDL formulas in a canonical model and thus link RL problem to machine learning. We apply the approach to dataset with 10000 entries and also compare to classical RL solving approaches. The results show to be more accurate than standard probabilistic approach.Keywords: description logic, fuzzy logic, neural networks, record linkage
Procedia PDF Downloads 27225576 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 26325575 The Profitability Management Mechanism of Leather Industry-Based on the Activity-Based Benefit Approach
Authors: Mei-Fang Wu, Shu-Li Wang, Tsung-Yueh Lu, Feng-Tsung Cheng
Abstract:
Strengthening core competitiveness is the main goal of enterprises in a fierce competitive environment. Accurate cost information is a great help for managers in dealing with operation strategies. This paper establishes a profitability management mechanism that applies the Activity-Based Benefit approach (ABBA) to solve the profitability for each customer from the market. ABBA provides financial and non-financial information for the operation, but also indicates what resources have expired in the operational process. The customer profit management model shows the level of profitability of each customer for the company. The empirical data were gathered from a case company operating in the leather industry in Taiwan. The research findings indicate that 30% of customers create little profit for the company as a result of asking for over 5% of sales discounts. Those customers ask for sales discount because of color differences of leather products. This paper provides a customer’s profitability evaluation mechanism to help enterprises to greatly improve operating effectiveness and promote operational activity efficiency and overall operation profitability.Keywords: activity-based benefit approach, customer profit analysis, leather industry, profitability management mechanism
Procedia PDF Downloads 30425574 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study
Authors: Zeba Mahmood
Abstract:
The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining
Procedia PDF Downloads 53725573 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 34925572 Pathological Disparities in Patients Diagnosed with Prostate Imaging Reporting and Data System 3 Lesions: A Retrospective Study in a High-Volume Academic Center
Authors: M. Reza Roshandel, Tannaz Aghaei Badr, Batoul Khoundabi, Sara C. Lewis, Soroush Rais-Bahrami, John Sfakianos, Reza Mehrazin, Ash K. Tewari
Abstract:
Introduction: Prostate biopsy is the most reliable diagnostic method for choosing the appropriate management of prostate cancer. However, discrepancies between Gleason grade groups (GG) of different biopsies remain a significant concern. This study aims to assess the association of the radiological factors with GG discrepancies in patients with index Prostate Imaging Reporting and Data System (PI-RADS) 3 lesions, using radical prostatectomy (RP) specimens as the most accurate and informative pathology. Methods: This single-institutional retrospective study was performed on a total of 2289 consecutive prostate cancer patients with combined targeted and systematic prostate biopsy followed by radical prostatectomy (RP). The database was explored for patients with the index PI-RADS 3 lesions version 2 and 2.1. Cancers with PI-RADS 4 or 5 scoring were excluded from the study. Patient characteristics and radiologic features were analyzed by multivariable logistic regression. Number-density of lesions was defined as the number of lesions per prostatic volume. Results: Of the 151 prostate cancer cases with PI-RADS 3 index lesions, 27% and 17% had upgrades and downgrades at RP, respectively. Analysis of grade changes showed no significant associations between discrepancies and the number or the number density of PI-RADS 3 lesions. Moreover, the study showed no significant association of the GG changes with race, age, location of the lesions, or prostate volume. Conclusions: This study demonstrated that in PI-RADS 3 cancerous nodules, the chance of the pathology changes in the final pathology of RP specimens was low. Furthermore, having multiple PI-RADS 3 nodules did not change the conclusion, as the possibility of grade changes in patients with multiple nodules was similar to those with solitary lesions.Keywords: prostate, adenocarcinoma, multiparametric MRI, Gleason score, robot-assisted surgery
Procedia PDF Downloads 13125571 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 16325570 Improving Grade Control Turnaround Times with In-Pit Hyperspectral Assaying
Authors: Gary Pattemore, Michael Edgar, Andrew Job, Marina Auad, Kathryn Job
Abstract:
As critical commodities become more scarce, significant time and resources have been used to better understand complicated ore bodies and extract their full potential. These challenging ore bodies provide several pain points for geologists and engineers to overcome, poor handling of these issues flows downs stream to the processing plant affecting throughput rates and recovery. Many open cut mines utilise blast hole drilling to extract additional information to feed back into the modelling process. This method requires samples to be collected during or after blast hole drilling. Samples are then sent for assay with turnaround times varying from 1 to 12 days. This method is time consuming, costly, requires human exposure on the bench and collects elemental data only. To address this challenge, research has been undertaken to utilise hyperspectral imaging across a broad spectrum to scan samples, collars or take down hole measurements for minerals and moisture content and grade abundances. Automation of this process using unmanned vehicles and on-board processing reduces human in pit exposure to ensure ongoing safety. On-board processing allows data to be integrated into modelling workflows with immediacy. The preliminary results demonstrate numerous direct and indirect benefits from this new technology, including rapid and accurate grade estimates, moisture content and mineralogy. These benefits allow for faster geo modelling updates, better informed mine scheduling and improved downstream blending and processing practices. The paper presents recommendations for implementation of the technology in open cut mining environments.Keywords: grade control, hyperspectral scanning, artificial intelligence, autonomous mining, machine learning
Procedia PDF Downloads 11125569 Optimization in Locating Firefighting Stations Using GIS Data and AHP Model; A Case Study on Arak City
Authors: Hasan Heydari
Abstract:
In recent decades, locating urban services is one of the significant discussions in urban planning. Among these considerations, cities require more accurate planning in order to supply citizen needs, especially part of urban safety. In order to gain this goal, one of the main tasks of urban planners and managers is specifying suitable sites to locate firefighting stations. This study has been done to reach this purpose. Therefore effective criteria consist of coverage radius, population density, proximity to pathway network, land use (compatible and incompatible neighborhood) have been specified. After that, descriptive and local information of the criteria was provided and their layers were created in ArcGIS 9.3. Using Analytic Hierarchy Process (AHP) these criteria and their sub-criteria got the weights. These layers were classified regarding their weights and finally were overlaid by Index Overlay Model and provided the final site selection map for firefighting stations of Arak city. The results gained by analyzing in GIS environment indicate the existing fire station don’t cover the whole city sufficiently and some of the stations have established on the unsuitable sites. The output map indicates the best sites to locate firefighting stations of Arak.Keywords: site-selection, firefighting stations, analytic hierarchy process (AHP), GIS, index overlay model
Procedia PDF Downloads 34725568 Utilization of Informatics to Transform Clinical Data into a Simplified Reporting System to Examine the Analgesic Prescribing Practices of a Single Urban Hospital’s Emergency Department
Authors: Rubaiat S. Ahmed, Jemer Garrido, Sergey M. Motov
Abstract:
Clinical informatics (CI) enables the transformation of data into a systematic organization that improves the quality of care and the generation of positive health outcomes.Innovative technology through informatics that compiles accurate data on analgesic utilization in the emergency department can enhance pain management in this important clinical setting. We aim to establish a simplified reporting system through CI to examine and assess the analgesic prescribing practices in the EDthrough executing a U.S. federal grant project on opioid reduction initiatives. Queried data points of interest from a level-one trauma ED’s electronic medical records were used to create data sets and develop informational/visual reporting dashboards (on Microsoft Excel and Google Sheets) concerning analgesic usage across several pre-defined parameters and performance metrics using CI. The data was then qualitatively analyzed to evaluate ED analgesic prescribing trends by departmental clinicians and leadership. During a 12-month reporting period (Dec. 1, 2020 – Nov. 30, 2021) for the ongoing project, about 41% of all ED patient visits (N = 91,747) were for pain conditions, of which 81.6% received analgesics in the ED and at discharge (D/C). Of those treated with analgesics, 24.3% received opioids compared to 75.7% receiving opioid alternatives in the ED and at D/C, including non-pharmacological modalities. Demographics showed among patients receiving analgesics, 56.7% were aged between 18-64, 51.8% were male, 51.7% were white, and 66.2% had government funded health insurance. Ninety-one percent of all opioids prescribed were in the ED, with intravenous (IV) morphine, IV fentanyl, and morphine sulfate immediate release (MSIR) tablets accounting for 88.0% of ED dispensed opioids. With 9.3% of all opioids prescribed at D/C, MSIR was dispensed 72.1% of the time. Hydrocodone, oxycodone, and tramadol usage to only 10-15% of the time, and hydromorphone at 0%. Of opioid alternatives, non-steroidal anti-inflammatory drugs were utilized 60.3% of the time, 23.5% with local anesthetics and ultrasound-guided nerve blocks, and 7.9% with acetaminophen as the primary non-opioid drug categories prescribed by ED providers. Non-pharmacological analgesia included virtual reality and other modalities. An average of 18.5 ED opioid orders and 1.9 opioid D/C prescriptions per 102.4 daily ED patient visits was observed for the period. Compared to other specialties within our institution, 2.0% of opioid D/C prescriptions are given by ED providers, compared to the national average of 4.8%. Opioid alternatives accounted for 69.7% and 30.3% usage, versus 90.7% and 9.3% for opioids in the ED and D/C, respectively.There is a pressing need for concise, relevant, and reliable clinical data on analgesic utilization for ED providers and leadership to evaluate prescribing practices and make data-driven decisions. Basic computer software can be used to create effective visual reporting dashboards with indicators that convey relevant and timely information in an easy-to-digest manner. We accurately examined our ED's analgesic prescribing practices using CI through dashboard reporting. Such reporting tools can quickly identify key performance indicators and prioritize data to enhance pain management and promote safe prescribing practices in the emergency setting.Keywords: clinical informatics, dashboards, emergency department, health informatics, healthcare informatics, medical informatics, opioids, pain management, technology
Procedia PDF Downloads 14425567 Non-Parametric Changepoint Approximation for Road Devices
Authors: Loïc Warscotte, Jehan Boreux
Abstract:
The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.Keywords: changepoint, weigh-in-motion, process, non-parametric
Procedia PDF Downloads 7825566 Comparative Study of Skeletonization and Radial Distance Methods for Automated Finger Enumeration
Authors: Mohammad Hossain Mohammadi, Saif Al Ameri, Sana Ziaei, Jinane Mounsef
Abstract:
Automated enumeration of the number of hand fingers is widely used in several motion gaming and distance control applications, and is discussed in several published papers as a starting block for hand recognition systems. The automated finger enumeration technique should not only be accurate, but also must have a fast response for a moving-picture input. The high performance of video in motion games or distance control will inhibit the program’s overall speed, for image processing software such as Matlab need to produce results at high computation speeds. Since an automated finger enumeration with minimum error and processing time is desired, a comparative study between two finger enumeration techniques is presented and analyzed in this paper. In the pre-processing stage, various image processing functions were applied on a real-time video input to obtain the final cleaned auto-cropped image of the hand to be used for the two techniques. The first technique uses the known morphological tool of skeletonization to count the number of skeleton’s endpoints for fingers. The second technique uses a radial distance method to enumerate the number of fingers in order to obtain a one dimensional hand representation. For both discussed methods, the different steps of the algorithms are explained. Then, a comparative study analyzes the accuracy and speed of both techniques. Through experimental testing in different background conditions, it was observed that the radial distance method was more accurate and responsive to a real-time video input compared to the skeletonization method. All test results were generated in Matlab and were based on displaying a human hand for three different orientations on top of a plain color background. Finally, the limitations surrounding the enumeration techniques are presented.Keywords: comparative study, hand recognition, fingertip detection, skeletonization, radial distance, Matlab
Procedia PDF Downloads 38125565 Adoption of Big Data by Global Chemical Industries
Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta
Abstract:
The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science
Procedia PDF Downloads 8425564 Secure Multiparty Computations for Privacy Preserving Classifiers
Authors: M. Sumana, K. S. Hareesha
Abstract:
Secure computations are essential while performing privacy preserving data mining. Distributed privacy preserving data mining involve two to more sites that cannot pool in their data to a third party due to the violation of law regarding the individual. Hence in order to model the private data without compromising privacy and information loss, secure multiparty computations are used. Secure computations of product, mean, variance, dot product, sigmoid function using the additive and multiplicative homomorphic property is discussed. The computations are performed on vertically partitioned data with a single site holding the class value.Keywords: homomorphic property, secure product, secure mean and variance, secure dot product, vertically partitioned data
Procedia PDF Downloads 41025563 The Relation between Vitamin C and Oral Health
Authors: Mai Ashraf Talaat
Abstract:
Background: Vitamin C (ascorbic acid) is an essential nutrient for the development and repair of all body tissues. It can be obtained from a healthy diet or through supplementation. Due to its importance, vitamin C has become a mainstay in the treatment and prevention of many diseases and in maintaining immune, skin, bone and overall health. This review article aims to discuss the studies and case reports conducted to evaluate the effect of Vitamin C on oral health and the recent advances in oral medicine that involve the use of vitamin C. Data/Sources: The review was conducted for clinical studies, case reports and published literature in the English language that addresses this topic. An extensive search in the electronic databases of PubMed, PubMed Central, Web of Science, National Library of Medicine and ResearchGate was performed. Conclusion: Vitamin C is thought to treat periodontal diseases and gingival enlargement. It also affects biofilm formation and therefore, it helps in reducing caries incidence. Recently, vitamin C mesotherapy has been used to treat inflamed gingiva, bleeding gums and gingival hyperpigmentation. More research and randomized controlled trials are needed on this specific topic for more accurate judgment. Clinical significance: A minimally invasive approach - the usage of vitamin C in dental care could drastically reduce the need for surgical intervention.Keywords: oral health, periodontology, vitamin C, Gingivitis
Procedia PDF Downloads 7725562 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer
Authors: Nabil Saad, David Morgan, Manish Gupta
Abstract:
Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.Keywords: aerosols, extinction, visibility, albedo
Procedia PDF Downloads 88