Search results for: quantification accuracy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4146

Search results for: quantification accuracy

2826 Extracting Attributes for Twitter Hashtag Communities

Authors: Ashwaq Alsulami, Jianhua Shao

Abstract:

Various organisations often need to understand discussions on social media, such as what trending topics are and characteristics of the people engaged in the discussion. A number of approaches have been proposed to extract attributes that would characterise a discussion group. However, these approaches are largely based on supervised learning, and as such they require a large amount of labelled data. We propose an approach in this paper that does not require labelled data, but rely on lexical sources to detect meaningful attributes for online discussion groups. Our findings show an acceptable level of accuracy in detecting attributes for Twitter discussion groups.

Keywords: attributed community, attribute detection, community, social network

Procedia PDF Downloads 155
2825 Natural and Construction/Demolition Waste Aggregates: A Comparative Study

Authors: Debora C. Mendes, Matthias Eckert, Claudia S. Moço, Helio Martins, Jean-Pierre Gonçalves, Miguel Oliveira, Jose P. Da Silva

Abstract:

Disposal of construction and demolition waste (C&DW) in embankments in the periphery of cities causes both environmental and social problems. To achieve the management of C&DW, a detailed analysis of the properties of these materials should be done. In this work we report a comparative study of the physical, chemical and environmental properties of natural and C&DW aggregates from 25 different origins. Assays were performed according to European Standards. Analysis of heavy metals and organic compounds, namely polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs), were performed. Finally, properties of concrete prepared with C&DW aggregates are reported. Physical analyses of C&DW aggregates indicated lower quality properties than natural aggregates, particularly for concrete preparation and unbound layers of road pavements. Chemical properties showed that most samples (80%) meet the values required by European regulations for concrete and unbound layers of road pavements. Analyses of heavy metals Cd, Cr, Cu, Pb, Ni, Mo and Zn in the C&DW leachates showed levels below the limits established by the Council Decision of 19 December 2002. Identification and quantification of PCBs and PAHs indicated that few samples shows the presence of these compounds. The measured levels of PCBs and PAHs are also below the limits. Other compounds identified in the C&DW leachates include phthalates and diphenylmethanol. The characterized C&DW aggregates show lower quality properties than natural aggregates but most samples showed to be environmentally safe. A continuous monitoring of the presence of heavy metals and organic compounds should be made to trial safe C&DW aggregates. C&DW aggregates provide a good economic and environmental alternative to natural aggregates.

Keywords: concrete preparation, construction and demolition waste, heavy metals, organic pollutants

Procedia PDF Downloads 355
2824 Deciphering Orangutan Drawing Behavior Using Artificial Intelligence

Authors: Benjamin Beltzung, Marie Pelé, Julien P. Renoult, Cédric Sueur

Abstract:

To this day, it is not known if drawing is specifically human behavior or if this behavior finds its origins in ancestor species. An interesting window to enlighten this question is to analyze the drawing behavior in genetically close to human species, such as non-human primate species. A good candidate for this approach is the orangutan, who shares 97% of our genes and exhibits multiple human-like behaviors. Focusing on figurative aspects may not be suitable for orangutans’ drawings, which may appear as scribbles but may have meaning. A manual feature selection would lead to an anthropocentric bias, as the features selected by humans may not match with those relevant for orangutans. In the present study, we used deep learning to analyze the drawings of a female orangutan named Molly († in 2011), who has produced 1,299 drawings in her last five years as part of a behavioral enrichment program at the Tama Zoo in Japan. We investigate multiple ways to decipher Molly’s drawings. First, we demonstrate the existence of differences between seasons by training a deep learning model to classify Molly’s drawings according to the seasons. Then, to understand and interpret these seasonal differences, we analyze how the information spreads within the network, from shallow to deep layers, where early layers encode simple local features and deep layers encode more complex and global information. More precisely, we investigate the impact of feature complexity on classification accuracy through features extraction fed to a Support Vector Machine. Last, we leverage style transfer to dissociate features associated with drawing style from those describing the representational content and analyze the relative importance of these two types of features in explaining seasonal variation. Content features were relevant for the classification, showing the presence of meaning in these non-figurative drawings and the ability of deep learning to decipher these differences. The style of the drawings was also relevant, as style features encoded enough information to have a classification better than random. The accuracy of style features was higher for deeper layers, demonstrating and highlighting the variation of style between seasons in Molly’s drawings. Through this study, we demonstrate how deep learning can help at finding meanings in non-figurative drawings and interpret these differences.

Keywords: cognition, deep learning, drawing behavior, interpretability

Procedia PDF Downloads 160
2823 Evaluation of the Impact of Telematics Use on Young Drivers’ Driving Behaviour: A Naturalistic Driving Study

Authors: WonSun Chen, James Boylan, Erwin Muharemovic, Denny Meyer

Abstract:

In Australia, drivers aged between 18 and 24 remained at high risk of road fatality over the last decade. Despite the successful implementation of the Graduated Licensing System (GLS) that supports young drivers in their early phases of driving, the road fatality statistics for these drivers remains high. In response to these statistics, studies conducted in Australia prior to the start of the COVID-19 pandemic have demonstrated the benefits of using telematics devices for improving driving behaviour, However, the impact of COVID-19 lockdown on young drivers’ driving behaviour has emerged as a global concern. Therefore, this naturalistic study aimed to evaluate and compare the driving behaviour(such as acceleration, braking, speeding, etc.) of young drivers with the adoption of in-vehicle telematics devices. Forty-two drivers aged between 18 and 30 and residing in the Australian state of Victoria participated in this study during the period of May to October 2022. All participants drove with the telematics devices during the first 30-day. At the start of the second 30-day, twenty-one participants were randomised to an intervention group where they were provided with an additional telematics ray device that provided visual feedback to the drivers, especially when they committed to aggressive driving behaviour. The remaining twenty-one participants remined their driving journeys without the extra telematics ray device (control group). Such trustworthy data enabled the assessment of changes in the driving behaviour of these young drivers using a machine learning approach in Python. Results are expected to show participants from the intervention group will show improvements in their driving behaviour compared to those from the control group.Furthermore, the telematics data enable the assessment and quantification of such improvements in driving behaviour. The findings from this study are anticipated to shed some light in guiding the development of customised campaigns and interventions to further address the high road fatality among young drivers in Australia.

Keywords: driving behaviour, naturalistic study, telematics data, young drivers

Procedia PDF Downloads 119
2822 Data Model to Predict Customize Skin Care Product Using Biosensor

Authors: Ashi Gautam, Isha Shukla, Akhil Seghal

Abstract:

Biosensors are analytical devices that use a biological sensing element to detect and measure a specific chemical substance or biomolecule in a sample. These devices are widely used in various fields, including medical diagnostics, environmental monitoring, and food analysis, due to their high specificity, sensitivity, and selectivity. In this research paper, a machine learning model is proposed for predicting the suitability of skin care products based on biosensor readings. The proposed model takes in features extracted from biosensor readings, such as biomarker concentration, skin hydration level, inflammation presence, sensitivity, and free radicals, and outputs the most appropriate skin care product for an individual. This model is trained on a dataset of biosensor readings and corresponding skin care product information. The model's performance is evaluated using several metrics, including accuracy, precision, recall, and F1 score. The aim of this research is to develop a personalised skin care product recommendation system using biosensor data. By leveraging the power of machine learning, the proposed model can accurately predict the most suitable skin care product for an individual based on their biosensor readings. This is particularly useful in the skin care industry, where personalised recommendations can lead to better outcomes for consumers. The developed model is based on supervised learning, which means that it is trained on a labeled dataset of biosensor readings and corresponding skin care product information. The model uses these labeled data to learn patterns and relationships between the biosensor readings and skin care products. Once trained, the model can predict the most suitable skin care product for an individual based on their biosensor readings. The results of this study show that the proposed machine learning model can accurately predict the most appropriate skin care product for an individual based on their biosensor readings. The evaluation metrics used in this study demonstrate the effectiveness of the model in predicting skin care products. This model has significant potential for practical use in the skin care industry for personalised skin care product recommendations. The proposed machine learning model for predicting the suitability of skin care products based on biosensor readings is a promising development in the skin care industry. The model's ability to accurately predict the most appropriate skin care product for an individual based on their biosensor readings can lead to better outcomes for consumers. Further research can be done to improve the model's accuracy and effectiveness.

Keywords: biosensors, data model, machine learning, skin care

Procedia PDF Downloads 93
2821 Tracking Maximum Power Point Utilizing Artificial Immunity System

Authors: Marwa Ahmed Abd El Hamied

Abstract:

In this paper In this paper, a new technique based on Artificial Immunity System (AIS) technique has been developed to track Maximum Power Point (MPP). AIS system is implemented in a photovoltaic system that is subjected to variable temperature and insulation condition. The proposed novel is simulated using Mat Lab program. The results of simulation have been compared to those who are generated from Observation Controller. The proposed model shows promising results as it provide better accuracy comparing to classical model.

Keywords: component, artificial immunity technique, solar energy, perturbation and observation, power based methods

Procedia PDF Downloads 425
2820 Hand Gesture Recognition Interface Based on IR Camera

Authors: Yang-Keun Ahn, Kwang-Soon Choi, Young-Choong Park, Kwang-Mo Jung

Abstract:

Vision based user interfaces to control TVs and PCs have the advantage of being able to perform natural control without being limited to a specific device. Accordingly, various studies on hand gesture recognition using RGB cameras or depth cameras have been conducted. However, such cameras have the disadvantage of lacking in accuracy or the construction cost being large. The proposed method uses a low cost IR camera to accurately differentiate between the hand and the background. Also, complicated learning and template matching methodologies are not used, and the correlation between the fingertips extracted through curvatures is utilized to recognize Click and Move gestures.

Keywords: recognition, hand gestures, infrared camera, RGB cameras

Procedia PDF Downloads 403
2819 Simultaneous Determination of Cefazolin and Cefotaxime in Urine by HPLC

Authors: Rafika Bibi, Khaled Khaladi, Hind Mokran, Mohamed Salah Boukhechem

Abstract:

A high performance liquid chromatographic method with ultraviolet detection at 264nm was developed and validate for quantitative determination and separation of cefazolin and cefotaxime in urine, the mobile phase consisted of acetonitrile and phosphate buffer pH4,2(15 :85) (v/v) pumped through ODB 250× 4,6 mm, 5um column at a flow rate of 1ml/min, loop of 20ul. In this condition, the validation of this technique showed that it is linear in a range of 0,01 to 10ug/ml with a good correlation coefficient ( R>0,9997), retention time of cefotaxime, cefazolin was 9.0, 10.1 respectively, the statistical evaluation of the method was examined by means of within day (n=6) and day to day (n=5) and was found to be satisfactory with high accuracy and precision.

Keywords: cefazolin, cefotaxime, HPLC, bioscience, biochemistry, pharmaceutical

Procedia PDF Downloads 359
2818 Analysis of Different Classification Techniques Using WEKA for Diabetic Disease

Authors: Usama Ahmed

Abstract:

Data mining is the process of analyze data which are used to predict helpful information. It is the field of research which solve various type of problem. In data mining, classification is an important technique to classify different kind of data. Diabetes is most common disease. This paper implements different classification technique using Waikato Environment for Knowledge Analysis (WEKA) on diabetes dataset and find which algorithm is suitable for working. The best classification algorithm based on diabetic data is Naïve Bayes. The accuracy of Naïve Bayes is 76.31% and take 0.06 seconds to build the model.

Keywords: data mining, classification, diabetes, WEKA

Procedia PDF Downloads 142
2817 Improvements in OpenCV's Viola Jones Algorithm in Face Detection–Skin Detection

Authors: Jyoti Bharti, M. K. Gupta, Astha Jain

Abstract:

This paper proposes a new improved approach for false positives filtering of detected face images on OpenCV’s Viola Jones Algorithm In this approach, for Filtering of False Positives, Skin Detection in two colour spaces i.e. HSV (Hue, Saturation and Value) and YCrCb (Y is luma component and Cr- red difference, Cb- Blue difference) is used. As a result, it is found that false detection has been reduced. Our proposed method reaches the accuracy of about 98.7%. Thus, a better recognition rate is achieved.

Keywords: face detection, Viola Jones, false positives, OpenCV

Procedia PDF Downloads 401
2816 Automated Localization of Palpebral Conjunctiva and Hemoglobin Determination Using Smart Phone Camera

Authors: Faraz Tahir, M. Usman Akram, Albab Ahmad Khan, Mujahid Abbass, Ahmad Tariq, Nuzhat Qaiser

Abstract:

The objective of this study was to evaluate the Degree of anemia by taking the picture of the palpebral conjunctiva using Smartphone Camera. We have first localized the region of interest from the image and then extracted certain features from that Region of interest and trained SVM classifier on those features and then, as a result, our system classifies the image in real-time on their level of hemoglobin. The proposed system has given an accuracy of 70%. We have trained our classifier on a locally gathered dataset of 30 patients.

Keywords: anemia, palpebral conjunctiva, SVM, smartphone

Procedia PDF Downloads 501
2815 Potential Risk Assessment Due to Groundwater Quality Deterioration and Quantifying the Major Influencing Factors Using Geographical Detectors in the Gunabay Watershed of Ethiopia

Authors: Asnakew Mulualem Tegegne, Tarun Kumar Lohani, , Abunu Atlabachew Eshete

Abstract:

Groundwater quality has become deteriorated due to natural and anthropogenic activities. Poor water quality has a potential risk to human health and the environment. Therefore, the study aimed to assess the potential risk of groundwater quality contamination levels and public health risks in the Gunabay watershed. For this task, seventy-eight groundwater samples were collected from thirty-nine locations in the dry and wet seasons during 2022. The ground water contamination index was applied to assess the overall quality of groundwater. Six major driving forces (temperature, population density, soil, land cover, recharge, and geology) and their quantitative impact of each factor on groundwater quality deterioration were demonstrated using Geodetector. The results showed that low groundwater quality was detected in urban and agricultural land. Especially nitrate contamination was highly linked to groundwater quality deterioration and public health risks, and a medium contamination level was observed in the area. This indicates that the inappropriate application of fertilizer on agricultural land and wastewater from urban areas has a great impact on shallow aquifers in the study area. Furthermore, the major influencing factors are ranked as soil type (0.33–0.31)>recharge (0.17–0.15)>temperature (0.13–0.08)>population density (0.1–0.08)>land cover types (0.07– 0.04)>lithology (0.05–0.04). The interaction detector revealed that the interaction between soil ∩ recharge, soil ∩ temperature, and soil ∩ land cover, temperature ∩ recharge is more influential to deteriorate groundwater quality in both seasons. Identification and quantification of the major influencing factors may provide new insight into groundwater resource management.

Keywords: groundwater contamination index, geographical detectors, public health · influencing factors, and water resources management

Procedia PDF Downloads 6
2814 New High Order Group Iterative Schemes in the Solution of Poisson Equation

Authors: Sam Teek Ling, Norhashidah Hj. Mohd. Ali

Abstract:

We investigate the formulation and implementation of new explicit group iterative methods in solving the two-dimensional Poisson equation with Dirichlet boundary conditions. The methods are derived from a fourth order compact nine point finite difference discretization. The methods are compared with the existing second order standard five point formula to show the dramatic improvement in computed accuracy. Numerical experiments are presented to illustrate the effectiveness of the proposed methods.

Keywords: explicit group iterative method, finite difference, fourth order compact, Poisson equation

Procedia PDF Downloads 429
2813 Phytochemical Exploration of Plectranthus stocksii Hook. F. for Antioxidant and Cytotoxic Properties

Authors: Kasipandi Muniyandi, Parimelazhagan Thangaraj

Abstract:

Plants are important prospective wealth of a country, combination of local health care information about a specific plant together with data published by several groups of scientists, can help in deciding whether it should be considered acceptable for medicinal use. In the developed countries, too, plant-derived drugs may be of importance. The wide variety of ailments that are being treated with Plectranthus is an indication of the medicinal value of the genus. A number of species are not toxic and so may be taken orally, whilst others are used topically on the skin or as enemas. This study was designed to evaluate the different properties of Plectranthus stocksii and the aerial parts were collected and extracted with petroleum ether, chloroform, ethyl acetate, acetone and methanol by Soxhlet apparatus and finally macerated with hot water. The quantification assays revealed that, leaf methanol extract showed higher total phenolic (415.41 mg GAE/ g extract) and tannin (177.53 mg GAE/ g extract) contents whereas leaf ethyl acetate exhibited higher flavonoids (777.11 mg RE/ g extract) content. The antioxidant efficiency of the extracts was analyzed by various radical scavenging assays. Among the different antioxidant assays, leaf ethyl acetate extract showed higher free radical scavenging activities against DPPH (IC50 = 3.46 µg/mL), ABTS (27417.65 µM TE/ g extract), FRAP (152.17 mM Fe(II)E/ mg extract) NO• radical (21.46%) and Superoxide radical (IC50 = 24.16 µg/mL) assays. All the parts P. stocksii extracts showed significant protection against OH• induced DNA damage at 50 µg concentration. The HPLC analysis of leaf ethyl acetate extract revealed the presence of Quercetin (30.29 µg/mg of extract) was the major compound. Anticancer activity of leaf ethyl acetate extract showed better IC50 values were 48.87 and 36.08 µg/ mL against MCF-7 and Caco-2 respectively. From this study, P. stocksii can act as a potent antioxidant and cytotoxic antimicrobial agent. The scope for drug development from this plant is endless and there is undoubtedly a call for further research in pharmaceutical industries.

Keywords: antioxidant, cytotoxicity, phenolics, plectranthus stocksii

Procedia PDF Downloads 379
2812 A Physically-Based Analytical Model for Reduced Surface Field Laterally Double Diffused MOSFETs

Authors: M. Abouelatta, A. Shaker, M. El-Banna, G. T. Sayah, C. Gontrand, A. Zekry

Abstract:

In this paper, a methodology for physically modeling the intrinsic MOS part and the drift region of the n-channel Laterally Double-diffused MOSFET (LDMOS) is presented. The basic physical effects like velocity saturation, mobility reduction, and nonuniform impurity concentration in the channel are taken into consideration. The analytical model is implemented using MATLAB. A comparison of the simulations from technology computer aided design (TCAD) and that from the proposed analytical model, at room temperature, shows a satisfactory accuracy which is less than 5% for the whole voltage domain.

Keywords: LDMOS, MATLAB, RESURF, modeling, TCAD

Procedia PDF Downloads 192
2811 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion

Authors: Ali Kazemi

Abstract:

Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.

Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting

Procedia PDF Downloads 58
2810 Using Digitally Reconstructed Radiographs from Magnetic Resonance Images to Localize Pelvic Lymph Nodes on 2D X-Ray Simulator-Based Brachytherapy Treatment Planning

Authors: Mohammad Ali Oghabian, Reza Reiazi, Esmaeel Parsai, Mehdi Aghili, Ramin Jaberi

Abstract:

In this project a new procedure has been introduced for utilizing digitally reconstructed radiograph from MRI images in Brachytherapy treatment planning. This procedure enables us to localize the tumor volume and delineate the extent of critical structures in vicinity of tumor volume. The aim of this project was to improve the accuracy of dose delivered to targets of interest in 2D treatment planning system.

Keywords: brachytherapy, cervix, digitally reconstructed radiographs, lymph node

Procedia PDF Downloads 526
2809 Software Architecture Optimization Using Swarm Intelligence Techniques

Authors: Arslan Ellahi, Syed Amjad Hussain, Fawaz Saleem Bokhari

Abstract:

Optimization of software architecture can be done with respect to a quality attributes (QA). In this paper, there is an analysis of multiple research papers from different dimensions that have been used to classify those attributes. We have proposed a technique of swarm intelligence Meta heuristic ant colony optimization algorithm as a contribution to solve this critical optimization problem of software architecture. We have ranked quality attributes and run our algorithm on every QA, and then we will rank those on the basis of accuracy. At the end, we have selected the most accurate quality attributes. Ant colony algorithm is an effective algorithm and will perform best in optimizing the QA’s and ranking them.

Keywords: complexity, rapid evolution, swarm intelligence, dimensions

Procedia PDF Downloads 256
2808 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity

Authors: Hoda A. Abdel Hafez

Abstract:

Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.

Keywords: mining big data, big data, machine learning, telecommunication

Procedia PDF Downloads 403
2807 Comparison of Finite Difference Schemes for Numerical Study of Ripa Model

Authors: Sidrah Ahmed

Abstract:

The river and lakes flows are modeled mathematically by shallow water equations that are depth-averaged Reynolds Averaged Navier-Stokes equations under Boussinesq approximation. The temperature stratification dynamics influence the water quality and mixing characteristics. It is mainly due to the atmospheric conditions including air temperature, wind velocity, and radiative forcing. The experimental observations are commonly taken along vertical scales and are not sufficient to estimate small turbulence effects of temperature variations induced characteristics of shallow flows. Wind shear stress over the water surface influence flow patterns, heat fluxes and thermodynamics of water bodies as well. Hence it is crucial to couple temperature gradients with shallow water model to estimate the atmospheric effects on flow patterns. The Ripa system has been introduced to study ocean currents as a variant of shallow water equations with addition of temperature variations within the flow. Ripa model is a hyperbolic system of partial differential equations because all the eigenvalues of the system’s Jacobian matrix are real and distinct. The time steps of a numerical scheme are estimated with the eigenvalues of the system. The solution to Riemann problem of the Ripa model is composed of shocks, contact and rarefaction waves. Solving Ripa model with Riemann initial data with the central schemes is difficult due to the eigen structure of the system.This works presents the comparison of four different finite difference schemes for the numerical solution of Riemann problem for Ripa model. These schemes include Lax-Friedrichs, Lax-Wendroff, MacCormack scheme and a higher order finite difference scheme with WENO method. The numerical flux functions in both dimensions are approximated according to these methods. The temporal accuracy is achieved by employing TVD Runge Kutta method. The numerical tests are presented to examine the accuracy and robustness of the applied methods. It is revealed that Lax-Freidrichs scheme produces results with oscillations while Lax-Wendroff and higher order difference scheme produce quite better results.

Keywords: finite difference schemes, Riemann problem, shallow water equations, temperature gradients

Procedia PDF Downloads 200
2806 Methodology of Automation and Supervisory Control and Data Acquisition for Restructuring Industrial Systems

Authors: Lakhoua Najeh

Abstract:

Introduction: In most situations, an industrial system already existing, conditioned by its history, its culture and its context are in difficulty facing the necessity to restructure itself in an organizational and technological environment in perpetual evolution. This is why all operations of restructuring first of all require a diagnosis based on a functional analysis. After a presentation of the functionality of a supervisory system for complex processes, we present the concepts of industrial automation and supervisory control and data acquisition (SCADA). Methods: This global analysis exploits the various available documents on the one hand and takes on the other hand in consideration the various testimonies through investigations, the interviews or the collective workshops; otherwise, it also takes observations through visits as a basis and even of the specific operations. The exploitation of this diagnosis enables us to elaborate the project of restructuring thereafter. Leaving from the system analysis for the restructuring of industrial systems, and after a technical diagnosis based on visits, an analysis of the various technical documents and management as well as on targeted interviews, a focusing retailing the various levels of analysis has been done according a general methodology. Results: The methodology adopted in order to contribute to the restructuring of industrial systems by its participative and systemic character and leaning on a large consultation a lot of human resources that of the documentary resources, various innovating actions has been proposed. These actions appear in the setting of the TQM gait requiring applicable parameter quantification and a treatment valorising some information. The new management environment will enable us to institute an information and communication system possibility of migration toward an ERP system. Conclusion: Technological advancements in process monitoring, control and industrial automation over the past decades have contributed greatly to improve the productivity of virtually all industrial systems throughout the world. This paper tries to identify the principles characteristics of a process monitoring, control and industrial automation in order to provide tools to help in the decision-making process.

Keywords: automation, supervision, SCADA, TQM

Procedia PDF Downloads 172
2805 The Evaluation of the Performance of Different Filtering Approaches in Tracking Problem and the Effect of Noise Variance

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

Performance of different filtering approaches depends on modeling of dynamical system and algorithm structure. For modeling and smoothing the data the evaluation of posterior distribution in different filtering approach should be chosen carefully. In this paper different filtering approaches like filter KALMAN, EKF, UKF, EKS and smoother RTS is simulated in some trajectory tracking of path and accuracy and limitation of these approaches are explained. Then probability of model with different filters is compered and finally the effect of the noise variance to estimation is described with simulations results.

Keywords: Gaussian approximation, Kalman smoother, parameter estimation, noise variance

Procedia PDF Downloads 435
2804 Classification of Germinatable Mung Bean by Near Infrared Hyperspectral Imaging

Authors: Kaewkarn Phuangsombat, Arthit Phuangsombat, Anupun Terdwongworakul

Abstract:

Hard seeds will not grow and can cause mold in sprouting process. Thus, the hard seeds need to be separated from the normal seeds. Near infrared hyperspectral imaging in a range of 900 to 1700 nm was implemented to develop a model by partial least squares discriminant analysis to discriminate the hard seeds from the normal seeds. The orientation of the seeds was also studied to compare the performance of the models. The model based on hilum-up orientation achieved the best result giving the coefficient of determination of 0.98, and root mean square error of prediction of 0.07 with classification accuracy was equal to 100%.

Keywords: mung bean, near infrared, germinatability, hard seed

Procedia PDF Downloads 300
2803 Effects of Mental Skill Training Programme on Direct Free Kick of Grassroot Footballers in Lagos, Nigeria

Authors: Mayowa Adeyeye, Kehinde Adeyemo

Abstract:

The direct free kick is considered a great opportunity to score a goal but this is not always the case amidst Nigerian and other elite footballers. This study, therefore, examined the extent to which an 8 weeks mental skill training programme is effective for improving accuracy in direct free kick in football. Sixty (n-60) students of Pepsi Football Academy participated in the study. They were randomly distributed into two groups of positive self-talk group (intervention n-30) and control group (n-30). The instrument used in the collection of data include a standard football goal post while the research materials include a dummy soccer wall, a cord, an improvised vanishing spray, a clipboard, writing materials, a recording sheet, a self-talk log book, six standard 5 football, cones, an audiotape and a compact disc. The Weinberge and Gould (2011) mental skills training manual was used. The reliability coefficient of the apparatus following a pilot study stood at 0.72. Before the commencement of the mental skills training programme, the participants were asked to take six simulated direct free kick. At the end of each physical skills training session after the pre-test, the researcher spent at least 15 minutes with the groups exposing them to the intervention. The mental skills training programme alongside physical skills training took place in two different locations for the different groups under study, these included Agege Stadium Main bowl Football Pitch (Imagery Group), and Ogba Ijaye (Control Group). The mental skills training programme lasted for eight weeks. After the completion of the mental skills training programme, all the participants were asked to take another six simulated direct free kick attempts using the same field used for the pre-test to determine the efficacy of the treatments. The pre-test and post-test data were analysed using inferential statistics of t-test, while the alpha level was set at 0.05. The result revealed significant differences in t-test for positive self-talk and control group. Based on the findings, it is recommended that athletes should be exposed to positive self-talk alongside their normal physical skills training for quality delivery of accurate direct free kick during training and competition.

Keywords: accuracy, direct free kick, pepsi football academy, positive self-talk

Procedia PDF Downloads 345
2802 Effectiveness of the Lacey Assessment of Preterm Infants to Predict Neuromotor Outcomes of Premature Babies at 12 Months Corrected Age

Authors: Thanooja Naushad, Meena Natarajan, Tushar Vasant Kulkarni

Abstract:

Background: The Lacey Assessment of Preterm Infants (LAPI) is used in clinical practice to identify premature babies at risk of neuromotor impairments, especially cerebral palsy. This study attempted to find the validity of the Lacey assessment of preterm infants to predict neuromotor outcomes of premature babies at 12 months corrected age and to compare its predictive ability with the brain ultrasound. Methods: This prospective cohort study included 89 preterm infants (45 females and 44 males) born below 35 weeks gestation who were admitted to the neonatal intensive care unit of a government hospital in Dubai. Initial assessment was done using the Lacey assessment after the babies reached 33 weeks postmenstrual age. Follow up assessment on neuromotor outcomes was done at 12 months (± 1 week) corrected age using two standardized outcome measures, i.e., infant neurological international battery and Alberta infant motor scale. Brain ultrasound data were collected retrospectively. Data were statistically analyzed, and the diagnostic accuracy of the Lacey assessment of preterm infants (LAPI) was calculated -when used alone and in combination with the brain ultrasound. Results: On comparison with brain ultrasound, the Lacey assessment showed superior specificity (96% vs. 77%), higher positive predictive value (57% vs. 22%), and higher positive likelihood ratio (18 vs. 3) to predict neuromotor outcomes at one year of age. The sensitivity of Lacey assessment was lower than brain ultrasound (66% vs. 83%), whereas specificity was similar (97% vs. 98%). A combination of Lacey assessment and brain ultrasound results showed higher sensitivity (80%), positive (66%), and negative (98%) predictive values, positive likelihood ratio (24), and test accuracy (95%) than Lacey assessment alone in predicting neurological outcomes. The negative predictive value of the Lacey assessment was similar to that of its combination with brain ultrasound (96%). Conclusion: Results of this study suggest that the Lacey assessment of preterm infants can be used as a supplementary assessment tool for premature babies in the neonatal intensive care unit. Due to its high specificity, Lacey assessment can be used to identify those babies at low risk of abnormal neuromotor outcomes at a later age. When used along with the findings of the brain ultrasound, Lacey assessment has better sensitivity to identify preterm babies at particular risk. These findings have applications in identifying premature babies who may benefit from early intervention services.

Keywords: brain ultrasound, lacey assessment of preterm infants, neuromotor outcomes, preterm

Procedia PDF Downloads 136
2801 Mapping of Forest Cover Change in the Democratic Republic of the Congo

Authors: Armand Okende, Benjamin Beaumont

Abstract:

Introduction: Deforestation is a change in the structure and composition of flora and fauna, which leads to a loss of biodiversity, production of goods and services and an increase in fires. It concerns vast territories in tropical zones particularly; this is the case of the territory of Bolobo in the current province of Maï- Ndombe in the Democratic Republic of Congo. Indeed, through this study between 2001 and 2018, we believe that it was important to show and analyze quantitatively the important forests changes and analyze quantitatively. It’s the overall objective of this study because, in this area, we are witnessing significant deforestation. Methodology: Mapping and quantification are the methodological approaches that we have put forward to assess the deforestation or forest changes through satellite images or raster layers. These satellites data from Global Forest Watch are integrated into the GIS software (GRASS GIS and Quantum GIS) to represent the loss of forest cover that has occurred and the various changes recorded (e.g., forest gain) in the territory of Bolobo. Results: The results obtained show, in terms of quantifying deforestation for the periods 2001-2006, 2007-2012 and 2013-2018, the loss of forest area in hectares each year. The different change maps produced during different study periods mentioned above show that the loss of forest areas is gradually increasing. Conclusion: With this study, knowledge of forest management and protection is a challenge to ensure good management of forest resources. To do this, it is wise to carry out more studies that would optimize the monitoring of forests to guarantee the ecological and economic functions they provide in the Congo Basin, particularly in the Democratic Republic of Congo. In addition, the cartographic approach, coupled with the geographic information system and remote sensing proposed by Global Forest Watch using raster layers, provides interesting information to explain the loss of forest areas.

Keywords: deforestation, loss year, forest change, remote sensing, drivers of deforestation

Procedia PDF Downloads 130
2800 A New Internal Architecture Based On Feature Selection for Holonic Manufacturing System

Authors: Jihan Abdulazeez Ahmed, Adnan Mohsin Abdulazeez Brifcani

Abstract:

This paper suggests a new internal architecture of holon based on feature selection model using the combination of Bees Algorithm (BA) and Artificial Neural Network (ANN). BA is used to generate features while ANN is used as a classifier to evaluate the produced features. Proposed system is applied on the Wine data set, the statistical result proves that the proposed system is effective and has the ability to choose informative features with high accuracy.

Keywords: artificial neural network, bees algorithm, feature selection, Holon

Procedia PDF Downloads 453
2799 Evaluation of Two DNA Extraction Methods for Minimal Porcine (Pork) Detection in Halal Food Sample Mixture Using Taqman Real-time PCR Technique

Authors: Duaa Mughal, Syeda Areeba Nadeem, Shakil Ahmed, Ishtiaq Ahmed Khan

Abstract:

The identification of porcine DNA in Halal food items is critical to ensuring compliance with dietary restrictions and religious beliefs. In Islam, Porcine is prohibited as clearly mentioned in Quran (Surah Al-Baqrah, Ayat 173). The purpose of this study was to compare two DNA extraction procedures for detecting 0.001% of porcine DNA in processed Halal food sample mixtures containing chicken, camel, veal, turkey and goat meat using the TaqMan Real-Time PCR technology. In this research, two different commercial kit protocols were compared. The processed sample mixtures were prepared by spiking known concentration of porcine DNA to non-porcine food matrices. Afterwards, TaqMan Real-Time PCR technique was used to target a particular porcine gene from the extracted DNA samples, which was quantified after extraction. The results of the amplification were evaluated for sensitivity, specificity, and reproducibility. The results of the study demonstrated that two DNA extraction techniques can detect 0.01% of porcine DNA in mixture of Halal food samples. However, as compared to the alternative approach, Eurofins| GeneScan GeneSpin DNA Isolation kit showed more effective sensitivity and specificity. Furthermore, the commercial kit-based approach showed great repeatability with minimal variance across repeats. Quantification of DNA was done by using fluorometric assay. In conclusion, the comparison of DNA extraction methods for detecting porcine DNA in Halal food sample mixes using the TaqMan Real-Time PCR technology reveals that the commercial kit-based approach outperforms the other methods in terms of sensitivity, specificity, and repeatability. This research helps to promote the development of reliable and standardized techniques for detecting porcine DNA in Halal food items, religious conformity and assuring nutritional.

Keywords: real time PCR (qPCR), DNA extraction, porcine DNA, halal food authentication, religious conformity

Procedia PDF Downloads 73
2798 The Use of Correlation Difference for the Prediction of Leakage in Pipeline Networks

Authors: Mabel Usunobun Olanipekun, Henry Ogbemudia Omoregbee

Abstract:

Anomalies such as water pipeline and hydraulic or petrochemical pipeline network leakages and bursts have significant implications for economic conditions and the environment. In order to ensure pipeline systems are reliable, they must be efficiently controlled. Wireless Sensor Networks (WSNs) have become a powerful network with critical infrastructure monitoring systems for water, oil and gas pipelines. The loss of water, oil and gas is inevitable and is strongly linked to financial costs and environmental problems, and its avoidance often leads to saving of economic resources. Substantial repair costs and the loss of precious natural resources are part of the financial impact of leaking pipes. Pipeline systems experts have implemented various methodologies in recent decades to identify and locate leakages in water, oil and gas supply networks. These methodologies include, among others, the use of acoustic sensors, measurements, abrupt statistical analysis etc. The issue of leak quantification is to estimate, given some observations about that network, the size and location of one or more leaks in a water pipeline network. In detecting background leakage, however, there is a greater uncertainty in using these methodologies since their output is not so reliable. In this work, we are presenting a scalable concept and simulation where a pressure-driven model (PDM) was used to determine water pipeline leakage in a system network. These pressure data were collected with the use of acoustic sensors located at various node points after a predetermined distance apart. We were able to determine with the use of correlation difference to determine the leakage point locally introduced at a predetermined point between two consecutive nodes, causing a substantial pressure difference between in a pipeline network. After de-noising the signal from the sensors at the nodes, we successfully obtained the exact point where we introduced the local leakage using the correlation difference model we developed.

Keywords: leakage detection, acoustic signals, pipeline network, correlation, wireless sensor networks (WSNs)

Procedia PDF Downloads 102
2797 Ensemble Methods in Machine Learning: An Algorithmic Approach to Derive Distinctive Behaviors of Criminal Activity Applied to the Poaching Domain

Authors: Zachary Blanks, Solomon Sonya

Abstract:

Poaching presents a serious threat to endangered animal species, environment conservations, and human life. Additionally, some poaching activity has even been linked to supplying funds to support terrorist networks elsewhere around the world. Consequently, agencies dedicated to protecting wildlife habitats have a near intractable task of adequately patrolling an entire area (spanning several thousand kilometers) given limited resources, funds, and personnel at their disposal. Thus, agencies need predictive tools that are both high-performing and easily implementable by the user to help in learning how the significant features (e.g. animal population densities, topography, behavior patterns of the criminals within the area, etc) interact with each other in hopes of abating poaching. This research develops a classification model using machine learning algorithms to aid in forecasting future attacks that is both easy to train and performs well when compared to other models. In this research, we demonstrate how data imputation methods (specifically predictive mean matching, gradient boosting, and random forest multiple imputation) can be applied to analyze data and create significant predictions across a varied data set. Specifically, we apply these methods to improve the accuracy of adopted prediction models (Logistic Regression, Support Vector Machine, etc). Finally, we assess the performance of the model and the accuracy of our data imputation methods by learning on a real-world data set constituting four years of imputed data and testing on one year of non-imputed data. This paper provides three main contributions. First, we extend work done by the Teamcore and CREATE (Center for Risk and Economic Analysis of Terrorism Events) research group at the University of Southern California (USC) working in conjunction with the Department of Homeland Security to apply game theory and machine learning algorithms to develop more efficient ways of reducing poaching. This research introduces ensemble methods (Random Forests and Stochastic Gradient Boosting) and applies it to real-world poaching data gathered from the Ugandan rain forest park rangers. Next, we consider the effect of data imputation on both the performance of various algorithms and the general accuracy of the method itself when applied to a dependent variable where a large number of observations are missing. Third, we provide an alternate approach to predict the probability of observing poaching both by season and by month. The results from this research are very promising. We conclude that by using Stochastic Gradient Boosting to predict observations for non-commercial poaching by season, we are able to produce statistically equivalent results while being orders of magnitude faster in computation time and complexity. Additionally, when predicting potential poaching incidents by individual month vice entire seasons, boosting techniques produce a mean area under the curve increase of approximately 3% relative to previous prediction schedules by entire seasons.

Keywords: ensemble methods, imputation, machine learning, random forests, statistical analysis, stochastic gradient boosting, wildlife protection

Procedia PDF Downloads 288