Search results for: diagnostic accuracy
2437 Navigating Uncertainties in Project Control: A Predictive Tracking Framework
Authors: Byung Cheol Kim
Abstract:
This study explores a method for the signal-noise separation challenge in project control, focusing on the limitations of traditional deterministic approaches that use single-point performance metrics to predict project outcomes. We detail how traditional methods often overlook future uncertainties, resulting in tracking biases when reliance is placed solely on immediate data without adjustments for predictive accuracy. Our investigation led to the development of the Predictive Tracking Project Control (PTPC) framework, which incorporates network simulation and Bayesian control models to adapt more effectively to project dynamics. The PTPC introduces controlled disturbances to better identify and separate tracking biases from useful predictive signals. We will demonstrate the efficacy of the PTPC with examples, highlighting its potential to enhance real-time project monitoring and decision-making, marking a significant shift towards more accurate project management practices.Keywords: predictive tracking, project control, signal-noise separation, Bayesian inference
Procedia PDF Downloads 162436 Designing a Model for Measuring the Components of Good Governance in the Iranian Higher Education System
Authors: Maria Ghorbanian, Mohammad Ghahramani, Mahmood Abolghasemi
Abstract:
Universities and institutions of higher education in Iran, like other higher education institutions in the world, have a heavy mission and task to educate students based on the needs of the country. Taking on such a serious responsibility requires having a good governance system for planning, formulating executive plans, evaluating, and finally modifying them in accordance with the current conditions and challenges ahead. In this regard, the present study was conducted with the aim of identifying the components of good governance in the Iranian higher education system by survey method and with a quantitative approach. In order to collect data, a researcher-made questionnaire was used, which includes two parts: personal and professional characteristics (5 questions) and the three components of good governance in the Iranian higher education system, including good management and leadership (8 items), continuous evaluation and effective (university performance, finance, and university appointments) (8 items) and civic responsibility and sustainable development (7 items). These variables were measured and coded in the form of a five-level Likert scale from "Very Low = 1" to "Very High = 5". First, the validity and reliability of the research model were examined. In order to calculate the reliability of the questionnaire, two methods of Cronbach's alpha and combined reliability were used. Fornell-Larker interaction and criterion were also used to determine the degree of diagnostic validity. The statistical population of this study included all faculty members of public universities in Tehran (N = 4429). The sample size was estimated to be 340 using the Cochran's formula. These numbers were studied using a randomized method with a proportional assignment. The data were analyzed by the structural equation method with the least-squares approach. The results showed that the component of civil responsibility and sustainable development with a factor load of 0.827 is the most important element of good governance.Keywords: good governance, higher education, sustainable, development
Procedia PDF Downloads 1702435 'Evaluating Radiation Protections Aspects For Pediatric Chest Radiography: imaging Standards and Radiation Dose Measurements in Various Hospitals In Kuwait
Authors: Kholood Baron
Abstract:
Chest radiography (CXR) is one of the most important diagnostic examinations in pediatric radiography for diagnosing various diseases. Since, chest X-ray use ionizing radiation to obtain image radiographers should follow strict radiation protection strategies and ALARA principle to ensure that pediatrics receive the lowest dose possible [1] [2]. The aim is to evaluate different criteria related to pediatric CXR examinations performed in the radiology department in five hospitals in Kuwait. Methods: Data collected from a questionnaire and Entrance Skin Dose (ESD) measurements during CXR. 100 responses were collected and analyzed to highlight issues related to immobilization devices, radiation protection issues and repeat rate. While ThermoLumenince Dosimeters (TLDs) measured ESD during 25 CXR for pediatric patients. In addition, other aspects on the radiographer skills and information written in patient requests were collected and recorded. Results: Questionnaires responses showed that most radiographers do follow most radiation protection guidelines, but need to focus on improving their skills in collimation to ROI, dealing with immobilization tools and exposure factors. Since the first issue was least applied to young pediatrics, and the latter two were the common reasons for repeating an image. The ESD measurements revealed that the averaged dose involved in pediatric CXR is 143.9 µGy, which is relatively high but still within the limits of the recommended values [2-3] . The data suggests that this relatively high ESD values can be the result of using higher mAs and thus it I recommended to lower it according to ALARA principle. In conclusion, radiographers have the knowledge and the tools to reduce the radiation dose to pediatric patients but few lack the skills to optimize the collimation, immobilization application and exposure factors. The ESD were within recommended values. This research recommends that more efforts in the future should focus on improving the radiographer commitment to radiation protection and their skills in dealing with pediatric patient. This involves lowering the mAs used during DR.Keywords: pediatric radiography, dosimetry, ESD measurements, radiation protection
Procedia PDF Downloads 272434 An Alternative Approach for Assessing the Impact of Cutting Conditions on Surface Roughness Using Single Decision Tree
Authors: S. Ghorbani, N. I. Polushin
Abstract:
In this study, an approach to identify factors affecting on surface roughness in a machining process is presented. This study is based on 81 data about surface roughness over a wide range of cutting tools (conventional, cutting tool with holes, cutting tool with composite material), workpiece materials (AISI 1045 Steel, AA2024 aluminum alloy, A48-class30 gray cast iron), spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev), depth of cut (0.05-0.15 mm) and tool overhang (41-65 mm). A single decision tree (SDT) analysis was done to identify factors for predicting a model of surface roughness, and the CART algorithm was employed for building and evaluating regression tree. Results show that a single decision tree is better than traditional regression models with higher rate and forecast accuracy and strong value.Keywords: cutting condition, surface roughness, decision tree, CART algorithm
Procedia PDF Downloads 3712433 The Influence of Machine Tool Composite Stiffness to the Surface Waviness When Processing Posture Constantly Switching
Authors: Song Zhiyong, Zhao Bo, Du Li, Wang Wei
Abstract:
Aircraft structures generally have complex surface. Because of constantly switching postures of motion axis, five-axis CNC machine’s composite stiffness changes during CNC machining. It gives rise to different amplitude of vibration of processing system, which further leads to the different effects on surface waviness. In order to provide a solution for this problem, we take the “S” shape test specimen’s CNC machining for the object, through calculate the five axis CNC machine’s composite stiffness and establish vibration model, we analysis of the influence mechanism between vibration amplitude and surface waviness. Through carry out the surface quality measurement experiments, verify the validity and accuracy of the theoretical analysis. This paper’s research results provide a theoretical basis for surface waviness control.Keywords: five axis CNC machine, “S” shape test specimen, composite stiffness, surface waviness
Procedia PDF Downloads 3892432 Binarization and Recognition of Characters from Historical Degraded Documents
Authors: Bency Jacob, S.B. Waykar
Abstract:
Degradations in historical document images appear due to aging of the documents. It is very difficult to understand and retrieve text from badly degraded documents as there is variation between the document foreground and background. Thresholding of such document images either result in broken characters or detection of false texts. Numerous algorithms exist that can separate text and background efficiently in the textual regions of the document; but portions of background are mistaken as text in areas that hardly contain any text. This paper presents a way to overcome these problems by a robust binarization technique that recovers the text from a severely degraded document images and thereby increases the accuracy of optical character recognition systems. The proposed document recovery algorithm efficiently removes degradations from document images. Here we are using the ostus method ,local thresholding and global thresholding and after the binarization training and recognizing the characters in the degraded documents.Keywords: binarization, denoising, global thresholding, local thresholding, thresholding
Procedia PDF Downloads 3422431 Attention-Based Spatio-Temporal Approach for Fire and Smoke Detection
Authors: Alireza Mirrashid, Mohammad Khoshbin, Ali Atghaei, Hassan Shahbazi
Abstract:
In various industries, smoke and fire are two of the most important threats in the workplace. One of the common methods for detecting smoke and fire is the use of infrared thermal and smoke sensors, which cannot be used in outdoor applications. Therefore, the use of vision-based methods seems necessary. The problem of smoke and fire detection is spatiotemporal and requires spatiotemporal solutions. This paper presents a method that uses spatial features along with temporal-based features to detect smoke and fire in the scene. It consists of three main parts; the task of each part is to reduce the error of the previous part so that the final model has a robust performance. This method also uses transformer modules to increase the accuracy of the model. The results of our model show the proper performance of the proposed approach in solving the problem of smoke and fire detection and can be used to increase workplace safety.Keywords: attention, fire detection, smoke detection, spatio-temporal
Procedia PDF Downloads 2012430 Development of a Computer Aided Diagnosis Tool for Brain Tumor Extraction and Classification
Authors: Fathi Kallel, Abdulelah Alabd Uljabbar, Abdulrahman Aldukhail, Abdulaziz Alomran
Abstract:
The brain is an important organ in our body since it is responsible about the majority actions such as vision, memory, etc. However, different diseases such as Alzheimer and tumors could affect the brain and conduct to a partial or full disorder. Regular diagnosis are necessary as a preventive measure and could help doctors to early detect a possible trouble and therefore taking the appropriate treatment, especially in the case of brain tumors. Different imaging modalities are proposed for diagnosis of brain tumor. The powerful and most used modality is the Magnetic Resonance Imaging (MRI). MRI images are analyzed by doctor in order to locate eventual tumor in the brain and describe the appropriate and needed treatment. Diverse image processing methods are also proposed for helping doctors in identifying and analyzing the tumor. In fact, a large Computer Aided Diagnostic (CAD) tools including developed image processing algorithms are proposed and exploited by doctors as a second opinion to analyze and identify the brain tumors. In this paper, we proposed a new advanced CAD for brain tumor identification, classification and feature extraction. Our proposed CAD includes three main parts. Firstly, we load the brain MRI. Secondly, a robust technique for brain tumor extraction is proposed. This technique is based on both Discrete Wavelet Transform (DWT) and Principal Component Analysis (PCA). DWT is characterized by its multiresolution analytic property, that’s why it was applied on MRI images with different decomposition levels for feature extraction. Nevertheless, this technique suffers from a main drawback since it necessitates a huge storage and is computationally expensive. To decrease the dimensions of the feature vector and the computing time, PCA technique is considered. In the last stage, according to different extracted features, the brain tumor is classified into either benign or malignant tumor using Support Vector Machine (SVM) algorithm. A CAD tool for brain tumor detection and classification, including all above-mentioned stages, is designed and developed using MATLAB guide user interface.Keywords: MRI, brain tumor, CAD, feature extraction, DWT, PCA, classification, SVM
Procedia PDF Downloads 2462429 Research on Measuring Operational Risk in Commercial Banks Based on Internal Control
Authors: Baobao Li
Abstract:
Operational risk covers all operations of commercial banks and has a close relationship with the bank’s internal control. But in the commercial banks' management practice, internal control is always separated from the operational risk measurement. With the increasing of operational risk events in recent years, operational risk is paid more and more attention by regulators and banks’ managements. The paper first discussed the relationship between internal control and operational risk management and used CVaR-POT model to measure operational risk, and then put forward a modified measurement method (to use operational risk assessment results to modify the measurement results of the CVaR-POT model). The paper also analyzed the necessity and rationality of this method. The method takes into consideration the influence of internal control, improves the accuracy and effectiveness of operational risk measurement and save the economic capital for commercial banks, avoiding the drawbacks of using some mainstream models one-sidedly.Keywords: commercial banks, internal control, operational risk, risk measurement
Procedia PDF Downloads 3962428 Cryptic Diversity: Identifying Two Morphologically Similar Species of Invasive Apple Snails in Peninsular Malaysia
Authors: Suganiya Rama Rao, Yoon-Yen Yow, Thor-Seng Liew, Shyamala Ratnayeke
Abstract:
Invasive snails in the genus Pomacea have spread across Southeast Asia including Peninsular Malaysia. Apart from significant economic costs to wetland crops, very little is known about the snails’ effects on native species, and wetland function through their alteration of macrophyte communities. This study was conducted to establish diagnostic characteristics of Pomacea species in the Malaysian environment using genetic and morphological criteria. Snails were collected from eight localities in northern and central regions of Peninsular Malaysia. The mitochondrial COI gene of 52 adult snails was amplified and sequenced. Maximum likelihood analysis was used to analyse species identity and assess phylogenetic relationships among snails from different geographic locations. Shells of the two species were compared using geometric morphometric analysis and covariance analyses. Shell height accounted for most of the observed variation between P. canaliculata and P. maculata, with the latter possessing a smaller mean ratio of shell height: aperture height (p < 0.0001) and shell height to shell width (give p < 0.0001). Genomic and phylogenetic analysis demonstrated the presence of two monophyletic taxa, P. canaliculata and P. maculata, in Peninsular Malaysia samples. P. maculata co-occurred with P. canaliculata in 5 localities, but samples from 3 localities contained only P. canaliculata. This study is the first to confirm the presence of two of the most invasive species of Pomacea in Peninsular Malaysia using a genomic approach. P. canaliculata appears to be the more widespread species. Despite statistical differences, both quantitative and qualitative morphological characteristics demonstrate much interspecific overlap and intraspecific variability; thus morphology alone cannot reliably verify species identity. Molecular techniques for distinguishing between these two highly invasive Pomacea species are needed to understand their specific ecological niches and develop effective protocols for their management.Keywords: Pomacea canaliculata, Pomacea maculata, invasive species, phylog enetic analysis, geometric morphometric analysis
Procedia PDF Downloads 2612427 Dissolved Gas Analysis Based Regression Rules from Trained ANN for Transformer Fault Diagnosis
Authors: Deepika Bhalla, Raj Kumar Bansal, Hari Om Gupta
Abstract:
Dissolved Gas Analysis (DGA) has been widely used for fault diagnosis in a transformer. Artificial neural networks (ANN) have high accuracy but are regarded as black boxes that are difficult to interpret. For many problems it is desired to extract knowledge from trained neural networks (NN) so that the user can gain a better understanding of the solution arrived by the NN. This paper applies a pedagogical approach for rule extraction from function approximating neural networks (REFANN) with application to incipient fault diagnosis using the concentrations of the dissolved gases within the transformer oil, as the input to the NN. The input space is split into subregions and for each subregion there is a linear equation that is used to predict the type of fault developing within a transformer. The experiments on real data indicate that the approach used can extract simple and useful rules and give fault predictions that match the actual fault and are at times also better than those predicted by the IEC method.Keywords: artificial neural networks, dissolved gas analysis, rules extraction, transformer
Procedia PDF Downloads 5342426 Using TRACE, PARCS, and SNAP Codes to Analyze the Load Rejection Transient of ABWR
Authors: J. R. Wang, H. C. Chang, A. L. Ho, J. H. Yang, S. W. Chen, C. Shih
Abstract:
The purpose of the study is to analyze the load rejection transient of ABWR by using TRACE, PARCS, and SNAP codes. This study has some steps. First, using TRACE, PARCS, and SNAP codes establish the model of ABWR. Second, the key parameters are identified to refine the TRACE/PARCS/SNAP model further in the frame of a steady state analysis. Third, the TRACE/PARCS/SNAP model is used to perform the load rejection transient analysis. Finally, the FSAR data are used to compare with the analysis results. The results of TRACE/PARCS are consistent with the FSAR data for the important parameters. It indicates that the TRACE/PARCS/SNAP model of ABWR has a good accuracy in the load rejection transient.Keywords: ABWR, TRACE, PARCS, SNAP
Procedia PDF Downloads 1952425 Efficient Fake News Detection Using Machine Learning and Deep Learning Approaches
Authors: Chaima Babi, Said Gadri
Abstract:
The rapid increase in fake news continues to grow at a very fast rate; this requires implementing efficient techniques that allow testing the re-liability of online content. For that, the current research strives to illuminate the fake news problem using deep learning DL and machine learning ML ap-proaches. We have developed the traditional LSTM (Long short-term memory), and the bidirectional BiLSTM model. A such process is to perform a training task on almost of samples of the dataset, validate the model on a subset called the test set to provide an unbiased evaluation of the final model fit on the training dataset, then compute the accuracy of detecting classifica-tion and comparing the results. For the programming stage, we used Tensor-Flow and Keras libraries on Python to support Graphical Processing Units (GPUs) that are being used for developing deep learning applications.Keywords: machine learning, deep learning, natural language, fake news, Bi-LSTM, LSTM, multiclass classification
Procedia PDF Downloads 932424 Urban Growth Prediction Using Artificial Neural Networks in Athens, Greece
Authors: Dimitrios Triantakonstantis, Demetris Stathakis
Abstract:
Urban areas have been expanded throughout the globe. Monitoring and modeling urban growth have become a necessity for a sustainable urban planning and decision making. Urban prediction models are important tools for analyzing the causes and consequences of urban land use dynamics. The objective of this research paper is to analyze and model the urban change, which has been occurred from 1990 to 2000 using CORINE land cover maps. The model was developed using drivers of urban changes (such as road distance, slope, etc.) under an Artificial Neural Network modeling approach. Validation was achieved using a prediction map for 2006 which was compared with a real map of Urban Atlas of 2006. The accuracy produced a Kappa index of agreement of 0,639 and a value of Cramer's V of 0,648. These encouraging results indicate the importance of the developed urban growth prediction model which using a set of available common biophysical drivers could serve as a management tool for the assessment of urban change.Keywords: artificial neural networks, CORINE, urban atlas, urban growth prediction
Procedia PDF Downloads 5272423 Decentralized Peak-Shaving Strategies for Integrated Domestic Batteries
Authors: Corentin Jankowiak, Aggelos Zacharopoulos, Caterina Brandoni
Abstract:
In a context of increasing stress put on the electricity network by the decarbonization of many sectors, energy storage is likely to be the key mitigating element, by acting as a buffer between production and demand. In particular, the highest potential for storage is when connected closer to the loads. Yet, low voltage storage struggles to penetrate the market at a large scale due to the novelty and complexity of the solution, and the competitive advantage of fossil fuel-based technologies regarding regulations. Strong and reliable numerical simulations are required to show the benefits of storage located near loads and promote its development. The present study was restrained from excluding aggregated control of storage: it is assumed that the storage units operate independently to one another without exchanging information – as is currently mostly the case. A computationally light battery model is presented in detail and validated by direct comparison with a domestic battery operating in real conditions. This model is then used to develop Peak-Shaving (PS) control strategies as it is the decentralized service from which beneficial impacts are most likely to emerge. The aggregation of flatter, peak- shaved consumption profiles is likely to lead to flatter and arbitraged profile at higher voltage layers. Furthermore, voltage fluctuations can be expected to decrease if spikes of individual consumption are reduced. The crucial part to achieve PS lies in the charging pattern: peaks depend on the switching on and off of appliances in the dwelling by the occupants and are therefore impossible to predict accurately. A performant PS strategy must, therefore, include a smart charge recovery algorithm that can ensure enough energy is present in the battery in case it is needed without generating new peaks by charging the unit. Three categories of PS algorithms are introduced in detail. First, using a constant threshold or power rate for charge recovery, followed by algorithms using the State Of Charge (SOC) as a decision variable. Finally, using a load forecast – of which the impact of the accuracy is discussed – to generate PS. A performance metrics was defined in order to quantitatively evaluate their operating regarding peak reduction, total energy consumption, and self-consumption of domestic photovoltaic generation. The algorithms were tested on load profiles with a 1-minute granularity over a 1-year period, and their performance was assessed regarding these metrics. The results show that constant charging threshold or power are far from optimal: a certain value is not likely to fit the variability of a residential profile. As could be expected, forecast-based algorithms show the highest performance. However, these depend on the accuracy of the forecast. On the other hand, SOC based algorithms also present satisfying performance, making them a strong alternative when the reliable forecast is not available.Keywords: decentralised control, domestic integrated batteries, electricity network performance, peak-shaving algorithm
Procedia PDF Downloads 1162422 Assessment of Acquired Language Disorders in Bilingual French-English Adults in Ontario: Current Practice and Challenges
Authors: Sophie Laurence, Catherine Rivard
Abstract:
The assessment of acquired language disorders in the adult population, whether for a bilingual or monolingual adult, is a complex process that requires the speech-language pathologist (SLP) to make a judicious choice when selecting the assessment method and tools. However, this task is even more complex with Ontario's bilingual population due to the lack of linguistically and culturally appropriate tools for this population. Numerous researches examined language assessment of the pediatric bilingual population; however, few studies have focused on assessing acquired language disorders in bilingual adults. This study's main objective is to identify the challenges that SLPs encounter when assessing language in the bilingual English-French adult population in Ontario to ultimately be able to serve this population in the future better. An online questionnaire was made available to 1325 members of the College of Audiologists and Speech-Language Pathologists of Ontario (CASLPO) who work with the adult population. The answers to this questionnaire (n = 71) allowed us to identify the tools and strategies most commonly used by SLPs in current practice, identify the assessment challenges faced by SLPs, and determine the causes of these challenges as well as potential solutions. In an English and French assessment, the Western Aphasia Battery, the Boston Diagnostic Aphasia Examination, and the Boston Naming Test were the three tools that respondents deemed to be the most relevant for the assessment. Besides, the results revealed that limited access to SLPs and interpreters who speak the client's language and the lack of standardized and normalized assessment tools for Ontario's French-speaking and bilingual English-French clientele are at the heart of the challenges of current SLP practice. Consistent with these findings, respondents highlighted two potential solutions to address these challenges: SLPs have access to standardized/normalized tools for the population under study and better access to SLPs and interpreters who speak the client's language.Keywords: assessment, acquired language disorders, bilingualism, speech-Language pathology, adult population
Procedia PDF Downloads 1362421 Estimation of Slab Depth, Column Size and Rebar Location of Concrete Specimen Using Impact Echo Method
Authors: Y. T. Lee, J. H. Na, S. H. Kim, S. U. Hong
Abstract:
In this study, an experimental research for estimation of slab depth, column size and location of rebar of concrete specimen is conducted using the Impact Echo Method (IE) based on stress wave among non-destructive test methods. Estimation of slab depth had total length of 1800×300 and 6 different depths including 150 mm, 180 mm, 210 mm, 240 mm, 270 mm and 300 mm. The concrete column specimen was manufactured by differentiating the size into 300×300×300 mm, 400×400×400 mm and 500×500×500 mm. In case of the specimen for estimation of rebar, rebar of ∅22 mm was used in a specimen of 300×370×200 and arranged at 130 mm and 150 mm from the top to the rebar top. As a result of error rate of slab depth was overall mean of 3.1%. Error rate of column size was overall mean of 1.7%. Mean error rate of rebar location was 1.72% for top, 1.19% for bottom and 1.5% for overall mean showing relative accuracy.Keywords: impact echo method, estimation, slab depth, column size, rebar location, concrete
Procedia PDF Downloads 3492420 Indoor Robot Positioning with Precise Correlation Computations over Walsh-Coded Lightwave Signal Sequences
Authors: Jen-Fa Huang, Yu-Wei Chiu, Jhe-Ren Cheng
Abstract:
Visible light communication (VLC) technique has become useful method via LED light blinking. Several issues on indoor mobile robot positioning with LED blinking are examined in the paper. In the transmitter, we control the transceivers blinking message. Orthogonal Walsh codes are adopted for such purpose on auto-correlation function (ACF) to detect signal sequences. In the robot receiver, we set the frame of time by 1 ns passing signal from the transceiver to the mobile robot. After going through many periods of time detecting the peak value of ACF in the mobile robot. Moreover, the transceiver transmits signal again immediately. By capturing three times of peak value, we can know the time difference of arrival (TDOA) between two peak value intervals and finally analyze the accuracy of the robot position.Keywords: Visible Light Communication, Auto-Correlation Function (ACF), peak value of ACF, Time difference of Arrival (TDOA)
Procedia PDF Downloads 3252419 Fast and Robust Long-term Tracking with Effective Searching Model
Authors: Thang V. Kieu, Long P. Nguyen
Abstract:
Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.Keywords: correlation filter, long-term tracking, random fern, real-time tracking
Procedia PDF Downloads 1352418 Predictors of Non-Alcoholic Fatty Liver Disease in Egyptian Obese Adolescents
Authors: Moushira Zaki, Wafaa Ezzat, Yasser Elhosary, Omnia Saleh
Abstract:
Nonalcoholic fatty liver disease (NAFLD) has increased in conjunction with obesity. The accuracy of risk factors for detecting NAFLD in obese adolescents has not undergone a formal evaluation. The aim of this study was to evaluate predictors of NAFLD among Egyptian female obese adolescents. The study included 162 obese female adolescents. All were subjected to anthropometry, biochemical analysis and abdominal ultrasongraphic assessment. Metabolic syndrome (MS) was diagnosed according to the IDF criteria. Significant association between presence of MS and NAFLD was observed. Obese adolescents with NAFLD had significantly higher levels of ALT, triglycerides, fasting glucose, insulin, blood pressure and HOMA-IR, whereas decreased HDL-C levels as compared with obese cases without NAFLD. Receiver–operating characteristic (ROC) curve analysis shows that ALT is a sensitive predictor for NAFLD, confirming that ALT can be used as a marker of NAFLD.Keywords: obesity, NAFLD, predictors, adolescents, Egyptians, risk factors, prevalence
Procedia PDF Downloads 3872417 Optimization Techniques for Microwave Structures
Authors: Malika Ourabia
Abstract:
A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The discontinuities is characterized using a hybrid spectral/numerical technique. This structure presents an arbitrary number of ports, each one with different orientation and dimensions. This article presents a hybrid method based on multimode contour integral and mode matching techniques. The process is based on segmentation and dividing the structure into key building blocks. We use the multimode contour integral method to analyze the blocks including irregular shape discontinuities. Finally, the multimode scattering matrix of the whole structure can be found by cascading the blocks. Therefore, the new method is suitable for analysis of a wide range of waveguide problems. Therefore, the present approach can be applied easily to the analysis of any multiport junctions and cascade blocks. The accuracy of the method is validated comparing with results for several complex problems found in the literature. CPU times are also included to show the efficiency of the new method proposed.Keywords: segmentation, s parameters, simulation, optimization
Procedia PDF Downloads 5262416 Prevalence of Gestational Diabetes Mellitus in Western Australia from 2015 until 2020
Authors: Kumaressan Ragunathan, Arisudhan Anantharachagan
Abstract:
Gestational diabetes mellitus (GDM) is the subtype of diabetes that has been rapidly increasing in numbers in Australia. The annual percentage of GDM has increased more than 50 percent in the last decade. According to Diabetes Australia, more than five hundred thousand women in Australia will be diagnosed with GDM. Globally, the prevalence of GDM ranges from single-digit to more than 45%. The prevalence of GDM has increased significantly last five years after the introduction of new diagnostic criteria. Hence, we have decided to investigate the trend in GDM prevalence in a tertiary maternity unit at Western Australia and compare it to national prevalence. Data is derived from STORK Perinatal Database which has been used by Maternity services in Western Australia to populate information on pregnancy and labour. We have selected data from 2015 until 2020, which includes 17508 women. Among 17508 women, 3850 women were diagnosed with GDM. In 2015, we had a total of 2213 deliveries with 345 of them were complicated by GDM. GDM prevalence was 15.6% compared to the Australian national prevalence of 12%. In 2016, total deliveries increased to 2759 with 590 of were with GDM. GDM prevalence was 21.4% compared to the Australian national prevalence of 12%. In 2017, total deliveries further increased to 3049 with 675 with GDM. GDM prevalence was 22.1%, with an Australian national prevalence of 13%. In 2018, total deliveries continued to increase, with numbers reaching 3231 with 749 with GDM. GDM prevalence was 23.2%, with an Australian National prevalence of 14%. In 2019, total deliveries were 3110, with 712 complicated by GDM. GDM prevalence was 22.9%, with Australian national prevalence 14%. In 2020, total deliveries 3146 with 819 complicated by GDM. GDM prevalence increased to 26% and we were unable to compare this to national standard as national prevalence has not been released. Among 3890 women with GDM, 2482 (64%) of them required insulin. Apart from that, a total 1642(42%) from the GDM group were delivered via the Caesarean section. 2121 (55%) women with GDM required induction of labour. Overall, we demonstrated an increase in the prevalence of GDM in our unit from 2015 until 2020. Our prevalence is also higher compared to national prevalence. This could be contributed by the increasing number of obesity and in addition, our unit accepts referrals of women with a body mass index (BMI) of more than 40. Hence, further studies are required to look at other risk factors like ethnicity, socio-economic status, health literacy and age, which could contribute to this high prevalence.Keywords: gestational diabetes mellitus, prevalence, Western Australia, Australia
Procedia PDF Downloads 1622415 Smoker Recognition from Lung X-Ray Images Using Convolutional Neural Network
Authors: Moumita Chanda, Md. Fazlul Karim Patwary
Abstract:
Smoking is one of the most popular recreational drug use behaviors, and it contributes to birth defects, COPD, heart attacks, and erectile dysfunction. To completely eradicate this disease, it is imperative that it be identified and treated. Numerous smoking cessation programs have been created, and they demonstrate how beneficial it may be to help someone stop smoking at the ideal time. A tomography meter is an effective smoking detector. Other wearables, such as RF-based proximity sensors worn on the collar and wrist to detect when the hand is close to the mouth, have been proposed in the past, but they are not impervious to deceptive variables. In this study, we create a machine that can discriminate between smokers and non-smokers in real-time with high sensitivity and specificity by watching and collecting the human lung and analyzing the X-ray data using machine learning. If it has the highest accuracy, this machine could be utilized in a hospital, in the selection of candidates for the army or police, or in university entrance.Keywords: CNN, smoker detection, non-smoker detection, OpenCV, artificial Intelligence, X-ray Image detection
Procedia PDF Downloads 832414 Comparative Evaluation of Seropositivity and Patterns Distribution Rates of the Anti-Nuclear Antibodies in the Diagnosis of Four Different Autoimmune Collagen Tissue Diseases
Authors: Recep Kesli, Onur Turkyilmaz, Cengiz Demir
Abstract:
Objective: Autoimmune collagen diseases occur with the immune reactions against the body’s own cell or tissues which cause inflammation and damage the tissues and organs. In this study, it was aimed to compare seropositivity rates and patterns of the anti-nuclear antibodies (ANA) in the diagnosis of four different autoimmune collagen tissue diseases (Rheumatoid Arthritis-RA, Systemic Lupus Erythematous-SLE, Scleroderma-SSc and Sjogren Syndrome-SS) with each other. Methods: One hundred eighty-eight patients applied to different clinics in Afyon Kocatepe University ANS Practice and Research Hospital between 11.07.2014 and 14.07.2015 that thought the different collagen disease such as RA, SLE, SSc and SS have participated in the study retrospectively. All the data obtained from the patients participated in the study were evaluated according to the included criteria. The historical archives belonging to the patients have been screened, assessed in terms of ANA positivity. The obtained data was analysed by using the descriptive statistics; chi-squared, Fischer's exact test. The evaluations were performed by SPSS 20.0 version and p < 0.05 level was considered as significant. Results: Distribution rates of the totally one hundred eighty-eight patients according to the diagnosis were found as follows: 82 (43.6%) were RA, 38 (20.2%) were SLE, 22 (11.7%) were SSc, and 46 (24.5%) were SS. Distribution of ANA positivity rates according to the collagen tissue diseases were found as follows; for RA were 54 (65,9 %), for SLE were 36 (94,7 %), for SSc were 18 (81,8 %), and for SS were 43 (93,5 %). Rheumatoid arthritis should be evaluated and classified as a different class among all the other investigated three autoimmune illnesses. ANA positivity rates were found as differently higher (91.5 %) in the SLE, SSc, and SS, from the RA (65.9 %). Differences at ANA positivity rates for RA and the other three diseases were found as statistically significant (p=0.015). Conclusions: Systemic autoimmune illnesses show broad spectrum. ANA positivity was found as an important predictor marker in the diagnosis of the rheumatologic illnesses. ANA positivity should be evaluated as more valuable and sensitive a predictor diagnostic marker in the laboratory findings of the SLE, SSc, and SS according to RA.Keywords: antinuclear antibody (ANA), rheumatoid arthritis, scleroderma, Sjogren syndrome, systemic lupus Erythemotosus
Procedia PDF Downloads 2432413 Sleep Tracking AI Application in Smart-Watches
Authors: Sumaiya Amir Khan, Shayma Al-Sharif, Samiha Mazher, Neha Intikhab Khan
Abstract:
This research paper aims to evaluate the effectiveness of sleep-tracking AI applications in smart-watches. It focuses on comparing the sleep analyses of two different smartwatch brands, Samsung and Fitbit, and measuring sleep at three different stages – REM (Rapid-Eye-Movement), NREM (Non-Rapid-Eye-Movement), and deep sleep. The methodology involves the participation of different users and analyzing their sleep data. The results reveal that although light sleep is the longest stage, deep sleep is higher than average in the participants. The study also suggests that light sleep is not uniform, and getting higher levels of deep sleep can prevent debilitating health conditions. Based on the findings, it is recommended that individuals should aim to achieve higher levels of deep sleep to maintain good health. Overall, this research contributes to the growing literature on the effectiveness of sleep-tracking AI applications and their potential to improve sleep quality.Keywords: sleep tracking, lifestyle, accuracy, health, AI, AI features, ML
Procedia PDF Downloads 792412 Molecular Characterization of Ovine Herpesvirus 2 Strains Based on Selected Glycoprotein and Tegument Genes
Authors: Fulufhelo Amanda Doboro, Kgomotso Sebeko, Stephen Njiro, Moritz Van Vuuren
Abstract:
Ovine herpesvirus 2 (OvHV-2) genome obtained from the lymphopblastoid cell line of a BJ1035 cow was recently sequenced in the United States of America (USA). Information on the sequences of OvHV-2 genes obtained from South African strains from bovine or other African countries and molecular characterization of OvHV-2 is not documented. Present investigation provides information on the nucleotide and derived amino acid sequences and genetic diversity of Ov 7, Ov 8 ex2, ORF 27 and ORF 73 genes, of these genes from OvHV-2 strains circulating in South Africa. Gene-specific primers were designed and used for PCR of DNA extracted from 42 bovine blood samples that previously tested positive for OvHV-2. The expected PCR products of 495 bp, 253 bp, 890 bp and 1632 bp respectively for Ov 7, Ov 8 ex2, ORF 27 and ORF 73 genes were sequenced and multiple sequence analysis done on the selected regions of the sequenced PCR products. Two genotypes for ORF 27 and ORF 73 gene sequences, and three genotypes for Ov 7 and Ov 8 ex2 gene sequences were identified, and similar groupings for the derived amino acid sequences were obtained for each gene. Nucleotide and amino acid sequence variations that led to the identification of the different genotypes included SNPs, deletions and insertions. Sequence analysis of Ov 7 and ORF 27 genes revealed variations that distinguished between sequences from SA and reference OvHV-2 strains. The implication of geographic origin among SA sequences was difficult to evaluate because of random distribution of genotypes in the different provinces, for each gene. However, socio-economic factors such as migration of people with animals, or transportation of animals for agricultural or business use from one province to another are most likely to be responsible for this observation. The sequence variations observed in this study have no impact on the antibody binding activities of glycoproteins encoded by Ov 7, Ov 8 ex2 and ORF 27 genes, as determined by prediction of the presence of B cell epitopes using BepiPred 1.0. The findings of this study will be used for selection of gene candidates for the development of diagnostic assays and vaccine development as well.Keywords: amino acid, genetic diversity, genes, nucleotide
Procedia PDF Downloads 4882411 Wireless Sensor Anomaly Detection Using Soft Computing
Authors: Mouhammd Alkasassbeh, Alaa Lasasmeh
Abstract:
We live in an era of rapid development as a result of significant scientific growth. Like other technologies, wireless sensor networks (WSNs) are playing one of the main roles. Based on WSNs, ZigBee adds many features to devices, such as minimum cost and power consumption, and increasing the range and connect ability of sensor nodes. ZigBee technology has come to be used in various fields, including science, engineering, and networks, and even in medicinal aspects of intelligence building. In this work, we generated two main datasets, the first being based on tree topology and the second on star topology. The datasets were evaluated by three machine learning (ML) algorithms: J48, meta.j48 and multilayer perceptron (MLP). Each topology was classified into normal and abnormal (attack) network traffic. The dataset used in our work contained simulated data from network simulation 2 (NS2). In each database, the Bayesian network meta.j48 classifier achieved the highest accuracy level among other classifiers, of 99.7% and 99.2% respectively.Keywords: IDS, Machine learning, WSN, ZigBee technology
Procedia PDF Downloads 5432410 Naïve Bayes: A Classical Approach for the Epileptic Seizures Recognition
Authors: Bhaveek Maini, Sanjay Dhanka, Surita Maini
Abstract:
Electroencephalography (EEG) is used to classify several epileptic seizures worldwide. It is a very crucial task for the neurologist to identify the epileptic seizure with manual EEG analysis, as it takes lots of effort and time. Human error is always at high risk in EEG, as acquiring signals needs manual intervention. Disease diagnosis using machine learning (ML) has continuously been explored since its inception. Moreover, where a large number of datasets have to be analyzed, ML is acting as a boon for doctors. In this research paper, authors proposed two different ML models, i.e., logistic regression (LR) and Naïve Bayes (NB), to predict epileptic seizures based on general parameters. These two techniques are applied to the epileptic seizures recognition dataset, available on the UCI ML repository. The algorithms are implemented on an 80:20 train test ratio (80% for training and 20% for testing), and the performance of the model was validated by 10-fold cross-validation. The proposed study has claimed accuracy of 81.87% and 95.49% for LR and NB, respectively.Keywords: epileptic seizure recognition, logistic regression, Naïve Bayes, machine learning
Procedia PDF Downloads 582409 Cricket Shot Recognition using Conditional Directed Spatial-Temporal Graph Networks
Authors: Tanu Aneja, Harsha Malaviya
Abstract:
Capturing pose information in cricket shots poses several challenges, such as low-resolution videos, noisy data, and joint occlusions caused by the nature of the shots. In response to these challenges, we propose a CondDGConv-based framework specifically for cricket shot prediction. By analyzing the spatial-temporal relationships in batsman shot sequences from an annotated 2D cricket dataset, our model achieves a 97% accuracy in predicting shot types. This performance is made possible by conditioning the graph network on batsman 2D poses, allowing for precise prediction of shot outcomes based on pose dynamics. Our approach highlights the potential for enhancing shot prediction in cricket analytics, offering a robust solution for overcoming pose-related challenges in sports analysis.Keywords: action recognition, cricket. sports video analytics, computer vision, graph convolutional networks
Procedia PDF Downloads 162408 Static vs. Stream Mining Trajectories Similarity Measures
Authors: Musaab Riyadh, Norwati Mustapha, Dina Riyadh
Abstract:
Trajectory similarity can be defined as the cost of transforming one trajectory into another based on certain similarity method. It is the core of numerous mining tasks such as clustering, classification, and indexing. Various approaches have been suggested to measure similarity based on the geometric and dynamic properties of trajectory, the overlapping between trajectory segments, and the confined area between entire trajectories. In this article, an evaluation of these approaches has been done based on computational cost, usage memory, accuracy, and the amount of data which is needed in advance to determine its suitability to stream mining applications. The evaluation results show that the stream mining applications support similarity methods which have low computational cost and memory, single scan on data, and free of mathematical complexity due to the high-speed generation of data.Keywords: global distance measure, local distance measure, semantic trajectory, spatial dimension, stream data mining
Procedia PDF Downloads 392