Search results for: image dataset
1367 Real-Time Course Recommendation System for Online Learning Platforms
Authors: benabbess anja
Abstract:
This research presents the design and implementation of a real-time course recommendation system for online learning platforms, leveraging user competencies and expertise levels. The system begins by extracting and classifying the complexity levels of courses from Udemy datasets using semantic enrichment techniques and resources such as WordNet and BERT. A predictive model assigns complexity levels to each course, adding columns that represent the course category, sub-category, and complexity level to the existing dataset. Simultaneously, user profiles are constructed through questionnaires capturing their skills, sub-skills, and proficiency levels. The recommendation process involves generating embeddings with BERT, followed by calculating cosine similarity between user profiles and courses. Courses are ranked based on their relevance, with the BERT model delivering the most accurate results. To enable real-time recommendations, Apache Kafka is integrated to track user interactions (clicks, comments, time spent, completed courses, feedback) and update user profiles. The embeddings are regenerated, and similarities with courses are recalculated to reflect users' evolving needs and behaviors, incorporating a progressive weighting of interactions for more personalized suggestions. This approach ensures dynamic and real-time course recommendations tailored to user progress and engagement, providing a more personalized and effective learning experience. This system aims to improve user engagement and optimize learning paths by offering courses that precisely match users' needs and current skill levels.Keywords: recommendation system, online learning, real-time, user skills, expertise level, personalized recommendations, dynamic suggestions
Procedia PDF Downloads 121366 Non-Destructing Testing of Sandstones from Unconventional Reservoir in Poland with Use of Ultrasonic Pulse Velocity Technique and X-Ray Computed Microtomography
Authors: Michał Maksimczuk, Łukasz Kaczmarek, Tomasz Wejrzanowski
Abstract:
This study concerns high-resolution X-ray computed microtomography (µCT) and ultrasonic pulse analysis of Cambrian sandstones from a borehole located in the Baltic Sea Coast of northern Poland. µCT and ultrasonic technique are non-destructive methods commonly used to determine the internal structure of reservoir rock sample. The spatial resolution of the µCT images obtained was 27 µm, which enabled the author to create accurate 3-D visualizations of structure geometry and to calculate the ratio of pores volume to the total sample volume. A copper X-ray source filter was used to reduce image artifacts. Furthermore, samples Young’s modulus and Poisson ratio were obtained with use of ultrasonic pulse technique. µCT and ultrasonic pulse technique provide complex information which can be used for explorations and characterization of reservoir rocks.Keywords: elastic parameters, linear absorption coefficient, northern Poland, tight gas
Procedia PDF Downloads 2541365 Performance Comparison of Deep Convolutional Neural Networks for Binary Classification of Fine-Grained Leaf Images
Authors: Kamal KC, Zhendong Yin, Dasen Li, Zhilu Wu
Abstract:
Intra-plant disease classification based on leaf images is a challenging computer vision task due to similarities in texture, color, and shape of leaves with a slight variation of leaf spot; and external environmental changes such as lighting and background noises. Deep convolutional neural network (DCNN) has proven to be an effective tool for binary classification. In this paper, two methods for binary classification of diseased plant leaves using DCNN are presented; model created from scratch and transfer learning. Our main contribution is a thorough evaluation of 4 networks created from scratch and transfer learning of 5 pre-trained models. Training and testing of these models were performed on a plant leaf images dataset belonging to 16 distinct classes, containing a total of 22,265 images from 8 different plants, consisting of a pair of healthy and diseased leaves. We introduce a deep CNN model, Optimized MobileNet. This model with depthwise separable CNN as a building block attained an average test accuracy of 99.77%. We also present a fine-tuning method by introducing the concept of a convolutional block, which is a collection of different deep neural layers. Fine-tuned models proved to be efficient in terms of accuracy and computational cost. Fine-tuned MobileNet achieved an average test accuracy of 99.89% on 8 pairs of [healthy, diseased] leaf ImageSet.Keywords: deep convolution neural network, depthwise separable convolution, fine-grained classification, MobileNet, plant disease, transfer learning
Procedia PDF Downloads 1911364 Comparison of the Chest X-Ray and Computerized Tomography Scans Requested from the Emergency Department
Authors: Sahabettin Mete, Abdullah C. Hocagil, Hilal Hocagil, Volkan Ulker, Hasan C. Taskin
Abstract:
Objectives and Goals: An emergency department is a place where people can come for a multitude of reasons 24 hours a day. As it is an easy, accessible place, thanks to self-sacrificing people who work in emergency departments. But the workload and overcrowding of emergency departments are increasing day by day. Under these circumstances, it is important to choose a quick, easily accessible and effective test for diagnosis. This results in laboratory and imaging tests being more than 40% of all emergency department costs. Despite all of the technological advances in imaging methods and available computerized tomography (CT), chest X-ray, the older imaging method, has not lost its appeal and effectiveness for nearly all emergency physicians. Progress in imaging methods are very convenient, but physicians should consider the radiation dose, cost, and effectiveness, as well as imaging methods to be carefully selected and used. The aim of the study was to investigate the effectiveness of chest X-ray in immediate diagnosis against the advancing technology by comparing chest X-ray and chest CT scan results of the patients in the emergency department. Methods: Patients who applied to Bulent Ecevit University Faculty of Medicine’s emergency department were investigated retrospectively in between 1 September 2014 and 28 February 2015. Data were obtained via MIAMED (Clear Canvas Image Server v6.2, Toronto, Canada), information management system which patients’ files are saved electronically in the clinic, and were retrospectively scanned. The study included 199 patients who were 18 or older, had both chest X-ray and chest CT imaging. Chest X-ray images were evaluated by the emergency medicine senior assistant in the emergency department, and the findings were saved to the study form. CT findings were obtained from already reported data by radiology department in the clinic. Chest X-ray was evaluated with seven questions in terms of technique and dose adequacy. Patients’ age, gender, application complaints, comorbid diseases, vital signs, physical examination findings, diagnosis, chest X-ray findings and chest CT findings were evaluated. Data saved and statistical analyses have made via using SPSS 19.0 for Windows. And the value of p < 0.05 were accepted statistically significant. Results: 199 patients were included in the study. In 38,2% (n=76) of all patients were diagnosed with pneumonia and it was the most common diagnosis. The chest X-ray imaging technique was appropriate in patients with the rate of 31% (n=62) of all patients. There was not any statistically significant difference (p > 0.05) between both imaging methods (chest X-ray and chest CT) in terms of determining the rates of displacement of the trachea, pneumothorax, parenchymal consolidation, increased cardiothoracic ratio, lymphadenopathy, diaphragmatic hernia, free air levels in the abdomen (in sections including the image), pleural thickening, parenchymal cyst, parenchymal mass, parenchymal cavity, parenchymal atelectasis and bone fractures. Conclusions: When imaging findings, showing cases that needed to be quickly diagnosed, were investigated, chest X-ray and chest CT findings were matched at a high rate in patients with an appropriate imaging technique. However, chest X-rays, evaluated in the emergency department, were frequently taken with an inappropriate technique.Keywords: chest x-ray, chest computerized tomography, chest imaging, emergency department
Procedia PDF Downloads 2001363 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images
Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge
Abstract:
Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.Keywords: band selection, fuzzy c-means, k-means, hyperspectral image
Procedia PDF Downloads 4121362 A Study on Websites of Public and Private Hospitals in Konya
Authors: H. Nur Görkemli, Mehmet Fidan
Abstract:
After the first acquaintance with internet in April 1993, number of internet users increased rapidly in Turkey. According to Turkish Statistical Institute’s 2013 data, internet usage in Turkey between 16-74 age group is 48,9%. Hospitals are one of the areas where internet is being intensively used like many other businesses. As a part of public relations application, websites are important tools for hospitals to reach a wide range of target audience within and outside the organization. With their websites, hospitals have opportunities to give information about their organization, strengthen their image, compete with their rivals, interact with shareholders, reflect their transparency and meet with new audiences. This study examines web sites of totally 31 hospitals which are located in Konya. Institutions are categorized as public and private hospitals and then three main research categories are determined: content, visual and technical. Main and sub categories are examined by using content analysis method. Results are interpreted in terms of public and private institutions.Keywords: websites, hospital, health communication, internet, webpages
Procedia PDF Downloads 3831361 A Thorough Analysis of the Literature on the Airport Service Quality and Patron Satisfaction
Authors: Mohammed Saad Alanazi
Abstract:
Satisfaction of travelers with services provided in the airports is a sign of competitiveness and the corporate image of the airport. This study conducted a systematic literature review of recent studies published after 2017 regarding the factors that positively influence travelers’ satisfaction and encourage them to report positive reviews online. This study found variations among the studies found. They used several research methodologies, and datasets and focused on different airports, yet, they commonly categorized airport services into seven categories that should receive high intention because their qualities were found increasing review rate and positivity. It was found that studies targeting travelers’ satisfaction and intention of revisiting tended to use primary sources of data (survey); meanwhile, studies concerned positivity and negativity of comments towards airport services often used online reviews provided by travelers.Keywords: business Intelligence, airport service quality, passenger satisfaction, thorough analysis
Procedia PDF Downloads 851360 Medical Diagnosis of Retinal Diseases Using Artificial Intelligence Deep Learning Models
Authors: Ethan James
Abstract:
Over one billion people worldwide suffer from some level of vision loss or blindness as a result of progressive retinal diseases. Many patients, particularly in developing areas, are incorrectly diagnosed or undiagnosed whatsoever due to unconventional diagnostic tools and screening methods. Artificial intelligence (AI) based on deep learning (DL) convolutional neural networks (CNN) have recently gained a high interest in ophthalmology for its computer-imaging diagnosis, disease prognosis, and risk assessment. Optical coherence tomography (OCT) is a popular imaging technique used to capture high-resolution cross-sections of retinas. In ophthalmology, DL has been applied to fundus photographs, optical coherence tomography, and visual fields, achieving robust classification performance in the detection of various retinal diseases including macular degeneration, diabetic retinopathy, and retinitis pigmentosa. However, there is no complete diagnostic model to analyze these retinal images that provide a diagnostic accuracy above 90%. Thus, the purpose of this project was to develop an AI model that utilizes machine learning techniques to automatically diagnose specific retinal diseases from OCT scans. The algorithm consists of neural network architecture that was trained from a dataset of over 20,000 real-world OCT images to train the robust model to utilize residual neural networks with cyclic pooling. This DL model can ultimately aid ophthalmologists in diagnosing patients with these retinal diseases more quickly and more accurately, therefore facilitating earlier treatment, which results in improved post-treatment outcomes.Keywords: artificial intelligence, deep learning, imaging, medical devices, ophthalmic devices, ophthalmology, retina
Procedia PDF Downloads 1851359 Preoperative Weight Management Education and Its Influence on Bariatric Surgery Patient Weights
Authors: Meghana Pandit, Abhishek Chakraborty
Abstract:
There are a multitude of factors that influence the clinical success of bariatric surgery. This study seeks to determine the efficacy of preoperative weight management education. The Food and Fitness Program at Mount Sinai serves to educate patients on topics such as stress management, sleep habits, body image, nutrition, and exercise 5-6 months before their surgeries to slowly decrease their weight. Each month, patients are weighed, and a different topic is presented. To evaluate the longitudinal effects of these lectures, patient’s weights are evaluated at the first appointment, before an informative lecture is presented. Weights are then reevaluated at the last appointment before the surgery. The weights were statistically analyzed using a paired t-test and the results demonstrated a statistically significant difference (p < .0001, n=55). Thus, it is reasonable to conclude that the education paradigm employed successfully empowered patients to maintain and reduce their gross BMI before clinical intervention.Keywords: bariatric, surgery, weight, education
Procedia PDF Downloads 1381358 Jordan Curves in the Digital Plane with Respect to the Connectednesses given by Certain Adjacency Graphs
Authors: Josef Slapal
Abstract:
Digital images are approximations of real ones and, therefore, to be able to study them, we need the digital plane Z2 to be equipped with a convenient structure that behaves analogously to the Euclidean topology on the real plane. In particular, it is required that such a structure allows for a digital analogue of the Jordan curve theorem. We introduce certain adjacency graphs on the digital plane and prove digital Jordan curves for them thus showing that the graphs provide convenient structures on Z2 for the study and processing of digital images. Further convenient structures including the wellknown Khalimsky and Marcus-Wyse adjacency graphs may be obtained as quotients of the graphs introduced. Since digital Jordan curves represent borders of objects in digital images, the adjacency graphs discussed may be used as background structures on the digital plane for solving the problems of digital image processing that are closely related to borders like border detection, contour filling, pattern recognition, thinning, etc.Keywords: digital plane, adjacency graph, Jordan curve, quotient adjacency
Procedia PDF Downloads 3831357 Bipolar Impulse Noise Removal and Edge Preservation in Color Images and Video Using Improved Kuwahara Filter
Authors: Reji Thankachan, Varsha PS
Abstract:
Both image capturing devices and human visual systems are nonlinear. Hence nonlinear filtering methods outperforms its linear counterpart in many applications. Linear methods are unable to remove impulsive noise in images by preserving its edges and fine details. In addition, linear algorithms are unable to remove signal dependent or multiplicative noise in images. This paper presents an approach to denoise and smoothen the Bipolar impulse noised images and videos using improved Kuwahara filter. It involves a 2 stage algorithm which includes a noise detection followed by filtering. Numerous simulation demonstrate that proposed method outperforms the existing method by eliminating the painting like flattening effect along the local feature direction while preserving edge with improvement in PSNR and MSE.Keywords: bipolar impulse noise, Kuwahara, PSNR MSE, PDF
Procedia PDF Downloads 5001356 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record
Authors: Raghavi C. Janaswamy
Abstract:
In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.Keywords: electronic health record, graph neural network, heterogeneous data, prediction
Procedia PDF Downloads 901355 Using Priority Order of Basic Features for Circumscribed Masses Detection in Mammograms
Authors: Minh Dong Le, Viet Dung Nguyen, Do Huu Viet, Nguyen Huu Tu
Abstract:
In this paper, we present a new method for circumscribed masses detection in mammograms. Our method is evaluated on 23 mammographic images of circumscribed masses and 20 normal mammograms from public Mini-MIAS database. The method is quite sanguine with sensitivity (SE) of 95% with only about 1 false positive per image (FPpI). To achieve above results we carry out a progression following: Firstly, the input images are preprocessed with the aim to enhance key information of circumscribed masses; Next, we calculate and evaluate statistically basic features of abnormal regions on training database; Then, mammograms on testing database are divided into equal blocks which calculated corresponding features. Finally, using priority order of basic features to classify blocks as an abnormal or normal regions.Keywords: mammograms, circumscribed masses, evaluated statistically, priority order of basic features
Procedia PDF Downloads 3401354 Smoker Recognition from Lung X-Ray Images Using Convolutional Neural Network
Authors: Moumita Chanda, Md. Fazlul Karim Patwary
Abstract:
Smoking is one of the most popular recreational drug use behaviors, and it contributes to birth defects, COPD, heart attacks, and erectile dysfunction. To completely eradicate this disease, it is imperative that it be identified and treated. Numerous smoking cessation programs have been created, and they demonstrate how beneficial it may be to help someone stop smoking at the ideal time. A tomography meter is an effective smoking detector. Other wearables, such as RF-based proximity sensors worn on the collar and wrist to detect when the hand is close to the mouth, have been proposed in the past, but they are not impervious to deceptive variables. In this study, we create a machine that can discriminate between smokers and non-smokers in real-time with high sensitivity and specificity by watching and collecting the human lung and analyzing the X-ray data using machine learning. If it has the highest accuracy, this machine could be utilized in a hospital, in the selection of candidates for the army or police, or in university entrance.Keywords: CNN, smoker detection, non-smoker detection, OpenCV, artificial Intelligence, X-ray Image detection
Procedia PDF Downloads 861353 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study
Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos
Abstract:
This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.Keywords: in-place devices, IoT, human-centred data-analytics, spatial design
Procedia PDF Downloads 2011352 A Construct to Perform in Situ Deformation Measurement of Material Extrusion-Fabricated Structures
Authors: Daniel Nelson, Valeria La Saponara
Abstract:
Material extrusion is an additive manufacturing modality that continues to show great promise in the ability to create low-cost, highly intricate, and exceedingly useful structural elements. As more capable and versatile filament materials are devised, and the resolution of manufacturing systems continues to increase, the need to understand and predict manufacturing-induced warping will gain ever greater importance. The following study presents an in situ remote sensing and data analysis construct that allows for the in situ mapping and quantification of surface displacements induced by residual stresses on a specified test structure. This proof-of-concept experimental process shows that it is possible to provide designers and manufacturers with insight into the manufacturing parameters that lead to the manifestation of these deformations and a greater understanding of the behavior of these warping events over the course of the manufacturing process.Keywords: additive manufacturing, deformation, digital image correlation, fused filament fabrication, residual stress, warping
Procedia PDF Downloads 941351 Predictive Modeling of Bridge Conditions Using Random Forest
Authors: Miral Selim, May Haggag, Ibrahim Abotaleb
Abstract:
The aging of transportation infrastructure presents significant challenges, particularly concerning the monitoring and maintenance of bridges. This study investigates the application of Random Forest algorithms for predictive modeling of bridge conditions, utilizing data from the US National Bridge Inventory (NBI). The research is significant as it aims to improve bridge management through data-driven insights that can enhance maintenance strategies and contribute to overall safety. Random Forest is chosen for its robustness, ability to handle complex, non-linear relationships among variables, and its effectiveness in feature importance evaluation. The study begins with comprehensive data collection and cleaning, followed by the identification of key variables influencing bridge condition ratings, including age, construction materials, environmental factors, and maintenance history. Random Forest is utilized to examine the relationships between these variables and the predicted bridge conditions. The dataset is divided into training and testing subsets to evaluate the model's performance. The findings demonstrate that the Random Forest model effectively enhances the understanding of factors affecting bridge conditions. By identifying bridges at greater risk of deterioration, the model facilitates proactive maintenance strategies, which can help avoid costly repairs and minimize service disruptions. Additionally, this research underscores the value of data-driven decision-making, enabling better resource allocation to prioritize maintenance efforts where they are most necessary. In summary, this study highlights the efficiency and applicability of Random Forest in predictive modeling for bridge management. Ultimately, these findings pave the way for more resilient and proactive management of bridge systems, ensuring their longevity and reliability for future use.Keywords: data analysis, random forest, predictive modeling, bridge management
Procedia PDF Downloads 271350 Adjusting Electricity Demand Data to Account for the Impact of Loadshedding in Forecasting Models
Authors: Migael van Zyl, Stefanie Visser, Awelani Phaswana
Abstract:
The electricity landscape in South Africa is characterized by frequent occurrences of loadshedding, a measure implemented by Eskom to manage electricity generation shortages by curtailing demand. Loadshedding, classified into stages ranging from 1 to 8 based on severity, involves the systematic rotation of power cuts across municipalities according to predefined schedules. However, this practice introduces distortions in recorded electricity demand, posing challenges to accurate forecasting essential for budgeting, network planning, and generation scheduling. Addressing this challenge requires the development of a methodology to quantify the impact of loadshedding and integrate it back into metered electricity demand data. Fortunately, comprehensive records of loadshedding impacts are maintained in a database, enabling the alignment of Loadshedding effects with hourly demand data. This adjustment ensures that forecasts accurately reflect true demand patterns, independent of loadshedding's influence, thereby enhancing the reliability of electricity supply management in South Africa. This paper presents a methodology for determining the hourly impact of load scheduling and subsequently adjusting historical demand data to account for it. Furthermore, two forecasting models are developed: one utilizing the original dataset and the other using the adjusted data. A comparative analysis is conducted to evaluate forecast accuracy improvements resulting from the adjustment process. By implementing this methodology, stakeholders can make more informed decisions regarding electricity infrastructure investments, resource allocation, and operational planning, contributing to the overall stability and efficiency of South Africa's electricity supply system.Keywords: electricity demand forecasting, load shedding, demand side management, data science
Procedia PDF Downloads 661349 Mean Shift-Based Preprocessing Methodology for Improved 3D Buildings Reconstruction
Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour
Abstract:
In this work we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift
Procedia PDF Downloads 3181348 Recognition of Grocery Products in Images Captured by Cellular Phones
Authors: Farshideh Einsele, Hassan Foroosh
Abstract:
In this paper, we present a robust algorithm to recognize extracted text from grocery product images captured by mobile phone cameras. Recognition of such text is challenging since text in grocery product images varies in its size, orientation, style, illumination, and can suffer from perspective distortion. Pre-processing is performed to make the characters scale and rotation invariant. Since text degradations can not be appropriately defined using wellknown geometric transformations such as translation, rotation, affine transformation and shearing, we use the whole character black pixels as our feature vector. Classification is performed with minimum distance classifier using the maximum likelihood criterion, which delivers very promising Character Recognition Rate (CRR) of 89%. We achieve considerably higher Word Recognition Rate (WRR) of 99% when using lower level linguistic knowledge about product words during the recognition process.Keywords: camera-based OCR, feature extraction, document, image processing, grocery products
Procedia PDF Downloads 4081347 A Comparative Analysis of (De)legitimation Strategies in Selected African Inaugural Speeches
Authors: Lily Chimuanya, Ehioghae Esther
Abstract:
Language, a versatile and sophisticated tool, is fundamentally sacrosanct to mankind especially within the realm of politics. In this dynamic world, political leaders adroitly use language to engage in a strategic show aimed at manipulating or mechanising the opinion of discerning people. This nuanced synergy is marked by different rhetorical strategies, meticulously synced with contextual factors ranging from cultural, ideological, and political to achieve multifaceted persuasive objectives. This study investigates the (de)legitimation strategies inherent in African presidential inaugural speeches, as African leaders not only state their policy agenda through inaugural speeches but also subtly indulge in a dance of legitimation and delegitimation, performing a twofold objective of strengthening the credibility of their administration and, at times, undermining the performance of the past administration. Drawing insights from two different legitimation models and a dataset of 4 African presidential inaugural speeches obtained from authentic websites, the study describes the roles of authorisation, rationalisation, moral evaluation, altruism, and mythopoesis in unmasking the structure of political discourse. The analysis takes a mixed-method approach to unpack the (de)legitimation strategy embedded in the carefully chosen speeches. The focus extends beyond a superficial exploration and delves into the linguistic elements that form the basis of presidential discourse. In conclusion, this examination goes beyond the nuanced landscape of language as a potent tool in politics, with each strategy contributing to the overall rhetorical impact and shaping the narrative. From this perspective, the study argues that presidential inaugural speeches are not only linguistic exercises but also viable weapons that influence perceptions and legitimise authority.Keywords: CDA, legitimation, inaugural speeches, delegitmation
Procedia PDF Downloads 741346 An Investigation of the University Council’s Image: A Case of Suan Sunandha Rajabhat University
Authors: Phitsanu Phunphetchphan
Abstract:
The purposes of this research was to investigate opinions of Rajabhat University staff towards performance of the university council committee by focusing on (1) personal characteristics of the committees; (2) duties designated by the university council; and (3) relationship between university council and staff. The population of this study included all high level of management from Suan Sunandha Rajabhat University which made a total of 200 respondents. Data analysis included frequency, percentage, mean and standard deviation. The findings revealed that the majority of staff rated the performance of university council at a high level. The 'overall appropriate qualification of the university council' was rated as the highest score while 'good governance' was rated as the lowest mean score. Moreover, the findings also revealed that the relationship between university council’s members and the staff was rated at a high level while 'the integrity of policy implementation' was rated as the lowest score.Keywords: investigation, performance, university council, management
Procedia PDF Downloads 2541345 Imaging Based On Bi-Static SAR Using GPS L5 Signal
Authors: Tahir Saleem, Mohammad Usman, Nadeem Khan
Abstract:
GPS signals are used for navigation and positioning purposes by a diverse set of users. However, this project intends to utilize the reflected GPS L5 signals for location of target in a region of interest by generating an image that highlights the positions of targets in the area of interest. The principle of bi-static radar is used to detect the targets or any movement or changes. The idea is confirmed by the results obtained during MATLAB simulations. A matched filter based technique is employed in the signal processing to improve the system resolution. The simulation is carried out under different conditions with moving receiver and targets. Noise and attenuation is also induced and atmospheric conditions that affect the direct and reflected GPS signals have been simulated to generate a more practical scenario. A realistic GPS L5 signal has been simulated, the simulation results verify that the detection and imaging of targets is possible by employing reflected GPS using L5 signals and matched filter processing technique with acceptable spatial resolution.Keywords: GPS, L5 Signal, SAR, spatial resolution
Procedia PDF Downloads 5401344 Voxel Models as Input for Heat Transfer Simulations with Siemens NX Based on X-Ray Microtomography Images of Random Fibre Reinforced Composites
Authors: Steven Latré, Frederik Desplentere, Ilya Straumit, Stepan V. Lomov
Abstract:
A method is proposed in order to create a three-dimensional finite element model representing fibre reinforced insulation materials for the simulation software Siemens NX. VoxTex software, a tool for quantification of µCT images of fibrous materials, is used for the transformation of microtomography images of random fibre reinforced composites into finite element models. An automatic tool was developed to execute the import of the models to the thermal solver module of Siemens NX. The paper describes the numerical tools used for the image quantification and the transformation and illustrates them on several thermal simulations of fibre reinforced insulation blankets filled with low thermal conductive fillers. The calculation of thermal conductivity is validated by comparison with the experimental data.Keywords: analysis, modelling, thermal, voxel
Procedia PDF Downloads 2891343 Scalable and Accurate Detection of Pathogens from Whole-Genome Shotgun Sequencing
Authors: Janos Juhasz, Sandor Pongor, Balazs Ligeti
Abstract:
Next-generation sequencing, especially whole genome shotgun sequencing, is becoming a common approach to gain insight into the microbiomes in a culture-independent way, even in clinical practice. It does not only give us information about the species composition of an environmental sample but opens the possibility to detect antimicrobial resistance and novel, or currently unknown, pathogens. Accurately and reliably detecting the microbial strains is a challenging task. Here we present a sensitive approach for detecting pathogens in metagenomics samples with special regard to detecting novel variants of known pathogens. We have developed a pipeline that uses fast, short read aligner programs (i.e., Bowtie2/BWA) and comprehensive nucleotide databases. Taxonomic binning is based on the lowest common ancestor (LCA) principle; each read is assigned to a taxon, covering the most significantly hit taxa. This approach helps in balancing between sensitivity and running time. The program was tested both on experimental and synthetic data. The results implicate that our method performs as good as the state-of-the-art BLAST-based ones, furthermore, in some cases, it even proves to be better, while running two orders magnitude faster. It is sensitive and capable of identifying taxa being present only in small abundance. Moreover, it needs two orders of magnitude less reads to complete the identification than MetaPhLan2 does. We analyzed an experimental anthrax dataset (B. anthracis strain BA104). The majority of the reads (96.50%) was classified as Bacillus anthracis, a small portion, 1.2%, was classified as other species from the Bacillus genus. We demonstrate that the evaluation of high-throughput sequencing data is feasible in a reasonable time with good classification accuracy.Keywords: metagenomics, taxonomy binning, pathogens, microbiome, B. anthracis
Procedia PDF Downloads 1391342 Fast Tumor Extraction Method Based on Nl-Means Filter and Expectation Maximization
Authors: Sandabad Sara, Sayd Tahri Yassine, Hammouch Ahmed
Abstract:
The development of science has allowed computer scientists to touch the medicine and bring aid to radiologists as we are presenting it in our article. Our work focuses on the detection and localization of tumors areas in the human brain; this will be a completely automatic without any human intervention. In front of the huge volume of MRI to be treated per day, the radiologist can spend hours and hours providing a tremendous effort. This burden has become less heavy with the automation of this step. In this article we present an automatic and effective tumor detection, this work consists of two steps: the first is the image filtering using the filter Nl-means, then applying the expectation maximization algorithm (EM) for retrieving the tumor mask from the brain MRI and extracting the tumor area using the mask obtained from the second step. To prove the effectiveness of this method multiple evaluation criteria will be used, so that we can compare our method to frequently extraction methods used in the literature.Keywords: MRI, Em algorithm, brain, tumor, Nl-means
Procedia PDF Downloads 3411341 Adaptive Energy-Aware Routing (AEAR) for Optimized Performance in Resource-Constrained Wireless Sensor Networks
Authors: Innocent Uzougbo Onwuegbuzie
Abstract:
Wireless Sensor Networks (WSNs) are crucial for numerous applications, yet they face significant challenges due to resource constraints such as limited power and memory. Traditional routing algorithms like Dijkstra, Ad hoc On-Demand Distance Vector (AODV), and Bellman-Ford, while effective in path establishment and discovery, are not optimized for the unique demands of WSNs due to their large memory footprint and power consumption. This paper introduces the Adaptive Energy-Aware Routing (AEAR) model, a solution designed to address these limitations. AEAR integrates reactive route discovery, localized decision-making using geographic information, energy-aware metrics, and dynamic adaptation to provide a robust and efficient routing strategy. We present a detailed comparative analysis using a dataset of 50 sensor nodes, evaluating power consumption, memory footprint, and path cost across AEAR, Dijkstra, AODV, and Bellman-Ford algorithms. Our results demonstrate that AEAR significantly reduces power consumption and memory usage while optimizing path weight. This improvement is achieved through adaptive mechanisms that balance energy efficiency and link quality, ensuring prolonged network lifespan and reliable communication. The AEAR model's superior performance underlines its potential as a viable routing solution for energy-constrained WSN environments, paving the way for more sustainable and resilient sensor network deployments.Keywords: wireless sensor networks (WSNs), adaptive energy-aware routing (AEAR), routing algorithms, energy, efficiency, network lifespan
Procedia PDF Downloads 411340 Generalized Additive Model for Estimating Propensity Score
Authors: Tahmidul Islam
Abstract:
Propensity Score Matching (PSM) technique has been widely used for estimating causal effect of treatment in observational studies. One major step of implementing PSM is estimating the propensity score (PS). Logistic regression model with additive linear terms of covariates is most used technique in many studies. Logistics regression model is also used with cubic splines for retaining flexibility in the model. However, choosing the functional form of the logistic regression model has been a question since the effectiveness of PSM depends on how accurately the PS been estimated. In many situations, the linearity assumption of linear logistic regression may not hold and non-linear relation between the logit and the covariates may be appropriate. One can estimate PS using machine learning techniques such as random forest, neural network etc for more accuracy in non-linear situation. In this study, an attempt has been made to compare the efficacy of Generalized Additive Model (GAM) in various linear and non-linear settings and compare its performance with usual logistic regression. GAM is a non-parametric technique where functional form of the covariates can be unspecified and a flexible regression model can be fitted. In this study various simple and complex models have been considered for treatment under several situations (small/large sample, low/high number of treatment units) and examined which method leads to more covariate balance in the matched dataset. It is found that logistic regression model is impressively robust against inclusion quadratic and interaction terms and reduces mean difference in treatment and control set equally efficiently as GAM does. GAM provided no significantly better covariate balance than logistic regression in both simple and complex models. The analysis also suggests that larger proportion of controls than treatment units leads to better balance for both of the methods.Keywords: accuracy, covariate balances, generalized additive model, logistic regression, non-linearity, propensity score matching
Procedia PDF Downloads 3701339 Enhancing the Work of Art through Fashion Attire
Authors: A. N. Roslen, S. A. Syed-Sahil, A. Musavir
Abstract:
In Malaysia, there are only few fashion designers who are inspired by the work of artists when creating their collections. The researchers confirmed this statement by interviewing fashion experts in Malaysia. The objectives of this study are to: 1. Investigate the acceptance of fashion inspired by the work of art among consumers. 2. Encourage more designers to use work of art as their inspirations. 3. Promote Malaysian Artists through fashion. Thus, the researchers interviewed Malaysian fashion designers, image consultants, and one famous Malaysian Artist (Awang Damit). All of them had agreed that the fashion inspired by the work of art in Malaysia has a long way to go. Therefore, the researchers’ aim is to attract more fashion designers to use the work of local artists in their creations. The researchers had used interview, survey and experimentation as methods of this study. In the experimentation procedure, paintings of local artist, Awang Damit was used as a source of inspiration in creating a design Line. The result of this study had shown that fashion inspired by work of art is acknowledged and accepted by the designers and consumers.Keywords: art, fashion, inspiration, local artist
Procedia PDF Downloads 4631338 Studying the Role of Teachers’ Self-Acceptance in the Development of Their Self-Esteem and Efficacy Level: A Case Study Applied to 37 Teachers at the English Department, Sidi Bel Abbes, Algeria
Authors: Asmaa Baghli
Abstract:
Self-acceptance is one of the most pertinent notions that attracted the attention of many scholars. These latters believed that the sense of self-acceptance for people contributes in the emergence of their self-esteem and helps to improve their efficacy level. Simply defined, self-acceptance stands for the ability of the person to admire and accept herself and her potentials. This fact is believed to participate in the personal image creation depending on the qualities and features possessed. Hitherto, the following paper aims, first, to provide a brief and concise definition of self-acceptance, self-esteem and self-efficacy. It tries to explain the correlation between the three concepts along with its linkage to language teaching. Then, it examines teachers’ acceptance level and its influence on their classroom actions. For that purpose, the main methodology undertaken is the mixed method. That means the combination between both quantitative and qualitative research methods. The prime tools selected are a questionnaire and self-acceptance test for teachers. Finally, it suggests some techniques for developing teachers’ self-acceptance.Keywords: competence, development, efficacy, Self-acceptance, self-esteem, teachers
Procedia PDF Downloads 148