Search results for: automatic classification
875 Identification System for Grading Banana in Food Processing Industry
Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan
Abstract:
In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.Keywords: banana, food processing, identification system, neural network
Procedia PDF Downloads 471874 Study on Optimization Design of Pressure Hull for Underwater Vehicle
Authors: Qasim Idrees, Gao Liangtian, Liu Bo, Miao Yiran
Abstract:
In order to improve the efficiency and accuracy of the pressure hull structure, optimization of underwater vehicle based on response surface methodology, a method for optimizing the design of pressure hull structure was studied. To determine the pressure shell of five dimensions as a design variable, the application of thin shell theory and the Chinese Classification Society (CCS) specification was carried on the preliminary design. In order to optimize variables of the feasible region, different methods were studied and implemented such as Opt LHD method (to determine the design test sample points in the feasible domain space), parametric ABAQUS solution for each sample point response, and the two-order polynomial response for the surface model of the limit load of structures. Based on the ultimate load of the structure and the quality of the shell, the two-generation genetic algorithm was used to solve the response surface, and the Pareto optimal solution set was obtained. The final optimization result was 41.68% higher than that of the initial design, and the shell quality was reduced by about 27.26%. The parametric method can ensure the accuracy of the test and improve the efficiency of optimization.Keywords: parameterization, response surface, structure optimization, pressure hull
Procedia PDF Downloads 233873 Using Satellite Images Datasets for Road Intersection Detection in Route Planning
Authors: Fatma El-Zahraa El-Taher, Ayman Taha, Jane Courtney, Susan Mckeever
Abstract:
Understanding road networks plays an important role in navigation applications such as self-driving vehicles and route planning for individual journeys. Intersections of roads are essential components of road networks. Understanding the features of an intersection, from a simple T-junction to larger multi-road junctions, is critical to decisions such as crossing roads or selecting the safest routes. The identification and profiling of intersections from satellite images is a challenging task. While deep learning approaches offer the state-of-the-art in image classification and detection, the availability of training datasets is a bottleneck in this approach. In this paper, a labelled satellite image dataset for the intersection recognition problem is presented. It consists of 14,692 satellite images of Washington DC, USA. To support other users of the dataset, an automated download and labelling script is provided for dataset replication. The challenges of construction and fine-grained feature labelling of a satellite image dataset is examined, including the issue of how to address features that are spread across multiple images. Finally, the accuracy of the detection of intersections in satellite images is evaluated.Keywords: satellite images, remote sensing images, data acquisition, autonomous vehicles
Procedia PDF Downloads 145872 Encryption and Decryption of Nucleic Acid Using Deoxyribonucleic Acid Algorithm
Authors: Iftikhar A. Tayubi, Aabdulrahman Alsubhi, Abdullah Althrwi
Abstract:
The deoxyribonucleic acid text provides a single source of high-quality Cryptography about Deoxyribonucleic acid sequence for structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to encrypt and decrypt Deoxy Ribonucleic Acid sequence text. It includes complex, securing by using Algorithm to encrypt and decrypt Deoxy Ribonucleic Acid sequence. The utility of this Deoxy Ribonucleic Acid Sequence Text is that, it can provide a user-friendly interface for users to Encrypt and Decrypt store the information about Deoxy Ribonucleic Acid sequence. These interfaces created in this project will satisfy the demands of the scientific community by providing fully encrypt of Deoxy Ribonucleic Acid sequence during this website. We have adopted a methodology by using C# and Active Server Page.NET for programming which is smart and secure. Deoxy Ribonucleic Acid sequence text is a wonderful piece of equipment for encrypting large quantities of data, efficiently. The users can thus navigate from one encoding and store orange text, depending on the field for user’s interest. Algorithm classification allows a user to Protect the deoxy ribonucleic acid sequence from change, whether an alteration or error occurred during the Deoxy Ribonucleic Acid sequence data transfer. It will check the integrity of the Deoxy Ribonucleic Acid sequence data during the access.Keywords: algorithm, ASP.NET, DNA, encrypt, decrypt
Procedia PDF Downloads 234871 The Impact of Legislation on Waste and Losses in the Food Processing Sector in the UK/EU
Authors: David Lloyd, David Owen, Martin Jardine
Abstract:
Introduction: European weight regulations with respect to food products require a full understanding of regulation guidelines to assure regulatory compliance. It is suggested that the complexity of regulation leads to practices which result to over filling of food packages by food processors. Purpose: To establish current practices by food processors and the financial, sustainable and societal impacts on the food supply chain of ineffective food production practices. Methods: An analysis of food packing controls with 10 companies of varying food categories and quantitative based research of a further 15 food processes on the confidence in weight control analysis of finished food packs within their organisation. Results: A process floor analysis of manufacturing operations focussing on 10 products found over fill of packages ranging from 4.8% to 20.2%. Standard deviation figures for all products showed a potential for reducing average weight of the pack whilst still retain the legal status of the product. In 20% of cases, an automatic weight analysis machine was in situ however weight packs were still significantly overweight. Collateral impacts noted included the effect of overfill on raw material purchase and added food miles often on a global basis with one raw material alone creating 10,000 extra food miles due to the poor weight control of the processing unit. A case study of a meat and bakery product will be discussed with the impact of poor controls resulting from complex legislation. The case studies will highlight extra energy costs in production and the impact of the extra weight on fuel usage. If successful a risk assessment model used primarily on food safety but adapted to identify waste /sustainability risks will be discussed within the presentation.Keywords: legislation, overfill, profile, waste
Procedia PDF Downloads 407870 A Paradigm Shift towards Personalized and Scalable Product Development and Lifecycle Management Systems in the Aerospace Industry
Authors: David E. Culler, Noah D. Anderson
Abstract:
Integrated systems for product design, manufacturing, and lifecycle management are difficult to implement and customize. Commercial software vendors, including CAD/CAM and third party PDM/PLM developers, create user interfaces and functionality that allow their products to be applied across many industries. The result is that systems become overloaded with functionality, difficult to navigate, and use terminology that is unfamiliar to engineers and production personnel. For example, manufacturers of automotive, aeronautical, electronics, and household products use similar but distinct methods and processes. Furthermore, each company tends to have their own preferred tools and programs for controlling work and information flow and that connect design, planning, and manufacturing processes to business applications. This paper presents a methodology and a case study that addresses these issues and suggests that in the future more companies will develop personalized applications that fit to the natural way that their business operates. A functioning system has been implemented at a highly competitive U.S. aerospace tooling and component supplier that works with many prominent airline manufacturers around the world including The Boeing Company, Airbus, Embraer, and Bombardier Aerospace. During the last three years, the program has produced significant benefits such as the automatic creation and management of component and assembly designs (parametric models and drawings), the extensive use of lightweight 3D data, and changes to the way projects are executed from beginning to end. CATIA (CAD/CAE/CAM) and a variety of programs developed in C#, VB.Net, HTML, and SQL make up the current system. The web-based platform is facilitating collaborative work across multiple sites around the world and improving communications with customers and suppliers. This work demonstrates that the creative use of Application Programming Interface (API) utilities, libraries, and methods is a key to automating many time-consuming tasks and linking applications together.Keywords: PDM, PLM, collaboration, CAD/CAM, scalable systems
Procedia PDF Downloads 174869 An Application for Risk of Crime Prediction Using Machine Learning
Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento
Abstract:
The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.Keywords: crime prediction, machine learning, public safety, smart city
Procedia PDF Downloads 112868 Stock Market Prediction Using Convolutional Neural Network That Learns from a Graph
Authors: Mo-Se Lee, Cheol-Hwi Ahn, Kee-Young Kwahk, Hyunchul Ahn
Abstract:
Over the past decade, deep learning has been in spotlight among various machine learning algorithms. In particular, CNN (Convolutional Neural Network), which is known as effective solution for recognizing and classifying images, has been popularly applied to classification and prediction problems in various fields. In this study, we try to apply CNN to stock market prediction, one of the most challenging tasks in the machine learning research. In specific, we propose to apply CNN as the binary classifier that predicts stock market direction (up or down) by using a graph as its input. That is, our proposal is to build a machine learning algorithm that mimics a person who looks at the graph and predicts whether the trend will go up or down. Our proposed model consists of four steps. In the first step, it divides the dataset into 5 days, 10 days, 15 days, and 20 days. And then, it creates graphs for each interval in step 2. In the next step, CNN classifiers are trained using the graphs generated in the previous step. In step 4, it optimizes the hyper parameters of the trained model by using the validation dataset. To validate our model, we will apply it to the prediction of KOSPI200 for 1,986 days in eight years (from 2009 to 2016). The experimental dataset will include 14 technical indicators such as CCI, Momentum, ROC and daily closing price of KOSPI200 of Korean stock market.Keywords: convolutional neural network, deep learning, Korean stock market, stock market prediction
Procedia PDF Downloads 425867 Using Closed Frequent Itemsets for Hierarchical Document Clustering
Authors: Cheng-Jhe Lee, Chiun-Chieh Hsu
Abstract:
Due to the rapid development of the Internet and the increased availability of digital documents, the excessive information on the Internet has led to information overflow problem. In order to solve these problems for effective information retrieval, document clustering in text mining becomes a popular research topic. Clustering is the unsupervised classification of data items into groups without the need of training data. Many conventional document clustering methods perform inefficiently for large document collections because they were originally designed for relational database. Therefore they are impractical in real-world document clustering and require special handling for high dimensionality and high volume. We propose the FIHC (Frequent Itemset-based Hierarchical Clustering) method, which is a hierarchical clustering method developed for document clustering, where the intuition of FIHC is that there exist some common words for each cluster. FIHC uses such words to cluster documents and builds hierarchical topic tree. In this paper, we combine FIHC algorithm with ontology to solve the semantic problem and mine the meaning behind the words in documents. Furthermore, we use the closed frequent itemsets instead of only use frequent itemsets, which increases efficiency and scalability. The experimental results show that our method is more accurate than those of well-known document clustering algorithms.Keywords: FIHC, documents clustering, ontology, closed frequent itemset
Procedia PDF Downloads 399866 A Comparative Analysis of Machine Learning Techniques for PM10 Forecasting in Vilnius
Authors: Mina Adel Shokry Fahim, Jūratė Sužiedelytė Visockienė
Abstract:
With the growing concern over air pollution (AP), it is clear that this has gained more prominence than ever before. The level of consciousness has increased and a sense of knowledge now has to be forwarded as a duty by those enlightened enough to disseminate it to others. This realisation often comes after an understanding of how poor air quality indices (AQI) damage human health. The study focuses on assessing air pollution prediction models specifically for Lithuania, addressing a substantial need for empirical research within the region. Concentrating on Vilnius, it specifically examines particulate matter concentrations 10 micrometers or less in diameter (PM10). Utilizing Gaussian Process Regression (GPR) and Regression Tree Ensemble, and Regression Tree methodologies, predictive forecasting models are validated and tested using hourly data from January 2020 to December 2022. The study explores the classification of AP data into anthropogenic and natural sources, the impact of AP on human health, and its connection to cardiovascular diseases. The study revealed varying levels of accuracy among the models, with GPR achieving the highest accuracy, indicated by an RMSE of 4.14 in validation and 3.89 in testing.Keywords: air pollution, anthropogenic and natural sources, machine learning, Gaussian process regression, tree ensemble, forecasting models, particulate matter
Procedia PDF Downloads 53865 Facility Data Model as Integration and Interoperability Platform
Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes
Abstract:
Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.Keywords: airport ontology, energy management, facility data model, ontology modeling
Procedia PDF Downloads 448864 A Web Service Based Sensor Data Management System
Authors: Rose A. Yemson, Ping Jiang, Oyedeji L. Inumoh
Abstract:
The deployment of wireless sensor network has rapidly increased, however with the increased capacity and diversity of sensors, and applications ranging from biological, environmental, military etc. generates tremendous volume of data’s where more attention is placed on the distributed sensing and little on how to manage, analyze, retrieve and understand the data generated. This makes it more quite difficult to process live sensor data, run concurrent control and update because sensor data are either heavyweight, complex, and slow. This work will focus on developing a web service platform for automatic detection of sensors, acquisition of sensor data, storage of sensor data into a database, processing of sensor data using reconfigurable software components. This work will also create a web service based sensor data management system to monitor physical movement of an individual wearing wireless network sensor technology (SunSPOT). The sensor will detect movement of that individual by sensing the acceleration in the direction of X, Y and Z axes accordingly and then send the sensed reading to a database that will be interfaced with an internet platform. The collected sensed data will determine the posture of the person such as standing, sitting and lying down. The system is designed using the Unified Modeling Language (UML) and implemented using Java, JavaScript, html and MySQL. This system allows real time monitoring an individual closely and obtain their physical activity details without been physically presence for in-situ measurement which enables you to work remotely instead of the time consuming check of an individual. These details can help in evaluating an individual’s physical activity and generate feedback on medication. It can also help in keeping track of any mandatory physical activities required to be done by the individuals. These evaluations and feedback can help in maintaining a better health status of the individual and providing improved health care.Keywords: HTML, java, javascript, MySQL, sunspot, UML, web-based, wireless network sensor
Procedia PDF Downloads 212863 Energy Management System and Interactive Functions of Smart Plug for Smart Home
Authors: Win Thandar Soe, Innocent Mpawenimana, Mathieu Di Fazio, Cécile Belleudy, Aung Ze Ya
Abstract:
Intelligent electronic equipment and automation network is the brain of high-tech energy management systems in critical role of smart homes dominance. Smart home is a technology integration for greater comfort, autonomy, reduced cost, and energy saving as well. These services can be provided to home owners for managing their home appliances locally or remotely and consequently allow them to automate intelligently and responsibly their consumption by individual or collective control systems. In this study, three smart plugs are described and one of them tested on typical household appliances. This article proposes to collect the data from the wireless technology and to extract some smart data for energy management system. This smart data is to quantify for three kinds of load: intermittent load, phantom load and continuous load. Phantom load is a waste power that is one of unnoticed power of each appliance while connected or disconnected to the main. Intermittent load and continuous load take in to consideration the power and using time of home appliances. By analysing the classification of loads, this smart data will be provided to reduce the communication of wireless sensor network for energy management system.Keywords: energy management, load profile, smart plug, wireless sensor network
Procedia PDF Downloads 273862 The Use of Boosted Multivariate Trees in Medical Decision-Making for Repeated Measurements
Authors: Ebru Turgal, Beyza Doganay Erdogan
Abstract:
Machine learning aims to model the relationship between the response and features. Medical decision-making researchers would like to make decisions about patients’ course and treatment, by examining the repeated measurements over time. Boosting approach is now being used in machine learning area for these aims as an influential tool. The aim of this study is to show the usage of multivariate tree boosting in this field. The main reason for utilizing this approach in the field of decision-making is the ease solutions of complex relationships. To show how multivariate tree boosting method can be used to identify important features and feature-time interaction, we used the data, which was collected retrospectively from Ankara University Chest Diseases Department records. Dataset includes repeated PF ratio measurements. The follow-up time is planned for 120 hours. A set of different models is tested. In conclusion, main idea of classification with weighed combination of classifiers is a reliable method which was shown with simulations several times. Furthermore, time varying variables will be taken into consideration within this concept and it could be possible to make accurate decisions about regression and survival problems.Keywords: boosted multivariate trees, longitudinal data, multivariate regression tree, panel data
Procedia PDF Downloads 203861 Phylogenetic Relationships of Common Reef Fish Species in Vietnam
Authors: Dang Thuy Binh, Truong Thi Oanh, Le Phan Khanh Hung, Luong thi Tuong Vy
Abstract:
One of the greatest environmental challenges facing Asia is the management and conservation of the marine biodiversity threaten by fisheries overexploitation, pollution, habitat destruction, and climate change. To date, a few molecular taxonomical studies has been conducted on marine fauna in Vietnam. The purpose of this study was to clarify the phylogeny of economic and ecological reef fish species in Vietnam Reef fish species covering Labridae, Scaridae, Nemipteridae, Serranidae, Acanthuridae, Lutjanidae, Lethrinidae, Mullidae, Balistidae, Pseudochromidae, Pinguipedidae, Fistulariidae, Holocentridae, Synodontidae, and Pomacentridae representing 28 genera were collected from South and Center, Vietnam. Combine with Genbank sequences, a phylogenetic tree was constructed based on 16S gene of mitochondrial DNA using maximum parsimony, maximum likelihood, and Bayesian inference approaches. The phylogram showed the well-resolved clades at genus and family level. Perciformes is the major order of reef fish species in Vietnam. The monophyly of Perciformes is not strongly supported as it was clustered in the same clade with Tetraodontiformes syngnathiformes and Beryciformes. Continue sampling of commercial fish species and classification based on morphology and genetics to build DNA barcoding of fish species in Vietnam is really necessary.Keywords: reef fish, 16s rDNA, Vietnam, phylogeny
Procedia PDF Downloads 438860 Response of Yield and Morphological Characteristic of Rice Cultivars to Heat Stress at Different Growth Stages
Authors: Mohammad Taghi Karbalaei Aghamolki, Mohd Khanif Yusop, Fateh Chand Oad, Hamed Zakikhani, Hawa Zee Jaafar, Sharifh Kharidah, Mohamed Hanafi Musa, Shahram Soltani
Abstract:
The high temperatures during sensitive growth phases are changing rice morphology as well as influencing yield. In the glass house study, the treatments were: growing conditions [normal growing (32oC+2) and heat stress (38oC+2) day time and 22oC+2 night time], growth stages (booting, flowering and ripening) and four cultivars (Hovaze, Hashemi, Fajr, as exotic and MR219 as indigenous). The heat chamber was prepared covered with plastic, and automatic heater was adjusted at 38oC+2 (day) and 22oC+2 (night) for two weeks in every growth stages. Rice morphological and yield under the influence of heat stress during various growth stages showed taller plants in Hashsemi due to its tall character. The total tillers per hill were significantly higher in Fajr receiving heat stress during booting stage. In all growing conditions and growth stages, Hashemi recorded higher panicle exertion and flag leaf length. The flag leaf width in all situations was found higher in Hovaze. The total tillers per hill were more in Fajr, although heat stress was imposed during booting and flowering stages. The indigenous MR219 in all situations of growing conditions, growth stages recorded higher grain yield. However, its grain yield slightly decreased when heat stress was imposed during booting and flowering. Similar results were found in all other exotic cultivars recording to lower grain yield in the heat stress condition during booting and flowering. However, plants had no effect on heat stress during ripening stage.Keywords: rice, growth, heat, temperature, stress, morphology, yield
Procedia PDF Downloads 276859 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 37858 Breast Cancer Diagnosing Based on Online Sequential Extreme Learning Machine Approach
Authors: Musatafa Abbas Abbood Albadr, Masri Ayob, Sabrina Tiun, Fahad Taha Al-Dhief, Mohammad Kamrul Hasan
Abstract:
Breast Cancer (BC) is considered one of the most frequent reasons of cancer death in women between 40 to 55 ages. The BC is diagnosed by using digital images of the FNA (Fine Needle Aspirate) for both benign and malignant tumors of the breast mass. Therefore, this work proposes the Online Sequential Extreme Learning Machine (OSELM) algorithm for diagnosing BC by using the tumor features of the breast mass. The current work has used the Wisconsin Diagnosis Breast Cancer (WDBC) dataset, which contains 569 samples (i.e., 357 samples for benign class and 212 samples for malignant class). Further, numerous measurements of assessment were used in order to evaluate the proposed OSELM algorithm, such as specificity, precision, F-measure, accuracy, G-mean, MCC, and recall. According to the outcomes of the experiment, the highest performance of the proposed OSELM was accomplished with 97.66% accuracy, 98.39% recall, 95.31% precision, 97.25% specificity, 96.83% F-measure, 95.00% MCC, and 96.84% G-Mean. The proposed OSELM algorithm demonstrates promising results in diagnosing BC. Besides, the performance of the proposed OSELM algorithm was superior to all its comparatives with respect to the rate of classification.Keywords: breast cancer, machine learning, online sequential extreme learning machine, artificial intelligence
Procedia PDF Downloads 111857 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis
Authors: Shriya Shukla, Lachin Fernando
Abstract:
Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning
Procedia PDF Downloads 126856 An Analysis of the Representation of the Translator and Translation Process into Brazilian Social Networking Groups
Authors: Érica Lima
Abstract:
In the digital era, in which we have an avalanche of information, it is not new that the Internet has brought new modes of communication and knowledge access. Characterized by the multiplicity of discourses, opinions, beliefs and cultures, the web is a space of political-ideological dimensions where people (who often do not know each other) interact and create representations, deconstruct stereotypes, and redefine identities. Currently, the translator needs to be able to deal with digital spaces ranging from specific software to social media, which inevitably impact on his professional life. One of the most impactful ways of being seen in cyberspace is the participation in social networking groups. In addition to its ability to disseminate information among participants, social networking groups allow a significant personal and social exposure. Such exposure is due to the visibility of each participant achieved not only on its personal profile page, but also in each comment or post the person makes in the groups. The objective of this paper is to study the representations of translators and translation process on the Internet, more specifically in publications in two Brazilian groups of great influence on the Facebook: "Translators/Interpreters" and "Translators, Interpreters and Curious". These chosen groups represent the changes the network has brought to the profession, including the way translators are seen and see themselves. The analyzed posts allowed a reading of what common sense seems to think about the translator as opposed to what the translators seem to think about themselves as a professional class. The results of the analysis lead to the conclusion that these two positions are antagonistic and sometimes represent conflict of interests: on the one hand, the society in general consider the translator’s work something easy, therefore it is not necessary to be well remunerated; on the other hand, the translators who know how complex a translation process is and how much it takes to be a good professional. The results also reveal that social networking sites such as Facebook provide more visibility, but it takes a more active role from the translator to achieve a greater appreciation of the profession and more recognition of the role of the translator, especially in face of increasingly development of automatic translation programs.Keywords: Facebook, social representation, translation, translator
Procedia PDF Downloads 148855 The Analysis of Differential Item and Test Functioning between Sexes by Studying on the Scholastic Aptitude Test 2013
Authors: Panwasn Mahalawalert
Abstract:
The purposes of this research were analyzed differential item functioning and differential test functioning of SWUSAT aptitude test classification by sex variable. The data used in this research is the secondary data from Srinakharinwirot University Scholastic Aptitude Test 2013 (SWUSAT). SWUSAT test consists of four subjects. There are verbal ability test, number ability test, reasoning ability test and spatial ability test. The data analysis was analyzed in 2 steps. The first step was analyzing descriptive statistics. In the second step were analyzed differential item functioning (DIF) and differential test functioning (DTF) by using the DIFAS program. The research results were as follows: The results of DIF and DTF analysis for all 10 tests in year 2013. Gender was the characteristic that found DIF all 10 tests. The percentage of item number that found DIF is between 6.67% - 60%. There are 5 tests that most of items favors female group and 2 tests that most of items favors male group. There are 3 tests that the number of items favors female group equal favors male group. For Differential test functioning (DTF), there are 8 tests that have small level.Keywords: aptitude test, differential item functioning, differential test functioning, educational measurement
Procedia PDF Downloads 412854 An Investigation of Differential Item and Test Functioning of Scholastic Aptitude Test 2011 (SWUSAT 2011)
Authors: Ruangdech Sirikit
Abstract:
The purposes of this study were analyzed differential item functioning and differential test functioning of SWUSAT aptitude test classification by sex variable. The data used in this research is the secondary data from Srinakharinwirot University Scholastic Aptitude Test 2011 (SWUSAT 2011) SWUSAT test consists of four subjects. There are verbal ability test, number ability test, reasoning ability test and spatial ability test. The data analysis was carried out in 2 steps. The first step was analyzing descriptive statistics. In the second step were analyzed differential item functioning (DIF) and differential test functioning (DTF) by using the DIFAS program. The research results were as follows: The results of data analysis for all 10 tests in year 2011. Sex was the characteristic that found DIF all 10 tests. The percentage of item number that found DIF was between 10% - 46.67%. There are 4 tests that most of items favors female group. There are 3 tests that most of items favors male group and there are 3 tests that the number of items favors female group equal favors male group. For Differential test functioning (DTF), there are 8 tests that have small DIF effect variance.Keywords: differential item functioning, differential test functioning, SWUSAT, aptitude test
Procedia PDF Downloads 611853 Dido: An Automatic Code Generation and Optimization Framework for Stencil Computations on Distributed Memory Architectures
Authors: Mariem Saied, Jens Gustedt, Gilles Muller
Abstract:
We present Dido, a source-to-source auto-generation and optimization framework for multi-dimensional stencil computations. It enables a large programmer community to easily and safely implement stencil codes on distributed-memory parallel architectures with Ordered Read-Write Locks (ORWL) as an execution and communication back-end. ORWL provides inter-task synchronization for data-oriented parallel and distributed computations. It has been proven to guarantee equity, liveness, and efficiency for a wide range of applications, particularly for iterative computations. Dido consists mainly of an implicitly parallel domain-specific language (DSL) implemented as a source-level transformer. It captures domain semantics at a high level of abstraction and generates parallel stencil code that leverages all ORWL features. The generated code is well-structured and lends itself to different possible optimizations. In this paper, we enhance Dido to handle both Jacobi and Gauss-Seidel grid traversals. We integrate temporal blocking to the Dido code generator in order to reduce the communication overhead and minimize data transfers. To increase data locality and improve intra-node data reuse, we coupled the code generation technique with the polyhedral parallelizer Pluto. The accuracy and portability of the generated code are guaranteed thanks to a parametrized solution. The combination of ORWL features, the code generation pattern and the suggested optimizations, make of Dido a powerful code generation framework for stencil computations in general, and for distributed-memory architectures in particular. We present a wide range of experiments over a number of stencil benchmarks.Keywords: stencil computations, ordered read-write locks, domain-specific language, polyhedral model, experiments
Procedia PDF Downloads 127852 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods
Authors: Bandar Alahmadi, Lethia Jackson
Abstract:
Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.Keywords: adversarial examples, attack, computer vision, image processing
Procedia PDF Downloads 339851 Comparative Study of sLASER and PRESS Techniques in Magnetic Resonance Spectroscopy of Normal Brain
Authors: Shin Ku Kim, Yun Ah Oh, Eun Hee Seo, Chang Min Dae, Yun Jung Bae
Abstract:
Objectives: The commonly used PRESS technique in magnetic resonance spectroscopy (MRS) has a limitation of incomplete water suppression. The recently developed sLASER technique is known for its improved effectiveness in suppressing water signal. However, no prior study has compared both sequences in a normal human brain. In this study, we firstly aimed to compare the performances of both techniques in brain MRS. Materials and methods: From January 2023 to July 2023, thirty healthy participants (mean age 38 years, 17 male, 13 female) without underlying neurological diseases were enrolled in this study. All participants underwent single-voxel MRS using both PRESS and sLASER techniques on 3T MRI. Two regions-of-interest were allocated in the left medial thalamus and left parietal white matter (WM) by a single reader. The SpectroView Analysis (SW5, Philips, Netherlands) provided automatic measurements, including signal-to-noise ratio (SNR) and peak_height of water, N-acetylaspartate (NAA)-water/Choline (Cho)-water/Creatine (Cr)-water ratios, and NAA-Cr/Cho-Cr ratios. The measurements from PRESS and sLASER techniques were compared using paired T-tests and Bland-Altman methods, and the variability was assessed using coefficients of variation (CV). Results: SNR and peak_heights of the water were significantly lower with sLASER compared to PRESS (left medial thalamus, sLASER SNR/peak_height 2092±475/328±85 vs. PRESS 2811±549/440±105); left parietal WM, 5422±1016/872±196 vs. 7152±1305/1150±278; all, P<0.001, respectively). Accordingly, NAA-water/Cho-water/Cr-water ratios and NAA-Cr/Cho-Cr ratios were significantly higher with sLASER than with PRESS (all, P< 0.001, respectively). The variabilities of NAA-water/Cho-water/Cr-water ratios and Cho-Cr ratio in the left medial thalamus were lower with sLASER than with PRESS (CV, sLASER vs. PRESS, 19.9 vs. 58.1/19.8 vs. 54.7/20.5 vs. 43.9 and 11.5 vs. 16.2) Conclusion: The sLASER technique demonstrated enhanced background water suppression, resulting in increased signals and reduced variability in brain metabolite measurements of MRS. Therefore, sLASER could offer a more precise and stable method for identifying brain metabolites.Keywords: Magnetic resonance spectroscopy, Brain, sLASER, PRESS
Procedia PDF Downloads 46850 Queuing Analysis and Optimization of Public Vehicle Transport Stations: A Case of South West Ethiopia Region Vehicle Stations
Authors: Mequanint Birhan
Abstract:
Modern urban environments present a dynamically growing field where, notwithstanding shared goals, several mutually conflicting interests frequently collide. However, it has a big impact on the city's socioeconomic standing, waiting lines and queues are common occurrences. This results in extremely long lines for both vehicles and people on incongruous routes, service coagulation, customer murmuring, unhappiness, complaints, and looking for other options sometimes illegally. The root cause of this is corruption, which leads to traffic jams, stopping, and packing vehicles beyond their safe carrying capacity, and violating the human rights and freedoms of passengers. This study focused on the optimizing time of passengers had to wait in public vehicle stations. This applied research employed both data gathering sources and mixed approaches, then 166 samples of key informants of transport station were taken by using the Slovin sampling formula. The length of time vehicles, including the drivers and auxiliary drivers ‘Weyala', had to wait was also studied. To maximize the service level at vehicle stations, a queuing model was subsequently devised ‘Menaharya’. Time, cost, and quality encompass performance, scope, and suitability for the intended purposes. The minimal response time for passengers and vehicles queuing to reach their final destination at the stations of the Tepi, Mizan, and Bonga towns was determined. A new bus station system was modeled and simulated by Arena simulation software in the chosen study area. 84% improvement on cost reduced by 56.25%, time 4hr to 1.5hr, quality, safety and designed load performance calculations employed. Stakeholders are asked to put the model into practice and monitor the results obtained.Keywords: Arena 14 automatic rockwell, queue, transport services, vehicle stations
Procedia PDF Downloads 78849 Identifying Artifacts in SEM-EDS of Fouled RO Membranes Used for the Treatment of Brackish Groundwater Through Raman and ICP-MS Analysis
Authors: Abhishek Soti, Aditya Sharma, Akhilendra Bhushan Gupta
Abstract:
Fouled reverse osmosis membranes are primarily characterized by Scanning Electron Microscopy (SEM) and Energy Dispersive X-ray Spectrometer (EDS) for a detailed investigation of foulants; however, this has severe limitations on several accounts. Apart from inaccuracy in spectral properties and inevitable interferences and interactions between sample and instrument, misidentification of elements due to overlapping peaks is a significant drawback of EDS. This paper discusses this limitation by analyzing fouled polyamide RO membranes derived from community RO plants of Rajasthan treating brackish water via a combination of results obtained from EDS and Raman spectroscopy and cross corroborating with ICP-MS analysis of water samples prepared by dissolving the deposited salts. The anomalous behavior of different morphic forms of CaCO₃ in aqueous suspensions tends to introduce false reporting of the presence of certain heavy metals and rare earth metals in the scales of the fouled RO membranes used for treating brackish groundwater when analyzed using the commonly adopted techniques like SEM-EDS or Raman spectrometry. Peaks of CaCO₃ reflected in EDS spectra of the membrane were found to be misinterpreted as Scandium due to the automatic assignment of elements by the software. Similarly, the morphic forms merged with the dominant peak of CaCO₃ might be reflected as a single peak of Molybdenum in the Raman spectrum. A subsequent ICP-MS analysis of the deposited salts showed that both Sc and Mo were below detectable levels. It is always essential to cross-confirm the results through a destructive analysis method to avoid such interferences. It is further recommended to study different morphic forms of CaCO₃ scales, as they exhibit anomalous properties like reverse solubility with temperature and hence altered precipitation tendencies, for an accurate description of the composition of scales, which is vital for the smooth functioning of RO systems.Keywords: reverse osmosis, foulant analysis, groundwater, EDS, artifacts
Procedia PDF Downloads 105848 Design Criteria Recommendation to Achieve Accessibility In-House to Different Users
Authors: Claudia Valderrama-Ulloa, Cristian Schmitt, Juan Pablo Marchetti, Viviana Bucarey
Abstract:
Access to adequate housing is a fundamental human right and a crucial factor for health. Housing should be inclusive, accessible, and able to meet the needs of all its inhabitants at every stage of their lives without hindering their health, autonomy, or independence. This article addresses the importance of designing housing for people with disabilities, which varies depending on individual abilities, preferences, and cultural considerations. Based on the components of the International Classification of Functioning, Disability and Health, wheelchair users, little people (achondroplasia), children with autism spectrum disorder and Down syndrome were characterized, and six domains of activities related to daily life inside homes were defined. The article describes the main barriers homes present for this group of people. It proposes a list of architectural and design aspects to reduce barriers to housing use. The aspects are divided into three main groups: space management, building services, and supporting facilities. The article emphasizes the importance of consulting professionals and users with experience designing for diverse needs to create inclusive, safe, and supportive housing for people with disabilities.Keywords: achondroplasia, autism spectrum disorder, disability, down syndrome, wheelchair user
Procedia PDF Downloads 107847 Towards Human-Interpretable, Automated Learning of Feedback Control for the Mixing Layer
Authors: Hao Li, Guy Y. Cornejo Maceda, Yiqing Li, Jianguo Tan, Marek Morzynski, Bernd R. Noack
Abstract:
We propose an automated analysis of the flow control behaviour from an ensemble of control laws and associated time-resolved flow snapshots. The input may be the rich database of machine learning control (MLC) optimizing a feedback law for a cost function in the plant. The proposed methodology provides (1) insights into the control landscape, which maps control laws to performance, including extrema and ridge-lines, (2) a catalogue of representative flow states and their contribution to cost function for investigated control laws and (3) visualization of the dynamics. Key enablers are classification and feature extraction methods of machine learning. The analysis is successfully applied to the stabilization of a mixing layer with sensor-based feedback driving an upstream actuator. The fluctuation energy is reduced by 26%. The control replaces unforced Kelvin-Helmholtz vortices with subsequent vortex pairing by higher-frequency Kelvin-Helmholtz structures of lower energy. These efforts target a human interpretable, fully automated analysis of MLC identifying qualitatively different actuation regimes, distilling corresponding coherent structures, and developing a digital twin of the plant.Keywords: machine learning control, mixing layer, feedback control, model-free control
Procedia PDF Downloads 223846 The Effect of Penalizing Wrong Answers in the Computerized Modified Multiple Choice Testing System
Authors: Min Hae Song, Jooyong Park
Abstract:
Even though assessment using information and communication technology will most likely lead the future of educational assessment, there is little research on this topic. Computerized assessment will not only cut costs but also measure students' performance in ways not possible before. In this context, this study introduces a tool which can overcome the problems of multiple choice tests. Multiple-choice tests (MC) are efficient in automatic grading, however structural problems of multiple-choice tests allow students to find the correct answer from options even though they do not know the answer. A computerized modified multiple-choice testing system (CMMT) was developed using the interactivity of computers, that presents questions first, and options later for a short time when the student requests for them. This study was conducted to find out whether penalizing for wrong answers in CMMT could lower random guessing. In this study, we checked whether students knew the answers by having them respond to the short-answer tests before choosing the given options in CMMT or MC format. Ninety-four students were tested with the directions that they will be penalized for wrong answers, but not for no response. There were 4 experimental conditions: two conditions of high or low percentage of penalizing, each in traditional multiple-choice or CMMT format. In the low penalty condition, the penalty rate was the probability of getting the correct answer by random guessing. In the high penalty condition, students were penalized at twice the percentage of the low penalty condition. The results showed that the number of no response was significantly higher for the CMMT format and the number of random guesses was significantly lower for the CMMT format. There were no significant between the two penalty conditions. This result may be due to the fact that the actual score difference between the two conditions was too small. In the discussion, the possibility of applying CMMT format tests while penalizing wrong answers in actual testing settings was addressed.Keywords: computerized modified multiple choice test format, multiple-choice test format, penalizing, test format
Procedia PDF Downloads 167