Search results for: automatic verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1393

Search results for: automatic verification

523 Autism Spectrum Disorder Classification Algorithm Using Multimodal Data Based on Graph Convolutional Network

Authors: Yuntao Liu, Lei Wang, Haoran Xia

Abstract:

Machine learning has shown extensive applications in the development of classification models for autism spectrum disorder (ASD) using neural image data. This paper proposes a fusion multi-modal classification network based on a graph neural network. First, the brain is segmented into 116 regions of interest using a medical segmentation template (AAL, Anatomical Automatic Labeling). The image features of sMRI and the signal features of fMRI are extracted, which build the node and edge embedding representations of the brain map. Then, we construct a dynamically updated brain map neural network and propose a method based on a dynamic brain map adjacency matrix update mechanism and learnable graph to further improve the accuracy of autism diagnosis and recognition results. Based on the Autism Brain Imaging Data Exchange I dataset(ABIDE I), we reached a prediction accuracy of 74% between ASD and TD subjects. Besides, to study the biomarkers that can help doctors analyze diseases and interpretability, we used the features by extracting the top five maximum and minimum ROI weights. This work provides a meaningful way for brain disorder identification.

Keywords: autism spectrum disorder, brain map, supervised machine learning, graph network, multimodal data, model interpretability

Procedia PDF Downloads 49
522 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)

Authors: Gule Teri

Abstract:

The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.

Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing

Procedia PDF Downloads 68
521 A Deep-Learning Based Prediction of Pancreatic Adenocarcinoma with Electronic Health Records from the State of Maine

Authors: Xiaodong Li, Peng Gao, Chao-Jung Huang, Shiying Hao, Xuefeng B. Ling, Yongxia Han, Yaqi Zhang, Le Zheng, Chengyin Ye, Modi Liu, Minjie Xia, Changlin Fu, Bo Jin, Karl G. Sylvester, Eric Widen

Abstract:

Predicting the risk of Pancreatic Adenocarcinoma (PA) in advance can benefit the quality of care and potentially reduce population mortality and morbidity. The aim of this study was to develop and prospectively validate a risk prediction model to identify patients at risk of new incident PA as early as 3 months before the onset of PA in a statewide, general population in Maine. The PA prediction model was developed using Deep Neural Networks, a deep learning algorithm, with a 2-year electronic-health-record (EHR) cohort. Prospective results showed that our model identified 54.35% of all inpatient episodes of PA, and 91.20% of all PA that required subsequent chemoradiotherapy, with a lead-time of up to 3 months and a true alert of 67.62%. The risk assessment tool has attained an improved discriminative ability. It can be immediately deployed to the health system to provide automatic early warnings to adults at risk of PA. It has potential to identify personalized risk factors to facilitate customized PA interventions.

Keywords: cancer prediction, deep learning, electronic health records, pancreatic adenocarcinoma

Procedia PDF Downloads 145
520 Estimating PM2.5 Concentrations Based on Landsat 8 Imagery and Historical Field Data over the Metropolitan Area of Mexico City

Authors: Rodrigo T. Sepulveda-Hirose, Ana B. Carrera-Aguilar, Francisco Andree Ramirez-Casas, Alondra Orozco-Gomez, Miguel Angel Sanchez-Caro, Carlos Herrera-Ventosa

Abstract:

High concentrations of particulate matter in the atmosphere pose a threat to human health, especially over areas with high concentrations of population; however, field air pollution monitoring is expensive and time-consuming. In order to achieve reduced costs and global coverage of the whole urban area, remote sensing can be used. This study evaluates PM2.5 concentrations, over the Mexico City´s metropolitan area, are estimated using atmospheric reflectance from LANDSAT 8, satellite imagery and historical PM2.5 measurements of the Automatic Environmental Monitoring Network of Mexico City (RAMA). Through the processing of the available satellite images, a preliminary model was generated to evaluate the optimal bands for the generation of the final model for Mexico City. Work on the final model continues with the results of the preliminary model. It was found that infrared bands have helped to model in other cities, but the effectiveness that these bands could provide for the geographic and climatic conditions of Mexico City is still being evaluated.

Keywords: air pollution modeling, Landsat 8, PM2.5, remote sensing

Procedia PDF Downloads 180
519 Right Solution of Geodesic Equation in Schwarzschild Metric and Overall Examination of Physical Laws

Authors: Kwan U. Kim, Jin Sim, Ryong Jin Jang, Sung Duk Kim

Abstract:

108 years have passed since a great number of physicists explained astronomical and physical phenomena by solving geodesic equations in Schwarzschild metric. However, when solving the geodesic equations in Schwarzschild metric, they did not correctly solve one branch of the component of space among spatial and temporal components of four-dimensional force and did not come up with physical laws correctly by means of physical analysis from the results obtained by solving the geodesic equations. In addition to it, they did not treat the astronomical and physical phenomena in a physical way based on the correct physical laws obtained from the solution of the geodesic equations in Schwarzschild metric. Therefore, some former scholars mentioned that Einstein’s theoretical basis of the general theory of relativity was obscure and incorrect, but they have not given a correct physical solution to the problems. Furthermore, since the general theory of relativity has not given a quantitative solution to obscure and incorrect problems, the generalization of gravitational theory has not been successfully completed yet, although the former scholars thought of it and tried to do it. In order to solve the problems it is necessary to explore the obscure and incorrect problems in general theory of relativity based on the physical laws and to find out the methodology of solving the problems. Therefore, first of all, as the first step for achieving the purpose, the right solution of the geodesic equation in Schwarzschild metric has been presented. Next, the correct physical laws found by making a physical analysis of the results have been presented, the obscure and incorrect problems have been shown, and an analysis of them has been made based on the physical laws. In addition, the experimental verification of the physical laws found by us has been made.

Keywords: equivalence principle, general relativity, geometrodynamics, Schwarzschild, Poincaré

Procedia PDF Downloads 62
518 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network

Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang

Abstract:

As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.

Keywords: GUI, deep learning, GAN, data augmentation

Procedia PDF Downloads 172
517 A Speeded up Robust Scale-Invariant Feature Transform Currency Recognition Algorithm

Authors: Daliyah S. Aljutaili, Redna A. Almutlaq, Suha A. Alharbi, Dina M. Ibrahim

Abstract:

All currencies around the world look very different from each other. For instance, the size, color, and pattern of the paper are different. With the development of modern banking services, automatic methods for paper currency recognition become important in many applications like vending machines. One of the currency recognition architecture’s phases is Feature detection and description. There are many algorithms that are used for this phase, but they still have some disadvantages. This paper proposes a feature detection algorithm, which merges the advantages given in the current SIFT and SURF algorithms, which we call, Speeded up Robust Scale-Invariant Feature Transform (SR-SIFT) algorithm. Our proposed SR-SIFT algorithm overcomes the problems of both the SIFT and SURF algorithms. The proposed algorithm aims to speed up the SIFT feature detection algorithm and keep it robust. Simulation results demonstrate that the proposed SR-SIFT algorithm decreases the average response time, especially in small and minimum number of best key points, increases the distribution of the number of best key points on the surface of the currency. Furthermore, the proposed algorithm increases the accuracy of the true best point distribution inside the currency edge than the other two algorithms.

Keywords: currency recognition, feature detection and description, SIFT algorithm, SURF algorithm, speeded up and robust features

Procedia PDF Downloads 226
516 Compared Psychophysiological Responses under Stress in Patients of Chronic Fatigue Syndrome and Depressive Disorder

Authors: Fu-Chien Hung, Chi‐Wen Liang

Abstract:

Background: People who suffer from chronic fatigue syndrome (CFS) frequently complain about continuous tiredness, weakness or lack of strength, but without apparent organic etiology. The prevalence rate of the CFS is nearly from 3% to 20%, yet more than 80% go undiagnosed or misdiagnosed as depression. The biopsychosocial model has suggested the associations among the CFS, depressive syndrome, and stress. This study aimed to investigate the difference between individuals with the CFS and with the depressive syndrome on psychophysiological responses under stress. Method: There were 23 participants in the CFS group, 14 participants in the depression group, and 23 participants in the healthy control group. All of the participants first completed the measures of demographic data, CFS-related symptoms, daily life functioning, and depressive symptoms. The participants were then asked to perform a stressful cognitive task. The participants’ psychophysiological responses including the HR, BVP and SC were measured during the task. These indexes were used to assess the reactivity and recovery rates of the automatic nervous system. Results: The stress reactivity of the CFS and depression groups was not different from that of the healthy control group. However, the stress recovery rate of the CFS group was worse than that of the healthy control group. Conclusion: The results from this study suggest that the CFS is a syndrome which can be independent from the depressive syndrome, although the depressive syndrome may include fatigue syndrome.

Keywords: chronic fatigue syndrome, depression, stress response, misdiagnosis

Procedia PDF Downloads 449
515 A Novel Breast Cancer Detection Algorithm Using Point Region Growing Segmentation and Pseudo-Zernike Moments

Authors: Aileen F. Wang

Abstract:

Mammography has been one of the most reliable methods for early detection and diagnosis of breast cancer. However, mammography misses about 17% and up to 30% of breast cancers due to the subtle and unstable appearances of breast cancer in their early stages. Recent computer-aided diagnosis (CADx) technology using Zernike moments has improved detection accuracy. However, it has several drawbacks: it uses manual segmentation, Zernike moments are not robust, and it still has a relatively high false negative rate (FNR)–17.6%. This project will focus on the development of a novel breast cancer detection algorithm to automatically segment the breast mass and further reduce FNR. The algorithm consists of automatic segmentation of a single breast mass using Point Region Growing Segmentation, reconstruction of the segmented breast mass using Pseudo-Zernike moments, and classification of the breast mass using the root mean square (RMS). A comparative study among the various algorithms on the segmentation and reconstruction of breast masses was performed on randomly selected mammographic images. The results demonstrated that the newly developed algorithm is the best in terms of accuracy and cost effectiveness. More importantly, the new classifier RMS has the lowest FNR–6%.

Keywords: computer aided diagnosis, mammography, point region growing segmentation, pseudo-zernike moments, root mean square

Procedia PDF Downloads 443
514 Automatic Tagging and Accuracy in Assamese Text Data

Authors: Chayanika Hazarika Bordoloi

Abstract:

This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.

Keywords: CRF, morphology, tagging, tagset

Procedia PDF Downloads 185
513 Multipurpose Agricultural Robot Platform: Conceptual Design of Control System Software for Autonomous Driving and Agricultural Operations Using Programmable Logic Controller

Authors: P. Abhishesh, B. S. Ryuh, Y. S. Oh, H. J. Moon, R. Akanksha

Abstract:

This paper discusses about the conceptual design and development of the control system software using Programmable logic controller (PLC) for autonomous driving and agricultural operations of Multipurpose Agricultural Robot Platform (MARP). Based on given initial conditions by field analysis and desired agricultural operations, the structural design development of MARP is done using modelling and analysis tool. PLC, being robust and easy to use, has been used to design the autonomous control system of robot platform for desired parameters. The robot is capable of performing autonomous driving and three automatic agricultural operations, viz. hilling, mulching, and sowing of seeds in the respective order. The input received from various sensors on the field is later transmitted to the controller via ZigBee network to make the changes in the control program to get desired field output. The research is conducted to provide assistance to farmers by reducing labor hours for agricultural activities by implementing automation. This study will provide an alternative to the existing systems with machineries attached behind tractors and rigorous manual operations on agricultural field at effective cost.

Keywords: agricultural operations, autonomous driving, MARP, PLC

Procedia PDF Downloads 353
512 Influence of Environmental Temperature on Dairy Herd Performance and Behaviour

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, S. Harapanahalli, J. Walsh

Abstract:

The objective of this study was to determine the effects of environmental stressors on the performance of lactating dairy cows and discuss some future trends. There exists a relationship between the meteorological data and milk yield prediction accuracy in pasture-based dairy systems. New precision technologies are available and are being developed to improve the sustainability of the dairy industry. Some of these technologies focus on welfare of individual animals on dairy farms. These technologies allow the automatic identification of animal behaviour and health events, greatly increasing overall herd health and yield while reducing animal health inspection demands and long-term animal healthcare costs. The data set consisted of records from 489 dairy cows at two dairy farms and temperature measured from the nearest meteorological weather station in 2018. The effects of temperature on milk production and behaviour of animals were analyzed. The statistical results indicate different effects of temperature on milk yield and behaviour. The “comfort zone” for animals is in the range 10 °C to 20 °C. Dairy cows out of this zone had to decrease or increase their metabolic heat production, and it affected their milk production and behaviour.

Keywords: behavior, milk yield, temperature, precision technologies

Procedia PDF Downloads 102
511 Functionality Based Composition of Web Services to Attain Maximum Quality of Service

Authors: M. Mohemmed Sha Mohamed Kunju, Abdalla A. Al-Ameen Abdurahman, T. Manesh Thankappan, A. Mohamed Mustaq Ahmed Hameed

Abstract:

Web service composition is an effective approach to complete the web based tasks with desired quality. A single web service with limited functionality is inadequate to execute a specific task with series of action. So, it is very much required to combine multiple web services with different functionalities to reach the target. Also, it will become more and more challenging, when these services are from different providers with identical functionalities and varying QoS, so while composing the web services, the overall QoS is considered to be the major factor. Also, it is not true that the expected QoS is always attained when the task is completed. A single web service in the composed chain may affect the overall performance of the task. So care should be taken in different aspects such as functionality of the service, while composition. Dynamic and automatic service composition is one of the main option available. But to achieve the actual functionality of the task, quality of the individual web services are also important. Normally the QoS of the individual service can be evaluated by using the non-functional parameters such as response time, throughput, reliability, availability, etc. At the same time, the QoS is not needed to be at the same level for all the composed services. So this paper proposes a framework that allows composing the services in terms of QoS by setting the appropriate weight to the non-functional parameters of each individual web service involved in the task. Experimental results show that the importance given to the non-functional parameter while composition will definitely improve the performance of the web services.

Keywords: composition, non-functional parameters, quality of service, web service

Procedia PDF Downloads 319
510 Autonomous Landing of UAV on Moving Platform: A Mathematical Approach

Authors: Mortez Alijani, Anas Osman

Abstract:

Recently, the popularity of Unmanned aerial vehicles (UAVs) has skyrocketed amidst the unprecedented events and the global pandemic, as they play a key role in both the security and health sectors, through surveillance, taking test samples, transportation of crucial goods and spreading awareness among civilians. However, the process of designing and producing such aerial robots is suppressed by the internal and external constraints that pose serious challenges. Landing is one of the key operations during flight, especially, the autonomous landing of UAVs on a moving platform is a scientifically complex engineering problem. Typically having a successful automatic landing of UAV on a moving platform requires accurate localization of landing, fast trajectory planning, and robust control planning. To achieve these goals, the information about the autonomous landing process such as the intersection point, the position of platform/UAV and inclination angle are more necessary. In this study, the mathematical approach to this problem in the X-Y axis based on the inclination angle and position of UAV in the landing process have been presented. The experimental results depict the accurate position of the UAV, intersection between UAV and moving platform and inclination angle in the landing process, allowing prediction of the intersection point.

Keywords: autonomous landing, inclination angle, unmanned aerial vehicles, moving platform, X-Y axis, intersection point

Procedia PDF Downloads 158
509 SAMRA: Dataset in Al-Soudani Arabic Maghrebi Script for Recognition of Arabic Ancient Words Handwritten

Authors: Sidi Ahmed Maouloud, Cheikh Ba

Abstract:

Much of West Africa’s cultural heritage is written in the Al-Soudani Arabic script, which was widely used in West Africa before the time of European colonization. This Al-Soudani Arabic script is an African version of the Maghrebi script, in particular, the Al-Mebssout script. However, the local African qualities were incorporated into the Al-Soudani script in a way that gave it a unique African diversity and character. Despite the existence of several Arabic datasets in Oriental script, allowing for the analysis, layout, and recognition of texts written in these calligraphies, many Arabic scripts and written traditions remain understudied. In this paper, we present a dataset of words from Al-Soudani calligraphy scripts. This dataset consists of 100 images selected from three different manuscripts written in Al-Soudani Arabic script by different copyists. The primary source for this database was the libraries of Boston University and Cambridge University. This dataset highlights the unique characteristics of the Al-Soudani Arabic script as well as the new challenges it presents in terms of automatic word recognition of Arabic manuscripts. An HTR system based on a hybrid ANN (CRNN-CTC) is also proposed to test this dataset. SAMRA is a dataset of annotated Arabic manuscript words in the Al-Soudani script that can help researchers automatically recognize and analyze manuscript words written in this script.

Keywords: dataset, CRNN-CTC, handwritten words recognition, Al-Soudani Arabic script, HTR, manuscripts

Procedia PDF Downloads 107
508 Application of Grey Theory in the Forecast of Facility Maintenance Hours for Office Building Tenants and Public Areas

Authors: Yen Chia-Ju, Cheng Ding-Ruei

Abstract:

This study took case office building as subject and explored the responsive work order repair request of facilities and equipment in offices and public areas by gray theory, with the purpose of providing for future related office building owners, executive managers, property management companies, mechanical and electrical companies as reference for deciding and assessing forecast model. Important conclusions of this study are summarized as follows according to the study findings: 1. Grey Relational Analysis discusses the importance of facilities repair number of six categories, namely, power systems, building systems, water systems, air conditioning systems, fire systems and manpower dispatch in order. In terms of facilities maintenance importance are power systems, building systems, water systems, air conditioning systems, manpower dispatch and fire systems in order. 2. GM (1,N) and regression method took maintenance hours as dependent variables and repair number, leased area and tenants number as independent variables and conducted single month forecast based on 12 data from January to December 2011. The mean absolute error and average accuracy of GM (1,N) from verification results were 6.41% and 93.59%; the mean absolute error and average accuracy of regression model were 4.66% and 95.34%, indicating that they have highly accurate forecast capability.

Keywords: rey theory, forecast model, Taipei 101, office buildings, property management, facilities, equipment

Procedia PDF Downloads 433
507 Research the Causes of Defects and Injuries of Reinforced Concrete and Stone Construction

Authors: Akaki Qatamidze

Abstract:

Implementation of the project will be a step forward in terms of reliability in Georgia and the improvement of the construction and the development of construction. Completion of the project is expected to result in a complete knowledge, which is expressed in concrete and stone structures of assessing the technical condition of the processing. This method is based on a detailed examination of the structure, in order to establish the injuries and the elimination of the possibility of changing the structural scheme of the new requirements and architectural preservationists. Reinforced concrete and stone structures research project carried out in a systematic analysis of the important approach is to optimize the process of research and development of new knowledge in the neighboring areas. In addition, the problem of physical and mathematical models of rational consent, the main pillar of the physical (in-situ) data and mathematical calculation models and physical experiments are used only for the calculation model specification and verification. Reinforced concrete and stone construction defects and failures the causes of the proposed research to enhance the effectiveness of their maximum automation capabilities and expenditure of resources to reduce the recommended system analysis of the methodological concept-based approach, as modern science and technology major particularity of one, it will allow all family structures to be identified for the same work stages and procedures, which makes it possible to exclude subjectivity and addresses the problem of the optimal direction. It discussed the methodology of the project and to establish a major step forward in the construction trades and practical assistance to engineers, supervisors, and technical experts in the construction of the settlement of the problem.

Keywords: building, reinforced concrete, expertise, stone structures

Procedia PDF Downloads 327
506 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method

Authors: Wassana Naiyapo, Atichat Sangtong

Abstract:

The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.

Keywords: classification tree method, test case, UML use case diagram, use case specification

Procedia PDF Downloads 155
505 Multimedia Firearms Training System

Authors: Aleksander Nawrat, Karol Jędrasiak, Artur Ryt, Dawid Sobel

Abstract:

The goal of the article is to present a novel Multimedia Firearms Training System. The system was developed in order to compensate for major problems of existing shooting training systems. The designed and implemented solution can be characterized by five major advantages: algorithm for automatic geometric calibration, algorithm of photometric recalibration, firearms hit point detection using thermal imaging camera, IR laser spot tracking algorithm for after action review analysis, and implementation of ballistics equations. The combination of the abovementioned advantages in a single multimedia firearms training system creates a comprehensive solution for detecting and tracking of the target point usable for shooting training systems and improving intervention tactics of uniformed services. The introduced algorithms of geometric and photometric recalibration allow the use of economically viable commercially available projectors for systems that require long and intensive use without most of the negative impacts on color mapping of existing multi-projector multimedia shooting range systems. The article presents the results of the developed algorithms and their application in real training systems.

Keywords: firearms shot detection, geometric recalibration, photometric recalibration, IR tracking algorithm, thermography, ballistics

Procedia PDF Downloads 213
504 Parametric Study for Optimal Design of Hybrid Bridge Joint

Authors: Bongsik Park, Jae Hyun Park, Jae-Yeol Cho

Abstract:

Mixed structure, which is a kind of hybrid system, is incorporating steel beam and prestressed concrete beam. Hybrid bridge adopting mixed structure have some merits. Main span length can be made longer by using steel as main span material. In case of cable-stayed bridge having asymmetric span length, negative reaction at side span can be restrained without extra restraining devices by using weight difference between main span material and side span material. However angle of refraction might happen because of rigidity difference between materials and stress concentration also might happen because of abnormal loading transmission at joint in the hybrid bridge. Therefore the joint might be a weak point of the structural system and it needs to pay attention to design of the joint. However, design codes and standards about the joint in the hybrid-bridge have not been established so the joint designs in most of construction cases have been very conservative or followed previous design without extra verification. In this study parametric study using finite element analysis for optimal design of hybrid bridge joint is conducted. Before parametric study, finite element analysis was conducted based on previous experimental data and it is verified that analysis result approximated experimental data. Based on the finite element analysis results, parametric study was conducted. The parameters were selected as those have influences on joint behavior. Based on the parametric study results, optimal design of hybrid bridge joint has been determined.

Keywords: parametric study, optimal design, hybrid bridge, finite element analysis

Procedia PDF Downloads 416
503 Exploratory Analysis of A Review of Nonexistence Polarity in Native Speech

Authors: Deawan Rakin Ahamed Remal, Sinthia Chowdhury, Sharun Akter Khushbu, Sheak Rashed Haider Noori

Abstract:

Native Speech to text synthesis has its own leverage for the purpose of mankind. The extensive nature of art to speaking different accents is common but the purpose of communication between two different accent types of people is quite difficult. This problem will be motivated by the extraction of the wrong perception of language meaning. Thus, many existing automatic speech recognition has been placed to detect text. Overall study of this paper mentions a review of NSTTR (Native Speech Text to Text Recognition) synthesis compared with Text to Text recognition. Review has exposed many text to text recognition systems that are at a very early stage to comply with the system by native speech recognition. Many discussions started about the progression of chatbots, linguistic theory another is rule based approach. In the Recent years Deep learning is an overwhelming chapter for text to text learning to detect language nature. To the best of our knowledge, In the sub continent a huge number of people speak in Bangla language but they have different accents in different regions therefore study has been elaborate contradictory discussion achievement of existing works and findings of future needs in Bangla language acoustic accent.

Keywords: TTR, NSTTR, text to text recognition, deep learning, natural language processing

Procedia PDF Downloads 120
502 Abdominal Organ Segmentation in CT Images Based On Watershed Transform and Mosaic Image

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

Accurate Liver, spleen and kidneys segmentation in abdominal CT images is one of the most important steps for computer aided abdominal organs pathology diagnosis. In this paper, we have proposed a new semi-automatic algorithm for Liver, spleen and kidneys area extraction in abdominal CT images. Our proposed method is based on hierarchical segmentation and watershed algorithm. In our approach, a powerful technique has been designed to suppress over-segmentation based on mosaic image and on the computation of the watershed transform. The algorithm is currency in two parts. In the first, we seek to improve the quality of the gradient-mosaic image. In this step, we propose a method for improving the gradient-mosaic image by applying the anisotropic diffusion filter followed by the morphological filters. Thereafter we proceed to the hierarchical segmentation of the liver, spleen and kidney. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.

Keywords: anisotropic diffusion filter, CT images, morphological filter, mosaic image, multi-abdominal organ segmentation, mosaic image, the watershed algorithm

Procedia PDF Downloads 484
501 Biosignal Recognition for Personal Identification

Authors: Hadri Hussain, M.Nasir Ibrahim, Chee-Ming Ting, Mariani Idroas, Fuad Numan, Alias Mohd Noor

Abstract:

A biometric security system has become an important application in client identification and verification system. A conventional biometric system is normally based on unimodal biometric that depends on either behavioural or physiological information for authentication purposes. The behavioural biometric depends on human body biometric signal (such as speech) and biosignal biometric (such as electrocardiogram (ECG) and phonocardiogram or heart sound (HS)). The speech signal is commonly used in a recognition system in biometric, while the ECG and the HS have been used to identify a person’s diseases uniquely related to its cluster. However, the conventional biometric system is liable to spoof attack that will affect the performance of the system. Therefore, a multimodal biometric security system is developed, which is based on biometric signal of ECG, HS, and speech. The biosignal data involved in the biometric system is initially segmented, with each segment Mel Frequency Cepstral Coefficients (MFCC) method is exploited for extracting the feature. The Hidden Markov Model (HMM) is used to model the client and to classify the unknown input with respect to the modal. The recognition system involved training and testing session that is known as client identification (CID). In this project, twenty clients are tested with the developed system. The best overall performance at 44 kHz was 93.92% for ECG and the worst overall performance was ECG at 88.47%. The results were compared to the best overall performance at 44 kHz for (20clients) to increment of clients, which was 90.00% for HS and the worst overall performance falls at ECG at 79.91%. It can be concluded that the difference multimodal biometric has a substantial effect on performance of the biometric system and with the increment of data, even with higher frequency sampling, the performance still decreased slightly as predicted.

Keywords: electrocardiogram, phonocardiogram, hidden markov model, mel frequency cepstral coeffiecients, client identification

Procedia PDF Downloads 272
500 Aerodynamic Analysis by Computational Fluids Dynamics in Building: Case Study

Authors: Javier Navarro Garcia, Narciso Vazquez Carretero

Abstract:

Eurocode 1, part 1-4, wind actions, includes in its article 1.5 the possibility of using numerical calculation methods to obtain information on the loads acting on a building. On the other hand, the analysis using computational fluids dynamics (CFD) in aerospace, aeronautical, and industrial applications is already in widespread use. The application of techniques based on CFD analysis on the building to study its aerodynamic behavior now opens a whole alternative field of possibilities for civil engineering and architecture; optimization of the results with respect to those obtained by applying the regulations, the possibility of obtaining information on pressures, speeds at any point of the model for each moment, the analysis of turbulence and the possibility of modeling any geometry or configuration. The present work compares the results obtained on a building, with respect to its aerodynamic behavior, from a mathematical model based on the analysis by CFD with the results obtained by applying Eurocode1, part1-4, wind actions. It is verified that the results obtained by CFD techniques suppose an optimization of the wind action that acts on the building with respect to the wind action obtained by applying the Eurocode1, part 1-4, wind actions. In order to carry out this verification, a 45m high square base truncated pyramid building has been taken. The mathematical model on CFD, based on finite volumes, has been calculated using the FLUENT commercial computer application using a scale-resolving simulation (SRS) type large eddy simulation (LES) turbulence model for an atmospheric boundary layer wind with turbulent component in the direction of the flow.

Keywords: aerodynamic, CFD, computacional fluids dynamics, computational mechanics

Procedia PDF Downloads 132
499 Ultrasonic Spectroscopy of Polymer Based PVDF-TrFE Composites with CNT Fillers

Authors: J. Belovickis, V. Samulionis, J. Banys, M. V. Silibin, A. V. Solnyshkin, A. V. Sysa

Abstract:

Ferroelectric polymers exhibit good flexibility, processability and low cost of production. Doping of ferroelectric polymers with nanofillers may modify its dielectric, elastic or piezoelectric properties. Carbon nanotubes are one of the ingredients that can improve the mechanical properties of polymer based composites. In this work, we report on both the ultrasonic and the dielectric properties of the copolymer polyvinylidene fluoride/tetrafluoroethylene (P(VDF-TrFE)) of the composition 70/30 mol% with various concentrations of carbon nanotubes (CNT). Experimental study of ultrasonic wave attenuation and velocity in these composites has been performed over wide temperature range (100 K – 410 K) using an ultrasonic automatic pulse-echo tecnique. The temperature dependences of ultrasonic velocity and attenuation showed anomalies attributed to the glass transition and paraelectric-ferroelectric phase transition. Our investigations showed mechanical losses to be dependent on the volume fraction of the CNTs within the composites. The existence of broad hysteresis of the ultrasonic wave attenuation and velocity within the nanocomposites is presented between cooling and heating cycles. By the means of dielectric spectroscopy, it is shown that the dielectric properties may be tuned by varying the volume fraction of the CNT fillers.

Keywords: carbon nanotubes, polymer composites, PVDF-TrFE, ultrasonic spectroscopy

Procedia PDF Downloads 334
498 The Theory of the Mystery: Unifying the Quantum and Cosmic Worlds

Authors: Md. Najiur Rahman

Abstract:

This hypothesis reveals a profound and symmetrical connection that goes beyond the boundaries of quantum physics and cosmology, revolutionizing our understanding of the fundamental building blocks of the cosmos, given its name ‘The Theory of the Mystery’. This theory has an elegantly simple equation, “R = ∆r / √∆m” which establishes a beautiful and well-crafted relationship between the radius (R) of an elementary particle or galaxy, the relative change in radius (∆r), and the mass difference (∆m) between related entities. It is fascinating to note that this formula presents a super synchronization, one which involves the convergence of every basic particle and any single celestial entity into perfect alignment with its respective mass and radius. In addition, we have a Supporting equation that defines the mass-radius connection of an entity by the equation: R=√m/N, where N is an empirically established constant, determined to be approximately 42.86 kg/m, representing the proportionality between mass and radius. It provides precise predictions, collects empirical evidence, and explores the far-reaching consequences of theories such as General Relativity. This elegant symmetry reveals a fundamental principle that underpins the cosmos: each component, whether small or large, follows a precise mass-radius relationship to exert gravity by a universal law. This hypothesis represents a transformative process towards a unified theory of physics, and the pursuit of experimental verification will show that each particle and galaxy is bound by gravity and plays a unique but harmonious role in shaping the universe. It promises to reveal the great symphony of the mighty cosmos. The predictive power of our hypothesis invites the exploration of entities at the farthest reaches of the cosmos, providing a bridge between the known and the unknown.

Keywords: unified theory, quantum gravity, mass-radius relationship, dark matter, uniform gravity

Procedia PDF Downloads 73
497 Studying the Possibility to Weld AA1100 Aluminum Alloy by Friction Stir Spot Welding

Authors: Ahmad K. Jassim, Raheem Kh. Al-Subar

Abstract:

Friction stir welding is a modern and an environmentally friendly solid state joining process used to joint relatively lighter family of materials. Recently, friction stir spot welding has been used instead of resistance spot welding which has received considerable attention from the automotive industry. It is environmentally friendly process that eliminated heat and pollution. In this research, friction stir spot welding has been used to study the possibility to weld AA1100 aluminum alloy sheet with 3 mm thickness by overlapping the edges of sheet as lap joint. The process was done using a drilling machine instead of milling machine. Different tool rotational speeds of 760, 1065, 1445, and 2000 RPM have been applied with manual and automatic compression to study their effect on the quality of welded joints. Heat generation, pressure applied, and depth of tool penetration have been measured during the welding process. The result shows that there is a possibility to weld AA1100 sheets; however, there is some surface defect that happened due to insufficient condition of welding. Moreover, the relationship between rotational speed, pressure, heat generation and tool depth penetration was created.

Keywords: friction, spot, stir, environmental, sustainable, AA1100 aluminum alloy

Procedia PDF Downloads 190
496 ECG Based Reliable User Identification Using Deep Learning

Authors: R. N. Begum, Ambalika Sharma, G. K. Singh

Abstract:

Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and ECG-based systems are unquestionably the best choice due to their appealing inherent characteristics. The CNNs are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the calibre of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest FAR of 0.04 percent and the highest FRR of 5%, the best performing network achieved an identification accuracy of 99.94 percent. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.

Keywords: Biometrics, Dense Networks, Identification Rate, Train/Test split ratio

Procedia PDF Downloads 155
495 Fight against Money Laundering with Optical Character Recognition

Authors: Saikiran Subbagari, Avinash Malladhi

Abstract:

Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.

Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition

Procedia PDF Downloads 130
494 Automated Classification of Hypoxia from Fetal Heart Rate Using Advanced Data Models of Intrapartum Cardiotocography

Authors: Malarvizhi Selvaraj, Paul Fergus, Andy Shaw

Abstract:

Uterine contractions produced during labour have the potential to damage the foetus by diminishing the maternal blood flow to the placenta. In order to observe this phenomenon labour and delivery are routinely monitored using cardiotocography monitors. An obstetrician usually makes the diagnosis of foetus hypoxia by interpreting cardiotocography recordings. However, cardiotocography capture and interpretation is time-consuming and subjective, often lead to misclassification that causes damage to the foetus and unnecessary caesarean section. Both of these have a high impact on the foetus and the cost to the national healthcare services. Automatic detection of foetal heart rate may be an objective solution to help to reduce unnecessary medical interventions, as reported in several studies. This paper aim is to provide a system for better identification and interpretation of abnormalities of the fetal heart rate using RStudio. An open dataset of 552 Intrapartum recordings has been filtered with 0.034 Hz filters in an attempt to remove noise while keeping as much of the discriminative data as possible. Features were chosen following an extensive literature review, which concluded with FIGO features such as acceleration, deceleration, mean, variance and standard derivation. The five features were extracted from 552 recordings. Using these features, recordings will be classified either normal or abnormal. If the recording is abnormal, it has got more chances of hypoxia.

Keywords: cardiotocography, foetus, intrapartum, hypoxia

Procedia PDF Downloads 211