Search results for: rapid and specific detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12787

Search results for: rapid and specific detection

11917 An in Situ Dna Content Detection Enabled by Organic Long-persistent Luminescence Materials with Tunable Afterglow-time in Water and Air

Authors: Desissa Yadeta Muleta

Abstract:

Purely organic long-persistent luminescence materials (OLPLMs) have been developed as emerging organic materials due to their simple production process, low preparation cost and better biocompatibilities. Notably, OLPLMs with afterglow-time-tunable long-persistent luminescence (LPL) characteristics enable higher-level protection applications and have great prospects in biological applications. The realization of these advanced performances depends on our ability to gradually tune LPL duration under ambient conditions, however, the strategies to achieve this are few due to the lack of unambiguous mechanisms. Here, we propose a two-step strategy to gradually tune LPL duration of OLPLMs over a wide range of seconds in water and air, by using derivatives as the guest and introducing a third-party material into the host-immobilized host–guest doping system. Based on this strategy, we develop an analysis method for deoxyribonucleic acid (DNA) content detection without DNA separation in aqueous samples, which circumvents the influence of the chromophore, fluorophore and other interferents in vivo, enabling a certain degree of in situ detection that is difficult to achieve using today’s methods. This work will expedite the development of afterglow-time-tunable OLPLMs and expand new horizons for their applications in data protection, bio-detection, and bio-sensing

Keywords: deoxyribonucliec acid, long persistent luminescent materials, water, air

Procedia PDF Downloads 75
11916 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing

Procedia PDF Downloads 187
11915 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model

Authors: Sujay Kotwale, Ramasubba Reddy M.

Abstract:

Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.

Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost

Procedia PDF Downloads 117
11914 Monocular 3D Person Tracking AIA Demographic Classification and Projective Image Processing

Authors: McClain Thiel

Abstract:

Object detection and localization has historically required two or more sensors due to the loss of information from 3D to 2D space, however, most surveillance systems currently in use in the real world only have one sensor per location. Generally, this consists of a single low-resolution camera positioned above the area under observation (mall, jewelry store, traffic camera). This is not sufficient for robust 3D tracking for applications such as security or more recent relevance, contract tracing. This paper proposes a lightweight system for 3D person tracking that requires no additional hardware, based on compressed object detection convolutional-nets, facial landmark detection, and projective geometry. This approach involves classifying the target into a demographic category and then making assumptions about the relative locations of facial landmarks from the demographic information, and from there using simple projective geometry and known constants to find the target's location in 3D space. Preliminary testing, although severely lacking, suggests reasonable success in 3D tracking under ideal conditions.

Keywords: monocular distancing, computer vision, facial analysis, 3D localization

Procedia PDF Downloads 137
11913 FlexPoints: Efficient Algorithm for Detection of Electrocardiogram Characteristic Points

Authors: Daniel Bulanda, Janusz A. Starzyk, Adrian Horzyk

Abstract:

The electrocardiogram (ECG) is one of the most commonly used medical tests, essential for correct diagnosis and treatment of the patient. While ECG devices generate a huge amount of data, only a small part of them carries valuable medical information. To deal with this problem, many compression algorithms and filters have been developed over the past years. However, the rapid development of new machine learning techniques poses new challenges. To address this class of problems, we created the FlexPoints algorithm that searches for characteristic points on the ECG signal and ignores all other points that do not carry relevant medical information. The conducted experiments proved that the presented algorithm can significantly reduce the number of data points which represents ECG signal without losing valuable medical information. These sparse but essential characteristic points (flex points) can be a perfect input for some modern machine learning models, which works much better using flex points as an input instead of raw data or data compressed by many popular algorithms.

Keywords: characteristic points, electrocardiogram, ECG, machine learning, signal compression

Procedia PDF Downloads 160
11912 Video Foreground Detection Based on Adaptive Mixture Gaussian Model for Video Surveillance Systems

Authors: M. A. Alavianmehr, A. Tashk, A. Sodagaran

Abstract:

Modeling background and moving objects are significant techniques for video surveillance and other video processing applications. This paper presents a foreground detection algorithm that is robust against illumination changes and noise based on adaptive mixture Gaussian model (GMM), and provides a novel and practical choice for intelligent video surveillance systems using static cameras. In the previous methods, the image of still objects (background image) is not significant. On the contrary, this method is based on forming a meticulous background image and exploiting it for separating moving objects from their background. The background image is specified either manually, by taking an image without vehicles, or is detected in real-time by forming a mathematical or exponential average of successive images. The proposed scheme can offer low image degradation. The simulation results demonstrate high degree of performance for the proposed method.

Keywords: image processing, background models, video surveillance, foreground detection, Gaussian mixture model

Procedia PDF Downloads 514
11911 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection

Authors: S. Delgado, C. Cerrada, R. S. Gómez

Abstract:

This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.

Keywords: voxelization, GPU acceleration, computer graphics, compute shaders

Procedia PDF Downloads 70
11910 Regulatory Frameworks and Bank Failure Prevention in South Africa: Assessing Effectiveness and Enhancing Resilience

Authors: Princess Ncube

Abstract:

In the context of South Africa's banking sector, the prevention of bank failures is of paramount importance to ensure financial stability and economic growth. This paper focuses on the role of regulatory frameworks in safeguarding the resilience of South African banks and mitigating the risks of failures. It aims to assess the effectiveness of existing regulatory measures and proposes strategies to enhance the resilience of financial institutions in the country. The paper begins by examining the specific regulatory frameworks in place in South Africa, including capital adequacy requirements, stress testing methodologies, risk management guidelines, and supervisory practices. It delves into the evolution of these measures in response to lessons learned from past financial crises and their relevance in the unique South African banking landscape. Drawing on empirical evidence and case studies specific to South Africa, this paper evaluates the effectiveness of regulatory frameworks in preventing bank failures within the country. It analyses the impact of these frameworks on crucial aspects such as early detection of distress signals, improvements in risk management practices, and advancements in corporate governance within South African financial institutions. Additionally, it explores the interplay between regulatory frameworks and the specific economic environment of South Africa, including the role of macroprudential policies in preventing systemic risks. Based on the assessment, this paper proposes recommendations to strengthen regulatory frameworks and enhance their effectiveness in bank failure prevention in South Africa. It explores avenues for refining existing regulations to align capital requirements with the risk profiles of South African banks, enhancing stress testing methodologies to capture specific vulnerabilities, and fostering better coordination among regulatory authorities within the country. Furthermore, it examines the potential benefits of adopting innovative approaches, such as leveraging technology and data analytics, to improve risk assessment and supervision in the South African banking sector.

Keywords: banks, resolution, liquidity, regulation

Procedia PDF Downloads 86
11909 Vehicle Timing Motion Detection Based on Multi-Dimensional Dynamic Detection Network

Authors: Jia Li, Xing Wei, Yuchen Hong, Yang Lu

Abstract:

Detecting vehicle behavior has always been the focus of intelligent transportation, but with the explosive growth of the number of vehicles and the complexity of the road environment, the vehicle behavior videos captured by traditional surveillance have been unable to satisfy the study of vehicle behavior. The traditional method of manually labeling vehicle behavior is too time-consuming and labor-intensive, but the existing object detection and tracking algorithms have poor practicability and low behavioral location detection rate. This paper proposes a vehicle behavior detection algorithm based on the dual-stream convolution network and the multi-dimensional video dynamic detection network. In the videos, the straight-line behavior of the vehicle will default to the background behavior. The Changing lanes, turning and turning around are set as target behaviors. The purpose of this model is to automatically mark the target behavior of the vehicle from the untrimmed videos. First, the target behavior proposals in the long video are extracted through the dual-stream convolution network. The model uses a dual-stream convolutional network to generate a one-dimensional action score waveform, and then extract segments with scores above a given threshold M into preliminary vehicle behavior proposals. Second, the preliminary proposals are pruned and identified using the multi-dimensional video dynamic detection network. Referring to the hierarchical reinforcement learning, the multi-dimensional network includes a Timer module and a Spacer module, where the Timer module mines time information in the video stream and the Spacer module extracts spatial information in the video frame. The Timer and Spacer module are implemented by Long Short-Term Memory (LSTM) and start from an all-zero hidden state. The Timer module uses the Transformer mechanism to extract timing information from the video stream and extract features by linear mapping and other methods. Finally, the model fuses time information and spatial information and obtains the location and category of the behavior through the softmax layer. This paper uses recall and precision to measure the performance of the model. Extensive experiments show that based on the dataset of this paper, the proposed model has obvious advantages compared with the existing state-of-the-art behavior detection algorithms. When the Time Intersection over Union (TIoU) threshold is 0.5, the Average-Precision (MP) reaches 36.3% (the MP of baselines is 21.5%). In summary, this paper proposes a vehicle behavior detection model based on multi-dimensional dynamic detection network. This paper introduces spatial information and temporal information to extract vehicle behaviors in long videos. Experiments show that the proposed algorithm is advanced and accurate in-vehicle timing behavior detection. In the future, the focus will be on simultaneously detecting the timing behavior of multiple vehicles in complex traffic scenes (such as a busy street) while ensuring accuracy.

Keywords: vehicle behavior detection, convolutional neural network, long short-term memory, deep learning

Procedia PDF Downloads 129
11908 Fabrication of Poly(Ethylene Oxide)/Chitosan/Indocyanine Green Nanoprobe by Co-Axial Electrospinning Method for Early Detection

Authors: Zeynep R. Ege, Aydin Akan, Faik N. Oktar, Betul Karademir, Oguzhan Gunduz

Abstract:

Early detection of cancer could save human life and quality in insidious cases by advanced biomedical imaging techniques. Designing targeted detection system is necessary in order to protect of healthy cells. Electrospun nanofibers are efficient and targetable nanocarriers which have important properties such as nanometric diameter, mechanical properties, elasticity, porosity and surface area to volume ratio. In the present study, indocyanine green (ICG) organic dye was stabilized and encapsulated in polymer matrix which polyethylene oxide (PEO) and chitosan (CHI) multilayer nanofibers via co-axial electrospinning method at one step. The co-axial electrospun nanofibers were characterized as morphological (SEM), molecular (FT-IR), and entrapment efficiency of Indocyanine Green (ICG) (confocal imaging). Controlled release profile of PEO/CHI/ICG nanofiber was also evaluated up to 40 hours.

Keywords: chitosan, coaxial electrospinning, controlled releasing, drug delivery, indocyanine green, polyethylene oxide

Procedia PDF Downloads 168
11907 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection

Authors: Muhammad Ali

Abstract:

Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.

Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection

Procedia PDF Downloads 122
11906 Design and Development of an Autonomous Beach Cleaning Vehicle

Authors: Mahdi Allaoua Seklab, Süleyman BaşTürk

Abstract:

In the quest to enhance coastal environmental health, this study introduces a fully autonomous beach cleaning machine, a breakthrough in leveraging green energy and advanced artificial intelligence for ecological preservation. Designed to operate independently, the machine is propelled by a solar-powered system, underscoring a commitment to sustainability and the use of renewable energy in autonomous robotics. The vehicle's autonomous navigation is achieved through a sophisticated integration of LIDAR and a camera system, utilizing an SSD MobileNet V2 object detection model for accurate and real-time trash identification. The SSD framework, renowned for its efficiency in detecting objects in various scenarios, is coupled with the lightweight and precise highly MobileNet V2 architecture, making it particularly suited for the computational constraints of on-board processing in mobile robotics. Training of the SSD MobileNet V2 model was conducted on Google Colab, harnessing cloud-based GPU resources to facilitate a rapid and cost-effective learning process. The model was refined with an extensive dataset of annotated beach debris, optimizing the parameters using the Adam optimizer and a cross-entropy loss function to achieve high-precision trash detection. This capability allows the machine to intelligently categorize and target waste, leading to more effective cleaning operations. This paper details the design and functionality of the beach cleaning machine, emphasizing its autonomous operational capabilities and the novel application of AI in environmental robotics. The results showcase the potential of such technology to fill existing gaps in beach maintenance, offering a scalable and eco-friendly solution to the growing problem of coastal pollution. The deployment of this machine represents a significant advancement in the field, setting a new standard for the integration of autonomous systems in the service of environmental stewardship.

Keywords: autonomous beach cleaning machine, renewable energy systems, coastal management, environmental robotics

Procedia PDF Downloads 23
11905 Quality and Shelf life of UHT Milk Produced in Tripoli, Libya

Authors: Faozia A. S. Abuhtana, Yahia S. Abujnah, Said O. Gnann

Abstract:

Ultra High Temperature (UHT) processed milk is widely distributed and preferred in numerous countries all over the world due its relatively high quality and long shelf life. Because of the notable high consumption rate of UHT in Libya in addition to negligible studies related to such product on the local level, this study was designed to assess the shelf life of locally produced as well as imported reconstituted sterilized whole milk samples marketed in Tripoli, Libya . Four locally produced vs. three imported brands were used in this study. All samples were stored at room temperature (25± 2C ) for 8 month long period, and subjected to physical, chemical, microbiological and sensory tests. These tests included : measurement of pH, specific gravity, percent acidity, and determination of fat, protein and melamine content. Microbiological tests included total aerobic count, total psychotropic bacteria, total spore forming bacteria and total coliform counts. Results indicated no detection of microbial growth of any type during the study period, in addition to no detection of melamine in all samples. On the other hand, a gradual decline in pH accompanied with gradual increase in % acidity of both locally produced and imported samples was observed. Such changes in both pH and % acidity reached their lowest and highest values respectively during the 24th week of storage. For instance pH values were (6.40, 6.55, 6.55, 6.15) and (6.30, 6.50, 6.20) for local and imported brands respectively. On the other hand, % acidity reached (0.185, 0181, 0170, 0183) and (0180, 0.180, 0.171) at the 24th week for local and imported brands respectively. Similar pattern of decline was also observed in specific gravity, fat and protein content in some local and imported samples especially at later stages of the study. In both cases, some of the recorded pH values, % acidity, sp. gravity and fat content were in violation of the accepted limits set by Libyan standard no. 356 for sterilized milk. Such changes in pH, % acidity and other UHT sterilized milk constituents during storage were coincided with a gradual decrease in the degree of acceptance of the stored milk samples of both types as shown by sensory scores recorded by the panelists. In either case degree of acceptance was significantly low at late stages of storage and most milk samples became relatively unacceptable after the 18th and 20th week for both untrained and trained panelists respectively.

Keywords: UHT milk, shelf life, quality, gravity, bacteria

Procedia PDF Downloads 337
11904 Multiaxial Fatigue Analysis of a High Performance Nickel-Based Superalloy

Authors: P. Selva, B. Lorraina, J. Alexis, A. Seror, A. Longuet, C. Mary, F. Denard

Abstract:

Over the past four decades, the fatigue behavior of nickel-based alloys has been widely studied. However, in recent years, significant advances in the fabrication process leading to grain size reduction have been made in order to improve fatigue properties of aircraft turbine discs. Indeed, a change in particle size affects the initiation mode of fatigue cracks as well as the fatigue life of the material. The present study aims to investigate the fatigue behavior of a newly developed nickel-based superalloy under biaxial-planar loading. Low Cycle Fatigue (LCF) tests are performed at different stress ratios so as to study the influence of the multiaxial stress state on the fatigue life of the material. Full-field displacement and strain measurements as well as crack initiation detection are obtained using Digital Image Correlation (DIC) techniques. The aim of this presentation is first to provide an in-depth description of both the experimental set-up and protocol: the multiaxial testing machine, the specific design of the cruciform specimen and performances of the DIC code are introduced. Second, results for sixteen specimens related to different load ratios are presented. Crack detection, strain amplitude and number of cycles to crack initiation vs. triaxial stress ratio for each loading case are given. Third, from fractographic investigations by scanning electron microscopy it is found that the mechanism of fatigue crack initiation does not depend on the triaxial stress ratio and that most fatigue cracks initiate from subsurface carbides.

Keywords: cruciform specimen, multiaxial fatigue, nickel-based superalloy

Procedia PDF Downloads 293
11903 Multi-Criteria Evaluation of IDS Architectures in Cloud Computing

Authors: Elmahdi Khalil, Saad Enniari, Mostapha Zbakh

Abstract:

Cloud computing promises to increase innovation and the velocity with witch applications are deployed, all while helping any enterprise meet most IT service needs at a lower total cost of ownership and higher return investment. As the march of cloud continues, it brings both new opportunities and new security challenges. To take advantages of those opportunities while minimizing risks, we think that Intrusion Detection Systems (IDS) integrated in the cloud is one of the best existing solutions nowadays in the field. The concept of intrusion detection was known since past and was first proposed by a well-known researcher named Anderson in 1980's. Since that time IDS's are evolving. Although, several efforts has been made in the area of Intrusion Detection systems for cloud computing environment, many attacks still prevail. Therefore, the work presented in this paper proposes a multi criteria analysis and a comparative study between several IDS architectures designated to work in a cloud computing environments. To achieve this objective, in the first place we will search in the state of the art of several consistent IDS architectures designed to work in a cloud environment. Whereas, in a second step we will establish the criteria that will be useful for the evaluation of architectures. Later, using the approach of multi criteria decision analysis Mac Beth (Measuring Attractiveness by a Categorical Based Evaluation Technique we will evaluate the criteria and assign to each one the appropriate weight according to their importance in the field of IDS architectures in cloud computing. The last step is to evaluate architectures against the criteria and collecting results of the model constructed in the previous steps.

Keywords: cloud computing, cloud security, intrusion detection/prevention system, multi-criteria decision analysis

Procedia PDF Downloads 468
11902 PLA Plastic as Biodegradable Material for 3D Printers

Authors: Juraj Beniak, Ľubomír Šooš, Peter Križan, Miloš Matúš

Abstract:

Within Rapid Prototyping technologies are used many types of materials. Many of them are recyclable but there are still as plastic like, so practically they do not degrade in the landfill. Polylactic acid (PLA) is one of the special plastic materials which are biodegradable and also available for 3D printing within Fused Deposition Modelling (FDM) technology. The question is, if the mechanical properties of produced models are comparable to similar technical plastic materials which are usual for prototype production. Presented paper shows the experiments results for tensile strength measurements for specimens prepared with different 3D printer settings and model orientation. Paper contains also the comparison of tensile strength values with values measured on specimens produced by conventional technologies as injection moulding.

Keywords: 3D printing, biodegradable plastic, fused deposition modeling, PLA plastic, rapid prototyping

Procedia PDF Downloads 414
11901 Eliminating Arm, Neck and Leg Fatigue of United Asia International Plastics Corporation Workers through Rapid Entire Body Assessment

Authors: John Cheferson R. De Belen, John Paul G. Elizares, Ronald John G. Raz, Janina Elyse A. Reyes, Charie G. Salengua, Aristotle L. Soriano

Abstract:

Plastic is a type of synthetic or man-made polymer that can readily be molded into a variety of products. Its usage over the past century has enabled society to make huge technological advances. The workers of United Asia International Plastics Corporation (UAIPC), a plastic manufacturing company performs manual packaging which causes fatigue and stress on their arm, neck, and legs due to extended periods of standing and repetitive motions. With the use of the Fishbone Diagram, Five-Why Analysis, Rapid Entire Body Assessment (REBA), and Anthropometry, the stressful tasks and activities were identified and analyzed. Given the anthropometric measurements obtained from the workers, improved dimensions for the tables and chairs should be used and provide a new packaging machine. The validation of this proposal shall follow after its implementation. By eliminating fatigue during working hours in the production, the workers will be at ease at performing their work properly; productivity will increase that will lead to more profit. Further areas for study include measurement and comparison of the worker’s anthropometric measurement with the industry standard.

Keywords: anthropometry, fishbone diagram, five-why analysis, rapid entire body assessment

Procedia PDF Downloads 263
11900 Correlation between the Ratios of House Dust Mite-Specific IgE/Total IgE and Asthma Control Test Score as a Biomarker of Immunotherapy Response Effectiveness in Pediatric Allergic Asthma Patients

Authors: Bela Siska Afrida, Wisnu Barlianto, Desy Wulandari, Ery Olivianto

Abstract:

Background: Allergic asthma, caused by IgE-mediated allergic reactions, remains a global health issue with high morbidity and mortality rates. Immunotherapy is the only etiology-based approach to treating asthma, but no standard biomarkers have been established to evaluate the therapy’s effectiveness. This study aims to determine the correlation between the ratios of serum levels of HDM-specific IgE/total IgE and Asthma Control Test (ACT) score as a biomarker of the response to immunotherapy in pediatric allergic asthma patients. Patient and Methods: This retrospective cohort study involved 26 pediatric allergic asthma patients who underwent HDM-specific subcutaneous immunotherapy for 14 weeks at the Pediatric Allergy Immunology Outpatient Clinic at Saiful Anwar General Hospital, Malang. Serum levels of HDM-Specific IgE and Total IgE were measured before and after immunotherapy using Chemiluminescence Immunoassay and Enzyme-linked Immunosorbent Assay (ELISA) method. Changes in asthma control were assessed using the ACT score. The Wilcoxon Signed Ranked Test and Spearman correlation test were used for data analysis. Results: There were 14 boys and 12 girls with a mean age of 6.48 ± 2.54 years. The study showed a significant decrease in serum HMD-specific levels before immunotherapy [9.88 ± 5.74 kuA/L] compared to those of 14 weeks after immunotherapy [4.51 ± 3.98 kuA/L], p = 0.000. Serum Total IgE levels significant decrease before immunotherapy [207.6 ± 120.8IU/ml] compared to those of 14 weeks after immunotherapy [109.83 ± 189.39 IU/mL], p = 0.000. The ratios of serum HDM-specific IgE/total IgE levels significant decrease before immunotherapy [0.063 ± 0.05] compared to those of 14 weeks after immunotherapy [0.041 ± 0.039], p = 0.012. There was also a significant increase in ACT scores before and after immunotherapy (each 15.5 ± 1.79 and 20.96 ± 2.049, p = 0.000). The correlation test showed a weak negative correlation between the ratios of HDM-specific IgE/total IgE levels and ACT score (p = 0.034 and r = -0.29). Conclusion: In conclusion, this study showed that a decrease in HDM-specific IgE levels, total IgE levels, and HDM-specific IgE/total IgE ratios, and an increase in ACT score, was observed after 14 weeks of HDM-specific subcutaneous immunotherapy. The weak negative correlation between the HDM-specific IgE/total IgE ratio and the ACT score suggests that this ratio can serve as a potential biomarker of the effectiveness of immunotherapy in treating pediatric allergic asthma patients.

Keywords: HDM-specific IgE/total IgE ratio, ACT score, immunotherapy, allergic asthma

Procedia PDF Downloads 65
11899 Sensitivity, Specificity and Efficiency Real-Time PCR Using SYBR Green Method to Determine Porcine and Bovine DNA Using Specific Primer Cytochrome B Gene

Authors: Ahlam Inayatullah Badrul Munir, M. Husaini A. Rahman, Mohd Sukri Hassan

Abstract:

Real-time PCR is a molecular biology technique that is currently being widely used for halal services to differentiating between porcine and bovine DNA. The useful of technique become very important for student or workers (who works in the laboratory) to learn how the technique could be run smoothly without fail. Same concept with conventional PCR, real-time PCR also needed DNA template, primer, enzyme polymerase, dNTP, and buffer. The difference is in real-time PCR, have additional component namely fluorescent dye. The most common use of fluorescent dye in real-time PCR is SYBR green. The purpose of this study was to find out how sensitive, specific and efficient real-time PCR technique was combined with SYBR green method and specific primers of CYT b. The results showed that real-time PCR technique using SYBR Green, capable of detecting porcine and bovine DNA concentrations up to 0.0001 µl/ng. The level of efficiency for both types of DNA was 91% (90-110). Not only that in specific primer CYT b bovine primer could detect only bovine DNA, and porcine primer could detect only porcine primer. So, from the study could be concluded that real-time PCR technique that was combined with specific primer CYT b and SYBR green method, was sensitive, specific and efficient to detect porcine and bovine DNA.

Keywords: sensitivity, specificity, efficiency, real-time PCR, SYBR green, Cytochrome b, porcine DNA, bovine DNA

Procedia PDF Downloads 313
11898 Combining in vitro Protein Expression with AlphaLISA Technology to Study Protein-Protein Interaction

Authors: Shayli Varasteh Moradi, Wayne A. Johnston, Dejan Gagoski, Kirill Alexandrov

Abstract:

The demand for a rapid and more efficient technique to identify protein-protein interaction particularly in the areas of therapeutics and diagnostics development is growing. The method described here is a rapid in vitro protein-protein interaction analysis approach based on AlphaLISA technology combined with Leishmania tarentolae cell-free protein production (LTE) system. Cell-free protein synthesis allows the rapid production of recombinant proteins in a multiplexed format. Among available in vitro expression systems, LTE offers several advantages over other eukaryotic cell-free systems. It is based on a fast growing fermentable organism that is inexpensive in cultivation and lysate production. High integrity of proteins produced in this system and the ability to co-express multiple proteins makes it a desirable method for screening protein interactions. Following the translation of protein pairs in LTE system, the physical interaction between proteins of interests is analysed by AlphaLISA assay. The assay is performed using unpurified in vitro translation reaction and therefore can be readily multiplexed. This approach can be used in various research applications such as epitope mapping, antigen-antibody analysis and protein interaction network mapping. The intra-viral protein interaction network of Zika virus was studied using the developed technique. The viral proteins were co-expressed pair-wise in LTE and all possible interactions among viral proteins were tested using AlphaLISA. The assay resulted to the identification of 54 intra-viral protein-protein interactions from which 19 binary interactions were found to be novel. The presented technique provides a powerful tool for rapid analysis of protein-protein interaction with high sensitivity and throughput.

Keywords: AlphaLISA technology, cell-free protein expression, epitope mapping, Leishmania tarentolae, protein-protein interaction

Procedia PDF Downloads 234
11897 Rapid Evidence Remote Acquisition in High-Availability Server and Storage System for Digital Forensic to Unravel Academic Crime

Authors: Bagus Hanindhito, Fariz Azmi Pratama, Ulfah Nadiya

Abstract:

Nowadays, digital system including, but not limited to, computer and internet have penetrated the education system widely. Critical information such as students’ academic records is stored in a server off- or on-campus. Although several countermeasures have been taken to protect the vital resources from outsider attack, the defense from insiders threat is not getting serious attention. At the end of 2017, a security incident that involved academic information system in one of the most respected universities in Indonesia affected not only the reputation of the institution and its academia but also academic integrity in Indonesia. In this paper, we will explain our efforts in investigating this security incident where we have implemented a novel rapid evidence remote acquisition method in high-availability server and storage system thus our data collection efforts do not disrupt the academic information system and can be conducted remotely minutes after incident report has been received. The acquired evidence is analyzed during digital forensic by constructing the model of the system in an isolated environment which allows multiple investigators to work together. In the end, the suspect is identified as a student (insider), and the investigation result is used by prosecutors to charge the suspect as an academic crime.

Keywords: academic information system, academic crime, digital forensic, high-availability server and storage, rapid evidence remote acquisition, security incident

Procedia PDF Downloads 147
11896 R-Killer: An Email-Based Ransomware Protection Tool

Authors: B. Lokuketagoda, M. Weerakoon, U. Madushan, A. N. Senaratne, K. Y. Abeywardena

Abstract:

Ransomware has become a common threat in past few years and the recent threat reports show an increase of growth in Ransomware infections. Researchers have identified different variants of Ransomware families since 2015. Lack of knowledge of the user about the threat is a major concern. Ransomware detection methodologies are still growing through the industry. Email is the easiest method to send Ransomware to its victims. Uninformed users tend to click on links and attachments without much consideration assuming the emails are genuine. As a solution to this in this paper R-Killer Ransomware detection tool is introduced. Tool can be integrated with existing email services. The core detection Engine (CDE) discussed in the paper focuses on separating suspicious samples from emails and handling them until a decision is made regarding the suspicious mail. It has the capability of preventing execution of identified ransomware processes. On the other hand, Sandboxing and URL analyzing system has the capability of communication with public threat intelligence services to gather known threat intelligence. The R-Killer has its own mechanism developed in its Proactive Monitoring System (PMS) which can monitor the processes created by downloaded email attachments and identify potential Ransomware activities. R-killer is capable of gathering threat intelligence without exposing the user’s data to public threat intelligence services, hence protecting the confidentiality of user data.

Keywords: ransomware, deep learning, recurrent neural networks, email, core detection engine

Procedia PDF Downloads 207
11895 A Less Complexity Deep Learning Method for Drones Detection

Authors: Mohamad Kassab, Amal El Fallah Seghrouchni, Frederic Barbaresco, Raed Abu Zitar

Abstract:

Detecting objects such as drones is a challenging task as their relative size and maneuvering capabilities deceive machine learning models and cause them to misclassify drones as birds or other objects. In this work, we investigate applying several deep learning techniques to benchmark real data sets of flying drones. A deep learning paradigm is proposed for the purpose of mitigating the complexity of those systems. The proposed paradigm consists of a hybrid between the AdderNet deep learning paradigm and the Single Shot Detector (SSD) paradigm. The goal was to minimize multiplication operations numbers in the filtering layers within the proposed system and, hence, reduce complexity. Some standard machine learning technique, such as SVM, is also tested and compared to other deep learning systems. The data sets used for training and testing were either complete or filtered in order to remove the images with mall objects. The types of data were RGB or IR data. Comparisons were made between all these types, and conclusions were presented.

Keywords: drones detection, deep learning, birds versus drones, precision of detection, AdderNet

Procedia PDF Downloads 178
11894 Dynamic Background Updating for Lightweight Moving Object Detection

Authors: Kelemewerk Destalem, Joongjae Cho, Jaeseong Lee, Ju H. Park, Joonhyuk Yoo

Abstract:

Background subtraction and temporal difference are often used for moving object detection in video. Both approaches are computationally simple and easy to be deployed in real-time image processing. However, while the background subtraction is highly sensitive to dynamic background and illumination changes, the temporal difference approach is poor at extracting relevant pixels of the moving object and at detecting the stopped or slowly moving objects in the scene. In this paper, we propose a moving object detection scheme based on adaptive background subtraction and temporal difference exploiting dynamic background updates. The proposed technique consists of a histogram equalization, a linear combination of background and temporal difference, followed by the novel frame-based and pixel-based background updating techniques. Finally, morphological operations are applied to the output images. Experimental results show that the proposed algorithm can solve the drawbacks of both background subtraction and temporal difference methods and can provide better performance than that of each method.

Keywords: background subtraction, background updating, real time, light weight algorithm, temporal difference

Procedia PDF Downloads 340
11893 Modeling of Digital and Settlement Consolidation of Soil under Oedomete

Authors: Yu-Lin Shen, Ming-Kuen Chang

Abstract:

In addition to a considerable amount of machinery and equipment, intricacies of the transmission pipeline exist in Petrochemical plants. Long term corrosion may lead to pipeline thinning and rupture, causing serious safety concerns. With the advances in non-destructive testing technology, more rapid and long-range ultrasonic detection techniques are often used for pipeline inspection, EMAT without coupling to detect, it is a non-contact ultrasonic, suitable for detecting elevated temperature or roughened e surface of line. In this study, we prepared artificial defects in pipeline for Electromagnetic Acoustic Transducer Testing (EMAT) to survey the relationship between the defect location, sizing and the EMAT signal. It was found that the signal amplitude of EMAT exhibited greater signal attenuation with larger defect depth and length.. In addition, with bigger flat hole diameter, greater amplitude attenuation was obtained. In summary, signal amplitude attenuation of EMAT was affected by the defect depth, defect length and the hole diameter and size.

Keywords: EMAT, artificial defect, NDT, ultrasonic testing

Procedia PDF Downloads 330
11892 An Experimental Study on the Thermal Properties of Concrete Aggregates in Relation to Their Mineral Composition

Authors: Kyung Suk Cho, Heung Youl Kim

Abstract:

The analysis of the petrologic characteristics and thermal properties of crushed aggregates for concrete such as granite, gneiss, dolomite, shale and andesite found that rock-forming minerals decided the thermal properties of the aggregates. The thermal expansion coefficients of aggregates containing lots of quartz increased rapidly at 573 degrees due to quartz transition. The mass of aggregate containing carbonate minerals decreased rapidly at 750 degrees due to decarboxylation, while its specific heat capacity increased relatively. The mass of aggregates containing hydrated silicate minerals decreased more significantly, and their specific heat capacities were greater when compared with aggregates containing feldspar or quartz. It is deduced that the hydroxyl group (OH) in hydrated silicate dissolved as its bond became loose at high temperatures. Aggregates containing mafic minerals turned red at high temperatures due to oxidation response. Moreover, the comparison of cooling methods showed that rapid cooling using water resulted in more reduction in aggregate mass than slow cooling at room temperatures. In order to observe the fire resistance performance of concrete composed of the identical but coarse aggregate, mass loss and compressive strength reduction factor at 200, 400, 600 and 800 degrees were measured. It was found from the analysis of granite and gneiss that the difference in thermal expansion coefficients between cement paste and aggregates caused by quartz transit at 573 degrees resulted in thermal stress inside the concrete and thus triggered concrete cracking. The ferromagnesian hydrated silicate in andesite and shale caused greater reduction in both initial stiffness and mass compared with other aggregates. However, the thermal expansion coefficient of andesite and shale was similar to that of cement paste. Since they were low in thermal conductivity and high in specific heat capacity, concrete cracking was relatively less severe. Being slow in heat transfer, they were judged to be materials of high heat capacity.

Keywords: crush-aggregates, fire resistance, thermal expansion, heat transfer

Procedia PDF Downloads 226
11891 Application of Semantic Technologies in Rapid Reconfiguration of Factory Systems

Authors: J. Zhang, K. Agyapong-Kodua

Abstract:

Digital factory based on visual design and simulation has emerged as a mainstream to reduce digital development life cycle. Some basic industrial systems are being integrated via semantic modelling, and products (P) matching process (P)-resource (R) requirements are designed to fulfill current customer demands. Nevertheless, product design is still limited to fixed product models and known knowledge of product engineers. Therefore, this paper presents a rapid reconfiguration method based on semantic technologies with PPR ontologies to reuse known and unknown knowledge. In order to avoid the influence of big data, our system uses a cloud manufactory and distributed database to improve the efficiency of querying meeting PPR requirements.

Keywords: semantic technologies, factory system, digital factory, cloud manufactory

Procedia PDF Downloads 486
11890 Genetic Instabilities in Marine Bivalve Following Benzo(α)pyrene Exposure: Utilization of Combined Random Amplified Polymorphic DNA and Comet Assay

Authors: Mengjie Qu, Yi Wang, Jiawei Ding, Siyu Chen, Yanan Di

Abstract:

Marine ecosystem is facing intensified multiple stresses caused by environmental contaminants from human activities. Xenobiotics, such as benzo(α)pyrene (BaP) have been discharged into marine environment and cause hazardous impacts on both marine organisms and human beings. As a filter-feeder, marine mussels, Mytilus spp., has been extensively used to monitor the marine environment. However, their genomic alterations induced by such xenobiotics are still kept unknown. In the present study, gills, as the first defense barrier in mussels, were selected to evaluate the genetic instability alterations induced by the exposure to BaP both in vivo and in vitro. Both random amplified polymorphic DNA (RAPD) assay and comet assay were applied as the rapid tools to assess the environmental stresses due to their low money- and time-consumption. All mussels were identified to be the single species of Mytilus coruscus before used in BaP exposure at the concentration of 56 μg/l for 1 & 3 days (in vivo exposure) or 1 & 3 hours (in vitro). Both RAPD and comet assay results were showed significantly increased genomic instability with time-specific altering pattern. After the recovery period in 'in vivo' exposure, the genomic status was as same as control condition. However, the relative higher genomic instabilities were still observed in gill cells after the recovery from in vitro exposure condition. Different repair mechanisms or signaling pathway might be involved in the isolated gill cells in the comparison with intact tissues. The study provides the robust and rapid techniques to exam the genomic stability in marine organisms in response to marine environmental changes and provide basic information for further mechanism research in stress responses in marine organisms.

Keywords: genotoxic impacts, in vivo/vitro exposure, marine mussels, RAPD and comet assay

Procedia PDF Downloads 278
11889 Financial Statement Fraud: The Need for a Paradigm Shift to Forensic Accounting

Authors: Ifedapo Francis Awolowo

Abstract:

The unrelenting series of embarrassing audit failures should stimulate a paradigm shift in accounting. And in this age of information revolution, there is need for a constant improvement on the products or services one offers to the market in order to be relevant. This study explores the perceptions of external auditors, forensic accountants and accounting academics on whether a paradigm shift to forensic accounting can reduce financial statement frauds. Through Neo-empiricism/inductive analytical approach, findings reveal that a paradigm shift to forensic accounting might be the right step in the right direction in order to increase the chances of fraud prevention and detection in the financial statement. This research has implication on accounting education on the need to incorporate forensic accounting into present day accounting curriculum. Accounting professional bodies, accounting standard setters and accounting firms all have roles to play in incorporating forensic accounting education into accounting curriculum. Particularly, there is need to alter the ISA 240 to make the prevention and detection of frauds the responsibilities of bot those charged with the management and governance of companies and statutory auditors.

Keywords: financial statement fraud, forensic accounting, fraud prevention and detection, auditing, audit expectation gap, corporate governance

Procedia PDF Downloads 365
11888 Detection and Classification Strabismus Using Convolutional Neural Network and Spatial Image Processing

Authors: Anoop T. R., Otman Basir, Robert F. Hess, Eileen E. Birch, Brooke A. Koritala, Reed M. Jost, Becky Luu, David Stager, Ben Thompson

Abstract:

Strabismus refers to a misalignment of the eyes. Early detection and treatment of strabismus in childhood can prevent the development of permanent vision loss due to abnormal development of visual brain areas. We developed a two-stage method for strabismus detection and classification based on photographs of the face. The first stage detects the presence or absence of strabismus, and the second stage classifies the type of strabismus. The first stage comprises face detection using Haar cascade, facial landmark estimation, face alignment, aligned face landmark detection, segmentation of the eye region, and detection of strabismus using VGG 16 convolution neural networks. Face alignment transforms the face to a canonical pose to ensure consistency in subsequent analysis. Using facial landmarks, the eye region is segmented from the aligned face and fed into a VGG 16 CNN model, which has been trained to classify strabismus. The CNN determines whether strabismus is present and classifies the type of strabismus (exotropia, esotropia, and vertical deviation). If stage 1 detects strabismus, the eye region image is fed into stage 2, which starts with the estimation of pupil center coordinates using mask R-CNN deep neural networks. Then, the distance between the pupil coordinates and eye landmarks is calculated along with the angle that the pupil coordinates make with the horizontal and vertical axis. The distance and angle information is used to characterize the degree and direction of the strabismic eye misalignment. This model was tested on 100 clinically labeled images of children with (n = 50) and without (n = 50) strabismus. The True Positive Rate (TPR) and False Positive Rate (FPR) of the first stage were 94% and 6% respectively. The classification stage has produced a TPR of 94.73%, 94.44%, and 100% for esotropia, exotropia, and vertical deviations, respectively. This method also had an FPR of 5.26%, 5.55%, and 0% for esotropia, exotropia, and vertical deviation, respectively. The addition of one more feature related to the location of corneal light reflections may reduce the FPR, which was primarily due to children with pseudo-strabismus (the appearance of strabismus due to a wide nasal bridge or skin folds on the nasal side of the eyes).

Keywords: strabismus, deep neural networks, face detection, facial landmarks, face alignment, segmentation, VGG 16, mask R-CNN, pupil coordinates, angle deviation, horizontal and vertical deviation

Procedia PDF Downloads 92