Search results for: processing map
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3692

Search results for: processing map

1652 The Impact of the General Data Protection Regulation on Human Resources Management in Schools

Authors: Alexandra Aslanidou

Abstract:

The General Data Protection Regulation (GDPR), concerning the protection of natural persons within the European Union with regard to the processing of personal data and on the free movement of such data, became applicable in the European Union (EU) on 25 May 2018 and transformed the way personal data were being treated under the Data Protection Directive (DPD) regime, generating sweeping organizational changes to both public sector and business. A social practice that is considerably influenced in the way of its day-to-day operations is Human Resource (HR) management, for which the importance of GDPR cannot be underestimated. That is because HR processes personal data coming in all shapes and sizes from many different systems and sources. The significance of the proper functioning of an HR department, specifically in human-centered, service-oriented environments such as the education field, is decisive due to the fact that HR operations in schools, conducted effectively, determine the quality of the provided services and consequently have a considerable impact on the success of the educational system. The purpose of this paper is to analyze the decisive role that GDPR plays in HR departments that operate in schools and in order to practically evaluate the aftermath of the Regulation during the first months of its applicability; a comparative use cases analysis in five highly dynamic schools, across three EU Member States, was attempted.

Keywords: general data protection regulation, human resource management, educational system

Procedia PDF Downloads 100
1651 Automatic Tagging and Accuracy in Assamese Text Data

Authors: Chayanika Hazarika Bordoloi

Abstract:

This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.

Keywords: CRF, morphology, tagging, tagset

Procedia PDF Downloads 194
1650 FPGA Implementation of a Marginalized Particle Filter for Delineation of P and T Waves of ECG Signal

Authors: Jugal Bhandari, K. Hari Priya

Abstract:

The ECG signal provides important clinical information which could be used to pretend the diseases related to heart. Accordingly, delineation of ECG signal is an important task. Whereas delineation of P and T waves is a complex task. This paper deals with the Study of ECG signal and analysis of signal by means of Verilog Design of efficient filters and MATLAB tool effectively. It includes generation and simulation of ECG signal, by means of real time ECG data, ECG signal filtering and processing by analysis of different algorithms and techniques. In this paper, we design a basic particle filter which generates a dynamic model depending on the present and past input samples and then produces the desired output. Afterwards, the output will be processed by MATLAB to get the actual shape and accurate values of the ranges of P-wave and T-wave of ECG signal. In this paper, Questasim is a tool of mentor graphics which is being used for simulation and functional verification. The same design is again verified using Xilinx ISE which will be also used for synthesis, mapping and bit file generation. Xilinx FPGA board will be used for implementation of system. The final results of FPGA shall be verified with ChipScope Pro where the output data can be observed.

Keywords: ECG, MATLAB, Bayesian filtering, particle filter, Verilog hardware descriptive language

Procedia PDF Downloads 367
1649 The Advancements of Transformer Models in Part-of-Speech Tagging System for Low-Resource Tigrinya Language

Authors: Shamm Kidane, Ibrahim Abdella, Fitsum Gaim, Simon Mulugeta, Sirak Asmerom, Natnael Ambasager, Yoel Ghebrihiwot

Abstract:

The call for natural language processing (NLP) systems for low-resource languages has become more apparent than ever in the past few years, with the arduous challenges still present in preparing such systems. This paper presents an improved dataset version of the Nagaoka Tigrinya Corpus for Parts-of-Speech (POS) classification system in the Tigrinya language. The size of the initial Nagaoka dataset was incremented, totaling the new tagged corpus to 118K tokens, which comprised the 12 basic POS annotations used previously. The additional content was also annotated manually in a stringent manner, followed similar rules to the former dataset and was formatted in CONLL format. The system made use of the novel approach in NLP tasks and use of the monolingually pre-trained TiELECTRA, TiBERT and TiRoBERTa transformer models. The highest achieved score is an impressive weighted F1-score of 94.2%, which surpassed the previous systems by a significant measure. The system will prove useful in the progress of NLP-related tasks for Tigrinya and similarly related low-resource languages with room for cross-referencing higher-resource languages.

Keywords: Tigrinya POS corpus, TiBERT, TiRoBERTa, conditional random fields

Procedia PDF Downloads 103
1648 Effect of Citric Acid and Clove on Cured Smoked Meat: A Traditional Meat Product

Authors: Esther Eduzor, Charles A. Negbenebor, Helen O. Agu

Abstract:

Smoking of meat enhances the taste and look of meat, it also increases its longevity, and helps preserve the meat by slowing down the spoilage of fat and growth of bacteria. The Lean meat from the forequarter of beef carcass was obtained from the Maiduguri abattoir. The meat was cut into four portions with weight ranging from 525-545 g. The meat was cut into bits measuring about 8 cm in length, 3.5 cm in thickness and weighed 64.5 g. Meat samples were washed, cured with various concentration of sodium chloride, sodium nitrate, citric acid and clove for 30 min, drained and smoked in a smoking kiln at a temperature range of 55-600°C, for 8 hr a day for 3 days. The products were stored at ambient temperature and evaluated microbiologically and organoleptically. In terms of processing and storage there were increases in pH, free fatty acid content, a decrease in water holding capacity and microbial count of the cured smoked meat. The panelists rated control samples significantly (p < 0.05) higher in terms of colour, texture, taste and overall acceptability. The following organisms were isolated and identified during storage: Bacillus specie, Bacillus subtilis, streptococcus, Pseudomonas, Aspergillus niger, Candida and Penicillium specie. The study forms a basis for new product development for meat industry.

Keywords: citric acid, cloves, smoked meat, bioengineering

Procedia PDF Downloads 445
1647 Study of Aerosol Deposition and Shielding Effects on Fluorescent Imaging Quantitative Evaluation in Protective Equipment Validation

Authors: Shinhao Yang, Hsiao-Chien Huang, Chin-Hsiang Luo

Abstract:

The leakage of protective clothing is an important issue in the occupational health field. There is no quantitative method for measuring the leakage of personal protective equipment. This work aims to measure the quantitative leakage of the personal protective equipment by using the fluorochrome aerosol tracer. The fluorescent aerosols were employed as airborne particulates in a controlled chamber with ultraviolet (UV) light-detectable stickers. After an exposure-and-leakage test, the protective equipment was removed and photographed with UV-scanning to evaluate areas, color depth ratio, and aerosol deposition and shielding effects of the areas where fluorescent aerosols had adhered to the body through the protective equipment. Thus, this work built a calculation software for quantitative leakage ratio of protective clothing based on fluorescent illumination depth/aerosol concentration ratio, illumination/Fa ratio, aerosol deposition and shielding effects, and the leakage area ratio on the segmentation. The results indicated that the two-repetition total leakage rate of the X, Y, and Z type protective clothing for subject T were about 3.05, 4.21, and 3.52 (mg/m2). For five-repetition, the leakage rate of T were about 4.12, 4.52, and 5.11 (mg/m2).

Keywords: fluorochrome, deposition, shielding effects, digital image processing, leakage ratio, personal protective equipment

Procedia PDF Downloads 323
1646 Statistical Tools for SFRA Diagnosis in Power Transformers

Authors: Rahul Srivastava, Priti Pundir, Y. R. Sood, Rajnish Shrivastava

Abstract:

For the interpretation of the signatures of sweep frequency response analysis(SFRA) of transformer different types of statistical techniques serves as an effective tool for doing either phase to phase comparison or sister unit comparison. In this paper with the discussion on SFRA several statistics techniques like cross correlation coefficient (CCF), root square error (RSQ), comparative standard deviation (CSD), Absolute difference, mean square error(MSE),Min-Max ratio(MM) are presented through several case studies. These methods require sample data size and spot frequencies of SFRA signatures that are being compared. The techniques used are based on power signal processing tools that can simplify result and limits can be created for the severity of the fault occurring in the transformer due to several short circuit forces or due to ageing. The advantages of using statistics techniques for analyzing of SFRA result are being indicated through several case studies and hence the results are obtained which determines the state of the transformer.

Keywords: absolute difference (DABS), cross correlation coefficient (CCF), mean square error (MSE), min-max ratio (MM-ratio), root square error (RSQ), standard deviation (CSD), sweep frequency response analysis (SFRA)

Procedia PDF Downloads 697
1645 Damage Identification Using Experimental Modal Analysis

Authors: Niladri Sekhar Barma, Satish Dhandole

Abstract:

Damage identification in the context of safety, nowadays, has become a fundamental research interest area in the field of mechanical, civil, and aerospace engineering structures. The following research is aimed to identify damage in a mechanical beam structure and quantify the severity or extent of damage in terms of loss of stiffness, and obtain an updated analytical Finite Element (FE) model. An FE model is used for analysis, and the location of damage for single and multiple damage cases is identified numerically using the modal strain energy method and mode shape curvature method. Experimental data has been acquired with the help of an accelerometer. Fast Fourier Transform (FFT) algorithm is applied to the measured signal, and subsequently, post-processing is done in MEscopeVes software. The two sets of data, the numerical FE model and experimental results, are compared to locate the damage accurately. The extent of the damage is identified via modal frequencies using a mixed numerical-experimental technique. Mode shape comparison is performed by Modal Assurance Criteria (MAC). The analytical FE model is adjusted by the direct method of model updating. The same study has been extended to some real-life structures such as plate and GARTEUR structures.

Keywords: damage identification, damage quantification, damage detection using modal analysis, structural damage identification

Procedia PDF Downloads 116
1644 Graph Codes - 2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval

Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje

Abstract:

Multimedia Indexing and Retrieval is generally designed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, especially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelization. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.

Keywords: indexing, retrieval, multimedia, graph algorithm, graph code

Procedia PDF Downloads 161
1643 Physical Properties and Elastic Studies of Fluoroaluminate Glasses Based on Alkali

Authors: C. Benhamideche

Abstract:

Fluoroaluminate glasses have been reported as the earliest heavy metal fluoride glasses. By comparison with flurozirconate glasses, they offer a set of similar optical features, but also some differences in their elastic and chemical properties. In practice they have been less developed because their stability against devitrification is smaller than that of the most stable fluoroziconates. The purpose of this study was to investigate glass formation in systems AlF3-YF3-PbF2-MgF2-MF2 (M= Li, Na, K). Synthesis was implemented at room atmosphere using the ammonium fluoride processing. After fining, the liquid was into a preheated brass mold, then annealed below the glass transition temperature for several hours. The samples were polished for optical measurements. Glass formation has been investigated in a systematic way, using pseudo ternary systems in order to allow parameters to vary at the same time. We have chosen the most stable glass compositions for the determination of the physical properties. These properties including characteristic temperatures, density and proprieties elastic. Glass stability increases in multicomponent glasses. Bulk samples have been prepared for physical characterization. These glasses have a potential interest for passive optical fibers because they are less sensitive to water attack than ZBLAN glass, mechanically stronger. It is expected they could have a larger damage threshold for laser power transmission.

Keywords: fluoride glass, aluminium fluoride, thermal properties, density, proprieties elastic

Procedia PDF Downloads 241
1642 Aerodynamics of Spherical Combat Platform Levitation

Authors: Aelina Franz

Abstract:

In recent years, the scientific community has witnessed a paradigm shift in the exploration of unconventional levitation methods, particularly in the domain of spherical combat platforms. This paper explores aerodynamics and levitational dynamics inherent in these spheres by examining interactions at the quantum level. Our research unravels the nuanced aerodynamic phenomena governing the levitation of spherical combat platforms. Through an analysis of the quantum fluid dynamics surrounding these spheres, we reveal the crucial interactions between air resistance, surface irregularities, and the quantum fluctuations that influence their levitational behavior. Our findings challenge conventional understanding, providing a perspective on the aerodynamic forces at play during the levitation of spherical combat platforms. Furthermore, we propose design modifications and control strategies informed by both classical aerodynamics and quantum information processing principles. These advancements not only enhance the stability and maneuverability of the combat platforms but also open new avenues for exploration in the interdisciplinary realm of engineering and quantum information sciences. This paper aims to contribute to levitation technologies and their applications in the field of spherical combat platforms. We anticipate that our work will stimulate further research to create a deeper understanding of aerodynamics and quantum phenomena in unconventional levitation systems.

Keywords: spherical combat platforms, levitation technologies, aerodynamics, maneuverable platforms

Procedia PDF Downloads 57
1641 Identifying the Structural Components of Old Buildings from Floor Plans

Authors: Shi-Yu Xu

Abstract:

The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.

Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence

Procedia PDF Downloads 89
1640 A Review on Existing Challenges of Data Mining and Future Research Perspectives

Authors: Hema Bhardwaj, D. Srinivasa Rao

Abstract:

Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.

Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges

Procedia PDF Downloads 110
1639 Random Subspace Neural Classifier for Meteor Recognition in the Night Sky

Authors: Carlos Vera, Tetyana Baydyk, Ernst Kussul, Graciela Velasco, Miguel Aparicio

Abstract:

This article describes the Random Subspace Neural Classifier (RSC) for the recognition of meteors in the night sky. We used images of meteors entering the atmosphere at night between 8:00 p.m.-5: 00 a.m. The objective of this project is to classify meteor and star images (with stars as the image background). The monitoring of the sky and the classification of meteors are made for future applications by scientists. The image database was collected from different websites. We worked with RGB-type images with dimensions of 220x220 pixels stored in the BitMap Protocol (BMP) format. Subsequent window scanning and processing were carried out for each image. The scan window where the characteristics were extracted had the size of 20x20 pixels with a scanning step size of 10 pixels. Brightness, contrast and contour orientation histograms were used as inputs for the RSC. The RSC worked with two classes and classified into: 1) with meteors and 2) without meteors. Different tests were carried out by varying the number of training cycles and the number of images for training and recognition. The percentage error for the neural classifier was calculated. The results show a good RSC classifier response with 89% correct recognition. The results of these experiments are presented and discussed.

Keywords: contour orientation histogram, meteors, night sky, RSC neural classifier, stars

Procedia PDF Downloads 139
1638 Multi-Modal Feature Fusion Network for Speaker Recognition Task

Authors: Xiang Shijie, Zhou Dong, Tian Dan

Abstract:

Speaker recognition is a crucial task in the field of speech processing, aimed at identifying individuals based on their vocal characteristics. However, existing speaker recognition methods face numerous challenges. Traditional methods primarily rely on audio signals, which often suffer from limitations in noisy environments, variations in speaking style, and insufficient sample sizes. Additionally, relying solely on audio features can sometimes fail to capture the unique identity of the speaker comprehensively, impacting recognition accuracy. To address these issues, we propose a multi-modal network architecture that simultaneously processes both audio and text signals. By gradually integrating audio and text features, we leverage the strengths of both modalities to enhance the robustness and accuracy of speaker recognition. Our experiments demonstrate significant improvements with this multi-modal approach, particularly in complex environments, where recognition performance has been notably enhanced. Our research not only highlights the limitations of current speaker recognition methods but also showcases the effectiveness of multi-modal fusion techniques in overcoming these limitations, providing valuable insights for future research.

Keywords: feature fusion, memory network, multimodal input, speaker recognition

Procedia PDF Downloads 33
1637 Solving Process Planning, Weighted Apparent Tardiness Cost Dispatching, and Weighted Processing plus Weight Due-Date Assignment Simultaneously Using a Hybrid Search

Authors: Halil Ibrahim Demir, Caner Erden, Abdullah Hulusi Kokcam, Mumtaz Ipek

Abstract:

Process planning, scheduling, and due date assignment are three important manufacturing functions which are studied independently in literature. There are hundreds of works on IPPS and SWDDA problems but a few works on IPPSDDA problem. Integrating these three functions is very crucial due to the high relationship between them. Since the scheduling problem is in the NP-Hard problem class without any integration, an integrated problem is even harder to solve. This study focuses on the integration of these functions. Sum of weighted tardiness, earliness, and due date related costs are used as a penalty function. Random search and hybrid metaheuristics are used to solve the integrated problem. Marginal improvement in random search is very high in the early iterations and reduces enormously in later iterations. At that point directed search contribute to marginal improvement more than random search. In this study, random and genetic search methods are combined to find better solutions. Results show that overall performance becomes better as the integration level increases.

Keywords: process planning, genetic algorithm, hybrid search, random search, weighted due-date assignment, weighted scheduling

Procedia PDF Downloads 362
1636 Electrochemical Treatment and Chemical Analyses of Tannery Wastewater Using Sacrificial Aluminum Electrode, Ethiopia

Authors: Dessie Tibebe, Muluken Asmare, Marye Mulugeta, Yezbie Kassa, Zerubabel Moges, Dereje Yenealem, Tarekegn Fentie, Agmas Amare

Abstract:

The performance of electrocoagulation (EC) using Aluminium electrodes for the treatment of effluent-containing chromium metal using a fixed bed electrochemical batch reactor was studied. In the present work, the efficiency evaluation of EC in removing physicochemical and heavy metals from real industrial tannery wastewater in the Amhara region, collected from Bahirdar, Debre Brihan, and Haik, was investigated. The treated and untreated samples were determined by AAS and ICP OES spectrophotometers. The results indicated that selected heavy metals were removed in all experiments with high removal percentages. The optimal results were obtained regarding both cost and electrocoagulation efficiency with initial pH = 3, initial concentration = 40 mg/L, electrolysis time = 30 min, current density = 40 mA/cm2, and temperature = 25oC favored metal removal. The maximum removal percentages of selected metals obtained were 84.42% for Haik, 92.64% for Bahir Dar and 94.90% for Debre Brihan. The sacrificial electrode and sludge were characterized by FT-IR, SEM and XRD. After treatment, some metals like chromium will be used again as a tanning agent in leather processing to promote a circular economy.

Keywords: electrochemical, treatment, aluminum, tannery effluent

Procedia PDF Downloads 110
1635 Investigation of Glacier Activity Using Optical and Radar Data in Zardkooh

Authors: Mehrnoosh Ghadimi, Golnoush Ghadimi

Abstract:

Precise monitoring of glacier velocity is critical in determining glacier-related hazards. Zardkooh Mountain was studied in terms of glacial activity rate in Zagros Mountainous region in Iran. In this study, we assessed the ability of optical and radar imagery to derive glacier-surface velocities in mountainous terrain. We processed Landsat 8 for optical data and Sentinel-1a for radar data. We used methods that are commonly used to measure glacier surface movements, such as cross correlation of optical and radar satellite images, SAR tracking techniques, and multiple aperture InSAR (MAI). We also assessed time series glacier surface displacement using our modified method, Enhanced Small Baseline Subset (ESBAS). The ESBAS has been implemented in StaMPS software, with several aspects of the processing chain modified, including filtering prior to phase unwrapping, topographic correction within three-dimensional phase unwrapping, reducing atmospheric noise, and removing the ramp caused by ionosphere turbulence and/or orbit errors. Our findings indicate an average surface velocity rate of 32 mm/yr in the Zardkooh mountainous areas.

Keywords: active rock glaciers, landsat 8, sentinel-1a, zagros mountainous region

Procedia PDF Downloads 77
1634 A Comparison between Underwater Image Enhancement Techniques

Authors: Ouafa Benaida, Abdelhamid Loukil, Adda Ali Pacha

Abstract:

In recent years, the growing interest of scientists in the field of image processing and analysis of underwater images and videos has been strengthened following the emergence of new underwater exploration techniques, such as the emergence of autonomous underwater vehicles and the use of underwater image sensors facilitating the exploration of underwater mineral resources as well as the search for new species of aquatic life by biologists. Indeed, underwater images and videos have several defects and must be preprocessed before their analysis. Underwater landscapes are usually darkened due to the interaction of light with the marine environment: light is absorbed as it travels through deep waters depending on its wavelength. Additionally, light does not follow a linear direction but is scattered due to its interaction with microparticles in water, resulting in low contrast, low brightness, color distortion, and restricted visibility. The improvement of the underwater image is, therefore, more than necessary in order to facilitate its analysis. The research presented in this paper aims to implement and evaluate a set of classical techniques used in the field of improving the quality of underwater images in several color representation spaces. These methods have the particularity of being simple to implement and do not require prior knowledge of the physical model at the origin of the degradation.

Keywords: underwater image enhancement, histogram normalization, histogram equalization, contrast limited adaptive histogram equalization, single-scale retinex

Procedia PDF Downloads 89
1633 Motion Detection Method for Clutter Rejection in the Bio-Radar Signal Processing

Authors: Carolina Gouveia, José Vieira, Pedro Pinho

Abstract:

The cardiopulmonary signal monitoring, without the usage of contact electrodes or any type of in-body sensors, has several applications such as sleeping monitoring and continuous monitoring of vital signals in bedridden patients. This system has also applications in the vehicular environment to monitor the driver, in order to avoid any possible accident in case of cardiac failure. Thus, the bio-radar system proposed in this paper, can measure vital signals accurately by using the Doppler effect principle that relates the received signal properties with the distance change between the radar antennas and the person’s chest-wall. Once the bio-radar aim is to monitor subjects in real-time and during long periods of time, it is impossible to guarantee the patient immobilization, hence their random motion will interfere in the acquired signals. In this paper, a mathematical model of the bio-radar is presented, as well as its simulation in MATLAB. The used algorithm for breath rate extraction is explained and a method for DC offsets removal based in a motion detection system is proposed. Furthermore, experimental tests were conducted with a view to prove that the unavoidable random motion can be used to estimate the DC offsets accurately and thus remove them successfully.

Keywords: bio-signals, DC component, Doppler effect, ellipse fitting, radar, SDR

Procedia PDF Downloads 141
1632 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks

Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam

Abstract:

In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.

Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion

Procedia PDF Downloads 123
1631 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic

Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi

Abstract:

In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.

Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing

Procedia PDF Downloads 299
1630 Analysing Waste Management Options in the Printing Industry: Case of a South African Company

Authors: Stanley Fore

Abstract:

The case study company is one of the leading newsprint companies in South Africa. The company has achieved this status through operational expansion, diversification and investing in cutting-edge technology. They have a reputation for the highest quality and personalised service that transcends borders and industries. The company offers a wide variety of small and large scales printing services. The company is faced with the challenge of significant waste production during normal operations. The company generates 1200 kg of plastic waste and 60 – 70 tonnes of paper waste per month. The company operates a waste management process currently, whereby waste paper is sold, at low cost, to recycling firms for further processing. Having considered the quantity of waste being generated, the company has embarked on a venture to find a more profitable solution to its current waste production. As waste management and recycling is not the company’s core business, the aim of the venture is to implement a secondary profitable waste process business. The venture will be expedited as a strategic project. This research aims to estimate the financial feasibility of a selected solution as well as the impact of non-financial considerations thereof. The financial feasibility is analysed using metrics such as Payback period; internal rate of return and net present value.

Keywords: waste, printing industry, up-cycling, management

Procedia PDF Downloads 262
1629 Encapsulation of Satureja khuzestanica Essential Oil in Chitosan Nanoparticles with Enhanced Antifungal Activity

Authors: Amir Amiri, Naghmeh Morakabati

Abstract:

During the recent years the six-fold growth of cancer in Iran has led the production of healthy products to become a challenge in the food industry. Due to the young population in the country, the consumption of fast foods is growing. The chemical cancer-causing preservatives are used to produce these products more than the standard; so using an appropriate alternative seems to be important. On the one hand, the plant essential oils show the high antimicrobial potential against pathogenic and spoilage microorganisms and on the other hand they are highly volatile and decomposed under the processing conditions. The study aims to produce the loaded chitosan nanoparticles with different concentrations of savory essential oil to improve the anti-microbial property and increase the resistance of essential oil to oxygen and heat. The encapsulation efficiency was obtained in the range of 32.07% to 39.93% and the particle size distribution of the samples was observed in the range of 159 to 210 nm. The range of Zeta potential was obtained between -11.9 to -23.1 mV. The essential oil loaded in chitosan showed stronger antifungal activity against Rhizopus stolonifer. The results showed that the antioxidant property is directly related to the concentration of loaded essential oil so that the antioxidant property increases by increasing the concentration of essential oil. In general, it seems that the savory essential oil loaded in chitosan particles can be used as a food processor.

Keywords: chitosan, encapsulation, essential oil, nanogel

Procedia PDF Downloads 274
1628 Plasma Lipid Profiles and Atherogenic Indices of Rats Fed Raw and Processed Jack Fruit (Artocarpus heterophyllus) Seeds Diets at Different Concentrations

Authors: O. E. Okafor, L. U. S. Ezeanyika, C. G. Nkwonta, C. J. Okonkwo

Abstract:

The effect of processing on plasma lipid profile and atherogenic indices of rats fed Artocarpus heterophyllus seed diets at different concentrations were investigated. Fifty five rats were used for this study, they were divided into eleven groups of five rats each (one control group and ten test groups), the test groups were fed raw, boiled, roasted, fermented, and soaked diets at 10 % and 40% concentrations. The study lasted for thirty five days. The diets led to significant decrease (p < 0.05) in plasma cholesterol and triacylglycerol of rats fed 10% and 40% concentrations of the diets, and a significant increase (p < 0.05) in high density lipoprotein (HDL) levels at 40% concentrations of the test diets. The diets also produced decrease in low density lipoprotein (LDL), very low density lipoprotein (VLDL), cardiac risk ratio (CRR), atherogenic index of plasma (AIP) and atherogenic coefficient (AC) at 40% concentrations except the soaked group that showed slight elevation of LDL, CRR, AC and AIP at 40% concentration. Artocarpus heterophyllus seeds could be beneficial to health because of its ability to increase plasma HDL and reduce plasma LDL, VLDL, cholesterol, triglycerides and atherogenic indices at higher diet concentration.

Keywords: artocarpus heterophyllus, atherogenic indices, concentrations, lipid profile

Procedia PDF Downloads 302
1627 Development of Colorimetric Based Microfluidic Platform for Quantification of Fluid Contaminants

Authors: Sangeeta Palekar, Mahima Rana, Jayu Kalambe

Abstract:

In this paper, a microfluidic-based platform for the quantification of contaminants in the water is proposed. The proposed system uses microfluidic channels with an embedded environment for contaminants detection in water. Microfluidics-based platforms present an evident stage of innovation for fluid analysis, with different applications advancing minimal efforts and simplicity of fabrication. Polydimethylsiloxane (PDMS)-based microfluidics channel is fabricated using a soft lithography technique. Vertical and horizontal connections for fluid dispensing with the microfluidic channel are explored. The principle of colorimetry, which incorporates the use of Griess reagent for the detection of nitrite, has been adopted. Nitrite has high water solubility and water retention, due to which it has a greater potential to stay in groundwater, endangering aquatic life along with human health, hence taken as a case study in this work. The developed platform also compares the detection methodology, containing photodetectors for measuring absorbance and image sensors for measuring color change for quantification of contaminants like nitrite in water. The utilization of image processing techniques offers the advantage of operational flexibility, as the same system can be used to identify other contaminants present in water by introducing minor software changes.

Keywords: colorimetric, fluid contaminants, nitrite detection, microfluidics

Procedia PDF Downloads 199
1626 Application of Box-Behnken Response Surface Design for Optimization of Essential Oil Based Disinfectant on Mixed Species Biofilm

Authors: Anita Vidacs, Robert Rajko, Csaba Vagvolgyi, Judit Krisch

Abstract:

With the optimization of a new disinfectant the number of tests could be decreased and the cost of processing too. Good sanitizers are eco-friendly and allow no resistance evolvement of bacteria. The essential oils (EOs) are natural antimicrobials, and most of them have the Generally Recognized As Safe (GRAS) status. In our study, the effect of the EOs cinnamon, marjoram, and thyme was investigated against mixed species bacterial biofilms of Escherichia coli, Listeria monocytogenes, Pseudomonas putida, and Staphylococcus aureus. The optimal concentration of EOs, disinfection time and level of pH were evaluated with the aid of Response Surface Box-Behnken Design (RSD) on 1 day and 7 days old biofilms on metal, plastic, and wood surfaces. The variable factors were in the range of 1-3 times of minimum bactericide concentration (MBC); 10-110 minutes acting time and 4.5- 7.5 pH. The optimized EO disinfectant was compared to industrial used chemicals (HC-DPE, Hypo). The natural based disinfectants were applicable; the acting time was below 30 minutes. EOs were able to eliminate the biofilm from the used surfaces except from wood. The disinfection effect of the EO based natural solutions was in most cases equivalent or better compared to chemical sanitizers used in food industry.

Keywords: biofilm, Box-Behnken design, disinfectant, essential oil

Procedia PDF Downloads 220
1625 Role of Geomatics in Architectural and Cultural Conservation

Authors: Shweta Lall

Abstract:

The intent of this paper is to demonstrate the role of computerized auxiliary science in advancing the desired and necessary alliance of historians, surveyors, topographers, and analysts of architectural conservation and management. The digital era practice of recording architectural and cultural heritage in view of its preservation, dissemination, and planning developments are discussed in this paper. Geomatics include practices like remote sensing, photogrammetry, surveying, Geographic Information System (GIS), laser scanning technology, etc. These all resources help in architectural and conservation applications which will be identified through various case studies analysed in this paper. The standardised outcomes and the methodologies using relevant case studies are listed and described. The main component of geomatics methodology adapted in conservation is data acquisition, processing, and presentation. Geomatics is used in a wide range of activities involved in architectural and cultural heritage – damage and risk assessment analysis, documentation, 3-D model construction, virtual reconstruction, spatial and structural decision – making analysis and monitoring. This paper will project the summary answers of the capabilities and limitations of the geomatics field in architectural and cultural conservation. Policy-makers, urban planners, architects, and conservationist not only need answers to these questions but also need to practice them in a predictable, transparent, spatially explicit and inexpensive manner.

Keywords: architectural and cultural conservation, geomatics, GIS, remote sensing

Procedia PDF Downloads 147
1624 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control

Authors: Ming-Yen Chang, Sheng-Hung Ke

Abstract:

This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.

Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride

Procedia PDF Downloads 66
1623 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 413