Search results for: automatic identification system
19482 Scientific Investigation for an Ancient Egyptian Polychrome Wooden Stele
Authors: Ahmed Abdrabou, Medhat Abdalla
Abstract:
The studied stele dates back to Third Intermediate Period (1075-664) B.C in an ancient Egypt. It is made of wood and covered with painted gesso layers. This study aims to use a combination of multi spectral imaging {visible, infrared (IR), Visible-induced infrared luminescence (VIL), Visible-induced ultraviolet luminescence (UVL) and ultraviolet reflected (UVR)}, along with portable x-ray fluorescence in order to map and identify the pigments as well as to provide a deeper understanding of the painting techniques. Moreover; the authors were significantly interested in the identification of wood species. Multispectral imaging acquired in 3 spectral bands, ultraviolet (360-400 nm), visible (400-780 nm) and infrared (780-1100 nm) using (UV Ultraviolet-induced luminescence (UVL), UV Reflected (UVR), Visible (VIS), Visible-induced infrared luminescence (VIL) and Infrared photography. False color images are made by digitally editing the VIS with IR or UV images using Adobe Photoshop. Optical Microscopy (OM), potable X-ray fluorescence spectroscopy (p-XRF) and Fourier Transform Infrared Spectroscopy (FTIR) were used in this study. Mapping and imaging techniques provided useful information about the spatial distribution of pigments, in particular visible-induced luminescence (VIL) which allowed the spatial distribution of Egyptian blue pigment to be mapped and every region containing Egyptian blue, even down to single crystals in some instances, is clearly visible as a bright white area; however complete characterization of the pigments requires the use of p. XRF spectroscopy. Based on the elemental analysis found by P.XRF, we conclude that the artists used mixtures of the basic mineral pigments to achieve a wider palette of hues. Identification of wood species Microscopic identification indicated that the wood used was Sycamore Fig (Ficus sycomorus L.) which is recorded as being native to Egypt and was used to make wooden artifacts since at least the Fifth Dynasty.Keywords: polychrome wooden stele, multispectral imaging, IR luminescence, Wood identification, Sycamore Fig, p-XRF
Procedia PDF Downloads 26419481 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories
Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos
Abstract:
Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.Keywords: database, forensic genetics, genetic analysis, sample management, software solution
Procedia PDF Downloads 37019480 Comparing Russian and American Students’ Metaphorical Competence
Authors: Svetlana L. Mishlanova, Evgeniia V. Ermakova, Mariia E. Timirkina
Abstract:
The paper is concerned with the study of metaphor production in essays written by Russian and English native speakers in the framework of cognitive metaphor theory. It considers metaphorical competence as individual’s ability to recognize, understand and use metaphors in speech. The work analyzes the influence of visual metaphor on production and density of conventional and novel verbal metaphors. The main methods of research include experiment connected with image interpretation, metaphor identification procedure (MIPVU) and visual conventional metaphors identification procedure proposed by VisMet group. The research findings will be used in the project aimed at comparing metaphorical competence of native and non-native English speakers.Keywords: metaphor, metaphorical competence, conventional, novel
Procedia PDF Downloads 28619479 Combined Heat and Power Generation in Pressure Reduction City Gas Station (CGS)
Authors: Sadegh Torfi
Abstract:
Realization of anticipated energy efficiency from recuperative run-around energy recovery (RER) systems requires identification of the system components influential parameters. Because simulation modeling is considered as an integral part of the design and economic evaluation of RER systems, it is essential to calibrate the developed models and validate the performance predictions by means of comparison with data from experimental measurements. Several theoretical and numerical analyses on RER systems by researchers have been done, but generally the effect of distance between hot and cold flow is ignored. The objective of this study is to develop a thermohydroulic model for a typical RER system that accounts for energy loss from the interconnecting piping and effects of interconnecting pipes length performance of run-around energy recovery systems. Numerical simulation shows that energy loss from the interconnecting piping is change linear with pipes length and if pipes are properly isolated, maximum reduction of effectiveness of RER systems is 2% in typical piping systems.Keywords: combined heat and power, heat recovery, effectiveness, CGS
Procedia PDF Downloads 20019478 Vehicular Speed Detection Camera System Using Video Stream
Authors: C. A. Anser Pasha
Abstract:
In this paper, a new Vehicular Speed Detection Camera System that is applicable as an alternative to traditional radars with the same accuracy or even better is presented. The real-time measurement and analysis of various traffic parameters such as speed and number of vehicles are increasingly required in traffic control and management. Image processing techniques are now considered as an attractive and flexible method for automatic analysis and data collections in traffic engineering. Various algorithms based on image processing techniques have been applied to detect multiple vehicles and track them. The SDCS processes can be divided into three successive phases; the first phase is Objects detection phase, which uses a hybrid algorithm based on combining an adaptive background subtraction technique with a three-frame differencing algorithm which ratifies the major drawback of using only adaptive background subtraction. The second phase is Objects tracking, which consists of three successive operations - object segmentation, object labeling, and object center extraction. Objects tracking operation takes into consideration the different possible scenarios of the moving object like simple tracking, the object has left the scene, the object has entered the scene, object crossed by another object, and object leaves and another one enters the scene. The third phase is speed calculation phase, which is calculated from the number of frames consumed by the object to pass by the scene.Keywords: radar, image processing, detection, tracking, segmentation
Procedia PDF Downloads 46719477 Steady State Analysis of Distribution System with Wind Generation Uncertainity
Authors: Zakir Husain, Neem Sagar, Neeraj Gupta
Abstract:
Due to the increased penetration of renewable energy resources in the distribution system, the system is no longer passive in nature. In this paper, a steady state analysis of the distribution system has been done with the inclusion of wind generation. The modeling of wind turbine generator system and wind generator has been made to obtain the average active and the reactive power injection into the system. The study has been conducted on a IEEE-33 bus system with two wind generators. The present research work is useful not only to utilities but also to customers.Keywords: distributed generation, distribution network, radial network, wind turbine generating system
Procedia PDF Downloads 40519476 The Clustering of Multiple Sclerosis Subgroups through L2 Norm Multifractal Denoising Technique
Authors: Yeliz Karaca, Rana Karabudak
Abstract:
Multifractal Denoising techniques are used in the identification of significant attributes by removing the noise of the dataset. Magnetic resonance (MR) image technique is the most sensitive method so as to identify chronic disorders of the nervous system such as Multiple Sclerosis. MRI and Expanded Disability Status Scale (EDSS) data belonging to 120 individuals who have one of the subgroups of MS (Relapsing Remitting MS (RRMS), Secondary Progressive MS (SPMS), Primary Progressive MS (PPMS)) as well as 19 healthy individuals in the control group have been used in this study. The study is comprised of the following stages: (i) L2 Norm Multifractal Denoising technique, one of the multifractal technique, has been used with the application on the MS data (MRI and EDSS). In this way, the new dataset has been obtained. (ii) The new MS dataset obtained from the MS dataset and L2 Multifractal Denoising technique has been applied to the K-Means and Fuzzy C Means clustering algorithms which are among the unsupervised methods. Thus, the clustering performances have been compared. (iii) In the identification of significant attributes in the MS dataset through the Multifractal denoising (L2 Norm) technique using K-Means and FCM algorithms on the MS subgroups and control group of healthy individuals, excellent performance outcome has been yielded. According to the clustering results based on the MS subgroups obtained in the study, successful clustering results have been obtained in the K-Means and FCM algorithms by applying the L2 norm of multifractal denoising technique for the MS dataset. Clustering performance has been more successful with the MS Dataset (L2_Norm MS Data Set) K-Means and FCM in which significant attributes are obtained by applying L2 Norm Denoising technique.Keywords: clinical decision support, clustering algorithms, multiple sclerosis, multifractal techniques
Procedia PDF Downloads 16819475 In-Context Meta Learning for Automatic Designing Pretext Tasks for Self-Supervised Image Analysis
Authors: Toktam Khatibi
Abstract:
Self-supervised learning (SSL) includes machine learning models that are trained on one aspect and/or one part of the input to learn other aspects and/or part of it. SSL models are divided into two different categories, including pre-text task-based models and contrastive learning ones. Pre-text tasks are some auxiliary tasks learning pseudo-labels, and the trained models are further fine-tuned for downstream tasks. However, one important disadvantage of SSL using pre-text task solving is defining an appropriate pre-text task for each image dataset with a variety of image modalities. Therefore, it is required to design an appropriate pretext task automatically for each dataset and each downstream task. To the best of our knowledge, the automatic designing of pretext tasks for image analysis has not been considered yet. In this paper, we present a framework based on In-context learning that describes each task based on its input and output data using a pre-trained image transformer. Our proposed method combines the input image and its learned description for optimizing the pre-text task design and its hyper-parameters using Meta-learning models. The representations learned from the pre-text tasks are fine-tuned for solving the downstream tasks. We demonstrate that our proposed framework outperforms the compared ones on unseen tasks and image modalities in addition to its superior performance for previously known tasks and datasets.Keywords: in-context learning (ICL), meta learning, self-supervised learning (SSL), vision-language domain, transformers
Procedia PDF Downloads 8019474 Tool for Maxillary Sinus Quantification in Computed Tomography Exams
Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina
Abstract:
The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.Keywords: maxillary sinus, support vector machine, region growing, volume quantification
Procedia PDF Downloads 50419473 Touching Interaction: An NFC-RFID Combination
Authors: Eduardo Álvarez, Gerardo Quiroga, Jorge Orozco, Gabriel Chavira
Abstract:
AmI proposes a new way of thinking about computers, which follows the ideas of the Ubiquitous Computing vision of Mark Weiser. In these, there is what is known as a Disappearing Computer Initiative, with users immersed in intelligent environments. Hence, technologies need to be adapted so that they are capable of replacing the traditional inputs to the system by embedding these in every-day artifacts. In this work, we present an approach, which uses Radiofrequency Identification (RFID) and Near Field Communication (NFC) technologies. In the latter, a new form of interaction appears by contact. We compare both technologies by analyzing their requirements and advantages. In addition, we propose using a combination of RFID and NFC.Keywords: touching interaction, ambient intelligence, ubiquitous computing, interaction, NFC and RFID
Procedia PDF Downloads 50519472 The Marker Active Compound Identification of Calotropis gigantea Roots Extract as an Anticancer
Authors: Roihatul Mutiah, Sukardiman, Aty Widyawaruyanti
Abstract:
Calotropis gigantiea (L.) R. Br (Apocynaceae) commonly called as “Biduri” or “giant milk weed” is a well-known weed to many cultures for treating various disorders. Several studies reported that C.gigantea roots has anticancer activity. The main aim of this research was to isolate and identify an active marker compound of C.gigantea roots for quality control purpose of its extract in the development as anticancer natural product. The isolation methods was bioactivity guided column chromatography, TLC, and HPLC. Evaluated anticancer activity of there substances using MTT assay methods. Identification structure active compound by UV, 1HNMR, 13CNMR, HMBC, HMQC spectral and other references. The result showed that the marker active compound was identical as Calotropin.Keywords: calotropin, Calotropis gigantea, anticancer, marker active
Procedia PDF Downloads 33419471 Mammographic Multi-View Cancer Identification Using Siamese Neural Networks
Authors: Alisher Ibragimov, Sofya Senotrusova, Aleksandra Beliaeva, Egor Ushakov, Yuri Markin
Abstract:
Mammography plays a critical role in screening for breast cancer in women, and artificial intelligence has enabled the automatic detection of diseases in medical images. Many of the current techniques used for mammogram analysis focus on a single view (mediolateral or craniocaudal view), while in clinical practice, radiologists consider multiple views of mammograms from both breasts to make a correct decision. Consequently, computer-aided diagnosis (CAD) systems could benefit from incorporating information gathered from multiple views. In this study, the introduce a method based on a Siamese neural network (SNN) model that simultaneously analyzes mammographic images from tri-view: bilateral and ipsilateral. In this way, when a decision is made on a single image of one breast, attention is also paid to two other images – a view of the same breast in a different projection and an image of the other breast as well. Consequently, the algorithm closely mimics the radiologist's practice of paying attention to the entire examination of a patient rather than to a single image. Additionally, to the best of our knowledge, this research represents the first experiments conducted using the recently released Vietnamese dataset of digital mammography (VinDr-Mammo). On an independent test set of images from this dataset, the best model achieved an AUC of 0.87 per image. Therefore, this suggests that there is a valuable automated second opinion in the interpretation of mammograms and breast cancer diagnosis, which in the future may help to alleviate the burden on radiologists and serve as an additional layer of verification.Keywords: breast cancer, computer-aided diagnosis, deep learning, multi-view mammogram, siamese neural network
Procedia PDF Downloads 13719470 Evaluation of the Trauma System in a District Hospital Setting in Ireland
Authors: Ahmeda Ali, Mary Codd, Susan Brundage
Abstract:
Importance: This research focuses on devising and improving Health Service Executive (HSE) policy and legislation and therefore improving patient trauma care and outcomes in Ireland. Objectives: The study measures components of the Trauma System in the district hospital setting of the Cavan/Monaghan Hospital Group (CMHG), HSE, Ireland, and uses the collected data to identify the strengths and weaknesses of the CMHG Trauma System organisation, to include governance, injury data, prevention and quality improvement, scene care and facility-based care, and rehabilitation. The information will be made available to local policy makers to provide objective situational analysis to assist in future trauma service planning and service provision. Design, setting and participants: From 28 April to May 28, 2016 a cross-sectional survey using World Health Organisation (WHO) Trauma System Assessment Tool (TSAT) was conducted among healthcare professionals directly involved in the level III trauma system of CMHG. Main outcomes: Identification of the strengths and weaknesses of the Trauma System of CMHG. Results: The participants who reported inadequate funding for pre hospital (62.3%) and facility based trauma care at CMHG (52.5%) were high. Thirty four (55.7%) respondents reported that a national trauma registry (TARN) exists but electronic health records are still not used in trauma care. Twenty one respondents (34.4%) reported that there are system wide protocols for determining patient destination and adequate, comprehensive legislation governing the use of ambulances was enforced, however, there is a lack of a reliable advisory service. Over 40% of the respondents reported uncertainty of the injury prevention programmes available in Ireland; as well as the allocated government funding for injury and violence prevention. Conclusions: The results of this study contributed to a comprehensive assessment of the trauma system organisation. The major findings of the study identified three fundamental areas: the inadequate funding at CMHG, the QI techniques and corrective strategies used, and the unfamiliarity of existing prevention strategies. The findings direct the need for further research to guide future development of the trauma system at CMHG (and in Ireland as a whole) in order to maximise best practice and to improve functional and life outcomes.Keywords: trauma, education, management, system
Procedia PDF Downloads 24319469 Lateral Control of Electric Vehicle Based on Fuzzy Logic Control
Authors: Hartani Kada, Merah Abdelkader
Abstract:
Aiming at the high nonlinearities and unmatched uncertainties of the intelligent electric vehicles’ dynamic system, this paper presents a lateral motion control algorithm for intelligent electric vehicles with four in-wheel motors. A fuzzy logic procedure is presented and formulated to realize lateral control in lane change. The vehicle dynamics model and a desired target tracking model were established in this paper. A fuzzy logic controller was designed for integrated active front steering (AFS) and direct yaw moment control (DYC) in order to improve vehicle handling performance and stability, and a fuzzy controller for the automatic steering problem. The simulation results demonstrate the strong robustness and excellent tracking performance of the control algorithm that is proposed.Keywords: fuzzy logic, lateral control, AFS, DYC, electric car technology, longitudinal control, lateral motion
Procedia PDF Downloads 61019468 Way to Successful Enterprise Resource Planning System Implementation in Developing Countries: Case of Public Sector Unit
Authors: Suraj Kumar Mukti
Abstract:
Enterprise Resource Planning (ERP) system is a management tool to integrate all departments in an organization. It integrates business processes, manages resources efficiently and provides an appropriate decision support system to management. ERP system implementation is a typical and time taking process as well as money consuming process. Articles related to key success factors of ERP system implementation are available in the literature, but rare authors have focused on roadmap of successful ERP system implementation. Postponement is better if the organization is not ready to implement ERP system in better way; hence checking of organization’s preparation to adopt new system is an important prerequisite to ensure the success of ERP system implementation in an organization. Then comes what will be called as success of ERP system implementation. Benefits achieved by ERP system may be categorized into two categories; viz. tangible and intangible benefits. This research article presents a roadmap to ensure the success of ERP system implementation and benefits achieved through the new system as in success indicator. A case study is presented to evaluate the success and benefit achieved through the new system. The article gives a comprehensive approach to academicians and a roadmap to the organizations seeking to implement the ERP system.Keywords: ERP system, decision support system, tangible, intangible
Procedia PDF Downloads 33219467 Identification of Wiener Model Using Iterative Schemes
Authors: Vikram Saini, Lillie Dewan
Abstract:
This paper presents the iterative schemes based on Least square, Hierarchical Least Square and Stochastic Approximation Gradient method for the Identification of Wiener model with parametric structure. A gradient method is presented for the parameter estimation of wiener model with noise conditions based on the stochastic approximation. Simulation results are presented for the Wiener model structure with different static non-linear elements in the presence of colored noise to show the comparative analysis of the iterative methods. The stochastic gradient method shows improvement in the estimation performance and provides fast convergence of the parameters estimates.Keywords: hard non-linearity, least square, parameter estimation, stochastic approximation gradient, Wiener model
Procedia PDF Downloads 40519466 Development of an Autonomous Automated Guided Vehicle with Robot Manipulator under Robot Operation System Architecture
Authors: Jinsiang Shaw, Sheng-Xiang Xu
Abstract:
This paper presents the development of an autonomous automated guided vehicle (AGV) with a robot arm attached on top of it within the framework of robot operation system (ROS). ROS can provide libraries and tools, including hardware abstraction, device drivers, libraries, visualizers, message-passing, package management, etc. For this reason, this AGV can provide automatic navigation and parts transportation and pick-and-place task using robot arm for typical industrial production line use. More specifically, this AGV will be controlled by an on-board host computer running ROS software. Command signals for vehicle and robot arm control and measurement signals from various sensors are transferred to respective microcontrollers. Users can operate the AGV remotely through the TCP / IP protocol and perform SLAM (Simultaneous Localization and Mapping). An RGBD camera and LIDAR sensors are installed on the AGV, using these data to perceive the environment. For SLAM, Gmapping is used to construct the environment map by Rao-Blackwellized particle filter; and AMCL method (Adaptive Monte Carlo localization) is employed for mobile robot localization. In addition, current AGV position and orientation can be visualized by ROS toolkit. As for robot navigation and obstacle avoidance, A* for global path planning and dynamic window approach for local planning are implemented. The developed ROS AGV with a robot arm on it has been experimented in the university factory. A 2-D and 3-D map of the factory were successfully constructed by the SLAM method. Base on this map, robot navigation through the factory with and without dynamic obstacles are shown to perform well. Finally, pick-and-place of parts using robot arm and ensuing delivery in the factory by the mobile robot are also accomplished.Keywords: automated guided vehicle, navigation, robot operation system, Simultaneous Localization and Mapping
Procedia PDF Downloads 14919465 Constructed Wetlands with Subsurface Flow for Nitrogen and Metazachlor Removal from Tile Drainage: First Year Results
Authors: P. Fucik, J. Vymazal, M. Seres
Abstract:
Pollution from agricultural drainage is a severe issue for water quality, and it is a major reason for the failure in accomplishment of 'good chemical status' according to Water Framework Directive, especially due to high nitrogen and pesticide burden of receiving waters. Constructed wetlands were proposed as a suitable measure for removal of nitrogen from agricultural drainage in the early 1990s. Until now, the vast majority of constructed wetlands designed to treat tile drainage were free-surface constructed wetlands. In 2018, three small experimental constructed wetlands with horizontal subsurface flow were built in Czech Highlands to treat tile drainage from 15.73 ha watershed. The wetlands have a surface area of 79, 90 and 98 m² and were planted with Phalaris arundinacea and Glyceria maxima in parallel bands. The substrate in the first two wetlands is gravel (4-8 mm) mixed with birch woodchips (10:1 volume ratio). In one of those wetlands, the water level is kept 10 cm above the surface; in the second one, the water is kept below the surface. The third wetland has 20 cm layer of birch woodchips on top of gravel. The drainage outlet, as well as wetland outlets, are equipped with automatic discharge-gauging devices, temperature probes, as well as automatic water samplers (Teledyne ISCO). During the monitored period (2018-2019), the flows were unexpectedly low due to a drop of the shallow ground water level, being the main source of water for the monitored drainage system, as experienced at many areas of the Czech Republic. The mean water residence time was analyzed in the wetlands (KBr), which was 16, 9 and 27 days, respectively. The mean total nitrogen concentration eliminations during one-year period were 61.2%, 62.6%, and 70.9% for wetlands 1, 2, and 3, respectively. The average load removals amounted to 0.516, 0.323, and 0.399 g N m-2 d-1 or 1885, 1180 and 1457 kg ha-1 yr-1 in wetlands 1, 2 and 3, respectively. The plant uptake and nitrogen sequestration in aboveground biomass contributed only marginally to the overall nitrogen removal. Among the three variants, the one with shallow water on the surface was revealed to be the most effective for removal of nitrogen from drainage water. In August 2019, herbicide Metazachlor was experimentally poured in time of 2 hours at drainage outlet in a concentration of 250 ug/l to find out the removal rates of the aforementioned wetlands. Water samples were taken the first day every six hours, and for the next nine days, every day one water sample was taken. The removal rates were as follows 94, 69 and 99%; when the most effective wetland was the one with the longest water residence time and the birch woodchip-layer on top of gravel.Keywords: constructed wetlands, metazachlor, nitrogen, tile drainage
Procedia PDF Downloads 14919464 Algorithm Development of Individual Lumped Parameter Modelling for Blood Circulatory System: An Optimization Study
Authors: Bao Li, Aike Qiao, Gaoyang Li, Youjun Liu
Abstract:
Background: Lumped parameter model (LPM) is a common numerical model for hemodynamic calculation. LPM uses circuit elements to simulate the human blood circulatory system. Physiological indicators and characteristics can be acquired through the model. However, due to the different physiological indicators of each individual, parameters in LPM should be personalized in order for convincing calculated results, which can reflect the individual physiological information. This study aimed to develop an automatic and effective optimization method to personalize the parameters in LPM of the blood circulatory system, which is of great significance to the numerical simulation of individual hemodynamics. Methods: A closed-loop LPM of the human blood circulatory system that is applicable for most persons were established based on the anatomical structures and physiological parameters. The patient-specific physiological data of 5 volunteers were non-invasively collected as personalized objectives of individual LPM. In this study, the blood pressure and flow rate of heart, brain, and limbs were the main concerns. The collected systolic blood pressure, diastolic blood pressure, cardiac output, and heart rate were set as objective data, and the waveforms of carotid artery flow and ankle pressure were set as objective waveforms. Aiming at the collected data and waveforms, sensitivity analysis of each parameter in LPM was conducted to determine the sensitive parameters that have an obvious influence on the objectives. Simulated annealing was adopted to iteratively optimize the sensitive parameters, and the objective function during optimization was the root mean square error between the collected waveforms and data and simulated waveforms and data. Each parameter in LPM was optimized 500 times. Results: In this study, the sensitive parameters in LPM were optimized according to the collected data of 5 individuals. Results show a slight error between collected and simulated data. The average relative root mean square error of all optimization objectives of 5 samples were 2.21%, 3.59%, 4.75%, 4.24%, and 3.56%, respectively. Conclusions: Slight error demonstrated good effects of optimization. The individual modeling algorithm developed in this study can effectively achieve the individualization of LPM for the blood circulatory system. LPM with individual parameters can output the individual physiological indicators after optimization, which are applicable for the numerical simulation of patient-specific hemodynamics.Keywords: blood circulatory system, individual physiological indicators, lumped parameter model, optimization algorithm
Procedia PDF Downloads 13719463 Energy Management Method in DC Microgrid Based on the Equivalent Hydrogen Consumption Minimum Strategy
Authors: Ying Han, Weirong Chen, Qi Li
Abstract:
An energy management method based on equivalent hydrogen consumption minimum strategy is proposed in this paper aiming at the direct-current (DC) microgrid consisting of photovoltaic cells, fuel cells, energy storage devices, converters and DC loads. The rational allocation of fuel cells and battery devices is achieved by adopting equivalent minimum hydrogen consumption strategy with the full use of power generated by photovoltaic cells. Considering the balance of the battery’s state of charge (SOC), the optimal power of the battery under different SOC conditions is obtained and the reference output power of the fuel cell is calculated. And then a droop control method based on time-varying droop coefficient is proposed to realize the automatic charge and discharge control of the battery, balance the system power and maintain the bus voltage. The proposed control strategy is verified by RT-LAB hardware-in-the-loop simulation platform. The simulation results show that the designed control algorithm can realize the rational allocation of DC micro-grid energy and improve the stability of system.Keywords: DC microgrid, equivalent minimum hydrogen consumption strategy, energy management, time-varying droop coefficient, droop control
Procedia PDF Downloads 30319462 Measuring Multi-Class Linear Classifier for Image Classification
Authors: Fatma Susilawati Mohamad, Azizah Abdul Manaf, Fadhillah Ahmad, Zarina Mohamad, Wan Suryani Wan Awang
Abstract:
A simple and robust multi-class linear classifier is proposed and implemented. For a pair of classes of the linear boundary, a collection of segments of hyper planes created as perpendicular bisectors of line segments linking centroids of the classes or part of classes. Nearest Neighbor and Linear Discriminant Analysis are compared in the experiments to see the performances of each classifier in discriminating ripeness of oil palm. This paper proposes a multi-class linear classifier using Linear Discriminant Analysis (LDA) for image identification. Result proves that LDA is well capable in separating multi-class features for ripeness identification.Keywords: multi-class, linear classifier, nearest neighbor, linear discriminant analysis
Procedia PDF Downloads 53819461 Using Structural Equation Modeling to Analyze the Impact of Remote Work on Job Satisfaction
Authors: Florian Pfeffel, Valentin Nickolai, Christian Louis Kühner
Abstract:
Digitalization has disrupted the traditional workplace environment by allowing many employees to work from anywhere at any time. This trend of working from home was further accelerated due to the COVID-19 crisis, which forced companies to rethink their workplace models. While in many companies, this shift happened out of pure necessity; many employees were left more satisfied with their job due to the opportunity to work from home. This study focuses on employees’ job satisfaction in the service sector in dependence on the different work models, which are defined as a “work from home” model, the traditional “work in office” model, and a hybrid model. Using structural equation modeling (SEM), these three work models have been analyzed based on 13 influencing factors on job satisfaction that have been further summarized in the three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, a survey has been conducted with n = 684 employees in the service sector. Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. Additionally, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that employees who work entirely remotely or have a hybrid work model are significantly more satisfied with their job, with a job satisfaction score of 5.0 respectively on a scale from 1 (very dissatisfied) to 7 (very satisfied), than employees do not have the option to work from home with a score of 4.6. This comes as a result of the lower identification with the work in the model without any remote working. Furthermore, the responses indicate that it is important to consider the individual preferences of each employee when it comes to the work model to achieve overall higher job satisfaction. Thus, it can be argued that companies can profit off of more motivation and higher productivity by considering the individual work model preferences, therefore, increasing the identification with the respective work.Keywords: home-office, identification with work, job satisfaction, new work, remote work, structural equation modeling
Procedia PDF Downloads 8219460 Inducible Trans-Encapsidation System for Temporal Separation of Hepatitis C Virus Life Cycle
Authors: Ovidiu Vlaicu, Leontina Banica, Dan Otelea, Andrei-Jose Petrescu, Costin-Ioan Popescu
Abstract:
Hepatitis C Virus (HCV) infects 170 million peoples worldwide. Major advances have been made recently in HCV standard of care with interferon-free therapy being already approved. Despite major progress in HCV therapy, the genotype associated treatment efficacy and toxicity still represent issues to address. To identify endogenous factors involved in different stages of HCV life cycle, we have developed a trans-packaging system for HCV subgenomic replicons lacking core protein gene. Huh7 cells were used to generate a packaging cell line expressing the core protein in an inducible manner. The core packaging cell line was able to trans-complemented various subgenomic replicons to secret infectious trans-complemented HCV particles (HCV-TCP). Further, we constructed subgenomic replicons with foreign epitopes suitable for immunoaffinity purification or fluorescence microscopy studies. We have shown that the insertion has not effects on the efficacy of trans-complementation yielding similar titers to the control subgenomic replicon. This system will be a valuable tool in studying pre- and post-assembly events in HCV life cycle and for the fast identification of HCV assembly inhibitors.Keywords: assembly inhibitors, core protein, HCV, trans-complementation
Procedia PDF Downloads 29219459 The Automatisation of Dictionary-Based Annotation in a Parallel Corpus of Old English
Authors: Ana Elvira Ojanguren Lopez, Javier Martin Arista
Abstract:
The aims of this paper are to present the automatisation procedure adopted in the implementation of a parallel corpus of Old English, as well as, to assess the progress of automatisation with respect to tagging, annotation, and lemmatisation. The corpus consists of an aligned parallel text with word-for-word comparison Old English-English that provides the Old English segment with inflectional form tagging (gloss, lemma, category, and inflection) and lemma annotation (spelling, meaning, inflectional class, paradigm, word-formation and secondary sources). This parallel corpus is intended to fill a gap in the field of Old English, in which no parallel and/or lemmatised corpora are available, while the average amount of corpus annotation is low. With this background, this presentation has two main parts. The first part, which focuses on tagging and annotation, selects the layouts and fields of lexical databases that are relevant for these tasks. Most information used for the annotation of the corpus can be retrieved from the lexical and morphological database Nerthus and the database of secondary sources Freya. These are the sources of linguistic and metalinguistic information that will be used for the annotation of the lemmas of the corpus, including morphological and semantic aspects as well as the references to the secondary sources that deal with the lemmas in question. Although substantially adapted and re-interpreted, the lemmatised part of these databases draws on the standard dictionaries of Old English, including The Student's Dictionary of Anglo-Saxon, An Anglo-Saxon Dictionary, and A Concise Anglo-Saxon Dictionary. The second part of this paper deals with lemmatisation. It presents the lemmatiser Norna, which has been implemented on Filemaker software. It is based on a concordance and an index to the Dictionary of Old English Corpus, which comprises around three thousand texts and three million words. In its present state, the lemmatiser Norna can assign lemma to around 80% of textual forms on an automatic basis, by searching the index and the concordance for prefixes, stems and inflectional endings. The conclusions of this presentation insist on the limits of the automatisation of dictionary-based annotation in a parallel corpus. While the tagging and annotation are largely automatic even at the present stage, the automatisation of alignment is pending for future research. Lemmatisation and morphological tagging are expected to be fully automatic in the near future, once the database of secondary sources Freya and the lemmatiser Norna have been completed.Keywords: corpus linguistics, historical linguistics, old English, parallel corpus
Procedia PDF Downloads 21219458 The Influence of Covariance Hankel Matrix Dimension on Algorithms for VARMA Models
Authors: Celina Pestano-Gabino, Concepcion Gonzalez-Concepcion, M. Candelaria Gil-Fariña
Abstract:
Some estimation methods for VARMA models, and Multivariate Time Series Models in general, rely on the use of a Hankel matrix. It is known that if the data sample is populous enough and the dimension of the Hankel matrix is unnecessarily large, this may result in an unnecessary number of computations as well as in numerical problems. In this sense, the aim of this paper is two-fold. First, we provide some theoretical results for these matrices which translate into a lower dimension for the matrices normally used in the algorithms. This contribution thus serves to improve those methods from a numerical and, presumably, statistical point of view. Second, we have chosen an estimation algorithm to illustrate in practice our improvements. The results we obtained in a simulation of VARMA models show that an increase in the size of the Hankel matrix beyond the theoretical bound proposed as valid does not necessarily lead to improved practical results. Therefore, for future research, we propose conducting similar studies using any of the linear system estimation methods that depend on Hankel matrices.Keywords: covariances Hankel matrices, Kronecker indices, system identification, VARMA models
Procedia PDF Downloads 24319457 Chaotic Sequence Noise Reduction and Chaotic Recognition Rate Improvement Based on Improved Local Geometric Projection
Authors: Rubin Dan, Xingcai Wang, Ziyang Chen
Abstract:
A chaotic time series noise reduction method based on the fusion of the local projection method, wavelet transform, and particle swarm algorithm (referred to as the LW-PSO method) is proposed to address the problem of false recognition due to noise in the recognition process of chaotic time series containing noise. The method first uses phase space reconstruction to recover the original dynamical system characteristics and removes the noise subspace by selecting the neighborhood radius; then it uses wavelet transform to remove D1-D3 high-frequency components to maximize the retention of signal information while least-squares optimization is performed by the particle swarm algorithm. The Lorenz system containing 30% Gaussian white noise is simulated and verified, and the phase space, SNR value, RMSE value, and K value of the 0-1 test method before and after noise reduction of the Schreiber method, local projection method, wavelet transform method, and LW-PSO method are compared and analyzed, which proves that the LW-PSO method has a better noise reduction effect compared with the other three common methods. The method is also applied to the classical system to evaluate the noise reduction effect of the four methods and the original system identification effect, which further verifies the superiority of the LW-PSO method. Finally, it is applied to the Chengdu rainfall chaotic sequence for research, and the results prove that the LW-PSO method can effectively reduce the noise and improve the chaos recognition rate.Keywords: Schreiber noise reduction, wavelet transform, particle swarm optimization, 0-1 test method, chaotic sequence denoising
Procedia PDF Downloads 19819456 Screening Deformed Red Blood Cells Irradiated by Ionizing Radiations Using Windowed Fourier Transform
Authors: Dahi Ghareab Abdelsalam Ibrahim, R. H. Bakr
Abstract:
Ionizing radiation, such as gamma radiation and X-rays, has many applications in medical diagnoses and cancer treatment. In this paper, we used the windowed Fourier transform to extract the complex image of the deformed red blood cells. The real values of the complex image are used to extract the best fitting of the deformed cell boundary. Male albino rats are irradiated by γ-rays from ⁶⁰Co. The male albino rats are anesthetized with ether, and then blood samples are collected from the eye vein by heparinized capillary tubes for studying the radiation-damaging effect in-vivo by the proposed windowed Fourier transform. The peripheral blood films are prepared according to the Brown method. The peripheral blood film is photographed by using an Automatic Image Contour Analysis system (SAMICA) from ELBEK-Bildanalyse GmbH, Siegen, Germany. The SAMICA system is provided with an electronic camera connected to a computer through a built-in interface card, and the image can be magnified up to 1200 times and displayed by the computer. The images of the peripheral blood films are then analyzed by the windowed Fourier transform method to extract the precise deformation from the best fitting. Based on accurate deformation evaluation of the red blood cells, diseases can be diagnosed in their primary stages.Keywords: windowed Fourier transform, red blood cells, phase wrapping, Image processing
Procedia PDF Downloads 8519455 The Impact of the Number of Neurons in the Hidden Layer on the Performance of MLP Neural Network: Application to the Fast Identification of Toxics Gases
Authors: Slimane Ouhmad, Abdellah Halimi
Abstract:
In this work, we have applied neural networks method MLP type to a database from an array of six sensors for the detection of three toxic gases. As the choice of the number of hidden layers and the weight values has a great influence on the convergence of the learning algorithm, we proposed, in this article, a mathematical formulation to determine the optimal number of hidden layers and good weight values based on the method of back propagation of errors. The results of this modeling have improved discrimination of these gases on the one hand, and optimize the computation time on the other hand, the comparison to other results achieved in this case.Keywords: MLP Neural Network, back-propagation, number of neurons in the hidden layer, identification, computing time
Procedia PDF Downloads 34719454 The Effects of “Never Pressure Injury” on the Incidence of Pressure Injuries in Critically Ill Patients
Authors: Nuchjaree Kidjawan, Orapan Thosingha, Pawinee Vaipatama, Prakrankiat Youngkong, Sirinapha Malangputhong, Kitti Thamrongaphichartkul, Phatcharaporn Phetcharat
Abstract:
NPI uses technology sensorization of things and processed by AI system. The main features are an individual interface pressure sensor system in contact with the mattress and a position management system where the sensor detects the determined pressure with automatic pressure reduction and distribution. The role of NPI is to monitor, identify the risk and manage the interface pressure automatically when the determined pressure is detected. This study aims to evaluate the effects of “Never Pressure Injury (NPI),” an innovative mattress, on the incidence of pressure injuries in critically ill patients. An observational case-control study was employed to compare the incidence of pressure injury between the case and the control group. The control group comprised 80 critically ill patients admitted to a critical care unit of Phyathai3 Hospital, receiving standard care with the use of memory foam according to intensive care unit guidelines. The case group comprised 80 critically ill patients receiving standard care and with the use of the Never Pressure Injury (NPI) innovation mattress. The patients who were over 20 years old and showed scores of less than 18 on the Risk Assessment Pressure Ulcer Scale – ICU and stayed in ICU for more than 24 hours were selected for the study. The patients’ skin was assessed for the occurrence of pressure injury once a day for five consecutive days or until the patients were discharged from ICU. The sample comprised 160 patients with ages ranging from 30-102 (mean = 70.1 years), and the Body Mass Index ranged from 13.69- 49.01 (mean = 24.63). The case and the control group were not different in their sex, age, Body Mass Index, Pressure Ulcer Risk Scores, and length of ICU stay. Twenty-two patients (27.5%) in the control group had pressure injuries, while no pressure injury was found in the case group.Keywords: pressure injury, never pressure injury, innovation mattress, critically ill patients, prevent pressure injury
Procedia PDF Downloads 12419453 Effect of Automatic Self Transcending Meditation on Perceived Stress and Sleep Quality in Adults
Authors: Divya Kanchibhotla, Shashank Kulkarni, Shweta Singh
Abstract:
Chronic stress and sleep quality reduces mental health and increases the risk of developing depression and anxiety as well. There is increasing evidence for the utility of meditation as an adjunct clinical intervention for conditions like depression and anxiety. The present study is an attempt to explore the impact of Sahaj Samadhi Meditation (SSM), a category of Automatic Self Transcending Meditation (ASTM), on perceived stress and sleep quality in adults. The study design was a single group pre-post assessment. Perceived Stress Scale (PSS) and the Pittsburgh Sleep Quality Index (PSQI) were used in this study. Fifty-two participants filled PSS, and 60 participants filled PSQI at the beginning of the program (day 0), after two weeks (day 16) and at two months (day 60). Significant pre-post differences for the perceived stress level on Day 0 - Day 16 (p < 0.01; Cohen's d = 0.46) and Day 0 - Day 60 (p < 0.01; Cohen's d = 0.76) clearly demonstrated that by practicing SSM, participants experienced reduction in the perceived stress. The effect size of the intervention observed on the 16th day of assessment was small to medium, but on the 60th day, a medium to large effect size of the intervention was observed. In addition to this, significant pre-post differences for the sleep quality on Day 0 - Day 16 and Day 0 - Day 60 (p < 0.05) clearly demonstrated that by practicing SSM, participants experienced improvement in the sleep quality. Compared with Day 0 assessment, participants demonstrated significant improvement in the quality of sleep on Day 16 and Day 60. The effect size of the intervention observed on the 16th day of assessment was small, but on the 60th day, a small to medium effect size of the intervention was observed. In the current study we found out that after practicing SSM for two months, participants reported a reduction in the perceived stress, they felt that they are more confident about their ability to handle personal problems, were able to cope with all the things that they had to do, felt that they were on top of the things, and felt less angered. Participants also reported that their overall sleep quality improved; they took less time to fall asleep; they had less disturbances in sleep and less daytime dysfunction due to sleep deprivation. The present study provides clear evidence of the efficacy and safety of non-pharmacological interventions such as SSM in reducing stress and improving sleep quality. Thus, ASTM may be considered a useful intervention to reduce psychological distress in healthy, non-clinical populations, and it can be an alternative remedy for treating poor sleep among individuals and decreasing the use of harmful sedatives.Keywords: automatic self transcending meditation, Sahaj Samadhi meditation, sleep, stress
Procedia PDF Downloads 134