Search results for: algebraic signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5167

Search results for: algebraic signal processing

3547 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 198
3546 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission

Authors: Tingwei Shu, Dong Zhou, Chengjun Guo

Abstract:

Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.

Keywords: semantic communication, transformer, wavelet transform, data processing

Procedia PDF Downloads 80
3545 The Differences and Similarities in Neurocognitive Deficits in Mild Traumatic Brain Injury and Depression

Authors: Boris Ershov

Abstract:

Depression is the most common mood disorder experienced by patients who have sustained a traumatic brain injury (TBI) and is associated with poorer cognitive functional outcomes. However, in some cases, similar cognitive impairments can also be observed in depression. There is not enough information about the features of the cognitive deficit in patients with TBI in relation to patients with depression. TBI patients without depressive symptoms (TBInD, n25), TBI patients with depressive symptoms (TBID, n31), and 28 patients with bipolar II disorder (BP) were included in the study. There were no significant differences in participants in respect to age, handedness and educational level. The patients clinical status was determined by using Montgomery–Asberg Depression Rating Scale (MADRS). All participants completed a cognitive battery (The Brief Assessment of Cognition in Affective Disorders (BAC-A)). Additionally, the Rey–Osterrieth Complex Figure (ROCF) was used to assess visuospatial construction abilities and visual memory, as well as planning and organizational skills. Compared to BP, TBInD and TBID showed a significant impairments in visuomotor abilities, verbal and visual memory. There were no significant differences between BP and TBID groups in working memory, speed of information processing, problem solving. Interference effect (cognitive inhibition) was significantly greater in TBInD and TBID compared to BP. Memory bias towards mood-related information in BP and TBID was greater in comparison with TBInD. These results suggest that depressive symptoms are associated with impairments some executive functions in combination at decrease of speed of information processing.

Keywords: bipolar II disorder, depression, neurocognitive deficits, traumatic brain injury

Procedia PDF Downloads 348
3544 A Review on Cloud Computing and Internet of Things

Authors: Sahar S. Tabrizi, Dogan Ibrahim

Abstract:

Cloud Computing is a convenient model for on-demand networks that uses shared pools of virtual configurable computing resources, such as servers, networks, storage devices, applications, etc. The cloud serves as an environment for companies and organizations to use infrastructure resources without making any purchases and they can access such resources wherever and whenever they need. Cloud computing is useful to overcome a number of problems in various Information Technology (IT) domains such as Geographical Information Systems (GIS), Scientific Research, e-Governance Systems, Decision Support Systems, ERP, Web Application Development, Mobile Technology, etc. Companies can use Cloud Computing services to store large amounts of data that can be accessed from anywhere on Earth and also at any time. Such services are rented by the client companies where the actual rent depends upon the amount of data stored on the cloud and also the amount of processing power used in a given time period. The resources offered by the cloud service companies are flexible in the sense that the user companies can increase or decrease their storage requirements or the processing power requirements at any time, thus minimizing the overall rental cost of the service they receive. In addition, the Cloud Computing service providers offer fast processors and applications software that can be shared by their clients. This is especially important for small companies with limited budgets which cannot afford to purchase their own expensive hardware and software. This paper is an overview of the Cloud Computing, giving its types, principles, advantages, and disadvantages. In addition, the paper gives some example engineering applications of Cloud Computing and makes suggestions for possible future applications in the field of engineering.

Keywords: cloud computing, cloud systems, cloud services, IaaS, PaaS, SaaS

Procedia PDF Downloads 234
3543 An Image Processing Scheme for Skin Fungal Disease Identification

Authors: A. A. M. A. S. S. Perera, L. A. Ranasinghe, T. K. H. Nimeshika, D. M. Dhanushka Dissanayake, Namalie Walgampaya

Abstract:

Nowadays, skin fungal diseases are mostly found in people of tropical countries like Sri Lanka. A skin fungal disease is a particular kind of illness caused by fungus. These diseases have various dangerous effects on the skin and keep on spreading over time. It becomes important to identify these diseases at their initial stage to control it from spreading. This paper presents an automated skin fungal disease identification system implemented to speed up the diagnosis process by identifying skin fungal infections in digital images. An image of the diseased skin lesion is acquired and a comprehensive computer vision and image processing scheme is used to process the image for the disease identification. This includes colour analysis using RGB and HSV colour models, texture classification using Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix and Local Binary Pattern, Object detection, Shape Identification and many more. This paper presents the approach and its outcome for identification of four most common skin fungal infections, namely, Tinea Corporis, Sporotrichosis, Malassezia and Onychomycosis. The main intention of this research is to provide an automated skin fungal disease identification system that increase the diagnostic quality, shorten the time-to-diagnosis and improve the efficiency of detection and successful treatment for skin fungal diseases.

Keywords: Circularity Index, Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix, Local Binary Pattern, Object detection, Ring Detection, Shape Identification

Procedia PDF Downloads 234
3542 Experimental Correlation for Erythrocyte Aggregation Rate in Population Balance Modeling

Authors: Erfan Niazi, Marianne Fenech

Abstract:

Red Blood Cells (RBCs) or erythrocytes tend to form chain-like aggregates under low shear rate called rouleaux. This is a reversible process and rouleaux disaggregate in high shear rates. Therefore, RBCs aggregation occurs in the microcirculation where low shear rates are present but does not occur under normal physiological conditions in large arteries. Numerical modeling of RBCs interactions is fundamental in analytical models of a blood flow in microcirculation. Population Balance Modeling (PBM) is particularly useful for studying problems where particles agglomerate and break in a two phase flow systems to find flow characteristics. In this method, the elementary particles lose their individual identity due to continuous destructions and recreations by break-up and agglomeration. The aim of this study is to find RBCs aggregation in a dynamic situation. Simplified PBM was used previously to find the aggregation rate on a static observation of the RBCs aggregation in a drop of blood under the microscope. To find aggregation rate in a dynamic situation we propose an experimental set up testing RBCs sedimentation. In this test, RBCs interact and aggregate to form rouleaux. In this configuration, disaggregation can be neglected due to low shear stress. A high-speed camera is used to acquire video-microscopic pictures of the process. The sizes of the aggregates and velocity of sedimentation are extracted using an image processing techniques. Based on the data collection from 5 healthy human blood samples, the aggregation rate was estimated as 2.7x103(±0.3 x103) 1/s.

Keywords: red blood cell, rouleaux, microfluidics, image processing, population balance modeling

Procedia PDF Downloads 357
3541 Least Support Orthogonal Matching Pursuit (LS-OMP) Recovery Method for Invisible Watermarking Image

Authors: Israa Sh. Tawfic, Sema Koc Kayhan

Abstract:

In this paper, first, we propose least support orthogonal matching pursuit (LS-OMP) algorithm to improve the performance, of the OMP (orthogonal matching pursuit) algorithm. LS-OMP algorithm adaptively chooses optimum L (least part of support), at each iteration. This modification helps to reduce the computational complexity significantly and performs better than OMP algorithm. Second, we give the procedure for the invisible image watermarking in the presence of compressive sampling. The image reconstruction based on a set of watermarked measurements is performed using LS-OMP.

Keywords: compressed sensing, orthogonal matching pursuit, restricted isometry property, signal reconstruction, least support orthogonal matching pursuit, watermark

Procedia PDF Downloads 339
3540 Using Artificial Intelligence Technology to Build the User-Oriented Platform for Integrated Archival Service

Authors: Lai Wenfang

Abstract:

Tthis study will describe how to use artificial intelligence (AI) technology to build the user-oriented platform for integrated archival service. The platform will be launched in 2020 by the National Archives Administration (NAA) in Taiwan. With the progression of information communication technology (ICT) the NAA has built many systems to provide archival service. In order to cope with new challenges, such as new ICT, artificial intelligence or blockchain etc. the NAA will try to use the natural language processing (NLP) and machine learning (ML) skill to build a training model and propose suggestions based on the data sent to the platform. NAA expects the platform not only can automatically inform the sending agencies’ staffs which records catalogues are against the transfer or destroy rules, but also can use the model to find the details hidden in the catalogues and suggest NAA’s staff whether the records should be or not to be, to shorten the auditing time. The platform keeps all the users’ browse trails; so that the platform can predict what kinds of archives user could be interested and recommend the search terms by visualization, moreover, inform them the new coming archives. In addition, according to the Archives Act, the NAA’s staff must spend a lot of time to mark or remove the personal data, classified data, etc. before archives provided. To upgrade the archives access service process, the platform will use some text recognition pattern to black out automatically, the staff only need to adjust the error and upload the correct one, when the platform has learned the accuracy will be getting higher. In short, the purpose of the platform is to deduct the government digital transformation and implement the vision of a service-oriented smart government.

Keywords: artificial intelligence, natural language processing, machine learning, visualization

Procedia PDF Downloads 178
3539 Visco - Plastic Transition and Transfer of Plastic Material with SGF in case of Linear Dry Friction Contact on Steel Surfaces

Authors: Lucian Capitanu, Virgil Florescu

Abstract:

Often for the laboratory studies, modeling of specific tribological processes raises special problems. One such problem is the modeling of some temperatures and extremely high contact pressures, allowing modeling of temperatures and pressures at which the injection or extrusion processing of thermoplastic materials takes place. Tribological problems occur mainly in thermoplastics materials reinforced with glass fibers. They produce an advanced wear to the barrels and screws of processing machines, in short time. Obtaining temperatures around 210 °C and higher, as well as pressures around 100 MPa is very difficult in the laboratory. This paper reports a simple and convenient solution to get these conditions, using friction sliding couples with linear contact, cylindrical liner plastic filled with glass fibers on plate steel samples, polished and super-finished. C120 steel, which is a steel for moulds and Rp3 steel, high speed steel for tools, were used. Obtaining the pressure was achieved by continuous request of the liner in rotational movement up to its elasticity limits, when the dry friction coefficient reaches or exceeds the hardness value of 0.5 HB. By dissipation of the power lost by friction on flat steel sample, are reached contact temperatures at the metal surface that reach and exceed 230 °C, being placed in the range temperature values of the injection. Contact pressures (in load and materials conditions used) ranging from 16.3-36.4 MPa were obtained depending on the plastic material used and the glass fibers content.

Keywords: plastics with glass fibers, dry friction, linear contact, contact temperature, contact pressure, experimental simulation

Procedia PDF Downloads 303
3538 Preparation of Carbon Nanofiber Reinforced HDPE Using Dialkylimidazolium as a Dispersing Agent: Effect on Thermal and Rheological Properties

Authors: J. Samuel, S. Al-Enezi, A. Al-Banna

Abstract:

High-density polyethylene reinforced with carbon nanofibers (HDPE/CNF) have been prepared via melt processing using dialkylimidazolium tetrafluoroborate (ionic liquid) as a dispersion agent. The prepared samples were characterized by thermogravimetric (TGA) and differential scanning calorimetric (DSC) analyses. The samples blended with imidazolium ionic liquid exhibit higher thermal stability. DSC analysis showed clear miscibility of ionic liquid in the HDPE matrix and showed single endothermic peak. The melt rheological analysis of HDPE/CNF composites was performed using an oscillatory rheometer. The influence of CNF and ionic liquid concentration (ranging from 0, 0.5, and 1 wt%) on the viscoelastic parameters was investigated at 200 °C with an angular frequency range of 0.1 to 100 rad/s. The rheological analysis shows the shear-thinning behavior for the composites. An improvement in the viscoelastic properties was observed as the nanofiber concentration increases. The progress in the modulus values was attributed to the structural rigidity imparted by the high aspect ratio CNF. The modulus values and complex viscosity of the composites increased significantly at low frequencies. Composites blended with ionic liquid exhibit slightly lower values of complex viscosity and modulus over the corresponding HDPE/CNF compositions. Therefore, reduction in melt viscosity is an additional benefit for polymer composite processing as a result of wetting effect by polymer-ionic liquid combinations.

Keywords: high-density polyethylene, carbon nanofibers, ionic liquid, complex viscosity

Procedia PDF Downloads 128
3537 Task Scheduling and Resource Allocation in Cloud-based on AHP Method

Authors: Zahra Ahmadi, Fazlollah Adibnia

Abstract:

Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).

Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow

Procedia PDF Downloads 147
3536 Geomatic Techniques to Filter Vegetation from Point Clouds

Authors: M. Amparo Núñez-Andrés, Felipe Buill, Albert Prades

Abstract:

More and more frequently, geomatics techniques such as terrestrial laser scanning or digital photogrammetry, either terrestrial or from drones, are being used to obtain digital terrain models (DTM) used for the monitoring of geological phenomena that cause natural disasters, such as landslides, rockfalls, debris-flow. One of the main multitemporal analyses developed from these models is the quantification of volume changes in the slopes and hillsides, either caused by erosion, fall, or land movement in the source area or sedimentation in the deposition zone. To carry out this task, it is necessary to filter the point clouds of all those elements that do not belong to the slopes. Among these elements, vegetation stands out as it is the one we find with the greatest presence and its constant change, both seasonal and daily, as it is affected by factors such as wind. One of the best-known indexes to detect vegetation on the image is the NVDI (Normalized Difference Vegetation Index), which is obtained from the combination of the infrared and red channels. Therefore it is necessary to have a multispectral camera. These cameras are generally of lower resolution than conventional RGB cameras, while their cost is much higher. Therefore we have to look for alternative indices based on RGB. In this communication, we present the results obtained in Georisk project (PID2019‐103974RB‐I00/MCIN/AEI/10.13039/501100011033) by using the GLI (Green Leaf Index) and ExG (Excessive Greenness), as well as the change to the Hue-Saturation-Value (HSV) color space being the H coordinate the one that gives us the most information for vegetation filtering. These filters are applied both to the images, creating binary masks to be used when applying the SfM algorithms, and to the point cloud obtained directly by the photogrammetric process without any previous filter or the one obtained by TLS (Terrestrial Laser Scanning). In this last case, we have also tried to work with a Riegl VZ400i sensor that allows the reception, as in the aerial LiDAR, of several returns of the signal. Information to be used for the classification on the point cloud. After applying all the techniques in different locations, the results show that the color-based filters allow correct filtering in those areas where the presence of shadows is not excessive and there is a contrast between the color of the slope lithology and the vegetation. As we have advanced in the case of using the HSV color space, it is the H coordinate that responds best for this filtering. Finally, the use of the various returns of the TLS signal allows filtering with some limitations.

Keywords: RGB index, TLS, photogrammetry, multispectral camera, point cloud

Procedia PDF Downloads 156
3535 Fractional-Order Modeling of GaN High Electron Mobility Transistors for Switching Applications

Authors: Anwar H. Jarndal, Ahmed S. Elwakil

Abstract:

In this paper, a fraction-order model for pad parasitic effect of GaN HEMT on Si substrate is developed and validated. Open de-embedding structure is used to characterize and de-embed substrate loading parasitic effects. Unbiased device measurements are implemented to extract parasitic inductances and resistances. The model shows very good simulation for S-parameter measurements under different bias conditions. It has been found that this approach can improve the simulation of intrinsic part of the transistor, which is very important for small- and large-signal modeling process.

Keywords: fractional-order modeling, GaNHEMT, si-substrate, open de-embedding structure

Procedia PDF Downloads 356
3534 Towards Law Data Labelling Using Topic Modelling

Authors: Daniel Pinheiro Da Silva Junior, Aline Paes, Daniel De Oliveira, Christiano Lacerda Ghuerren, Marcio Duran

Abstract:

The Courts of Accounts are institutions responsible for overseeing and point out irregularities of Public Administration expenses. They have a high demand for processes to be analyzed, whose decisions must be grounded on severity laws. Despite the existing large amount of processes, there are several cases reporting similar subjects. Thus, previous decisions on already analyzed processes can be a precedent for current processes that refer to similar topics. Identifying similar topics is an open, yet essential task for identifying similarities between several processes. Since the actual amount of topics is considerably large, it is tedious and error-prone to identify topics using a pure manual approach. This paper presents a tool based on Machine Learning and Natural Language Processing to assists in building a labeled dataset. The tool relies on Topic Modelling with Latent Dirichlet Allocation to find the topics underlying a document followed by Jensen Shannon distance metric to generate a probability of similarity between documents pairs. Furthermore, in a case study with a corpus of decisions of the Rio de Janeiro State Court of Accounts, it was noted that data pre-processing plays an essential role in modeling relevant topics. Also, the combination of topic modeling and a calculated distance metric over document represented among generated topics has been proved useful in helping to construct a labeled base of similar and non-similar document pairs.

Keywords: courts of accounts, data labelling, document similarity, topic modeling

Procedia PDF Downloads 182
3533 Investigating the Effects of Data Transformations on a Bi-Dimensional Chi-Square Test

Authors: Alexandru George Vaduva, Adriana Vlad, Bogdan Badea

Abstract:

In this research, we conduct a Monte Carlo analysis on a two-dimensional χ2 test, which is used to determine the minimum distance required for independent sampling in the context of chaotic signals. We investigate the impact of transforming initial data sets from any probability distribution to new signals with a uniform distribution using the Spearman rank correlation on the χ2 test. This transformation removes the randomness of the data pairs, and as a result, the observed distribution of χ2 test values differs from the expected distribution. We propose a solution to this problem and evaluate it using another chaotic signal.

Keywords: chaotic signals, logistic map, Pearson’s test, Chi Square test, bivariate distribution, statistical independence

Procedia PDF Downloads 100
3532 The Reasons for Food Losses and Waste and the Trends of Their Management in Basic Vegetal Production in Poland

Authors: Krystian Szczepanski, Sylwia Łaba

Abstract:

Production of fruit and vegetables, food cereals or oilseeds affects the natural environment via intake of nutrients being contained in the soil, use of the resources of water, fertilizers and food protection products, and energy. The limitation of the mentioned effects requires the introduction of techniques and methods for cultivation being friendly to the environment and counteracting losses and waste of agricultural raw materials as well as the appropriate management of food waste in every stage of the agri-food supply chain. The link to basic production includes obtaining a vegetal raw material and its storage in agricultural farm and transport to a collecting point. When the plants are ready to be harvested is the initial point; the stage before harvesting is not considered in the system of measuring and monitoring the food losses. The moment at which the raw material enters the stage of processing, i.e., its receipt at the gate of the processing plant, is considered as a final point of basic production. According to the Regulation (EC) No 178/2002 of the European Parliament and of the Council of 28 January 2002, Art. 2, “food” means any substance or product, intended to be, or reasonably expected to be consumed by humans. For the needs of the studies and their analysis, it was determined when raw material is considered as food – the plants (fruit, vegetables, cereals, oilseeds), after being harvested, arrive at storehouses. The aim of the studies was to determine the reasons for loss generation and to analyze the directions of their management in basic vegetal production in Poland in the years 2017 and 2018. The studies on food losses and waste in basic vegetal production were carried out in three sectors – fruit and vegetables, cereals and oilseeds. The studies of the basic production were conducted during the period of March-May 2019 at the territory of the whole country on a representative trail of 250 farms in each sector. The surveys were carried out using the questionnaires by the PAP method; the pollsters conducted the direct questionnaire interviews. From the conducted studies, it is followed that in 19% of the examined farms, any losses were not recorded during preparation, loading, and transport of the raw material to the manufacturing plant. In the farms, where the losses were indicated, the main reason in production of fruit and vegetables was rotting and it constituted more than 20% of the reported reasons, while in the case of cereals and oilseeds’ production, the respondents identified damages, moisture and pests as the most frequent reason. The losses and waste, generated in vegetal production as well as in processing and trade of fruit and vegetables, or cereal products should be appropriately managed or recovered. The respondents indicated composting (more than 60%) as the main direction of waste management in all categories. Animal feed and landfill sites were the other indicated directions of management. Prevention and minimization of loss generation are important in every stage of production as well as in basic production. When possessing the knowledge on the reasons for loss generation, we may introduce the preventive measures, mainly connected with the appropriate conditions and methods of the storage. Production of fruit and vegetables, food cereals or oilseeds affects the natural environment via intake of nutrients being contained in the soil, use of the resources of water, fertilizers and food protection products, and energy. The limitation of the mentioned effects requires the introduction of techniques and methods for cultivation being friendly to the environment and counteracting losses and waste of agricultural raw materials as well as the appropriate management of food waste in every stage of the agri-food supply chain. The link to basic production includes obtaining a vegetal raw material and its storage in agricultural farm and transport to a collecting point. The starting point is when the plants are ready to be harvested; the stage before harvesting is not considered in the system of measuring and monitoring the food losses. The successive stage is the transport of the collected crops to the collecting point or its storage and transport. The moment, at which the raw material enters the stage of processing, i.e. its receipt at the gate of the processing plant, is considered as a final point of basic production. Processing is understood as the change of the raw material into food products. According to the Regulation (EC) No 178/2002 of the European Parliament and of the Council of 28 January 2002, Art. 2, “food” means any substance or product, intended to be, or reasonably expected to be consumed by humans. It was determined (for the needs of the present studies) when raw material is considered as a food; it is the moment when the plants (fruit, vegetables, cereals, oilseeds), after being harvested, arrive at storehouses. The aim of the studies was to determine the reasons for loss generation and to analyze the directions of their management in basic vegetal production in Poland in the years 2017 and 2018. The studies on food losses and waste in basic vegetal production were carried out in three sectors – fruit and vegetables, cereals and oilseeds. The studies of the basic production were conducted during the period of March-May 2019 at the territory of the whole country on a representative trail of 250 farms in each sector. The surveys were carried out using the questionnaires by the PAPI (Paper & Pen Personal Interview) method; the pollsters conducted the direct questionnaire interviews. From the conducted studies, it is followed that in 19% of the examined farms, any losses were not recorded during preparation, loading, and transport of the raw material to the manufacturing plant. In the farms, where the losses were indicated, the main reason in production of fruit and vegetables was rotting and it constituted more than 20% of the reported reasons, while in the case of cereals and oilseeds’ production, the respondents identified damages, moisture, and pests as the most frequent reason. The losses and waste, generated in vegetal production as well as in processing and trade of fruit and vegetables, or cereal products should be appropriately managed or recovered. The respondents indicated composting (more than 60%) as the main direction of waste management in all categories. Animal feed and landfill sites were the other indicated directions of management. Prevention and minimization of loss generation are important in every stage of production as well as in basic production. When possessing the knowledge on the reasons for loss generation, we may introduce the preventive measures, mainly connected with the appropriate conditions and methods of the storage. ACKNOWLEDGEMENT The article was prepared within the project: "Development of a waste food monitoring system and an effective program to rationalize losses and reduce food waste", acronym PROM implemented under the STRATEGIC SCIENTIFIC AND LEARNING PROGRAM - GOSPOSTRATEG financed by the National Center for Research and Development in accordance with the provisions of Gospostrateg1 / 385753/1/2018

Keywords: food losses, food waste, PAP method, vegetal production

Procedia PDF Downloads 118
3531 A Stepwise Approach for Piezoresistive Microcantilever Biosensor Optimization

Authors: Amal E. Ahmed, Levent Trabzon

Abstract:

Due to the low concentration of the analytes in biological samples, the use of Biological Microelectromechanical System (Bio-MEMS) biosensors for biomolecules detection results in a minuscule output signal that is not good enough for practical applications. In response to this, a need has arisen for an optimized biosensor capable of giving high output signal in response the detection of few analytes in the sample; the ultimate goal is being able to convert the attachment of a single biomolecule into a measurable quantity. For this purpose, MEMS microcantilevers based biosensors emerged as a promising sensing solution because it is simple, cheap, very sensitive and more importantly does not need analytes optical labeling (Label-free). Among the different microcantilever transducing techniques, piezoresistive based microcantilever biosensors became more prominent because it works well in liquid environments and has an integrated readout system. However, the design of piezoresistive microcantilevers is not a straightforward problem due to coupling between the design parameters, constraints, process conditions, and performance. It was found that the parameters that can be optimized to enhance the sensitivity of Piezoresistive microcantilever-based sensors are: cantilever dimensions, cantilever material, cantilever shape, piezoresistor material, piezoresistor doping level, piezoresistor dimensions, piezoresistor position, Stress Concentration Region's (SCR) shape and position. After a systematic analyzation of the effect of each design and process parameters on the sensitivity, a step-wise optimization approach was developed in which almost all these parameters were variated one at each step while fixing the others to get the maximum possible sensitivity at the end. At each step, the goal was to optimize the parameter in a way that it maximizes and concentrates the stress in the piezoresistor region for the same applied force thus get the higher sensitivity. Using this approach, an optimized sensor that has 73.5x times higher electrical sensitivity (ΔR⁄R) than the starting sensor was obtained. In addition to that, this piezoresistive microcantilever biosensor it is more sensitive than the other similar sensors previously reported in the open literature. The mechanical sensitivity of the final senior is -1.5×10-8 Ω/Ω ⁄pN; which means that for each 1pN (10-10 g) biomolecules attach to this biosensor; the piezoresistor resistivity will decrease by 1.5×10-8 Ω. Throughout this work COMSOL Multiphysics 5.0, a commercial Finite Element Analysis (FEA) tool, has been used to simulate the sensor performance.

Keywords: biosensor, microcantilever, piezoresistive, stress concentration region (SCR)

Procedia PDF Downloads 572
3530 Quantification of E-Waste: A Case Study in Federal University of Espírito Santo, Brazil

Authors: Andressa S. T. Gomes, Luiza A. Souza, Luciana H. Yamane, Renato R. Siman

Abstract:

The segregation of waste of electrical and electronic equipment (WEEE) in the generating source, its characterization (quali-quantitative) and identification of origin, besides being integral parts of classification reports, are crucial steps to the success of its integrated management. The aim of this paper was to count WEEE generation at the Federal University of Espírito Santo (UFES), Brazil, as well as to define sources, temporary storage sites, main transportations routes and destinations, the most generated WEEE and its recycling potential. Quantification of WEEE generated at the University in the years between 2010 and 2015 was performed using data analysis provided by UFES’s sector of assets management. EEE and WEEE flow in the campuses information were obtained through questionnaires applied to the University workers. It was recorded 6028 WEEEs units of data processing equipment disposed by the university between 2010 and 2015. Among these waste, the most generated were CRT screens, desktops, keyboards and printers. Furthermore, it was observed that these WEEEs are temporarily stored in inappropriate places at the University campuses. In general, these WEEE units are donated to NGOs of the city, or sold through auctions (2010 and 2013). As for recycling potential, from the primary processing and further sale of printed circuit boards (PCB) from the computers, the amount collected could reach U$ 27,839.23. The results highlight the importance of a WEEE management policy at the University.

Keywords: solid waste, waste of electrical and electronic equipment, waste management, institutional solid waste generation

Procedia PDF Downloads 260
3529 Opinion Mining to Extract Community Emotions on Covid-19 Immunization Possible Side Effects

Authors: Yahya Almurtadha, Mukhtar Ghaleb, Ahmed M. Shamsan Saleh

Abstract:

The world witnessed a fierce attack from the Covid-19 virus, which affected public life socially, economically, healthily and psychologically. The world's governments tried to confront the pandemic by imposing a number of precautionary measures such as general closure, curfews and social distancing. Scientists have also made strenuous efforts to develop an effective vaccine to train the immune system to develop antibodies to combat the virus, thus reducing its symptoms and limiting its spread. Artificial intelligence, along with researchers and medical authorities, has accelerated the vaccine development process through big data processing and simulation. On the other hand, one of the most important negatives of the impact of Covid 19 was the state of anxiety and fear due to the blowout of rumors through social media, which prompted governments to try to reassure the public with the available means. This study aims to proposed using Sentiment Analysis (AKA Opinion Mining) and deep learning as efficient artificial intelligence techniques to work on retrieving the tweets of the public from Twitter and then analyze it automatically to extract their opinions, expression and feelings, negatively or positively, about the symptoms they may feel after vaccination. Sentiment analysis is characterized by its ability to access what the public post in social media within a record time and at a lower cost than traditional means such as questionnaires and interviews, not to mention the accuracy of the information as it comes from what the public expresses voluntarily.

Keywords: deep learning, opinion mining, natural language processing, sentiment analysis

Procedia PDF Downloads 173
3528 Melatonin Improved Vase Quality by Delaying Oxidation Reaction and Supplying More Energies in Cut Peony (Paeonia Lactiflora cv. Sarah)

Authors: Tai Chen, Caihuan Tian, Xiuxia Ren, Jingqi Xue, Xiuxin Zhang

Abstract:

The herbaceous peony has become increasingly popular worldwide in recent years, especially as a cut flower with great economic value. However, peony has a very short vase life, only 3-5 d usually, which seriously affects its commodity value. In this study, we used the cut peony (Paeonia lactiflora cv. Sarah) as a material and found that melatonin treatment significantly improved its postharvest performance. In the control group, its vase life was 4.8 d, accompanied by petal dropping at last; melatonin treatment (40 μM) increased this time to 6.9 d without petal dropping at the end. Further study showed that melatonin treatment significantly increased the activity of antioxidant enzymes as well as reduced sugar content in petals, whereas the starch content in petals decreased. These results indicated that melatonin treatment may delay the oxidation reaction caused by aging, which also provides extra energy for maintaining flowering. Through full-length transcriptome sequencing, a total of 2819 differentially expressed genes (DEGs) between control and melatonin treatment groups were identified. KEGG enrichment analysis showed that these DEGs were mainly involved in three pathways, including melatonin synthesis, starch and sucrose conversion, and plant disease resistance. After the RT-qPCR verification, we identified three DEGs, named PlBAM3, PlWRKY22 and PlTIP1, and they should play major roles in melatonin-improved postharvest performance. One possible reason is that PlBAM3 caused maltose production (by starch degradation), maintained the proline biosynthesis, and then alleviated oxidative stress. Another reason is that both PlBAM3 and PlWRKY22 are key drought resistance regulators, which have the ability to alleviate osmotic stress and improve water absorption, which may also help to improve the postharvest quality of cut peony. In addition, PlTIP1 is involved in the sugar signal pathway, indicating sugar may also as a signal substance during this process. Our work may give new ideas for developing new ways to prolong the vase life of cut peony and improve its commodity value eventually.

Keywords: cut peony, melatonin, vase life, oxidation reaction, energy supply, differentially expressed genes

Procedia PDF Downloads 55
3527 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 366
3526 Processing Methods for Increasing the Yield, Nutritional Value and Stability of Coconut Milk

Authors: Archana G. Lamdande, Shyam R. Garud, K. S. M. S. Raghavarao

Abstract:

Coconut has two edible parts, that is, a white kernel (solid endosperm) and coconut water (liquid endosperm). The white kernel is generally used in fresh or dried form for culinary purposes. Coconut testa, is the brown skin, covering the coconut kernel. It is removed by paring of wet coconut and obtained as a by-product in coconut processing industries during the production of products such as desiccated coconut, coconut milk, whole coconut milk powder and virgin coconut oil. At present, it is used as animal feed component after drying and recovering the residual oil (by expelling). Experiments were carried out on expelling of coconut milk for shredded coconut with and without testa removal, in order to explore the possibility of increasing the milk yield and value addition in terms of increased polyphenol content. The color characteristics of coconut milk obtained from the grating without removal of testa were observed to be L* 82.79, a* 0.0125, b* 6.245, while that obtained from grating with removal of testa were L* 83.24, a* -0.7925, b* 3.1. A significant increase was observed in total phenol content of coconut milk obtained from the grating with testa (833.8 µl/ml) when compared to that from without testa (521.3 µl/ml). However, significant difference was not observed in protein content of coconut milk obtained from the grating with and without testa (4.9 and 5.0% w/w, respectively). Coconut milk obtained from grating without removal of testa showed higher milk yield (62% w/w) when compared to that obtained from grating with removal of testa (60% w/w). The fat content in coconut milk was observed to be 32% (w/w), and it is unstable due to such a high fat content. Therefore, several experiments were carried out for examining its stability by adjusting the fat content at different levels (32, 28, 24, and 20% w/w). It was found that the coconut milk was more stable with a fat content of 24 % (w/w). Homogenization and ultrasonication and their combinations were used for exploring the possibility of increasing the stability of coconut milk. The microscopic study was carried out for analyzing the size of fat globules and the degree of their uniform distribution.

Keywords: coconut milk, homogenization, stability, testa, ultrasonication

Procedia PDF Downloads 318
3525 ICAM-2, A Protein of Antitumor Immune Response in Mekong Giant Catfish (Pangasianodon gigas)

Authors: Jiraporn Rojtinnakorn

Abstract:

ICAM-2 (intercellular adhesion molecule 2) or CD102 (Cluster of Differentiation 102) is type I trans-membrane glycoproteins, composing 2-9 immunoglobulin-like C2-type domains. ICAM-2 plays the particular role in immune response and cell surveillance. It is concerned in innate and specific immunity, cell survival signal, apoptosis, and anticancer. EST clone of ICAM-2, from P. gigas blood cell EST libraries, showed high identity to human ICAM-2 (92%) with conserve region of ICAM N-terminal domain and part of Ig superfamily. Gene and protein of ICAM-2 has been founded in mammals. This is the first report of ICAM-2 in fish.

Keywords: ICAM-2, CD102, Pangasianodon gigas, antitumor

Procedia PDF Downloads 229
3524 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 113
3523 Contribution of Spatial Teledetection to the Geological Mapping of the Imiter Buttonhole: Application to the Mineralized Structures of the Principal Corps B3 (CPB3) of the Imiter Mine (Anti-atlas, Morocco)

Authors: Bouayachi Ali, Alikouss Saida, Baroudi Zouhir, Zerhouni Youssef, Zouhair Mohammed, El Idrissi Assia, Essalhi Mourad

Abstract:

The world-class Imiter silver deposit is located on the northern flank of the Precambrian Imiter buttonhole. This deposit is formed by epithermal veins hosted in the sandstone-pelite formations of the lower complex and in the basic conglomerates of the upper complex, these veins are controlled by a regional scale fault cluster, oriented N70°E to N90°E. The present work on the contribution of remote sensing on the geological mapping of the Imiter buttonhole and application to the mineralized structures of the Principal Corps B3. Mapping on satellite images is a very important tool in mineral prospecting. It allows the localization of the zones of interest in order to orientate the field missions by helping the localization of the major structures which facilitates the interpretation, the programming and the orientation of the mining works. The predictive map also allows for the correction of field mapping work, especially the direction and dimensions of structures such as dykes, corridors or scrapings. The use of a series of processing such as SAM, PCA, MNF and unsupervised and supervised classification on a Landsat 8 satellite image of the study area allowed us to highlight the main facies of the Imite area. To improve the exploration research, we used another processing that allows to realize a spatial distribution of the alteration mineral indices, and the application of several filters on the different bands to have lineament maps.

Keywords: principal corps B3, teledetection, Landsat 8, Imiter II, silver mineralization, lineaments

Procedia PDF Downloads 97
3522 Comprehensive Study of X-Ray Emission by APF Plasma Focus Device

Authors: M. Habibi

Abstract:

The time-resolved studies of soft and hard X-ray were carried out over a wide range of argon pressures by employing an array of eight filtered photo PIN diodes and a scintillation detector, simultaneously. In 50% of the discharges, the soft X-ray is seen to be emitted in short multiple pulses corresponding to different compression, whereas it is a single pulse for hard X-rays corresponding to only the first strong compression. It should be stated that multiple compressions dominantly occur at low pressures and high pressures are mostly in the single compression regime. In 43% of the discharges, at all pressures except for optimum pressure, the first period is characterized by two or more sharp peaks.The X–ray signal intensity during the second and subsequent compressions is much smaller than the first compression.

Keywords: plasma focus device, SXR, HXR, Pin-diode, argon plasma

Procedia PDF Downloads 409
3521 Integrating Optuna and Synthetic Data Generation for Optimized Medical Transcript Classification Using BioBERT

Authors: Sachi Nandan Mohanty, Shreya Sinha, Sweeti Sah, Shweta Sharma4

Abstract:

The advancement of natural language processing has majorly influenced the field of medical transcript classification, providing a robust framework for enhancing the accuracy of clinical data processing. It has enormous potential to transform healthcare and improve people's livelihoods. This research focuses on improving the accuracy of medical transcript categorization using Bidirectional Encoder Representations from Transformers (BERT) and its specialized variants, including BioBERT, ClinicalBERT, SciBERT, and BlueBERT. The experimental work employs Optuna, an optimization framework, for hyperparameter tuning to identify the most effective variant, concluding that BioBERT yields the best performance. Furthermore, various optimizers, including Adam, RMSprop, and Layerwise adaptive large batch optimization (LAMB), were evaluated alongside BERT's default AdamW optimizer. The findings show that the LAMB optimizer achieves a performance that is equally good as AdamW's. Synthetic data generation techniques from Gretel were utilized to augment the dataset, expanding the original dataset from 5,000 to 10,000 rows. Subsequent evaluations demonstrated that the model maintained its performance with synthetic data, with the LAMB optimizer showing marginally better results. The enhanced dataset and optimized model configurations improved classification accuracy, showcasing the efficacy of the BioBERT variant and the LAMB optimizer. It resulted in an accuracy of up to 98.2% and 90.8% for the original and combined datasets.

Keywords: BioBERT, clinical data, healthcare AI, transformer models

Procedia PDF Downloads 6
3520 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data

Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone

Abstract:

The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.

Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine

Procedia PDF Downloads 241
3519 A U-Net Based Architecture for Fast and Accurate Diagram Extraction

Authors: Revoti Prasad Bora, Saurabh Yadav, Nikita Katyal

Abstract:

In the context of educational data mining, the use case of extracting information from images containing both text and diagrams is of high importance. Hence, document analysis requires the extraction of diagrams from such images and processes the text and diagrams separately. To the author’s best knowledge, none among plenty of approaches for extracting tables, figures, etc., suffice the need for real-time processing with high accuracy as needed in multiple applications. In the education domain, diagrams can be of varied characteristics viz. line-based i.e. geometric diagrams, chemical bonds, mathematical formulas, etc. There are two broad categories of approaches that try to solve similar problems viz. traditional computer vision based approaches and deep learning approaches. The traditional computer vision based approaches mainly leverage connected components and distance transform based processing and hence perform well in very limited scenarios. The existing deep learning approaches either leverage YOLO or faster-RCNN architectures. These approaches suffer from a performance-accuracy tradeoff. This paper proposes a U-Net based architecture that formulates the diagram extraction as a segmentation problem. The proposed method provides similar accuracy with a much faster extraction time as compared to the mentioned state-of-the-art approaches. Further, the segmentation mask in this approach allows the extraction of diagrams of irregular shapes.

Keywords: computer vision, deep-learning, educational data mining, faster-RCNN, figure extraction, image segmentation, real-time document analysis, text extraction, U-Net, YOLO

Procedia PDF Downloads 141
3518 Methotrexate Associated Skin Cancer: A Signal Review of Pharmacovigilance Center

Authors: Abdulaziz Alakeel, Abdulrahman Alomair, Mohammed Fouda

Abstract:

Introduction: Methotrexate (MTX) is an antimetabolite used to treat multiple conditions, including neoplastic diseases, severe psoriasis, and rheumatoid arthritis. Skin cancer is the out-of-control growth of abnormal cells in the epidermis, the outermost skin layer, caused by unrepaired DNA damage that triggers mutations. These mutations lead the skin cells to multiply rapidly and form malignant tumors. The aim of this review is to evaluate the risk of skin cancer associated with the use of methotrexate and to suggest regulatory recommendations if required. Methodology: Signal Detection team at Saudi Food and Drug Authority (SFDA) performed a safety review using National Pharmacovigilance Center (NPC) database as well as the World Health Organization (WHO) VigiBase, alongside with literature screening to retrieve related information for assessing the causality between skin cancer and methotrexate. The search conducted in July 2020. Results: Four published articles support the association seen while searching in literature, a recent randomized control trial published in 2020 revealed a statistically significant increase in skin cancer among MTX users. Another study mentioned methotrexate increases the risk of non-melanoma skin cancer when used in combination with immunosuppressant and biologic agents. In addition, the incidence of melanoma for methotrexate users was 3-fold more than the general population in a cohort study of rheumatoid arthritis patients. The last article estimated the risk of cutaneous malignant melanoma (CMM) in a cohort study shows a statistically significant risk increase for CMM was observed in MTX exposed patients. The WHO database (VigiBase) searched for individual case safety reports (ICSRs) reported for “Skin Cancer” and 'Methotrexate' use, which yielded 121 ICSRs. The initial review revealed that 106 cases are insufficiently documented for proper medical assessment. However, the remaining fifteen cases have extensively evaluated by applying the WHO criteria of causality assessment. As a result, 30 percent of the cases showed that MTX could possibly cause skin cancer; five cases provide unlikely association and five un-assessable cases due to lack of information. The Saudi NPC database searched to retrieve any reported cases for the combined terms methotrexate/skin cancer; however, no local cases reported up to date. The data mining of the observed and the expected reporting rate for drug/adverse drug reaction pair is estimated using information component (IC), a tool developed by the WHO Uppsala Monitoring Centre to measure the reporting ratio. Positive IC reflects higher statistical association, while negative values translated as a less statistical association, considering the null value equal to zero. Results showed that a combination of 'Methotrexate' and 'Skin cancer' observed more than expected when compared to other medications in the WHO database (IC value is 1.2). Conclusion: The weighted cumulative pieces of evidence identified from global cases, data mining, and published literature are sufficient to support a causal association between the risk of skin cancer and methotrexate. Therefore, health care professionals should be aware of this possible risk and may consider monitoring any signs or symptoms of skin cancer in patients treated with methotrexate.

Keywords: methotrexate, skin cancer, signal detection, pharmacovigilance

Procedia PDF Downloads 116