Search results for: cancer dataset
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 636

Search results for: cancer dataset

66 Faster Pedestrian Recognition Using Deformable Part Models

Authors: Alessandro Preziosi, Antonio Prioletti, Luca Castangia

Abstract:

Deformable part models achieve high precision in pedestrian recognition, but all publicly available implementations are too slow for real-time applications. We implemented a deformable part model algorithm fast enough for real-time use by exploiting information about the camera position and orientation. This implementation is both faster and more precise than alternative DPM implementations. These results are obtained by computing convolutions in the frequency domain and using lookup tables to speed up feature computation. This approach is almost an order of magnitude faster than the reference DPM implementation, with no loss in precision. Knowing the position of the camera with respect to horizon it is also possible prune many hypotheses based on their size and location. The range of acceptable sizes and positions is set by looking at the statistical distribution of bounding boxes in labelled images. With this approach it is not needed to compute the entire feature pyramid: for example higher resolution features are only needed near the horizon. This results in an increase in mean average precision of 5% and an increase in speed by a factor of two. Furthermore, to reduce misdetections involving small pedestrians near the horizon, input images are supersampled near the horizon. Supersampling the image at 1.5 times the original scale, results in an increase in precision of about 4%. The implementation was tested against the public KITTI dataset, obtaining an 8% improvement in mean average precision over the best performing DPM-based method. By allowing for a small loss in precision computational time can be easily brought down to our target of 100ms per image, reaching a solution that is faster and still more precise than all publicly available DPM implementations.

Keywords: Autonomous vehicles, deformable part model, dpm, pedestrian recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
65 Effects of Pressure and Temperature on the Extraction of Benzyl Isothiocyanate by Supercritical Fluids from Tropaeolum majus L. Leaves

Authors: Espinoza S. Clara, Gamarra Q. Flor, Marianela F. Ramos Quispe S. Miguel, Flores R. Omar

Abstract:

Tropaeolum majus L. is a native plant to South and Central America, used since ancient times by our ancestors to combat different diseases. Glucotropaeolonin is one of its main components, which when hydrolyzed, forms benzyl isothiocyanate (BIT) that promotes cellular apoptosis (programmed cell death in cancer cells). Therefore, the present research aims to evaluate the effect of the pressure and temperature of BIT extraction by supercritical CO2 from Tropaeolum majus L. The extraction was carried out in a supercritical fluid extractor equipment Speed SFE BASIC Brand: Poly science, the leaves of Tropaeolum majus L. were ground for one hour and lyophilized until obtaining a humidity of 6%. The extraction with supercritical CO2 was carried out with pressures of 200 bar and 300 bar, temperatures of 50°C, 60°C and 70°C, obtained by the conjugation of these six treatments. BIT was identified by thin layer chromatography using 98% BIT as the standard, and as the mobile phase hexane: dichloromethane (4:2). Subsequently, BIT quantification was performed by high performance liquid chromatography (HPLC). The highest yield of oleoresin by supercritical CO2 extraction was obtained pressure 300 bar and temperature at 60°C; and the higher content of BIT at pressure 200 bar and 70°C for 30 minutes to obtain 113.615 ± 0.03 mg BIT/100 g dry matter was obtained.

Keywords: Tropaeolum majus L., supercritical fluids, benzyl isothiocyanate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 879
64 Copper Price Prediction Model for Various Economic Situations

Authors: Haidy S. Ghali, Engy Serag, A. Samer Ezeldin

Abstract:

Copper is an essential raw material used in the construction industry. During 2021 and the first half of 2022, the global market suffered from a significant fluctuation in copper raw material prices due to the aftermath of both the COVID-19 pandemic and the Russia-Ukraine war which exposed its consumers to an unexpected financial risk. Thereto, this paper aims to develop two hybrid price prediction models using artificial neural network and long short-term memory (ANN-LSTM), by Python, that can forecast the average monthly copper prices, traded in the London Metal Exchange; the first model is a multivariate model that forecasts the copper price of the next 1-month and the second is a univariate model that predicts the copper prices of the upcoming three months. Historical data of average monthly London Metal Exchange copper prices are collected from January 2009 till July 2022 and potential external factors are identified and employed in the multivariate model. These factors lie under three main categories: energy prices, and economic indicators of the three major exporting countries of copper depending on the data availability. Before developing the LSTM models, the collected external parameters are analyzed with respect to the copper prices using correlation, and multicollinearity tests in R software; then, the parameters are further screened to select the parameters that influence the copper prices. Then, the two LSTM models are developed, and the dataset is divided into training, validation, and testing sets. The results show that the performance of the 3-month prediction model is better than the 1-month prediction model; but still, both models can act as predicting tools for diverse economic situations.

Keywords: Copper prices, prediction model, neural network, time series forecasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 187
63 Evaluating the Validity of Computational Fluid Dynamics Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements

Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck

Abstract:

This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the mean geometric bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.

Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 374
62 Characteristics of Hemodynamics in a Bileaflet Mechanical Heart Valve using an Implicit FSI Method

Authors: Tae-Hyub Hong, Choeng-Ryul Choi, Chang-Nyung Kim

Abstract:

Human heart valves diseased by congenital heart defects, rheumatic fever, bacterial infection, cancer may cause stenosis or insufficiency in the valves. Treatment may be with medication but often involves valve repair or replacement (insertion of an artificial heart valve). Bileaflet mechanical heart valves (BMHVs) are widely implanted to replace the diseased heart valves, but still suffer from complications such as hemolysis, platelet activation, tissue overgrowth and device failure. These complications are closely related to both flow characteristics through the valves and leaflet dynamics. In this study, the physiological flow interacting with the moving leaflets in a bileaflet mechanical heart valve (BMHV) is simulated with a strongly coupled implicit fluid-structure interaction (FSI) method which is newly organized based on the Arbitrary-Lagrangian-Eulerian (ALE) approach and the dynamic mesh method (remeshing) of FLUENT. The simulated results are in good agreement with previous experimental studies. This study shows the applicability of the present FSI model to the complicated physics interacting between fluid flow and moving boundary.

Keywords: Bileaflet Mechanical Heart Valve, Fluid- Structure Interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
61 A Conceptual Framework of Scheduled Waste Management in Highway Industry

Authors: Nurul Nadhirah Anuar, Muhammad Fauzi Abdul Ghani

Abstract:

Scheduled waste management is very important in environmental and health aspects. In delivering services, highway industry has been indirectly involved in producing scheduled wastes. This paper aims to define the scheduled waste, to provide a conceptual framework of the scheduled waste management in highway industry, to highlight the effect of improper management of scheduled waste and to encourage future researchers to identify and share the present practice of scheduled waste management in their country. The understanding on effective management of scheduled waste will help the operators of highway industry, the academicians, future researchers, and encourage a friendly environment around the world. The study on scheduled waste management in highway industry is very crucial as highway transverse and run along kilometers crossing the various type of environment, residential and schools. Using Environmental Quality (Scheduled Waste) Regulations 2005 as a guide, this conceptual paper highlight several scheduled wastes produced by highway industry in Malaysia and provide a conceptual framework of scheduled waste management that focused on the highway industry. Understanding on scheduled waste management is vital in order to preserve the environment. Besides that, the waste substances are hazardous to human being. Many diseases have been associated with the improper management of schedule waste such as cancer, throat irritation and respiration problem.

Keywords: Asia Region, Environment, Highway Industry, Scheduled Waste.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2465
60 Remaining Useful Life Estimation of Bearings Based on Nonlinear Dimensional Reduction Combined with Timing Signals

Authors: Zhongmin Wang, Wudong Fan, Hengshan Zhang, Yimin Zhou

Abstract:

In data-driven prognostic methods, the prediction accuracy of the estimation for remaining useful life of bearings mainly depends on the performance of health indicators, which are usually fused some statistical features extracted from vibrating signals. However, the existing health indicators have the following two drawbacks: (1) The differnet ranges of the statistical features have the different contributions to construct the health indicators, the expert knowledge is required to extract the features. (2) When convolutional neural networks are utilized to tackle time-frequency features of signals, the time-series of signals are not considered. To overcome these drawbacks, in this study, the method combining convolutional neural network with gated recurrent unit is proposed to extract the time-frequency image features. The extracted features are utilized to construct health indicator and predict remaining useful life of bearings. First, original signals are converted into time-frequency images by using continuous wavelet transform so as to form the original feature sets. Second, with convolutional and pooling layers of convolutional neural networks, the most sensitive features of time-frequency images are selected from the original feature sets. Finally, these selected features are fed into the gated recurrent unit to construct the health indicator. The results state that the proposed method shows the enhance performance than the related studies which have used the same bearing dataset provided by PRONOSTIA.

Keywords: Continuous wavelet transform, convolution neural network, gated recurrent unit, health indicators, remaining useful life.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 768
59 A Deep-Learning Based Prediction of Pancreatic Adenocarcinoma with Electronic Health Records from the State of Maine

Authors: Xiaodong Li, Peng Gao, Chao-Jung Huang, Shiying Hao, Xuefeng B. Ling, Yongxia Han, Yaqi Zhang, Le Zheng, Chengyin Ye, Modi Liu, Minjie Xia, Changlin Fu, Bo Jin, Karl G. Sylvester, Eric Widen

Abstract:

Predicting the risk of Pancreatic Adenocarcinoma (PA) in advance can benefit the quality of care and potentially reduce population mortality and morbidity. The aim of this study was to develop and prospectively validate a risk prediction model to identify patients at risk of new incident PA as early as 3 months before the onset of PA in a statewide, general population in Maine. The PA prediction model was developed using Deep Neural Networks, a deep learning algorithm, with a 2-year electronic-health-record (EHR) cohort. Prospective results showed that our model identified 54.35% of all inpatient episodes of PA, and 91.20% of all PA that required subsequent chemoradiotherapy, with a lead-time of up to 3 months and a true alert of 67.62%. The risk assessment tool has attained an improved discriminative ability. It can be immediately deployed to the health system to provide automatic early warnings to adults at risk of PA. It has potential to identify personalized risk factors to facilitate customized PA interventions.

Keywords: Cancer prediction, deep learning, electronic health records, pancreatic adenocarcinoma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 847
58 Electro-Thermal Imaging of Breast Phantom: An Experimental Study

Authors: H. Feza Carlak, N. G. Gencer

Abstract:

To increase the temperature contrast in thermal images, the characteristics of the electrical conductivity and thermal imaging modalities can be combined. In this experimental study, it is objected to observe whether the temperature contrast created by the tumor tissue can be improved just due to the current application within medical safety limits. Various thermal breast phantoms are developed to simulate the female breast tissue. In vitro experiments are implemented using a thermal infrared camera in a controlled manner. Since experiments are implemented in vitro, there is no metabolic heat generation and blood perfusion. Only the effects and results of the electrical stimulation are investigated. Experimental study is implemented with two-dimensional models. Temperature contrasts due to the tumor tissues are obtained. Cancerous tissue is determined using the difference and ratio of healthy and tumor images. 1 cm diameter single tumor tissue causes almost 40 °mC temperature contrast on the thermal-breast phantom. Electrode artifacts are reduced by taking the difference and ratio of background (healthy) and tumor images. Ratio of healthy and tumor images show that temperature contrast is increased by the current application.

Keywords: Medical diagnostic imaging, breast phantom, active thermography, breast cancer detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
57 A Review on the Mechanism Removal of Pesticides and Heavy Metal from Agricultural Runoff in Treatment Train

Authors: N. A. Ahmad Zubairi, H. Takaijudin, K. W. Yusof

Abstract:

Pesticides have been used widely over the world in agriculture to protect from pests and reduce crop losses. However, it affects the environment with toxic chemicals. Exceed of toxic constituents in the ecosystem will result in bad side effects. The hydrological cycle is related to the existence of pesticides and heavy metal which it can penetrate through varieties of sources into the soil or water bodies, especially runoff. Therefore, proper mechanisms of pesticide and heavy metal removal should be studied to improve the quality of ecosystem free or reduce from unwanted substances. This paper reviews the use of treatment train and its mechanisms to minimize pesticides and heavy metal from agricultural runoff. Organochlorine (OCL) is a common pesticide that was found in the agricultural runoff. OCL is one of the toxic chemicals that can disturb the ecosystem such as inhibiting plants' growth and harm human health by having symptoms as asthma, active cancer cell, vomit, diarrhea, etc. Thus, this unwanted contaminant gives disadvantages to the environment and needs treatment system. Hence, treatment train by bioretention system is suitable because removal efficiency achieves until 90% of pesticide removal with selected vegetated plant and additive.

Keywords: Pesticides, heavy metal, agricultural runoff, bioretention, mechanism removal, treatment train.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 601
56 Chemical Composition of Essential Oil and in vitro Antibacterial and Anticancer Activity of the Hydroalcolic Extract from Coronilla varia

Authors: Dehpour A. A., Eslami B., Rezaie S., Hashemian S. F., Shafie F., Kiaie M.

Abstract:

The aims of study were investigation on chemical composition essential oil and the effect of extract of Coronilla varia on antimicrobial and cytotoxicity activity. The essential oils of Coronilla varia is obtained by hydrodistillation and analyzed by (GC/MS) for determining their chemical composition and identification of their components. Antibacterial activity of plant extract was determined by disc diffusion method and anticancer activity measured by MTT assay. The major components in essential oil were Caryophyllene Oxide (60.19%), Alphacadinol (4.13%) and Homoadantaneca Robexylic Acid (3.31%). The extracts from Coronilla varia had interesting activity against Proteus mirabilis in the concentration of 700 μg/disc and did not show any activity against Staphylococus aureus, Bacillus subtillis, Klebsiella pneumonia and Entrobacter cloacae. The positive control, Ampicillin, Chloramphenicol and Cenphalothin had shown zone of inhibition resistant all bacteria. The ethanol extract of Corohilla varia inhibited on MCF7 cell lines. IC50 0.6(mg/ml) was the optimum concentration of extract from Coronilla varia inhibition of cell line growth. The MCF7 cancer cell line and Proteus mirabilis were more sensitive to Coronilla varia ethanol extract.

Keywords: Coronilla varia, Essential oil, Antibacterial, Anticancer, HeLa cell line.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1797
55 A Computer Aided Detection (CAD) System for Microcalcifications in Mammograms - MammoScan mCaD

Authors: Kjersti Engan, Thor Ole Gulsrud, Karl Fredrik Fretheim, Barbro Furebotten Iversen, Liv Eriksen

Abstract:

Clusters of microcalcifications in mammograms are an important sign of breast cancer. This paper presents a complete Computer Aided Detection (CAD) scheme for automatic detection of clustered microcalcifications in digital mammograms. The proposed system, MammoScan μCaD, consists of three main steps. Firstly all potential microcalcifications are detected using a a method for feature extraction, VarMet, and adaptive thresholding. This will also give a number of false detections. The goal of the second step, Classifier level 1, is to remove everything but microcalcifications. The last step, Classifier level 2, uses learned dictionaries and sparse representations as a texture classification technique to distinguish single, benign microcalcifications from clustered microcalcifications, in addition to remove some remaining false detections. The system is trained and tested on true digital data from Stavanger University Hospital, and the results are evaluated by radiologists. The overall results are promising, with a sensitivity > 90 % and a low false detection rate (approx 1 unwanted pr. image, or 0.3 false pr. image).

Keywords: mammogram, microcalcifications, detection, CAD, MammoScan μCaD, VarMet, dictionary learning, texture, FTCM, classification, adaptive thresholding

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1807
54 Sea Level Characteristics Referenced to Specific Geodetic Datum in Alexandria, Egypt

Authors: Ahmed M. Khedr, Saad M. Abdelrahman, Kareem M. Tonbol

Abstract:

Two geo-referenced sea level datasets (September 2008 – November 2010) and (April 2012 – January 2014) were recorded at Alexandria Western Harbour (AWH). Accurate re-definition of tidal datum, referred to the latest International Terrestrial Reference Frame (ITRF-2014), was discussed and updated to improve our understanding of the old predefined tidal datum at Alexandria. Tidal and non-tidal components of sea level were separated with the use of Delft-3D hydrodynamic model-tide suit (Delft-3D, 2015). Tidal characteristics at AWH were investigated and harmonic analysis showed the most significant 34 constituents with their amplitudes and phases. Tide was identified as semi-diurnal pattern as indicated by a “Form Factor” of 0.24 and 0.25, respectively. Principle tidal datums related to major tidal phenomena were recalculated referred to a meaningful geodetic height datum. The portion of residual energy (surge) out of the total sea level energy was computed for each dataset and found 77% and 72%, respectively. Power spectral density (PSD) showed accurate resolvability in high band (1–6) cycle/days for the nominated independent constituents, except some neighbouring constituents, which are too close in frequency. Wind and atmospheric pressure data, during the recorded sea level time, were analysed and cross-correlated with the surge signals. Moderate association between surge and wind and atmospheric pressure data were obtained. In addition, long-term sea level rise trend at AWH was computed and showed good agreement with earlier estimated rates.

Keywords: Alexandria, Delft-3D, Egypt, geodetic reference, harmonic analysis, sea level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360
53 Hands-off Parking: Deep Learning Gesture-Based System for Individuals with Mobility Needs

Authors: Javier Romera, Alberto Justo, Ignacio Fidalgo, Javier Araluce, Joshué Pérez

Abstract:

Nowadays, individuals with mobility needs face a significant challenge when docking vehicles. In many cases, after parking, they encounter insufficient space to exit, leading to two undesired outcomes: either avoiding parking in that spot or settling for improperly placed vehicles. To address this issue, this paper presents a parking control system employing gestural teleoperation. The system comprises three main phases: capturing body markers, interpreting gestures, and transmitting orders to the vehicle. The initial phase is centered around the MediaPipe framework, a versatile tool optimized for real-time gesture recognition. MediaPipe excels at detecting and tracing body markers, with a special emphasis on hand gestures. Hands detection is done by generating 21 reference points for each hand. Subsequently, after data capture, the project employs the MultiPerceptron Layer (MPL) for in-depth gesture classification. This tandem of MediaPipe’s extraction prowess and MPL’s analytical capability ensures that human gestures are translated into actionable commands with high precision. Furthermore, the system has been trained and validated within a built-in dataset. To prove the domain adaptation, a framework based on the Robot Operating System 2 (ROS2), as a communication backbone, alongside CARLA Simulator, is used. Following successful simulations, the system is transitioned to a real-world platform, marking a significant milestone in the project. This real-vehicle implementation verifies the practicality and efficiency of the system beyond theoretical constructs.

Keywords: Gesture detection, MediaPipe, MultiLayer Perceptron Layer, Robot Operating System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138
52 Providing Emotional Support to Children under Long-Term Health Treatments

Authors: Ramón Cruzat, Sergio F. Ochoa, Ignacio Casas, Luis A. Guerrero, José Bravo

Abstract:

Patients under health treatments that involve long  stays at a hospital or health center (e.g. cancer, organ transplants and  severe burns), tend to get bored or depressed because of the lack of  social interaction with family and friends. Such a situation also  affects the evolution and effectiveness of their treatments. In many  cases, the solution to this problem involves extra challenges, since  many patients need to rest quietly (or remain in bed) to their being  contagious. Considering the weak health condition in which usually  are these kinds, keeping them motivated and quiet represents an  important challenge for nurses and caregivers. This article presents a  mobile ubiquitous game called MagicRace, which allows hospitalized  kinds to interact socially with one another without putting to risk  their sensitive health conditions. The game does not require a  communication infrastructure at the hospital, but instead, it uses a  mobile ad hoc network composed of the handheld devices used by  the kids to play. The usability and performance of this application  was tested in two different sessions. The preliminary results show  that users experienced positive feelings from this experience.

 

Keywords: Ubiquitous game, children's emotional support, social isolation, mobile collaborative interactions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
51 Variability in Near-Surface Ultraviolet Radiation and Its Dependence on Atmospheric Parameters

Authors: Yusuff Idowu Moshood, Sanni Mohammed

Abstract:

Natural radiations such as ultraviolet (UV) radiation sourced from sun are known to be the main causes of skin cancer, sunburn, eye damage, premature aging of skin and other skin related diseases. Its percentage of radiation reaching the earth populace and its impacts are not well known. Its variability in near-surface relating to its impacts on populace depends on some atmospheric parameters. Hence, this work was embarked on to determine the variability in near-surface UV radiation and its dependency on some atmospheric parameters at different time of the day in Offa, Nigeria. The variability was determined using the data obtained from meteorological garden, Science Laboratory Technology Department, Federal Polytechnic Offa, Nigeria. The data obtained were solar UV radiation, solar radiation, temperature, humidity and pressure at 30 minutes interval. Relationships were determined and correlations were derived using SPSS Pearson Correlation tool. The results showed a significant level of correlation with p-value of 0.01 and 0.05 levels. Thus, the results revealed some good relationships between the solar UV radiation and other atmospheric parameters with significance level less than p-value obtained. Inferentially, interdependent relationships were found to exist. Therefore, the nature of relationship obtained could be a yardstick for decision making in short term environmental planning on solar UV radiation depending of some atmospheric parameters within Offa locality.

Keywords: Correlation, inferential, radiation, yardstick.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 783
50 A Pairwise-Gaussian-Merging Approach: Towards Genome Segmentation for Copy Number Analysis

Authors: Chih-Hao Chen, Hsing-Chung Lee, Qingdong Ling, Hsiao-Jung Chen, Sun-Chong Wang, Li-Ching Wu, H.C. Lee

Abstract:

Segmentation, filtering out of measurement errors and identification of breakpoints are integral parts of any analysis of microarray data for the detection of copy number variation (CNV). Existing algorithms designed for these tasks have had some successes in the past, but they tend to be O(N2) in either computation time or memory requirement, or both, and the rapid advance of microarray resolution has practically rendered such algorithms useless. Here we propose an algorithm, SAD, that is much faster and much less thirsty for memory – O(N) in both computation time and memory requirement -- and offers higher accuracy. The two key ingredients of SAD are the fundamental assumption in statistics that measurement errors are normally distributed and the mathematical relation that the product of two Gaussians is another Gaussian (function). We have produced a computer program for analyzing CNV based on SAD. In addition to being fast and small it offers two important features: quantitative statistics for predictions and, with only two user-decided parameters, ease of use. Its speed shows little dependence on genomic profile. Running on an average modern computer, it completes CNV analyses for a 262 thousand-probe array in ~1 second and a 1.8 million-probe array in 9 seconds

Keywords: Cancer, pathogenesis, chromosomal aberration, copy number variation, segmentation analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477
49 Automatic Segmentation of Lung Areas in Magnetic Resonance Images

Authors: Alireza Osareh, Bita Shadgar

Abstract:

Segmenting the lungs in medical images is a challenging and important task for many applications. In particular, automatic segmentation of lung cavities from multiple magnetic resonance (MR) images is very useful for oncological applications such as radiotherapy treatment planning. However, distinguishing of the lung areas is not trivial due to largely changing lung shapes, low contrast and poorly defined boundaries. In this paper, we address lung segmentation problem from pulmonary magnetic resonance images and propose an automated method based on a robust regionaided geometric snake with a modified diffused region force into the standard geometric model definition. The extra region force gives the snake a global complementary view of the lung boundary information within the image which along with the local gradient flow, helps detect fuzzy boundaries. The proposed method has been successful in segmenting the lungs in every slice of 30 magnetic resonance images with 80 consecutive slices in each image. We present results by comparing our automatic method to manually segmented lung cavities provided by an expert radiologist and with those of previous works, showing encouraging results and high robustness of our approach.

Keywords: Active contours, breast cancer, fuzzy c-means segmentation, treatment planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2057
48 Energy Deposited by Secondary Electrons Generated by Swift Proton Beams through Polymethylmethacrylate

Authors: Maurizio Dapor, Isabel Abril, Pablo de Vera, Rafael Garcia-Molina

Abstract:

The ionization yield of ion tracks in polymers and bio-molecular systems reaches a maximum, known as the Bragg peak, close to the end of the ion trajectories. Along the path of the ions through the materials, many electrons are generated, which produce a cascade of further ionizations and, consequently, a shower of secondary electrons. Among these, very low energy secondary electrons can produce damage in the biomolecules by dissociative electron attachment. This work deals with the calculation of the energy distribution of electrons produced by protons in a sample of polymethylmethacrylate (PMMA), a material that is used as a phantom for living tissues in hadron therapy. PMMA is also of relevance for microelectronics in CMOS technologies and as a photoresist mask in electron beam lithography. We present a Monte Carlo code that, starting from a realistic description of the energy distribution of the electrons ejected by protons moving through PMMA, simulates the entire cascade of generated secondary electrons. By following in detail the motion of all these electrons, we find the radial distribution of the energy that they deposit in PMMA for several initial proton energies characteristic of the Bragg peak.

Keywords: Monte Carlo method, secondary electrons, energetic ions, ion-beam cancer therapy, ionization cross section, polymethylmethacrylate, proton beams, secondary electrons, radial energy distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
47 Pre-Malignant Breast Lesions, Methods of Treatment and Outcome

Authors: Ahmed Mostafa, Mohamed Mahmoud, Nesreen H. Hafez, Mohamed Fahim

Abstract:

This retrospective study includes 60 patients with pre-invasive breast cancer. Aim of the study: Evaluation of premalignant lesions of the breast (DCIS), different treatment methods and outcome. Patients and methods: 60 patients with DCIS were studied from the period between 2005 to 2012, for 38 patients the primary surgical method was wide local resection (WLE) (63.3%) and the other cases (22 patients, 36.7%) had mastectomy, fourteen cases from those who underwent local excision received radiotherapy, while no adjuvant radiotherapy was given for those who underwent mastectomy. In case of hormonal receptor positive DCIS lesions hormonal treatment (Tamoxifen) was given after local control. Results: No difference in overall survival between mastectomy & breast conserving therapy (wide local excision and adjuvant radiotherapy), however local recurrence rate is higher in case of breast conserving therapy, also no role of Axillary evacuation in case of DCIS. The use of hormonal therapy decreases the incidence of local recurrence by about 98%. Conclusion: The main management of DCIS is local treatment (wide local excision and radiotherapy) with hormonal treatment in case of hormone receptor positive lesions.

Keywords: Ductal carcinoma in situ, surgical treatment, radiotherapy, breast conserving therapy, hormonal treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1437
46 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis

Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan

Abstract:

Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of big data technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centres or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through VADER and RoBERTa model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and Term Frequency – Inverse Document Frequency (TFIDF) Vectorization and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide if the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.

Keywords: Counter vectorization, Convolutional Neural Network, Crawler, data technology, Long Short-Term Memory, LSTM, Web Scraping, sentiment analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175
45 Spatial Clustering Model of Vessel Trajectory to Extract Sailing Routes Based on AIS Data

Authors: Lubna Eljabu, Mohammad Etemad, Stan Matwin

Abstract:

The automatic extraction of shipping routes is advantageous for intelligent traffic management systems to identify events and support decision-making in maritime surveillance. At present, there is a high demand for the extraction of maritime traffic networks that resemble the real traffic of vessels accurately, which is valuable for further analytical processing tasks for vessels trajectories (e.g., naval routing and voyage planning, anomaly detection, destination prediction, time of arrival estimation). With the help of big data and processing huge amounts of vessels’ trajectory data, it is possible to learn these shipping routes from the navigation history of past behaviour of other, similar ships that were travelling in a given area. In this paper, we propose a spatial clustering model of vessels’ trajectories (SPTCLUST) to extract spatial representations of sailing routes from historical Automatic Identification System (AIS) data. The whole model consists of three main parts: data preprocessing, path finding, and route extraction, which consists of clustering and representative trajectory extraction. The proposed clustering method provides techniques to overcome the problems of: (i) optimal input parameters selection; (ii) the high complexity of processing a huge volume of multidimensional data; (iii) and the spatial representation of complete representative trajectory detection in the context of trajectory clustering algorithms. The experimental evaluation showed the effectiveness of the proposed model by using a real-world AIS dataset from the Port of Halifax. The results contribute to further understanding of shipping route patterns. This could aid surveillance authorities in stable and sustainable vessel traffic management.

Keywords: Vessel trajectory clustering, trajectory mining, Spatial Clustering, marine intelligent navigation, maritime traffic network extraction, sdailing routes extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 458
44 Toward Indoor and Outdoor Surveillance Using an Improved Fast Background Subtraction Algorithm

Authors: A. El Harraj, N. Raissouni

Abstract:

The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes invariance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.

Keywords: Video surveillance, background subtraction, Contrast Limited Histogram Equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2088
43 Design Criteria for Achieving Acceptable Indoor Radon Concentration

Authors: T. Valdbjørn Rasmussen

Abstract:

Design criteria for achieving an acceptable indoor radon concentration are presented in this paper. The paper suggests three design criteria. These criteria have to be considered at the early stage of the building design phase to meet the latest recommendations from the World Health Organization in most countries. The three design criteria are; first, establishing a radon barrier facing the ground; second, lowering the air pressure in the lower zone of the slab on ground facing downwards; third, diluting the indoor air with outdoor air. The first two criteria can prevent radon from infiltrating from the ground, and the third criteria can dilute the indoor air. By combining these three criteria, the indoor radon concentration can be lowered achieving an acceptable level. In addition, a cheap and reliable method for measuring the radon concentration in the indoor air is described. The provision on radon in the Danish Building Regulations complies with the latest recommendations from the World Health Organization. Radon can cause lung cancer and it is not known whether there is a lower limit for when it is not harmful to human beings. Therefore, it is important to reduce the radon concentration as much as possible in buildings. Airtightness is an important factor when dealing with buildings. It is important to avoid air leakages in the building envelope both facing the atmosphere, e.g. in compliance with energy requirements, but also facing the ground, to meet the requirements to ensure and control the indoor environment. Infiltration of air from the ground underneath a building is the main providing source of radon to the indoor air.

Keywords: Radon, natural radiation, barrier, pressure lowering, ventilation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1190
42 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter

Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal

Abstract:

Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.

Keywords: Air, component-specific toxicity, human health risks, particulate matter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1188
41 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory

Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock

Abstract:

Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.

Keywords: Subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830
40 Contextual SenSe Model: Word Sense Disambiguation Using Sense and Sense Value of Context Surrounding the Target

Authors: Vishal Raj, Noorhan Abbas

Abstract:

Ambiguity in NLP (Natural Language Processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a method to create an affinity matrix to calculate the affinity between any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an algorithm to create the sense clusters of tokens using affinity matrix under hierarchy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contextual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and challenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.

Keywords: Word Sense Disambiguation, WSD, Contextual SenSe Model, Most Frequent Sense, part of speech, POS, Natural Language Processing, NLP, OOV, out of vocabulary, ELMo, Embeddings from Language Model, BERT, Bidirectional Encoder Representations from Transformers, Word2Vec, lemma_POS, Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 389
39 Study of Encapsulation of Quantum Dots in Polystyrene and Poly (E-Caprolactone)Microreactors Prepared by Microvolcanic Eruption of Freeze Dried Microspheres

Authors: Deepak Kukkar, Inderpreet Kaur, Jagtar Singh, Lalit M Bharadwaj

Abstract:

Polymeric microreactors have emerged as a new generation of carriers that hold tremendous promise in the areas of cancer therapy, controlled delivery of drugs, for removal of pollutants etc. Present work reports a simple and convenient methodology for synthesis of polystyrene and poly caprolactone microreactors. An aqueous suspension of carboxylated (1μm) polystyrene latex particles was mixed with toluene solution followed by freezing with liquid nitrogen. Freezed particles were incubated at -20°C and characterized for formation of voids on the surface of polymer microspheres by Field Emission Scanning Electron Microscope. The hollow particles were then overnight incubated at 40ºC with unfunctionalized quantum dots (QDs) in 5:1 ratio. QDs Encapsulated polystyrene microcapsules were characterized by fluorescence microscopy. Likewise Poly ε-caprolactone microreactors were prepared by micro-volcanic rupture of freeze dried microspheres synthesized using emulsification of polymer with aqueous Poly vinyl alcohol and freezed with liquid nitrogen. Microreactors were examined with Field Emission Scanning Electron Microscope for size and morphology. Current study is an attempt to create hollow polymer particles which can be employed for microencapsulation of nanoparticles and drug molecules.

Keywords: FE-SEM, Microreactors, Microvolcanic rupture, Poly (ε-caprolactone), Polystyrene

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2360
38 Methods for Material and Process Monitoring by Characterization of (Second and Third Order) Elastic Properties with Lamb Waves

Authors: R. Meier, M. Pander

Abstract:

In accordance with the industry 4.0 concept, manufacturing process steps as well as the materials themselves are going to be more and more digitalized within the next years. The “digital twin” representing the simulated and measured dataset of the (semi-finished) product can be used to control and optimize the individual processing steps and help to reduce costs and expenditure of time in product development, manufacturing, and recycling. In the present work, two material characterization methods based on Lamb waves were evaluated and compared. For demonstration purpose, both methods were shown at a standard industrial product - copper ribbons, often used in photovoltaic modules as well as in high-current microelectronic devices. By numerical approximation of the Rayleigh-Lamb dispersion model on measured phase velocities second order elastic constants (Young’s modulus, Poisson’s ratio) were determined. Furthermore, the effective third order elastic constants were evaluated by applying elastic, “non-destructive”, mechanical stress on the samples. In this way, small microstructural variations due to mechanical preconditioning could be detected for the first time. Both methods were compared with respect to precision and inline application capabilities. Microstructure of the samples was systematically varied by mechanical loading and annealing. Changes in the elastic ultrasound transport properties were correlated with results from microstructural analysis and mechanical testing. In summary, monitoring the elastic material properties of plate-like structures using Lamb waves is valuable for inline and non-destructive material characterization and manufacturing process control. Second order elastic constants analysis is robust over wide environmental and sample conditions, whereas the effective third order elastic constants highly increase the sensitivity with respect to small microstructural changes. Both Lamb wave based characterization methods are fitting perfectly into the industry 4.0 concept.

Keywords: Lamb waves, industry 4.0, process control, elasticity, acoustoelasticity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1099
37 Improved Computational Efficiency of Machine Learning Algorithms Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK

Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick

Abstract:

The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning (ML) archetypal that could forecast the COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID-19 cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organization (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data are split into 8:2 ratio for training and testing purposes to forecast future new COVID-19 cases. Support Vector Machine (SVM), Random Forest (RF), and linear regression (LR) algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID-19 cases is evaluated. RF outperformed the other two ML algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n = 30. The mean square error obtained for RF is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis, RF algorithm can perform more effectively and efficiently in predicting the new COVID-19 cases, which could help the health sector to take relevant control measures for the spread of the virus.

Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 172