Search results for: Mike Gray
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 241

Search results for: Mike Gray

121 Early Detection of Breast Cancer in Digital Mammograms Based on Image Processing and Artificial Intelligence

Authors: Sehreen Moorat, Mussarat Lakho

Abstract:

A method of artificial intelligence using digital mammograms data has been proposed in this paper for detection of breast cancer. Many researchers have developed techniques for the early detection of breast cancer; the early diagnosis helps to save many lives. The detection of breast cancer through mammography is effective method which detects the cancer before it is felt and increases the survival rate. In this paper, we have purposed image processing technique for enhancing the image to detect the graphical table data and markings. Texture features based on Gray-Level Co-Occurrence Matrix and intensity based features are extracted from the selected region. For classification purpose, neural network based supervised classifier system has been used which can discriminate between benign and malignant. Hence, 68 digital mammograms have been used to train the classifier. The obtained result proved that automated detection of breast cancer is beneficial for early diagnosis and increases the survival rates of breast cancer patients. The proposed system will help radiologist in the better interpretation of breast cancer.

Keywords: medical imaging, cancer, processing, neural network

Procedia PDF Downloads 233
120 The Study of Implications on Modern Businesses Performances by Digital Communities: Case of Data Leak

Authors: Asim Majeed, Anwar Ul Haq, Ayesha Asim, Mike Lloyd-Williams, Arshad Jamal, Usman Butt

Abstract:

This study aims to investigate the impact of data leak of M&S customers on digital communities. Modern businesses are using digital communities as an important public relations tool for marketing purposes. This form of communication helps companies to build better relationship with their customers which also act as another source of information. The communication between the customers and the organizations is not regulated so users may post positive and negative comments. There are new platforms being developed on a daily basis and it is very crucial for the businesses to not only get themselves familiar with those but also know how to reach their existing and perspective consumers. The driving force of marketing and communication in modern businesses is the digital communities and these are continuously increasing and developing. This phenomenon is changing the way marketing is conducted. The current research has discussed the implications on M&S business performance since the data was exploited on digital communities; users contacted M&S and raised the security concerns. M&S closed down its website for few hours to try to resolve the issue. The next day M&S made a public apology about this incidence. This information was proliferated on various digital communities and it has impacted negatively on M&S brand name, sales and customers. The content analysis approach is being used to collect qualitative data from 100 digital bloggers including social media communities such as Facebook and Twitter. The results and finding provide useful new insights into the nature and form of security concerns of digital users. Findings have theoretical and practical implications. This research will showcase a large corporation utilizing various digital community platforms and can serve as a model for future organizations.

Keywords: Digital, communities, performance, dissemination, implications, data, exploitation

Procedia PDF Downloads 369
119 Digital Material Characterization Using the Quantum Fourier Transform

Authors: Felix Givois, Nicolas R. Gauger, Matthias Kabel

Abstract:

The efficient digital material characterization is of great interest to many fields of application. It consists of the following three steps. First, a 3D reconstruction of 2D scans must be performed. Then, the resulting gray-value image of the material sample is enhanced by image processing methods. Finally, partial differential equations (PDE) are solved on the segmented image, and by averaging the resulting solutions fields, effective properties like stiffness or conductivity can be computed. Due to the high resolution of current CT images, the latter is typically performed with matrix-free solvers. Among them, a solver that uses the explicit formula of the Green-Eshelby operator in Fourier space has been proposed by Moulinec and Suquet. Its algorithmic, most complex part is the Fast Fourier Transformation (FFT). In our talk, we will discuss the potential quantum advantage that can be obtained by replacing the FFT with the Quantum Fourier Transformation (QFT). We will especially show that the data transfer for noisy intermediate-scale quantum (NISQ) devices can be improved by using appropriate boundary conditions for the PDE, which also allows using semi-classical versions of the QFT. In the end, we will compare the results of the QFT-based algorithm for simple geometries with the results of the FFT-based homogenization method.

Keywords: most likelihood amplitude estimation (MLQAE), numerical homogenization, quantum Fourier transformation (QFT), NISQ devises

Procedia PDF Downloads 46
118 A Mega-Analysis of the Predictive Power of Initial Contact within Minimal Social Network

Authors: Cathal Ffrench, Ryan Barrett, Mike Quayle

Abstract:

It is accepted in social psychology that categorization leads to ingroup favoritism, without further thought given to the processes that may co-occur or even precede categorization. These categorizations move away from the conceptualization of the self as a unique social being toward an increasingly collective identity. Subsequently, many individuals derive much of their self-evaluations from these collective identities. The seminal literature on this topic argues that it is primarily categorization that evokes instances of ingroup favoritism. Apropos to these theories, we argue that categorization acts to enhance and further intergroup processes rather than defining them. More accurately, we propose categorization aids initial ingroup contact and this first contact is predictive of subsequent favoritism on individual and collective levels. This analysis focuses on Virtual Interaction APPLication (VIAPPL) based studies, a software interface that builds on the flaws of the original minimal group studies. The VIAPPL allows the exchange of tokens in an intra and inter-group manner. This token exchange is how we classified the first contact. The study involves binary longitudinal analysis to better understand the subsequent exchanges of individuals based on who they first interacted with. Studies were selected on the criteria of evidence of explicit first interactions and two-group designs. Our findings paint a compelling picture in support of a motivated contact hypothesis, which suggests that an individual’s first motivated contact toward another has strong predictive capabilities for future behavior. This contact can lead to habit formation and specific favoritism towards individuals where contact has been established. This has important implications for understanding how group conflict occurs, and how intra-group individual bias can develop.

Keywords: categorization, group dynamics, initial contact, minimal social networks, momentary contact

Procedia PDF Downloads 123
117 Heuristic Classification of Hydrophone Recordings

Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas

Abstract:

An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.

Keywords: anthrophony, hydrophone, k-means, machine learning

Procedia PDF Downloads 134
116 Geographical Data Visualization Using Video Games Technologies

Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava

Abstract:

In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.

Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material

Procedia PDF Downloads 218
115 Screening for Diabetes in Patients with Chronic Pancreatitis: The Belfast Trust Experience

Authors: Riyas Peringattuthodiyil, Mark Taylor, Ian Wallace, Ailish Nugent, Mike Mitchell, Judith Thompson, Allison McKee, Philip C. Johnston

Abstract:

Aim of Study: The purpose of the study was to screen for diabetes through HbA1c in patients with chronic pancreatitis (CP) within the Belfast Trust. Background: Patients with chronic pancreatitis are at risk of developing diabetes, earlier diagnosis with subsequent multi-disciplinary input has the potential to improve clinical outcomes. Methods: Clinical and laboratory data of patients with chronic pancreatitis were obtained through the Northern Ireland Electronic Healthcare Record (NIECR), specialist hepatobiliary, and gastrointestinal clinics. Patients were invited to have a blood test for HbA1c. Newly diagnosed patients with diabetes were then invited to attend a dedicated Belfast City Hospital (BCH) specialist chronic pancreatitis and diabetes clinic for follow up. Results: A total of 89 chronic pancreatitis patients were identified; Male54; Female:35, mean age 52 years, range 12-90 years. Aetiology of CP included alcohol 52/89 (58%), gallstones 18/89 (20%), idiopathic 10/89 11%, 2 were genetic, 1: post ECRP, 1: IgG autoimmune, 1: medication induced, 1: lipoprotein lipase deficiency 1: mumps, 1: IVDU and 1: pancreatic divisum. No patients had pancreatic carcinoma. Mean duration of CP was nine years, range 3-30 years. 15/89 (16%) of patients underwent previous pancreatic surgery/resections. Recent mean BMI was 25.1 range 14-40 kg/m². 62/89 (70%) patients had HbA1c performed. Mean HbA1c was 42 mmol/mol, range 27-97mmol/mol, 42/62 (68%) had normal HbA1c (< 42 mmol/mol) 13/62 (21%) had pre-diabetes (42-47mmol/mol) and 7/62 (11%) had diabetes (≥ 48 mmol/mol). Conclusions: Of those that participated in the screening program around one-third of patients with CP had glycaemic control in the pre and diabetic range. Potential opportunities for improving screening rates for diabetes in this cohort could include regular yearly testing at gastrointestinal and hepatobiliary clinics.

Keywords: pancreatogenic diabetes, screening, chronic pancreatitis, trust experience

Procedia PDF Downloads 130
114 Wear and Mechanical Properties of Nodular Iron Modified with Copper

Authors: J. Ramos, V. Gil, A. F. Torres

Abstract:

The nodular iron is a material that has shown great advantages respect to other materials (steel and gray iron) in the production of machine elements. The engineering industry, especially automobile, are potential users of this material. As it is known, the alloying elements modify the properties of steels and castings. Copper has been investigated as a structural modifier of nodular iron, but studies of its mechanical and tribological implications still need to be addressed for industrial use. With the aim of improving the mechanical properties of nodular iron, alloying elements (Mn, Si, and Cu) are added in order to increase their pearlite (or ferrite) structure according to the percentage of the alloying element. In this research (using induction furnace process) nodular iron with three different percentages of copper (residual, 0,5% and 1,2%) was obtained. Chemical analysis was performed by optical emission spectrometry and microstructures were characterized by Optical Microscopy (ASTM E3) and Scanning Electron Microscopy (SEM). The study of mechanical behavior was carried out in a mechanical test machine (ASTM E8) and a Pin on disk tribometer (ASTM G99) was used to assess wear resistance. It is observed that copper increases the pearlite structure improving the wear behavior; tension behavior. This improvement is observed in higher proportion with 0,5% due to the fact that too much increase of pearlite leads to ductility loss.

Keywords: copper, mechanical properties, nodular iron, pearlite structure, wear

Procedia PDF Downloads 360
113 Image Segmentation Techniques: Review

Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo

Abstract:

Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.

Keywords: clustering-based, convolution-network, edge-based, region-growing

Procedia PDF Downloads 55
112 Secure Message Transmission Using Meaningful Shares

Authors: Ajish Sreedharan

Abstract:

Visual cryptography encodes a secret image into shares of random binary patterns. If the shares are exerted onto transparencies, the secret image can be visually decoded by superimposing a qualified subset of transparencies, but no secret information can be obtained from the superposition of a forbidden subset. The binary patterns of the shares, however, have no visual meaning and hinder the objectives of visual cryptography. In the Secret Message Transmission through Meaningful Shares a secret message to be transmitted is converted to grey scale image. Then (2,2) visual cryptographic shares are generated from this converted gray scale image. The shares are encrypted using A Chaos-Based Image Encryption Algorithm Using Wavelet Transform. Two separate color images which are of the same size of the shares, taken as cover image of the respective shares to hide the shares into them. The encrypted shares which are covered by meaningful images so that a potential eavesdropper wont know there is a message to be read. The meaningful shares are transmitted through two different transmission medium. During decoding shares are fetched from received meaningful images and decrypted using A Chaos-Based Image Encryption Algorithm Using Wavelet Transform. The shares are combined to regenerate the grey scale image from where the secret message is obtained.

Keywords: visual cryptography, wavelet transform, meaningful shares, grey scale image

Procedia PDF Downloads 424
111 Earth Observations and Hydrodynamic Modeling to Monitor and Simulate the Oil Pollution in the Gulf of Suez, Red Sea, Egypt

Authors: Islam Abou El-Magd, Elham Ali, Moahmed Zakzouk, Nesreen Khairy, Naglaa Zanaty

Abstract:

Maine environment and coastal zone are wealthy with natural resources that contribute to the local economy of Egypt. The Gulf of Suez and Red Sea area accommodates diverse human activities that contribute to the local economy, including oil exploration and production, touristic activities, export and import harbors, etc, however, it is always under the threat of pollution due to human interaction and activities. This research aimed at integrating in-situ measurements and remotely sensed data with hydrodynamic model to map and simulate the oil pollution. High-resolution satellite sensors including Sentinel 2 and Plantlab were functioned to trace the oil pollution. Spectral band ratio of band 4 (infrared) over band 3 (red) underpinned the mapping of the point source pollution from the oil industrial estates. This ratio is supporting the absorption windows detected in the hyperspectral profiles. ASD in-situ hyperspectral device was used to measure experimentally the oil pollution in the marine environment. The experiment used to measure water behavior in three cases a) clear water without oil, b) water covered with raw oil, and c) water after a while from throwing the raw oil. The spectral curve is clearly identified absorption windows for oil pollution, particularly at 600-700nm. MIKE 21 model was applied to simulate the dispersion of the oil contamination and create scenarios for crises management. The model requires precise data preparation of the bathymetry, tides, waves, atmospheric parameters, which partially obtained from online modeled data and other from historical in-situ stations. The simulation enabled to project the movement of the oil spill and could create a warning system for mitigation. Details of the research results will be described in the paper.

Keywords: oil pollution, remote sensing, modelling, Red Sea, Egypt

Procedia PDF Downloads 322
110 Utilization of Two Kind of Recycling Greywater in Irrigation of Syngonium SP. Plants Grown Under Different Water Regime

Authors: Sami Ali Metwally, Bedour Helmy Abou-Leila, Hussien I.Abdel-Shafy

Abstract:

The work was carried out at the greenhouse of National Research Centre, Pot experiment was carried out during of 2020 and 2021 seasons aimed to study the effect of two types of water (two recycling gray water treatments((SMR (Sequencing Batch Reactor) and MBR(Membrane Biology Reactor) and three watering intervals 15, 20 and 25 days on Syangonium plants growth. Examination of data cleared that, (MBR) recorded increase in vegetative growth parameters, osmotic pressure, transpiration rate chlorophyll a,b,carotenoids and carbohydrate)in compared with SBR.As for water, intervalsthe highest values of most growth parameters were obtained from plants irrigated with after (20 days) compared with other treatments.15 days irrigation intervals recorded significantly increased in osmotic pressure, transpiration rate and photosynthetic pigments, while carbohydrate values recorded decreased. Interaction between water type and water intervals(SBR) recorded the highest values of most growth parameters by irrigation after 20 days. While the treatment (MBR)and irrigated after 25 days showed the highest values on leaf area and leaves fresh weight compared with other treatments.

Keywords: grey water, water intervals, Syngonium plant, recycling water, vegetative growth

Procedia PDF Downloads 80
109 One year later after the entry into force of the Treaty on the Prohibition of Nuclear Weapons (TPNW): Reviewing Legal Impact and Implementation

Authors: Cristina Siserman-Gray

Abstract:

TheTreaty on the Prohibition of Nuclear Weapons(TPNW)will mark in January 2022 one year since the entry into force of the treaty. TPNW provides that within one year of entry into force, the 86 countries that have signed it so far will convene to discuss and take decisions on the treaty’s implementation at the first meeting of states-parties. Austria has formally offered to host the meeting in Vienna in the spring of 2022. At this first meeting, the States Parties would need to work. Among others, on the interpretations of some of the provisions of the Treaty, disarmament timelines under Article 4, and address universalization of the Treaty. The main objective of this paper is to explore the legal implications of the TPNW for States-Parties and discuss how these will impact non-State Parties, particularly the United States. In a first part, the article will address the legal requirements that States Parties to this treaty must adhere to by illustrating some of the progress made by these states regarding the implementation of the TPNW. In a second part, the paper will address the challenges and opportunities for universalizing the treaty and will focus on the response of Nuclear Weapons States, and particularly the current US administration. Since it has become clear that TPNW has become a new and important element to the nonproliferation and disarmament architecture, the article will provide a number of suggestions regarding ways US administration could positively contribute to the international discourse on TPNW.

Keywords: disarmament, arms control and nonproliferation, legal regime, TPNW

Procedia PDF Downloads 136
108 Image Multi-Feature Analysis by Principal Component Analysis for Visual Surface Roughness Measurement

Authors: Wei Zhang, Yan He, Yan Wang, Yufeng Li, Chuanpeng Hao

Abstract:

Surface roughness is an important index for evaluating surface quality, needs to be accurately measured to ensure the performance of the workpiece. The roughness measurement based on machine vision involves various image features, some of which are redundant. These redundant features affect the accuracy and speed of the visual approach. Previous research used correlation analysis methods to select the appropriate features. However, this feature analysis is independent and cannot fully utilize the information of data. Besides, blindly reducing features lose a lot of useful information, resulting in unreliable results. Therefore, the focus of this paper is on providing a redundant feature removal approach for visual roughness measurement. In this paper, the statistical methods and gray-level co-occurrence matrix(GLCM) are employed to extract the texture features of machined images effectively. Then, the principal component analysis(PCA) is used to fuse all extracted features into a new one, which reduces the feature dimension and maintains the integrity of the original information. Finally, the relationship between new features and roughness is established by the support vector machine(SVM). The experimental results show that the approach can effectively solve multi-feature information redundancy of machined surface images and provides a new idea for the visual evaluation of surface roughness.

Keywords: feature analysis, machine vision, PCA, surface roughness, SVM

Procedia PDF Downloads 184
107 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework

Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim

Abstract:

Background modeling and subtraction in video analysis has been widely proved to be an effective method for moving objects detection in many computer vision applications. Over the past years, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are two of the most frequently occurring issues in the practical situation. This paper presents a new two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean values of RGB color channels. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block-wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the outputs of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate a very competitive performance compared to previous models.

Keywords: background subtraction, codebook model, local binary pattern, dynamic background, illumination change

Procedia PDF Downloads 192
106 Optimization of Spatial Light Modulator to Generate Aberration Free Optical Traps

Authors: Deepak K. Gupta, T. R. Ravindran

Abstract:

Holographic Optical Tweezers (HOTs) in general use iterative algorithms such as weighted Gerchberg-Saxton (WGS) to generate multiple traps, which produce traps with 99% uniformity theoretically. But in experiments, it is the phase response of the spatial light modulator (SLM) which ultimately determines the efficiency, uniformity, and quality of the trap spots. In general, SLMs show a nonlinear phase response behavior, and they may even have asymmetric phase modulation depth before and after π. This affects the resolution with which the gray levels are addressed before and after π, leading to a degraded trap performance. We present a method to optimize the SLM for a linear phase response behavior along with a symmetric phase modulation depth around π. Further, we optimize the SLM for its varying phase response over different spatial regions by optimizing the brightness/contrast and gamma of the hologram in different subsections. We show the effect of the optimization on an array of trap spots resulting in improved efficiency and uniformity. We also calculate the spot sharpness metric and trap performance metric and show a tightly focused spot with reduced aberration. The trap performance is compared by calculating the trap stiffness of a trapped particle in a given trap spot before and after aberration correction. The trap stiffness is found to improve by 200% after the optimization.

Keywords: spatial light modulator, optical trapping, aberration, phase modulation

Procedia PDF Downloads 145
105 Multi-Scale Green Infrastructure: An Integrated Literature Review

Authors: Panpan Feng

Abstract:

The concept of green infrastructure originated in Europe and the United States. It aims to ensure smart growth of urban and rural ecosystems and achieve sustainable urban and rural ecological, social, and economic development by combining it with gray infrastructure in traditional planning. Based on the literature review of the theoretical origin, value connotation, and measurement methods of green infrastructure, this study summarizes the research content of green infrastructure at different scales from the three spatial levels of region, city, and block and divides it into functional dimensions, spatial dimension, and strategic dimension. The results show that in the functional dimension, from region-city-block, the research on green infrastructure gradually shifts from ecological function to social function. In the spatial dimension, from region-city-block, the research on the spatial form of green infrastructure has shifted from two-dimensional to three-dimensional, and the spatial structure of green infrastructure has shifted from single ecological elements to multiple composite elements. From a strategic perspective, green infrastructure research is more of a spatial planning tool based on land management, environmental livability and ecological psychology, providing certain decision-making support.

Keywords: green infrastructure, multi-scale, social and ecological functions, spatial strategic decision-making tools

Procedia PDF Downloads 22
104 Numerical Analysis of a Pilot Solar Chimney Power Plant

Authors: Ehsan Gholamalizadeh, Jae Dong Chung

Abstract:

Solar chimney power plant is a feasible solar thermal system which produces electricity from the Sun. The objective of this study is to investigate buoyancy-driven flow and heat transfer through a built pilot solar chimney system called 'Kerman Project'. The system has a chimney with the height and diameter of 60 m and 3 m, respectively, and the average radius of its solar collector is about 20 m, and also its average collector height is about 2 m. A three-dimensional simulation was conducted to analyze the system, using computational fluid dynamics (CFD). In this model, radiative transfer equation was solved using the discrete ordinates (DO) radiation model taking into account a non-gray radiation behavior. In order to modelling solar irradiation from the sun’s rays, the solar ray tracing algorithm was coupled to the computation via a source term in the energy equation. The model was validated with comparing to the experimental data of the Manzanares prototype and also the performance of the built pilot system. Then, based on the numerical simulations, velocity and temperature distributions through the system, the temperature profile of the ground surface and the system performance were presented. The analysis accurately shows the flow and heat transfer characteristics through the pilot system and predicts its performance.

Keywords: buoyancy-driven flow, computational fluid dynamics, heat transfer, renewable energy, solar chimney power plant

Procedia PDF Downloads 228
103 Predicting Radioactive Waste Glass Viscosity, Density and Dissolution with Machine Learning

Authors: Joseph Lillington, Tom Gout, Mike Harrison, Ian Farnan

Abstract:

The vitrification of high-level nuclear waste within borosilicate glass and its incorporation within a multi-barrier repository deep underground is widely accepted as the preferred disposal method. However, for this to happen, any safety case will require validation that the initially localized radionuclides will not be considerably released into the near/far-field. Therefore, accurate mechanistic models are necessary to predict glass dissolution, and these should be robust to a variety of incorporated waste species and leaching test conditions, particularly given substantial variations across international waste-streams. Here, machine learning is used to predict glass material properties (viscosity, density) and glass leaching model parameters from large-scale industrial data. A variety of different machine learning algorithms have been compared to assess performance. Density was predicted solely from composition, whereas viscosity additionally considered temperature. To predict suitable glass leaching model parameters, a large simulated dataset was created by coupling MATLAB and the chemical reactive-transport code HYTEC, considering the state-of-the-art GRAAL model (glass reactivity in allowance of the alteration layer). The trained models were then subsequently applied to the large-scale industrial, experimental data to identify potentially appropriate model parameters. Results indicate that ensemble methods can accurately predict viscosity as a function of temperature and composition across all three industrial datasets. Glass density prediction shows reliable learning performance with predictions primarily being within the experimental uncertainty of the test data. Furthermore, machine learning can predict glass dissolution model parameters behavior, demonstrating potential value in GRAAL model development and in assessing suitable model parameters for large-scale industrial glass dissolution data.

Keywords: machine learning, predictive modelling, pattern recognition, radioactive waste glass

Procedia PDF Downloads 89
102 Filmic and Verbal Metafphors

Authors: Manana Rusieshvili, Rusudan Dolidze

Abstract:

This paper aims at 1) investigating the ways in which a traditional, monomodal written verbal metaphor can be transposed as a monomodal non-verbal (visual) or multimodal (aural and -visual) filmic metaphor ; 2) exploring similarities and differences in the process of encoding and decoding of monomodal and multimodal metaphors. The empiric data, on which the research is based, embrace three sources: the novel by Harry Gray ‘The Hoods’, the script of the film ‘Once Upon a Time in America’ (English version by David Mills) and the resultant film by Sergio Leone. In order to achieve the above mentioned goals, the research focuses on the following issues: 1) identification of verbal and non-verbal monomodal and multimodal metaphors in the above-mentioned sources and 2) investigation of the ways and modes the specific written monomodal metaphors appearing in the novel and the script are enacted in the film and become visual, aural or visual-aural filmic metaphors ; 3) study of the factors which play an important role in contributing to the encoding and decoding of the filmic metaphor. The collection and analysis of the data were carried out in two stages: firstly, the relevant data, i.e. the monomodal metaphors from the novel, the script and the film were identified and collected. In the second, final stage the metaphors taken from all of the three sources were analysed, compared and two types of phenomena were selected for discussion: (1) the monomodal written metaphors found in the novel and/or in the script which become monomodal visual/aural metaphors in the film; (2) the monomodal written metaphors found in the novel and/or in the script which become multimodal, filmic (visual-aural) metaphors in the film.

Keywords: encoding, decoding, filmic metaphor, multimodality

Procedia PDF Downloads 485
101 Segmentation of the Liver and Spleen From Abdominal CT Images Using Watershed Approach

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The phase of segmentation is an important step in the processing and interpretation of medical images. In this paper, we focus on the segmentation of liver and spleen from the abdomen computed tomography (CT) images. The importance of our study comes from the fact that the segmentation of ROI from CT images is usually a difficult task. This difficulty is the gray’s level of which is similar to the other organ also the ROI are connected to the ribs, heart, kidneys, etc. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to remove the surrounding and connected organs and tissues by applying morphological filters. This first step makes the extraction of interest regions easier. The second step consists of improving the quality of the image gradient. In this step, we propose a method for improving the image gradient to reduce these deficiencies by applying the spatial filters followed by the morphological filters. Thereafter we proceed to the segmentation of the liver, spleen. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.The system has been evaluated by computing the sensitivity and specificity between the semi-automatically segmented (liver and spleen) contour and the manually contour traced by radiological experts.

Keywords: CT images, liver and spleen segmentation, anisotropic diffusion filter, morphological filters, watershed algorithm

Procedia PDF Downloads 463
100 A Combination of Anisotropic Diffusion and Sobel Operator to Enhance the Performance of the Morphological Component Analysis for Automatic Crack Detection

Authors: Ankur Dixit, Hiroaki Wagatsuma

Abstract:

The crack detection on a concrete bridge is an important and constant task in civil engineering. Chronically, humans are checking the bridge for inspection of cracks to maintain the quality and reliability of bridge. But this process is very long and costly. To overcome such limitations, we have used a drone with a digital camera, which took some images of bridge deck and these images are processed by morphological component analysis (MCA). MCA technique is a very strong application of sparse coding and it explores the possibility of separation of images. In this paper, MCA has been used to decompose the image into coarse and fine components with the effectiveness of two dictionaries namely anisotropic diffusion and wavelet transform. An anisotropic diffusion is an adaptive smoothing process used to adjust diffusion coefficient by finding gray level and gradient as features. These cracks in image are enhanced by subtracting the diffused coarse image into the original image and the results are treated by Sobel edge detector and binary filtering to exhibit the cracks in a fine way. Our results demonstrated that proposed MCA framework using anisotropic diffusion followed by Sobel operator and binary filtering may contribute to an automation of crack detection even in open field sever conditions such as bridge decks.

Keywords: anisotropic diffusion, coarse component, fine component, MCA, Sobel edge detector and wavelet transform

Procedia PDF Downloads 151
99 Implementation of A Treatment Escalation Plan During The Covid 19 Outbreak in Aneurin Bevan University Health Board

Authors: Peter Collett, Mike Pynn, Haseeb Ur Rahman

Abstract:

For the last few years across the UK there has been a push towards implementing treatment escalation plans (TEP) for every patient admitted to hospital. This is a paper form which is completed by a junior doctor then countersigned by the consultant responsible for the patient's care. It is designed to address what level of care is appropriate for the patient in question at point of entry to hospital. It helps decide whether the patient would benefit for ward based, high dependency or intensive care. They are completed to ensure the patient's best interests are maintained and aim to facilitate difficult decisions which may be required at a later date. For example, a frail patient with significant co-morbidities, unlikely to survive a pathology requiring an intensive care admission is admitted to hospital the decision can be made early to state the patient would not benefit from an ICU admission. This decision can be reversed depending on the clinical course of the patient's admission. It promotes discussions with the patient regarding their wishes to receive certain levels of healthcare. This poster describes the steps taken in the Aneurin Bevan University Health Board (ABUHB) when implementing the TEP form. The team implementing the TEP form campaigned for it's use to the board of directors. The directors were eager to hear of experiences of other health boards who had implemented the TEP form. The team presented the data produced in a number of health boards and demonstrated the proposed form. Concern was raised regarding the legalities of the form and that it could upset patients and relatives if the form was not explained properly. This delayed the effectuation of the TEP form and further research and discussion would be required. When COVID 19 reached the UK the National Institute for Health and Clinical Excellence issued guidance stating every patient admitted to hospital should be issued a TEP form. The TEP form was accelerated through the vetting process and was approved with immediate effect. The TEP form in ABUHB has now been in circulation for a month. An audit investigating it's uptake and a survey gathering opinions have been conducted.

Keywords: acute medicine, clinical governance, intensive care, patient centered decision making

Procedia PDF Downloads 147
98 Electroencephalography (EEG) Analysis of Alcoholic and Control Subjects Using Multiscale Permutation Entropy

Authors: Lal Hussain, Wajid Aziz, Sajjad Ahmed Nadeem, Saeed Arif Shah, Abdul Majid

Abstract:

Brain electrical activity as reflected in Electroencephalography (EEG) have been analyzed and diagnosed using various techniques. Among them, complexity measure, nonlinearity, disorder, and unpredictability play vital role due to the nonlinear interconnection between functional and anatomical subsystem emerged in brain in healthy state and during various diseases. There are many social and economical issues of alcoholic abuse as memory weakness, decision making, impairments, and concentrations etc. Alcoholism not only defect the brains but also associated with emotional, behavior, and cognitive impairments damaging the white and gray brain matters. A recently developed signal analysis method i.e. Multiscale Permutation Entropy (MPE) is proposed to estimate the complexity of long-range temporal correlation time series EEG of Alcoholic and Control subjects acquired from University of California Machine Learning repository and results are compared with MSE. Using MPE, coarsed grained series is first generated and the PE is computed for each coarsed grained time series against the electrodes O1, O2, C3, C4, F2, F3, F4, F7, F8, Fp1, Fp2, P3, P4, T7, and T8. The results computed against each electrode using MPE gives higher significant values as compared to MSE as well as mean rank differences accordingly. Likewise, ROC and Area under the ROC also gives higher separation against each electrode using MPE in comparison to MSE.

Keywords: electroencephalogram (EEG), multiscale permutation entropy (MPE), multiscale sample entropy (MSE), permutation entropy (PE), mann whitney test (MMT), receiver operator curve (ROC), complexity measure

Procedia PDF Downloads 462
97 Simulation of the FDA Centrifugal Blood Pump Using High Performance Computing

Authors: Mehdi Behbahani, Sebastian Rible, Charles Moulinec, Yvan Fournier, Mike Nicolai, Paolo Crosetto

Abstract:

Computational Fluid Dynamics blood-flow simulations are increasingly used to develop and validate blood-contacting medical devices. This study shows that numerical simulations can provide additional and accurate estimates of relevant hemodynamic indicators (e.g., recirculation zones or wall shear stresses), which may be difficult and expensive to obtain from in-vivo or in-vitro experiments. The most recent FDA (Food and Drug Administration) benchmark consisted of a simplified centrifugal blood pump model that contains fluid flow features as they are commonly found in these devices with a clear focus on highly turbulent phenomena. The FDA centrifugal blood pump study is composed of six test cases with different volumetric flow rates ranging from 2.5 to 7.0 liters per minute, pump speeds, and Reynolds numbers ranging from 210,000 to 293,000. Within the frame of this study different turbulence models were tested including RANS models, e.g. k-omega, k-epsilon and a Reynolds Stress Model (RSM) and, LES. The partitioners Hilbert, METIS, ParMETIS and SCOTCH were used to create an unstructured mesh of 76 million elements and compared in their efficiency. Computations were performed on the JUQUEEN BG/Q architecture applying the highly parallel flow solver Code SATURNE and typically using 32768 or more processors in parallel. Visualisations were performed by means of PARAVIEW. Different turbulence models including all six flow situations could be successfully analysed and validated against analytical considerations and from comparison to other data-bases. It showed that an RSM represents an appropriate choice with respect to modeling high-Reynolds number flow cases. Especially, the Rij-SSG (Speziale, Sarkar, Gatzki) variant turned out to be a good approach. Visualisation of complex flow features could be obtained and the flow situation inside the pump could be characterized.

Keywords: blood flow, centrifugal blood pump, high performance computing, scalability, turbulence

Procedia PDF Downloads 364
96 Prediction of PM₂.₅ Concentration in Ulaanbaatar with Deep Learning Models

Authors: Suriya

Abstract:

Rapid socio-economic development and urbanization have led to an increasingly serious air pollution problem in Ulaanbaatar (UB), the capital of Mongolia. PM₂.₅ pollution has become the most pressing aspect of UB air pollution. Therefore, monitoring and predicting PM₂.₅ concentration in UB is of great significance for the health of the local people and environmental management. As of yet, very few studies have used models to predict PM₂.₅ concentrations in UB. Using data from 0:00 on June 1, 2018, to 23:00 on April 30, 2020, we proposed two deep learning models based on Bayesian-optimized LSTM (Bayes-LSTM) and CNN-LSTM. We utilized hourly observed data, including Himawari8 (H8) aerosol optical depth (AOD), meteorology, and PM₂.₅ concentration, as input for the prediction of PM₂.₅ concentrations. The correlation strengths between meteorology, AOD, and PM₂.₅ were analyzed using the gray correlation analysis method; the comparison of the performance improvement of the model by using the AOD input value was tested, and the performance of these models was evaluated using mean absolute error (MAE) and root mean square error (RMSE). The prediction accuracies of Bayes-LSTM and CNN-LSTM deep learning models were both improved when AOD was included as an input parameter. Improvement of the prediction accuracy of the CNN-LSTM model was particularly enhanced in the non-heating season; in the heating season, the prediction accuracy of the Bayes-LSTM model slightly improved, while the prediction accuracy of the CNN-LSTM model slightly decreased. We propose two novel deep learning models for PM₂.₅ concentration prediction in UB, Bayes-LSTM, and CNN-LSTM deep learning models. Pioneering the use of AOD data from H8 and demonstrating the inclusion of AOD input data improves the performance of our two proposed deep learning models.

Keywords: deep learning, AOD, PM2.5, prediction, Ulaanbaatar

Procedia PDF Downloads 18
95 Analysis of Citation Rate and Data Reuse for Openly Accessible Biodiversity Datasets on Global Biodiversity Information Facility

Authors: Nushrat Khan, Mike Thelwall, Kayvan Kousha

Abstract:

Making research data openly accessible has been mandated by most funders over the last 5 years as it promotes reproducibility in science and reduces duplication of effort to collect the same data. There are evidence that articles that publicly share research data have higher citation rates in biological and social sciences. However, how and whether shared data is being reused is not always intuitive as such information is not easily accessible from the majority of research data repositories. This study aims to understand the practice of data citation and how data is being reused over the years focusing on biodiversity since research data is frequently reused in this field. Metadata of 38,878 datasets including citation counts were collected through the Global Biodiversity Information Facility (GBIF) API for this purpose. GBIF was used as a data source since it provides citation count for datasets, not a commonly available feature for most repositories. Analysis of dataset types, citation counts, creation and update time of datasets suggests that citation rate varies for different types of datasets, where occurrence datasets that have more granular information have higher citation rates than checklist and metadata-only datasets. Another finding is that biodiversity datasets on GBIF are frequently updated, which is unique to this field. Majority of the datasets from the earliest year of 2007 were updated after 11 years, with no dataset that was not updated since creation. For each year between 2007 and 2017, we compared the correlations between update time and citation rate of four different types of datasets. While recent datasets do not show any correlations, 3 to 4 years old datasets show weak correlation where datasets that were updated more recently received high citations. The results are suggestive that it takes several years to cumulate citations for research datasets. However, this investigation found that when searched on Google Scholar or Scopus databases for the same datasets, the number of citations is often not the same as GBIF. Hence future aim is to further explore the citation count system adopted by GBIF to evaluate its reliability and whether it can be applicable to other fields of studies as well.

Keywords: data citation, data reuse, research data sharing, webometrics

Procedia PDF Downloads 147
94 China's New "Pivots" in the Indian Ocean: Towards "String of Pearls" Strategy 2.0

Authors: Mike Chia-Yu Huang

Abstract:

China’s port facility construction projects in the Indian Ocean (IO) region, Gwadar Port and Djibouti Port projects in particular, have led to a heated debate among both Chinese and Western strategists over whether the country has literally been carrying out its “string of pearls” strategy, an alleged Chinese plan to challenge America’s military predominance in South Asia. Even though the Chinese government repeatedly denied the existence of such a strategy and highlighted the civilian/commercial nature of its port projects, it has significantly enhanced its strategic cooperation with littoral countries in the IO region since the “One Belt One Road” initiative was introduced by Chinese President Xi Jinping in 2013. Whether China does have a plan to expand its sphere of military influence westward concerns the balance of power in the IO region. If the answer is positive, the security environment there will be changed drastically. This paper argues that rather than simply copying the U.S. model of developing overseas military bases along the IO periphery, Beijing has been deliberating a more sophisticated plan for its physical presence there: creating a new set of “overseas strategic pivots.” These “pivots,” semi-military and semi-commercial in nature, are designed to help Beijing sustain its anti-piracy operations in the Gulf of Aden and serve as forward stations for the transportation of China’s imported energy and merchandise. They can support the Chinese Navy’s operations overseas but are not supposed to undertake face-to-face combat missions. This upgraded Chinese scheme can be identified as “string of pearls” strategy 2.0. Moreover, it is expected to help China deepen its roots in the IO region, implying that Beijing has to a large extent scratched its old diplomatic philosophy which highlighted the merits of non-interference and nonalignment. While a full-scale maritime confrontation between China and the U.S.-India security alliance is unlikely to be witnessed in the near future, an ambitious Chinese plan to step into the global maritime domain has been evidently shown.

Keywords: Chinese navy, Djibouti, Gwadar, Indian Ocean, string of pearls strategy

Procedia PDF Downloads 307
93 Elaboration and Investigation of the New Ecologically Clean Friction Composite Materials on the Basis of Nanoporous Raw Materials

Authors: Lia Gventsadze, Elguja Kutelia, David Gventsadze

Abstract:

The purpose of the article is to show the possibility for the development of a new generation, eco-friendly (asbestos free) nano-porous friction materials on the basis of Georgian raw materials, along with the determination of technological parameters for their production, as well as the optimization of tribological properties and the investigation of structural aspects of wear peculiarities of elaborated materials using the scanning electron microscopy (SEM) and Auger electron spectroscopy (AES) methods. The study investigated the tribological properties of the polymer friction materials on the basis of the phenol-formaldehyde resin using the porous diatomite filler modified by silane with the aim to improve the thermal stability, while the composition was modified by iron phosphate, technical carbon and basalt fibre. As a result of testing the stable values of friction factor (0.3-0,45) were reached, both in dry and wet friction conditions, the friction working parameters (friction factor and wear stability) remained stable up to 500 OC temperatures, the wear stability of gray cast-iron disk increased 3-4 times, the soundless operation of materials without squeaking were achieved. Herewith it was proved that small amount of ingredients (5-6) are enough to compose the nano-porous friction materials. The study explains the mechanism of the action of nano-porous composition base brake lining materials and its tribological efficiency on the basis of the triple phase model of the tribo-pair.

Keywords: brake lining, friction coefficient, wear, nanoporous composite, phenolic resin

Procedia PDF Downloads 367
92 Spinach Lipid Extract as an Alternative Flow Aid for Fat Suspensions

Authors: Nizaha Juhaida Mohamad, David Gray, Bettina Wolf

Abstract:

Chocolate is a material composite with a high fraction of solid particles dispersed in a fat phase largely composed of cocoa butter. Viscosity properties of chocolate can be manipulated by the amount of fat - increased levels of fat lead to lower viscosity. However, a high content of cocoa butter can increase the cost of the chocolate and instead surfactants are used to manipulate viscosity behaviour. Most commonly, lecithin and polyglycerol polyricinoleate (PGPR) are used. Lecithin is a natural lipid emulsifier which is based on phospholipids while PGPR is a chemically produced emulsifier which based on the long continuous chain of ricinoleic acid. Lecithin and PGPR act to lower the viscosity and yield stress, respectively. Recently, natural lipid emulsifiers based on galactolipid as the functional ingredient have become of interest. Spinach lipid is found to have a high amount of galactolipid, specifically MGDG and DGDG. The aim of this research is to explore the influence of spinach lipid in comparison with PGPR and lecithin on the rheological properties of sugar/oil suspensions which serve as chocolate model system. For that purpose, icing sugar was dispersed from 40%, 45% and 50% (w/w) in oil which has spinach lipid at concentrations from 0.1 – 0.7% (w/w). Based on viscosity at 40 s-1 and yield value reported as shear stress measured at 5 s-1, it was found that spinach lipid shows viscosity reducing and yield stress lowering effects comparable to lecithin and PGPR, respectively. This characteristic of spinach lipid demonstrates great potential for it to act as single natural lipid emulsifier in chocolate.

Keywords: chocolate viscosity, lecithin, polyglycerol polyricinoleate (PGPR), spinach lipid

Procedia PDF Downloads 224