Search results for: Randall Gray Underwood
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 186

Search results for: Randall Gray Underwood

96 Foundations for Global Interactions: The Theoretical Underpinnings of Understanding Others

Authors: Randall E. Osborne

Abstract:

In a course on International Psychology, 8 theoretical perspectives (Critical Psychology, Liberation Psychology, Post-Modernism, Social Constructivism, Social Identity Theory, Social Reduction Theory, Symbolic Interactionism, and Vygotsky’s Sociocultural Theory) are used as a framework for getting students to understand the concept of and need for Globalization. One of critical psychology's main criticisms of conventional psychology is that it fails to consider or deliberately ignores the way power differences between social classes and groups can impact the mental and physical well-being of individuals or groups of people. Liberation psychology, also known as liberation social psychology or psicología social de la liberación, is an approach to psychological science that aims to understand the psychology of oppressed and impoverished communities by addressing the oppressive sociopolitical structure in which they exist. Postmodernism is largely a reaction to the assumed certainty of scientific, or objective, efforts to explain reality. It stems from a recognition that reality is not simply mirrored in human understanding of it, but rather, is constructed as the mind tries to understand its own particular and personal reality. Lev Vygotsky argued that all cognitive functions originate in, and must therefore be explained as products of social interactions and that learning was not simply the assimilation and accommodation of new knowledge by learners. Social Identity Theory discusses the implications of social identity for human interactions with and assumptions about other people. Social Identification Theory suggests people: (1) categorize—people find it helpful (humans might be perceived as having a need) to place people and objects into categories, (2) identify—people align themselves with groups and gain identity and self-esteem from it, and (3) compare—people compare self to others. Social reductionism argues that all behavior and experiences can be explained simply by the affect of groups on the individual. Symbolic interaction theory focuses attention on the way that people interact through symbols: words, gestures, rules, and roles. Meaning evolves from human their interactions in their environment and with people. Vygotsky’s sociocultural theory of human learning describes learning as a social process and the origination of human intelligence in society or culture. The major theme of Vygotsky’s theoretical framework is that social interaction plays a fundamental role in the development of cognition. This presentation will discuss how these theoretical perspectives are incorporated into a course on International Psychology, a course on the Politics of Hate, and a course on the Psychology of Prejudice, Discrimination and Hate to promote student thinking in a more ‘global’ manner.

Keywords: globalization, international psychology, society and culture, teaching interculturally

Procedia PDF Downloads 223
95 Development of Paper Based Analytical Devices for Analysis of Iron (III) in Natural Water Samples

Authors: Sakchai Satienperakul, Manoch Thanomwat, Jutiporn Seedasama

Abstract:

A paper based analytical devices (PADs) for the analysis of Fe (III) ion in natural water samples is developed, using reagent from guava leaf extract. The extraction is simply performed in deionized water pH 7, where tannin extract is obtained and used as an alternative natural reagent. The PADs are fabricated by ink-jet printing using alkenyl ketene dimer (AKD) wax. The quantitation of Fe (III) is carried out using reagent from guava leaf extract prepared in acetate buffer at the ratio of 1:1. A color change to gray-purple is observed by naked eye when dropping sample contained Fe (III) ion on PADs channel. The reflective absorption measurement is performed for creating a standard curve. The linear calibration range is observed over the concentration range of 2-10 mg L-1. Detection limited of Fe (III) is observed at 2 mg L-1. In its optimum form, the PADs is stable for up to 30 days under oxygen free conditions. The small dimensions, low volume requirement and alternative natural reagent make the proposed PADs attractive for on-site environmental monitoring and analysis.

Keywords: green chemical analysis, guava leaf extract, lab on a chip, paper based analytical device

Procedia PDF Downloads 218
94 Early Detection of Breast Cancer in Digital Mammograms Based on Image Processing and Artificial Intelligence

Authors: Sehreen Moorat, Mussarat Lakho

Abstract:

A method of artificial intelligence using digital mammograms data has been proposed in this paper for detection of breast cancer. Many researchers have developed techniques for the early detection of breast cancer; the early diagnosis helps to save many lives. The detection of breast cancer through mammography is effective method which detects the cancer before it is felt and increases the survival rate. In this paper, we have purposed image processing technique for enhancing the image to detect the graphical table data and markings. Texture features based on Gray-Level Co-Occurrence Matrix and intensity based features are extracted from the selected region. For classification purpose, neural network based supervised classifier system has been used which can discriminate between benign and malignant. Hence, 68 digital mammograms have been used to train the classifier. The obtained result proved that automated detection of breast cancer is beneficial for early diagnosis and increases the survival rates of breast cancer patients. The proposed system will help radiologist in the better interpretation of breast cancer.

Keywords: medical imaging, cancer, processing, neural network

Procedia PDF Downloads 234
93 Digital Material Characterization Using the Quantum Fourier Transform

Authors: Felix Givois, Nicolas R. Gauger, Matthias Kabel

Abstract:

The efficient digital material characterization is of great interest to many fields of application. It consists of the following three steps. First, a 3D reconstruction of 2D scans must be performed. Then, the resulting gray-value image of the material sample is enhanced by image processing methods. Finally, partial differential equations (PDE) are solved on the segmented image, and by averaging the resulting solutions fields, effective properties like stiffness or conductivity can be computed. Due to the high resolution of current CT images, the latter is typically performed with matrix-free solvers. Among them, a solver that uses the explicit formula of the Green-Eshelby operator in Fourier space has been proposed by Moulinec and Suquet. Its algorithmic, most complex part is the Fast Fourier Transformation (FFT). In our talk, we will discuss the potential quantum advantage that can be obtained by replacing the FFT with the Quantum Fourier Transformation (QFT). We will especially show that the data transfer for noisy intermediate-scale quantum (NISQ) devices can be improved by using appropriate boundary conditions for the PDE, which also allows using semi-classical versions of the QFT. In the end, we will compare the results of the QFT-based algorithm for simple geometries with the results of the FFT-based homogenization method.

Keywords: most likelihood amplitude estimation (MLQAE), numerical homogenization, quantum Fourier transformation (QFT), NISQ devises

Procedia PDF Downloads 47
92 Heuristic Classification of Hydrophone Recordings

Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas

Abstract:

An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.

Keywords: anthrophony, hydrophone, k-means, machine learning

Procedia PDF Downloads 135
91 Geographical Data Visualization Using Video Games Technologies

Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava

Abstract:

In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.

Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material

Procedia PDF Downloads 219
90 Wear and Mechanical Properties of Nodular Iron Modified with Copper

Authors: J. Ramos, V. Gil, A. F. Torres

Abstract:

The nodular iron is a material that has shown great advantages respect to other materials (steel and gray iron) in the production of machine elements. The engineering industry, especially automobile, are potential users of this material. As it is known, the alloying elements modify the properties of steels and castings. Copper has been investigated as a structural modifier of nodular iron, but studies of its mechanical and tribological implications still need to be addressed for industrial use. With the aim of improving the mechanical properties of nodular iron, alloying elements (Mn, Si, and Cu) are added in order to increase their pearlite (or ferrite) structure according to the percentage of the alloying element. In this research (using induction furnace process) nodular iron with three different percentages of copper (residual, 0,5% and 1,2%) was obtained. Chemical analysis was performed by optical emission spectrometry and microstructures were characterized by Optical Microscopy (ASTM E3) and Scanning Electron Microscopy (SEM). The study of mechanical behavior was carried out in a mechanical test machine (ASTM E8) and a Pin on disk tribometer (ASTM G99) was used to assess wear resistance. It is observed that copper increases the pearlite structure improving the wear behavior; tension behavior. This improvement is observed in higher proportion with 0,5% due to the fact that too much increase of pearlite leads to ductility loss.

Keywords: copper, mechanical properties, nodular iron, pearlite structure, wear

Procedia PDF Downloads 361
89 Image Segmentation Techniques: Review

Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo

Abstract:

Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.

Keywords: clustering-based, convolution-network, edge-based, region-growing

Procedia PDF Downloads 58
88 Secure Message Transmission Using Meaningful Shares

Authors: Ajish Sreedharan

Abstract:

Visual cryptography encodes a secret image into shares of random binary patterns. If the shares are exerted onto transparencies, the secret image can be visually decoded by superimposing a qualified subset of transparencies, but no secret information can be obtained from the superposition of a forbidden subset. The binary patterns of the shares, however, have no visual meaning and hinder the objectives of visual cryptography. In the Secret Message Transmission through Meaningful Shares a secret message to be transmitted is converted to grey scale image. Then (2,2) visual cryptographic shares are generated from this converted gray scale image. The shares are encrypted using A Chaos-Based Image Encryption Algorithm Using Wavelet Transform. Two separate color images which are of the same size of the shares, taken as cover image of the respective shares to hide the shares into them. The encrypted shares which are covered by meaningful images so that a potential eavesdropper wont know there is a message to be read. The meaningful shares are transmitted through two different transmission medium. During decoding shares are fetched from received meaningful images and decrypted using A Chaos-Based Image Encryption Algorithm Using Wavelet Transform. The shares are combined to regenerate the grey scale image from where the secret message is obtained.

Keywords: visual cryptography, wavelet transform, meaningful shares, grey scale image

Procedia PDF Downloads 426
87 Utilization of Two Kind of Recycling Greywater in Irrigation of Syngonium SP. Plants Grown Under Different Water Regime

Authors: Sami Ali Metwally, Bedour Helmy Abou-Leila, Hussien I.Abdel-Shafy

Abstract:

The work was carried out at the greenhouse of National Research Centre, Pot experiment was carried out during of 2020 and 2021 seasons aimed to study the effect of two types of water (two recycling gray water treatments((SMR (Sequencing Batch Reactor) and MBR(Membrane Biology Reactor) and three watering intervals 15, 20 and 25 days on Syangonium plants growth. Examination of data cleared that, (MBR) recorded increase in vegetative growth parameters, osmotic pressure, transpiration rate chlorophyll a,b,carotenoids and carbohydrate)in compared with SBR.As for water, intervalsthe highest values of most growth parameters were obtained from plants irrigated with after (20 days) compared with other treatments.15 days irrigation intervals recorded significantly increased in osmotic pressure, transpiration rate and photosynthetic pigments, while carbohydrate values recorded decreased. Interaction between water type and water intervals(SBR) recorded the highest values of most growth parameters by irrigation after 20 days. While the treatment (MBR)and irrigated after 25 days showed the highest values on leaf area and leaves fresh weight compared with other treatments.

Keywords: grey water, water intervals, Syngonium plant, recycling water, vegetative growth

Procedia PDF Downloads 82
86 One year later after the entry into force of the Treaty on the Prohibition of Nuclear Weapons (TPNW): Reviewing Legal Impact and Implementation

Authors: Cristina Siserman-Gray

Abstract:

TheTreaty on the Prohibition of Nuclear Weapons(TPNW)will mark in January 2022 one year since the entry into force of the treaty. TPNW provides that within one year of entry into force, the 86 countries that have signed it so far will convene to discuss and take decisions on the treaty’s implementation at the first meeting of states-parties. Austria has formally offered to host the meeting in Vienna in the spring of 2022. At this first meeting, the States Parties would need to work. Among others, on the interpretations of some of the provisions of the Treaty, disarmament timelines under Article 4, and address universalization of the Treaty. The main objective of this paper is to explore the legal implications of the TPNW for States-Parties and discuss how these will impact non-State Parties, particularly the United States. In a first part, the article will address the legal requirements that States Parties to this treaty must adhere to by illustrating some of the progress made by these states regarding the implementation of the TPNW. In a second part, the paper will address the challenges and opportunities for universalizing the treaty and will focus on the response of Nuclear Weapons States, and particularly the current US administration. Since it has become clear that TPNW has become a new and important element to the nonproliferation and disarmament architecture, the article will provide a number of suggestions regarding ways US administration could positively contribute to the international discourse on TPNW.

Keywords: disarmament, arms control and nonproliferation, legal regime, TPNW

Procedia PDF Downloads 137
85 Image Multi-Feature Analysis by Principal Component Analysis for Visual Surface Roughness Measurement

Authors: Wei Zhang, Yan He, Yan Wang, Yufeng Li, Chuanpeng Hao

Abstract:

Surface roughness is an important index for evaluating surface quality, needs to be accurately measured to ensure the performance of the workpiece. The roughness measurement based on machine vision involves various image features, some of which are redundant. These redundant features affect the accuracy and speed of the visual approach. Previous research used correlation analysis methods to select the appropriate features. However, this feature analysis is independent and cannot fully utilize the information of data. Besides, blindly reducing features lose a lot of useful information, resulting in unreliable results. Therefore, the focus of this paper is on providing a redundant feature removal approach for visual roughness measurement. In this paper, the statistical methods and gray-level co-occurrence matrix(GLCM) are employed to extract the texture features of machined images effectively. Then, the principal component analysis(PCA) is used to fuse all extracted features into a new one, which reduces the feature dimension and maintains the integrity of the original information. Finally, the relationship between new features and roughness is established by the support vector machine(SVM). The experimental results show that the approach can effectively solve multi-feature information redundancy of machined surface images and provides a new idea for the visual evaluation of surface roughness.

Keywords: feature analysis, machine vision, PCA, surface roughness, SVM

Procedia PDF Downloads 185
84 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework

Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim

Abstract:

Background modeling and subtraction in video analysis has been widely proved to be an effective method for moving objects detection in many computer vision applications. Over the past years, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are two of the most frequently occurring issues in the practical situation. This paper presents a new two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean values of RGB color channels. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block-wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the outputs of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate a very competitive performance compared to previous models.

Keywords: background subtraction, codebook model, local binary pattern, dynamic background, illumination change

Procedia PDF Downloads 193
83 Optimization of Spatial Light Modulator to Generate Aberration Free Optical Traps

Authors: Deepak K. Gupta, T. R. Ravindran

Abstract:

Holographic Optical Tweezers (HOTs) in general use iterative algorithms such as weighted Gerchberg-Saxton (WGS) to generate multiple traps, which produce traps with 99% uniformity theoretically. But in experiments, it is the phase response of the spatial light modulator (SLM) which ultimately determines the efficiency, uniformity, and quality of the trap spots. In general, SLMs show a nonlinear phase response behavior, and they may even have asymmetric phase modulation depth before and after π. This affects the resolution with which the gray levels are addressed before and after π, leading to a degraded trap performance. We present a method to optimize the SLM for a linear phase response behavior along with a symmetric phase modulation depth around π. Further, we optimize the SLM for its varying phase response over different spatial regions by optimizing the brightness/contrast and gamma of the hologram in different subsections. We show the effect of the optimization on an array of trap spots resulting in improved efficiency and uniformity. We also calculate the spot sharpness metric and trap performance metric and show a tightly focused spot with reduced aberration. The trap performance is compared by calculating the trap stiffness of a trapped particle in a given trap spot before and after aberration correction. The trap stiffness is found to improve by 200% after the optimization.

Keywords: spatial light modulator, optical trapping, aberration, phase modulation

Procedia PDF Downloads 147
82 Multi-Scale Green Infrastructure: An Integrated Literature Review

Authors: Panpan Feng

Abstract:

The concept of green infrastructure originated in Europe and the United States. It aims to ensure smart growth of urban and rural ecosystems and achieve sustainable urban and rural ecological, social, and economic development by combining it with gray infrastructure in traditional planning. Based on the literature review of the theoretical origin, value connotation, and measurement methods of green infrastructure, this study summarizes the research content of green infrastructure at different scales from the three spatial levels of region, city, and block and divides it into functional dimensions, spatial dimension, and strategic dimension. The results show that in the functional dimension, from region-city-block, the research on green infrastructure gradually shifts from ecological function to social function. In the spatial dimension, from region-city-block, the research on the spatial form of green infrastructure has shifted from two-dimensional to three-dimensional, and the spatial structure of green infrastructure has shifted from single ecological elements to multiple composite elements. From a strategic perspective, green infrastructure research is more of a spatial planning tool based on land management, environmental livability and ecological psychology, providing certain decision-making support.

Keywords: green infrastructure, multi-scale, social and ecological functions, spatial strategic decision-making tools

Procedia PDF Downloads 23
81 Numerical Analysis of a Pilot Solar Chimney Power Plant

Authors: Ehsan Gholamalizadeh, Jae Dong Chung

Abstract:

Solar chimney power plant is a feasible solar thermal system which produces electricity from the Sun. The objective of this study is to investigate buoyancy-driven flow and heat transfer through a built pilot solar chimney system called 'Kerman Project'. The system has a chimney with the height and diameter of 60 m and 3 m, respectively, and the average radius of its solar collector is about 20 m, and also its average collector height is about 2 m. A three-dimensional simulation was conducted to analyze the system, using computational fluid dynamics (CFD). In this model, radiative transfer equation was solved using the discrete ordinates (DO) radiation model taking into account a non-gray radiation behavior. In order to modelling solar irradiation from the sun’s rays, the solar ray tracing algorithm was coupled to the computation via a source term in the energy equation. The model was validated with comparing to the experimental data of the Manzanares prototype and also the performance of the built pilot system. Then, based on the numerical simulations, velocity and temperature distributions through the system, the temperature profile of the ground surface and the system performance were presented. The analysis accurately shows the flow and heat transfer characteristics through the pilot system and predicts its performance.

Keywords: buoyancy-driven flow, computational fluid dynamics, heat transfer, renewable energy, solar chimney power plant

Procedia PDF Downloads 228
80 Filmic and Verbal Metafphors

Authors: Manana Rusieshvili, Rusudan Dolidze

Abstract:

This paper aims at 1) investigating the ways in which a traditional, monomodal written verbal metaphor can be transposed as a monomodal non-verbal (visual) or multimodal (aural and -visual) filmic metaphor ; 2) exploring similarities and differences in the process of encoding and decoding of monomodal and multimodal metaphors. The empiric data, on which the research is based, embrace three sources: the novel by Harry Gray ‘The Hoods’, the script of the film ‘Once Upon a Time in America’ (English version by David Mills) and the resultant film by Sergio Leone. In order to achieve the above mentioned goals, the research focuses on the following issues: 1) identification of verbal and non-verbal monomodal and multimodal metaphors in the above-mentioned sources and 2) investigation of the ways and modes the specific written monomodal metaphors appearing in the novel and the script are enacted in the film and become visual, aural or visual-aural filmic metaphors ; 3) study of the factors which play an important role in contributing to the encoding and decoding of the filmic metaphor. The collection and analysis of the data were carried out in two stages: firstly, the relevant data, i.e. the monomodal metaphors from the novel, the script and the film were identified and collected. In the second, final stage the metaphors taken from all of the three sources were analysed, compared and two types of phenomena were selected for discussion: (1) the monomodal written metaphors found in the novel and/or in the script which become monomodal visual/aural metaphors in the film; (2) the monomodal written metaphors found in the novel and/or in the script which become multimodal, filmic (visual-aural) metaphors in the film.

Keywords: encoding, decoding, filmic metaphor, multimodality

Procedia PDF Downloads 488
79 Segmentation of the Liver and Spleen From Abdominal CT Images Using Watershed Approach

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The phase of segmentation is an important step in the processing and interpretation of medical images. In this paper, we focus on the segmentation of liver and spleen from the abdomen computed tomography (CT) images. The importance of our study comes from the fact that the segmentation of ROI from CT images is usually a difficult task. This difficulty is the gray’s level of which is similar to the other organ also the ROI are connected to the ribs, heart, kidneys, etc. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to remove the surrounding and connected organs and tissues by applying morphological filters. This first step makes the extraction of interest regions easier. The second step consists of improving the quality of the image gradient. In this step, we propose a method for improving the image gradient to reduce these deficiencies by applying the spatial filters followed by the morphological filters. Thereafter we proceed to the segmentation of the liver, spleen. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.The system has been evaluated by computing the sensitivity and specificity between the semi-automatically segmented (liver and spleen) contour and the manually contour traced by radiological experts.

Keywords: CT images, liver and spleen segmentation, anisotropic diffusion filter, morphological filters, watershed algorithm

Procedia PDF Downloads 466
78 A Combination of Anisotropic Diffusion and Sobel Operator to Enhance the Performance of the Morphological Component Analysis for Automatic Crack Detection

Authors: Ankur Dixit, Hiroaki Wagatsuma

Abstract:

The crack detection on a concrete bridge is an important and constant task in civil engineering. Chronically, humans are checking the bridge for inspection of cracks to maintain the quality and reliability of bridge. But this process is very long and costly. To overcome such limitations, we have used a drone with a digital camera, which took some images of bridge deck and these images are processed by morphological component analysis (MCA). MCA technique is a very strong application of sparse coding and it explores the possibility of separation of images. In this paper, MCA has been used to decompose the image into coarse and fine components with the effectiveness of two dictionaries namely anisotropic diffusion and wavelet transform. An anisotropic diffusion is an adaptive smoothing process used to adjust diffusion coefficient by finding gray level and gradient as features. These cracks in image are enhanced by subtracting the diffused coarse image into the original image and the results are treated by Sobel edge detector and binary filtering to exhibit the cracks in a fine way. Our results demonstrated that proposed MCA framework using anisotropic diffusion followed by Sobel operator and binary filtering may contribute to an automation of crack detection even in open field sever conditions such as bridge decks.

Keywords: anisotropic diffusion, coarse component, fine component, MCA, Sobel edge detector and wavelet transform

Procedia PDF Downloads 152
77 Electroencephalography (EEG) Analysis of Alcoholic and Control Subjects Using Multiscale Permutation Entropy

Authors: Lal Hussain, Wajid Aziz, Sajjad Ahmed Nadeem, Saeed Arif Shah, Abdul Majid

Abstract:

Brain electrical activity as reflected in Electroencephalography (EEG) have been analyzed and diagnosed using various techniques. Among them, complexity measure, nonlinearity, disorder, and unpredictability play vital role due to the nonlinear interconnection between functional and anatomical subsystem emerged in brain in healthy state and during various diseases. There are many social and economical issues of alcoholic abuse as memory weakness, decision making, impairments, and concentrations etc. Alcoholism not only defect the brains but also associated with emotional, behavior, and cognitive impairments damaging the white and gray brain matters. A recently developed signal analysis method i.e. Multiscale Permutation Entropy (MPE) is proposed to estimate the complexity of long-range temporal correlation time series EEG of Alcoholic and Control subjects acquired from University of California Machine Learning repository and results are compared with MSE. Using MPE, coarsed grained series is first generated and the PE is computed for each coarsed grained time series against the electrodes O1, O2, C3, C4, F2, F3, F4, F7, F8, Fp1, Fp2, P3, P4, T7, and T8. The results computed against each electrode using MPE gives higher significant values as compared to MSE as well as mean rank differences accordingly. Likewise, ROC and Area under the ROC also gives higher separation against each electrode using MPE in comparison to MSE.

Keywords: electroencephalogram (EEG), multiscale permutation entropy (MPE), multiscale sample entropy (MSE), permutation entropy (PE), mann whitney test (MMT), receiver operator curve (ROC), complexity measure

Procedia PDF Downloads 462
76 Prediction of PM₂.₅ Concentration in Ulaanbaatar with Deep Learning Models

Authors: Suriya

Abstract:

Rapid socio-economic development and urbanization have led to an increasingly serious air pollution problem in Ulaanbaatar (UB), the capital of Mongolia. PM₂.₅ pollution has become the most pressing aspect of UB air pollution. Therefore, monitoring and predicting PM₂.₅ concentration in UB is of great significance for the health of the local people and environmental management. As of yet, very few studies have used models to predict PM₂.₅ concentrations in UB. Using data from 0:00 on June 1, 2018, to 23:00 on April 30, 2020, we proposed two deep learning models based on Bayesian-optimized LSTM (Bayes-LSTM) and CNN-LSTM. We utilized hourly observed data, including Himawari8 (H8) aerosol optical depth (AOD), meteorology, and PM₂.₅ concentration, as input for the prediction of PM₂.₅ concentrations. The correlation strengths between meteorology, AOD, and PM₂.₅ were analyzed using the gray correlation analysis method; the comparison of the performance improvement of the model by using the AOD input value was tested, and the performance of these models was evaluated using mean absolute error (MAE) and root mean square error (RMSE). The prediction accuracies of Bayes-LSTM and CNN-LSTM deep learning models were both improved when AOD was included as an input parameter. Improvement of the prediction accuracy of the CNN-LSTM model was particularly enhanced in the non-heating season; in the heating season, the prediction accuracy of the Bayes-LSTM model slightly improved, while the prediction accuracy of the CNN-LSTM model slightly decreased. We propose two novel deep learning models for PM₂.₅ concentration prediction in UB, Bayes-LSTM, and CNN-LSTM deep learning models. Pioneering the use of AOD data from H8 and demonstrating the inclusion of AOD input data improves the performance of our two proposed deep learning models.

Keywords: deep learning, AOD, PM2.5, prediction, Ulaanbaatar

Procedia PDF Downloads 18
75 Elaboration and Investigation of the New Ecologically Clean Friction Composite Materials on the Basis of Nanoporous Raw Materials

Authors: Lia Gventsadze, Elguja Kutelia, David Gventsadze

Abstract:

The purpose of the article is to show the possibility for the development of a new generation, eco-friendly (asbestos free) nano-porous friction materials on the basis of Georgian raw materials, along with the determination of technological parameters for their production, as well as the optimization of tribological properties and the investigation of structural aspects of wear peculiarities of elaborated materials using the scanning electron microscopy (SEM) and Auger electron spectroscopy (AES) methods. The study investigated the tribological properties of the polymer friction materials on the basis of the phenol-formaldehyde resin using the porous diatomite filler modified by silane with the aim to improve the thermal stability, while the composition was modified by iron phosphate, technical carbon and basalt fibre. As a result of testing the stable values of friction factor (0.3-0,45) were reached, both in dry and wet friction conditions, the friction working parameters (friction factor and wear stability) remained stable up to 500 OC temperatures, the wear stability of gray cast-iron disk increased 3-4 times, the soundless operation of materials without squeaking were achieved. Herewith it was proved that small amount of ingredients (5-6) are enough to compose the nano-porous friction materials. The study explains the mechanism of the action of nano-porous composition base brake lining materials and its tribological efficiency on the basis of the triple phase model of the tribo-pair.

Keywords: brake lining, friction coefficient, wear, nanoporous composite, phenolic resin

Procedia PDF Downloads 367
74 Spinach Lipid Extract as an Alternative Flow Aid for Fat Suspensions

Authors: Nizaha Juhaida Mohamad, David Gray, Bettina Wolf

Abstract:

Chocolate is a material composite with a high fraction of solid particles dispersed in a fat phase largely composed of cocoa butter. Viscosity properties of chocolate can be manipulated by the amount of fat - increased levels of fat lead to lower viscosity. However, a high content of cocoa butter can increase the cost of the chocolate and instead surfactants are used to manipulate viscosity behaviour. Most commonly, lecithin and polyglycerol polyricinoleate (PGPR) are used. Lecithin is a natural lipid emulsifier which is based on phospholipids while PGPR is a chemically produced emulsifier which based on the long continuous chain of ricinoleic acid. Lecithin and PGPR act to lower the viscosity and yield stress, respectively. Recently, natural lipid emulsifiers based on galactolipid as the functional ingredient have become of interest. Spinach lipid is found to have a high amount of galactolipid, specifically MGDG and DGDG. The aim of this research is to explore the influence of spinach lipid in comparison with PGPR and lecithin on the rheological properties of sugar/oil suspensions which serve as chocolate model system. For that purpose, icing sugar was dispersed from 40%, 45% and 50% (w/w) in oil which has spinach lipid at concentrations from 0.1 – 0.7% (w/w). Based on viscosity at 40 s-1 and yield value reported as shear stress measured at 5 s-1, it was found that spinach lipid shows viscosity reducing and yield stress lowering effects comparable to lecithin and PGPR, respectively. This characteristic of spinach lipid demonstrates great potential for it to act as single natural lipid emulsifier in chocolate.

Keywords: chocolate viscosity, lecithin, polyglycerol polyricinoleate (PGPR), spinach lipid

Procedia PDF Downloads 225
73 The Inversion of Helical Twist Sense in Liquid Crystal by Spectroscopy Methods

Authors: Anna Drzewicz, Marzena Tykarska

Abstract:

The chiral liquid crystal phases form the helicoidal structure, which is characterized by the helical pitch and the helical twist sense. In anticlinic smectic phase with antiferroelectric properties three types of helix temperature dependence have been obtained: increased helical pitch with temperature and right-handed helix, decreased helical pitch with temperature and left-handed helix and the inversion of both. The change of helical twist sense may be observed during the transition from one liquid crystal phase to another or within one phase for the same substance. According to Gray and McDonnell theory, the helical handedness depends on the absolute configuration of the assymetric carbon atom and its position related to the rigid core of the molecule. However, this theory does not explain the inversion of helical twist sense phenomenon. It is supposed, that it may be caused by the presence of different conformers with opposite handendess, which concentration may change with temperature. In this work, the inversion of helical twist sense in the chiral liquid crystals differing in the length of alkyl chain, in the substitution the benzene ring by fluorine atoms and in the type of helix handedness was tested by vibrational spectroscopy (infrared and raman spectroscopy) and by nuclear magnetic resonance spectroscopy. The results obtained from the vibrational spectroscopy confirm the presence of different conformers. Moreover, the analysis of nuclear magnetic resonance spectra is very useful to check, on which structural fragments the change of conformations are important for the change of helical twist sense.

Keywords: helical twist sense, liquid crystals, nuclear magnetic resonance spectroscopy, vibrational spectroscopy

Procedia PDF Downloads 253
72 Modification of the Risk for Incident Cancer with Changes in the Metabolic Syndrome Status: A Prospective Cohort Study in Taiwan

Authors: Yung-Feng Yen, Yun-Ju Lai

Abstract:

Background: Metabolic syndrome (MetS) is reversible; however, the effect of changes in MetS status on the risk of incident cancer has not been extensively studied. We aimed to investigate the effects of changes in MetS status on incident cancer risk. Methods: This prospective, longitudinal study used data from Taiwan’s MJ cohort of 157,915 adults recruited from 2002–2016 who had repeated MetS measurements 5.2 (±3.5) years apart and were followed up for the new onset of cancer over 8.2 (±4.5) years. A new diagnosis of incident cancer in study individuals was confirmed by their pathohistological reports. The participants’ MetS status included MetS-free (n=119,331), MetS-developed (n=14,272), MetS-recovered (n=7,914), and MetS-persistent (n=16,398). We used the Fine-Gray sub-distribution method, with death as the competing risk, to determine the association between MetS changes and the risk of incident cancer. Results: During the follow-up period, 7,486 individuals had new development of cancer. Compared with the MetS-free group, MetS-persistent individuals had a significantly higher risk of incident cancer (adjusted hazard ratio [aHR], 1.10; 95% confidence interval [CI], 1.03-1.18). Considering the effect of dynamic changes in MetS status on the risk of specific cancer types, MetS persistence was significantly associated with a higher risk of incident colon and rectum, kidney, pancreas, uterus, and thyroid cancer. The risk of kidney, uterus, and thyroid cancer in MetS-recovered individuals was higher than in those who remained MetS but lower than MetS-persistent individuals. Conclusions: Persistent MetS is associated with a higher risk of incident cancer, and recovery from MetS may reduce the risk. The findings of our study suggest that it is imperative for individuals with pre-existing MetS to seek treatment for this condition to reduce the cancer risk.

Keywords: metabolic syndrome change, cancer, risk factor, cohort study

Procedia PDF Downloads 48
71 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 404
70 From Conflicts to Synergies between Mitigation and Adaptation Strategies to Climate Change: The Case of Lisbon Downtown 2010-2030

Authors: Nuno M. Pereira

Abstract:

In the last thirty years, European cities have been addressing global climate change and its local impacts by implementing mitigation and adaptation strategies. Lisbon Downtown is no exception with 10 plans under implementation since 2010 with completion scheduled for 2030 valued 1 billion euros of public investment. However, the gap between mitigation and adaptation strategies is not yet sufficiently studied alongside with its nuances- vulnerability and risk mitigation, resilience and adaptation. In Lisbon Downtown, these plans are being implemented separately, therefore compromising the effectiveness of public investment. The research reviewed the common ground of mitigation and adaptation strategies of the theoretical framework and analyzed the current urban development actions in Lisbon Downtown in order to identify potential conflicts and synergies. The empirical fieldwork supported by a sounding board of experts has been developed during two years and the results suggest that the largest public investment in Lisbon on flooding mitigation will conflict with the new Cruise ship terminal and old Downton building stock, therefore increasing risk and vulnerability factors. The study concludes that the Lisbon Downtown blue infrastructure plan should be redesigned in some areas in a trans- disciplinary and holistic approach and that the current theoretical framework on climate change should focus more on mitigation and adaptation synergies articulating the gray, blue and green infrastructures, combining old knowledge tested by resilient communities and new knowledge emerging from the digital era.

Keywords: adaptation, climate change, conflict, Lisbon Downtown, mitigation, synergy

Procedia PDF Downloads 166
69 Study on the Mechanical Properties of Bamboo Fiber-Reinforced Polypropylene Based Composites: Effect of Gamma Radiation

Authors: Kamrun N. Keya, Nasrin A. Kona, Ruhul A. Khan

Abstract:

Bamboo fiber (BF) reinforced polypropylene (PP) based composites were fabricated by a conventional compression molding technique. In this investigation, bamboo composites were manufactured using different percentages of fiber, which were varying from 25-65% on the total weight of the composites. To fabricate the BF/PP composites untreated and treated fibers were selected. A systematic study was done to observe the physical, mechanical, and interfacial behavior of the composites. In this study, mechanical properties of the composites such as tensile, impact, and bending properties were observed precisely. Maximum tensile strength (TS) and bending strength (BS) were found for 50 wt% fiber composites, 65 MPa, and 85.5 MPa respectively, whereas the highest tensile modulus (TM) and bending modulus (BM) was examined, 5.73 GPa and 7.85 GPa respectively. The BF/PP based composites were treated with irradiated under gamma radiation (the source strength 50 kCi Cobalt-60) of various doses (i.e. 10, 20, 30, 40, 50 and 60 kGy doses). The effect of gamma radiation on the composites was also investigated, and it found that the effect of 30.0 kGy (i.e. units for radiation measurement is 'gray', kGy=kilogray) gamma dose showed better mechanical properties than other doses. After flexural testing, fracture sides of the untreated and treated both composites were studied by scanning electron microscope (SEM). SEM results of the treated BF/PP based composites showed better fiber-matrix adhesion and interfacial bonding than untreated BF/PP based composites. Water uptake and soil degradation tests of untreated and treated composites were also investigated.

Keywords: bamboo fiber, polypropylene, compression molding technique, gamma radiation, mechanical properties, scanning electron microscope

Procedia PDF Downloads 107
68 Case Study on Innovative Aquatic-Based Bioeconomy for Chlorella sorokiniana

Authors: Iryna Atamaniuk, Hannah Boysen, Nils Wieczorek, Natalia Politaeva, Iuliia Bazarnova, Kerstin Kuchta

Abstract:

Over the last decade due to climate change and a strategy of natural resources preservation, the interest for the aquatic biomass has dramatically increased. Along with mitigation of the environmental pressure and connection of waste streams (including CO2 and heat emissions), microalgae bioeconomy can supply food, feed, as well as the pharmaceutical and power industry with number of value-added products. Furthermore, in comparison to conventional biomass, microalgae can be cultivated in wide range of conditions without compromising food and feed production, thus addressing issues associated with negative social and the environmental impacts. This paper presents the state-of-the art technology for microalgae bioeconomy from cultivation process to production of valuable components and by-streams. Microalgae Chlorella sorokiniana were cultivated in the pilot-scale innovation concept in Hamburg (Germany) using different systems such as race way pond (5000 L) and flat panel reactors (8 x 180 L). In order to achieve the optimum growth conditions along with suitable cellular composition for the further extraction of the value-added components, process parameters such as light intensity, temperature and pH are continuously being monitored. On the other hand, metabolic needs in nutrients were provided by addition of micro- and macro-nutrients into a medium to ensure autotrophic growth conditions of microalgae. The cultivation was further followed by downstream process and extraction of lipids, proteins and saccharides. Lipids extraction is conducted in repeated-batch semi-automatic mode using hot extraction method according to Randall. As solvents hexane and ethanol are used at different ratio of 9:1 and 1:9, respectively. Depending on cell disruption method along with solvents ratio, the total lipids content showed significant variations between 8.1% and 13.9 %. The highest percentage of extracted biomass was reached with a sample pretreated with microwave digestion using 90% of hexane and 10% of ethanol as solvents. Proteins content in microalgae was determined by two different methods, namely: Total Kejadahl Nitrogen (TKN), which further was converted to protein content, as well as Bradford method using Brilliant Blue G-250 dye. Obtained results, showed a good correlation between both methods with protein content being in the range of 39.8–47.1%. Characterization of neutral and acid saccharides from microalgae was conducted by phenol-sulfuric acid method at two wavelengths of 480 nm and 490 nm. The average concentration of neutral and acid saccharides under the optimal cultivation conditions was 19.5% and 26.1%, respectively. Subsequently, biomass residues are used as substrate for anaerobic digestion on the laboratory-scale. The methane concentration, which was measured on the daily bases, showed some variations for different samples after extraction steps but was in the range between 48% and 55%. CO2 which is formed during the fermentation process and after the combustion in the Combined Heat and Power unit can potentially be used within the cultivation process as a carbon source for the photoautotrophic synthesis of biomass.

Keywords: bioeconomy, lipids, microalgae, proteins, saccharides

Procedia PDF Downloads 222
67 Optimal Image Representation for Linear Canonical Transform Multiplexing

Authors: Navdeep Goel, Salvador Gabarda

Abstract:

Digital images are widely used in computer applications. To store or transmit the uncompressed images requires considerable storage capacity and transmission bandwidth. Image compression is a means to perform transmission or storage of visual data in the most economical way. This paper explains about how images can be encoded to be transmitted in a multiplexing time-frequency domain channel. Multiplexing involves packing signals together whose representations are compact in the working domain. In order to optimize transmission resources each 4x4 pixel block of the image is transformed by a suitable polynomial approximation, into a minimal number of coefficients. Less than 4*4 coefficients in one block spares a significant amount of transmitted information, but some information is lost. Different approximations for image transformation have been evaluated as polynomial representation (Vandermonde matrix), least squares + gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev polynomials or singular value decomposition (SVD). Results have been compared in terms of nominal compression rate (NCR), compression ratio (CR) and peak signal-to-noise ratio (PSNR) in order to minimize the error function defined as the difference between the original pixel gray levels and the approximated polynomial output. Polynomial coefficients have been later encoded and handled for generating chirps in a target rate of about two chirps per 4*4 pixel block and then submitted to a transmission multiplexing operation in the time-frequency domain.

Keywords: chirp signals, image multiplexing, image transformation, linear canonical transform, polynomial approximation

Procedia PDF Downloads 393