Search results for: high resolution satellite image
22702 Nutritional Status and Body Image Perception among Thai Adolescents
Authors: Nareemarn Neelapaichit, Sookfong Wongsathapat, Noppawan Piaseu
Abstract:
Body image plays an important role in adolescents. Thai adolescents put high concern on their body image result in unsatisfied their body shapes. Therefore, inappropriate weight management methods have been used. This study examined the body image perception and the nutritional status of Thai adolescents. Body mass index screening was done on 181 nursing students of Ramathibodi School of Nursing to categorized obesity, overweight, normal weight and underweight respondents by using recommended body-mass index (BMI) cut-off points for Asian populations. Self report questionnaire on demographics and body image perception were completed. Results showed that the respondents were mainly female (93.4%) and their mean age were 19.2 years. The prevalence of obesity, overweight, normal weight and underweight of the nursing students were 5.5%, 7.2%, 55.2% and 32.0%, respectively. Of all the respondents, 57.5% correctly perceived themselves, with 37.0% overestimating and 5.5% underestimating their weight status. Of those in the obesity category, 20.0% correctly perceived themselves and 80.0% perceived themselves as overweight. For overweight category, total respondents correctly perceived themselves. Fifty two percent of the normal weight respondents perceived themselves as overweight and 2.0% perceived themselves as obesity. Of the underweight respondents, 77.6% correctly perceived themselves and 20.7% perceived themselves as normal weight. These findings show high occurrence of body image misperception among Thai adolescents. Being concerned with this situation can promote adolescents for healthy weight and practice appropriate weight management methods.Keywords: nutritional status, body image perception, Thai adolescents, body-mass index (BMI)
Procedia PDF Downloads 39322701 Effective Corporate Image Management as a Strategy for Enhancing Profitability
Authors: Shola Haruna Adeosun, Ajoke F. Adebiyi
Abstract:
Business organizations in Nigeria have failed to realize the role of a good corporate image policy in business dealings. This is probably because they do not understand the concept of corporate image and the necessary tools for promoting it. Corporate image goes beyond attractive products or rendering quality services, advertising and paying good salary. It pervades every aspect of business concern, from the least worker’s personality to the dealings within the organization and with the large society. In the face of the societal dynamics, especially in the business world, brought by technology, companies are faced with stiff competition that maintaining a competitive edge requires aggressive strategies. One of such strategies in effective corporate image management is promotion. This study investigates the strategies that could be deployed in order to build and promote the effective corporate image, as well as enhance profit margins of an organization, using Phinomar Nigeria Limited, Ngwo as case study. The study reveals that Phinomar Nigeria Limited has a laid down corporate image policy but not effectively managed; and that, strategies deployed to promote corporate image are limited; while responses to Phinomar products are fairly high. It, therefore, suggests profitable products but requires periodical improvement in the employee's welfare and work environment; as well as, the need to increase the scope of Phinomar’s social responsibility.Keywords: corporate image, effective, enhancing, management, profitability, strategy
Procedia PDF Downloads 31222700 Structural Analysis of Polymer Thin Films at Single Macromolecule Level
Authors: Hiroyuki Aoki, Toru Asada, Tomomi Tanii
Abstract:
The properties of a spin-cast film of a polymer material are different from those in the bulk material because the polymer chains are frozen in an un-equilibrium state due to the rapid evaporation of the solvent. However, there has been little information on the un-equilibrated conformation and dynamics in a spin-cast film at the single chain level. The real-space observation of individual chains would provide direct information to discuss the morphology and dynamics of single polymer chains. The recent development of super-resolution fluorescence microscopy methods allows the conformational analysis of single polymer chain. In the current study, the conformation of a polymer chain in a spin-cast film by the super-resolution microscopy. Poly(methyl methacrylate) (PMMA) with the molecular weight of 2.2 x 10^6 was spin-cast onto a glass substrate from toluene and chloroform. For the super-resolution fluorescence imaging, a small amount of the PMMA labeled by rhodamine spiroamide dye was added. The radius of gyration (Rg) was evaluated from the super-resolution fluorescence image of each PMMA chain. The mean-square-root of Rg was 48.7 and 54.0 nm in the spin-cast films prepared from the toluene and chloroform solutions, respectively. On the other hand, the chain dimension in a bulk state (a thermally annealed 10- μm-thick sample) was observed to be 43.1 nm. This indicates that the PMMA chain in the spin-cast film takes an expanded conformation compared to the unperturbed chain and that the chain dimension is dependent on the solvent quality. In a good solvent, the PMMA chain has an expanded conformation by the excluded volume effect. The polymer chain is frozen before the relaxation from an un-equilibrated expanded conformation to an unperturbed one by the rapid solvent evaporation.Keywords: chain conformation, polymer thin film, spin-coating, super-resolution optical microscopy
Procedia PDF Downloads 28722699 Determination of Unknown Radionuclides Using High Purity Germanium Detectors
Authors: O. G. Onuk, L. S. Taura, C. M. Eze, S. M. Ngaram
Abstract:
The decay chain of radioactive elements in the laboratory and the verification of natural radioactivity of the human body was investigated using the High Purity Germanium (HPGe) detector. Properties of the HPGe detectors were also investigated. The efficiency and energy resolution of HPGe detector used in the laboratory was found to be excellent. The detector was calibrated three times so as to cover a wider energy range. Also the Centroid C of the detector was found to have a linear relationship with the energies of the known gamma-rays. Using the three calibrations of the detector, the energy of an unknown radionuclide was found to follow the decay chain of thorium-232 (232Th) and it was also found that an average adult has about 2.5g Potasium-40 (40K) in the body.Keywords: detector, efficiency, energy, radionuclides, resolution
Procedia PDF Downloads 25022698 Destination Image: A Case Study of International Tourists Who Revisit Thailand
Authors: Aticha Kwaengsopha, Kevin Wongleedee
Abstract:
Destination image can cause an increase and decrease in the growth rate of international tourists visiting Thailand. This paper drew upon data collected from an international tourist survey conducted in Bangkok, Thailand during January to March of 2014. The survey was structured primarily to obtain international tourists’ opinions towards the importance of destination image factors that they encountered during their trip in Thailand. A total of 200 respondents were elicited as data input for mean, SD, and t-test. The findings revealed that the overall level of importance of these factors was not very high. The findings also revealed the three most important factors as tourist experience, interesting place, and pleasing destination. In addition, the result for t-test revealed that there was not much effect from gender differences in opinions of the level concerning importance for destination image factors.Keywords: destination image, international tourists, Thailand, revisit
Procedia PDF Downloads 33722697 Ending Communal Conflicts in Africa: The Relevance of Traditional Approaches to Conflict Resolution
Authors: Kindeye Fenta Mekonnen, Alagaw Ababu Kifle
Abstract:
The failure of international responses to armed conflict to address local preconditions for national stability has recently attracted what has been called the ‘local turn’ in peace building. This ‘local turn’ in peace building amplified a renewed interest in traditional/indigenous methods of conflict resolution, a field that has been hitherto dominated by anthropologists with their focus on the procedures and rituals of such approaches. This notwithstanding, there is still limited empirical work on the relevance of traditional methods of conflict resolution to end localized conflicts vis-à-vis hybrid and modern approaches. The few exceptions to this generally draw their conclusion from very few (almost all successful) cases that make it difficult to judge the validity and cross-case application of their results. This paper seeks to fill these gaps by undertaking a quantitative analysis of the trend and applications of different communal conflict resolution initiatives, their potential to usher in long-term peace, and the extent to which their outcomes are influenced by the intensity and scope of a conflict. The paper makes the following three tentative conclusions. First, traditional mechanisms and traditional actors still dominate the communal conflict resolution landscape, either individually or in combination with other methods. Second, traditional mechanisms of conflict resolution tend to be more successful in ending a conflict and preventing its re-occurrence compared to hybrid and modern arrangements. This notwithstanding and probably due to the scholarly call for local turn in peace building, contemporary communal conflict resolution approaches are becoming less and less reliant on traditional mechanisms alone and (therefore) less effective. Third, there is yet inconclusive evidence on whether hybridization is an asset or a liability in the resolution of communal conflicts and the extent to which this might be mediated by the intensity of a conflict.Keywords: traditional conflict resolution, hybrid conflict resolution, communal conflict, relevance, conflict intensity
Procedia PDF Downloads 8222696 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission
Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong
Abstract:
Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU
Procedia PDF Downloads 29022695 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm
Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin
Abstract:
Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform
Procedia PDF Downloads 53322694 Short Arc Technique for Baselines Determinations
Authors: Gamal F.Attia
Abstract:
The baselines are the distances and lengths of the chords between projections of the positions of the laser stations on the reference ellipsoid. For the satellite geodesy, it is very important to determine the optimal length of orbital arc along which laser measurements are to be carried out. It is clear that for the dynamical methods long arcs (one month or more) are to be used. According to which more errors of modeling of different physical forces such as earth's gravitational field, air drag, solar radiation pressure, and others that may influence the accuracy of the estimation of the satellites position, at the same time the measured errors con be almost completely excluded and high stability in determination of relative coordinate system can be achieved. It is possible to diminish the influence of the errors of modeling by using short-arcs of the satellite orbit (several revolutions or days), but the station's coordinates estimated by different arcs con differ from each other by a larger quantity than statistical zero. Under the semidynamical ‘short arc’ method one or several passes of the satellite in one of simultaneous visibility from both ends of the chord is known and the estimated parameter in this case is the length of the chord. The comparison of the same baselines calculated with long and short arcs methods shows a good agreement and even speaks in favor of the last one. In this paper the Short Arc technique has been explained and 3 baselines have been determined using the ‘short arc’ method.Keywords: baselines, short arc, dynamical, gravitational field
Procedia PDF Downloads 46322693 High-Resolution ECG Automated Analysis and Diagnosis
Authors: Ayad Dalloo, Sulaf Dalloo
Abstract:
Electrocardiogram (ECG) recording is prone to complications, on analysis by physicians, due to noise and artifacts, thus creating ambiguity leading to possible error of diagnosis. Such drawbacks may be overcome with the advent of high resolution Methods, such as Discrete Wavelet Analysis and Digital Signal Processing (DSP) techniques. This ECG signal analysis is implemented in three stages: ECG preprocessing, features extraction and classification with the aim of realizing high resolution ECG diagnosis and improved detection of abnormal conditions in the heart. The preprocessing stage involves removing spurious artifacts (noise), due to such factors as muscle contraction, motion, respiration, etc. ECG features are extracted by applying DSP and suggested sloping method techniques. These measured features represent peak amplitude values and intervals of P, Q, R, S, R’, and T waves on ECG, and other features such as ST elevation, QRS width, heart rate, electrical axis, QR and QT intervals. The classification is preformed using these extracted features and the criteria for cardiovascular diseases. The ECG diagnostic system is successfully applied to 12-lead ECG recordings for 12 cases. The system is provided with information to enable it diagnoses 15 different diseases. Physician’s and computer’s diagnoses are compared with 90% agreement, with respect to physician diagnosis, and the time taken for diagnosis is 2 seconds. All of these operations are programmed in Matlab environment.Keywords: ECG diagnostic system, QRS detection, ECG baseline removal, cardiovascular diseases
Procedia PDF Downloads 29722692 Image Classification with Localization Using Convolutional Neural Networks
Authors: Bhuyain Mobarok Hossain
Abstract:
Image classification and localization research is currently an important strategy in the field of computer vision. The evolution and advancement of deep learning and convolutional neural networks (CNN) have greatly improved the capabilities of object detection and image-based classification. Target detection is important to research in the field of computer vision, especially in video surveillance systems. To solve this problem, we will be applying a convolutional neural network of multiple scales at multiple locations in the image in one sliding window. Most translation networks move away from the bounding box around the area of interest. In contrast to this architecture, we consider the problem to be a classification problem where each pixel of the image is a separate section. Image classification is the method of predicting an individual category or specifying by a shoal of data points. Image classification is a part of the classification problem, including any labels throughout the image. The image can be classified as a day or night shot. Or, likewise, images of cars and motorbikes will be automatically placed in their collection. The deep learning of image classification generally includes convolutional layers; the invention of it is referred to as a convolutional neural network (CNN).Keywords: image classification, object detection, localization, particle filter
Procedia PDF Downloads 30522691 Spatial Object-Oriented Template Matching Algorithm Using Normalized Cross-Correlation Criterion for Tracking Aerial Image Scene
Authors: Jigg Pelayo, Ricardo Villar
Abstract:
Leaning on the development of aerial laser scanning in the Philippine geospatial industry, researches about remote sensing and machine vision technology became a trend. Object detection via template matching is one of its application which characterized to be fast and in real time. The paper purposely attempts to provide application for robust pattern matching algorithm based on the normalized cross correlation (NCC) criterion function subjected in Object-based image analysis (OBIA) utilizing high-resolution aerial imagery and low density LiDAR data. The height information from laser scanning provides effective partitioning order, thus improving the hierarchal class feature pattern which allows to skip unnecessary calculation. Since detection is executed in the object-oriented platform, mathematical morphology and multi-level filter algorithms were established to effectively avoid the influence of noise, small distortion and fluctuating image saturation that affect the rate of recognition of features. Furthermore, the scheme is evaluated to recognized the performance in different situations and inspect the computational complexities of the algorithms. Its effectiveness is demonstrated in areas of Misamis Oriental province, achieving an overall accuracy of 91% above. Also, the garnered results portray the potential and efficiency of the implemented algorithm under different lighting conditions.Keywords: algorithm, LiDAR, object recognition, OBIA
Procedia PDF Downloads 24422690 Hyperspectral Image Classification Using Tree Search Algorithm
Authors: Shreya Pare, Parvin Akhter
Abstract:
Remotely sensing image classification becomes a very challenging task owing to the high dimensionality of hyperspectral images. The pixel-wise classification methods fail to take the spatial structure information of an image. Therefore, to improve the performance of classification, spatial information can be integrated into the classification process. In this paper, the multilevel thresholding algorithm based on a modified fuzzy entropy function is used to perform the segmentation of hyperspectral images. The fuzzy parameters of the MFE function have been optimized by using a new meta-heuristic algorithm based on the Tree-Search algorithm. The segmented image is classified by a large distribution machine (LDM) classifier. Experimental results are shown on a hyperspectral image dataset. The experimental outputs indicate that the proposed technique (MFE-TSA-LDM) achieves much higher classification accuracy for hyperspectral images when compared to state-of-art classification techniques. The proposed algorithm provides accurate segmentation and classification maps, thus becoming more suitable for image classification with large spatial structures.Keywords: classification, hyperspectral images, large distribution margin, modified fuzzy entropy function, multilevel thresholding, tree search algorithm, hyperspectral image classification using tree search algorithm
Procedia PDF Downloads 17722689 City Image of Rio De Janeiro as the Host City of 2016 Olympic Games
Authors: Luciana Brandao Ferreira, Janaina de Moura Engracia Giraldi, Fabiana Gondim Mariutti, Marina Toledo de Arruda Lourencao
Abstract:
Developing countries, such as BRICS (Brazil, Russia, India, China and South Africa) are hosting sports mega-events to promote socio-economic development and image enhancement. Thus, this paper aims to verify the image of Rio de Janeiro, in Brazil, as the host city of 2016 Olympic Games, considering the main cognitive and affective image dimensions. The research design uses exploratory factorial analysis to find the most important factors highlighted in the city image dimensions. The data were collected by structured questionnaires with an international respondents sample (n=274) with high international travel experience. The results show that Rio’s image as a sport mega-event host city has two main factors in each dimension: Cognitive ('General Infrastructure'; 'Services and Attractions') and Affective ('Positive Feelings'; 'Negative Feelings'). The most important factor related to cognitive dimension was 'Services and Attractions' which is more related to tourism activities. In the affective dimension 'Positive Feelings' was the most important factor, which means a good result considering that is a city in an emerging country with many unmet social demands.Keywords: Rio de Janeiro, 2016 olympic games, host city image, cognitive image dimension, affective image dimension
Procedia PDF Downloads 14722688 Digital Image Forensics: Discovering the History of Digital Images
Authors: Gurinder Singh, Kulbir Singh
Abstract:
Digital multimedia contents such as image, video, and audio can be tampered easily due to the availability of powerful editing softwares. Multimedia forensics is devoted to analyze these contents by using various digital forensic techniques in order to validate their authenticity. Digital image forensics is dedicated to investigate the reliability of digital images by analyzing the integrity of data and by reconstructing the historical information of an image related to its acquisition phase. In this paper, a survey is carried out on the forgery detection by considering the most recent and promising digital image forensic techniques.Keywords: Computer Forensics, Multimedia Forensics, Image Ballistics, Camera Source Identification, Forgery Detection
Procedia PDF Downloads 24622687 Quantitative Risk Analysis for Major Subsystems and Project Success of a Highthrouput Satellite
Authors: Ibrahim Isa Ali (Pantami), Abdu Jaafaru Bambale, Abimbola Alale, Danjuma Ibrahim Ndihgihdah, Muhammad Alkali, Adamu Idris Umar, Babadoko Dantala Mohammed, Moshood Kareem Olawole
Abstract:
This paper dwells on the risk management required for High throughput Satellite (HTS) project, and major subsystems that pertains to the improved performance and reliability of the spacecraft. The paper gives a clear picture of high‐throughput satellites (HTS) and the associated technologies with performances as they align and differ with the traditional geostationary orbit or Geosynchronous Equatorial Orbit (GEO) Communication Satellites. The paper also highlights critical subsystems and processes in project conceptualization and execution. The paper discusses the configuration of the payload. The need for optimization of resources for the HTS project and successful integration of critical subsystems for spacecraft requires implementation of risk analysis and mitigation from the preliminary design stage; Assembly, Integration and Test (AIT); Launch and in-orbit- Management stage.Keywords: AIT, HTS, in-orbit management, optimization
Procedia PDF Downloads 10322686 Participatory Cartography for Disaster Reduction in Pogreso, Yucatan Mexico
Authors: Gustavo Cruz-Bello
Abstract:
Progreso is a coastal community in Yucatan, Mexico, highly exposed to floods produced by severe storms and tropical cyclones. A participatory cartography approach was conducted to help to reduce floods disasters and assess social vulnerability within the community. The first step was to engage local authorities in risk management to facilitate the process. Two workshop were conducted, in the first, a poster size printed high spatial resolution satellite image of the town was used to gather information from the participants: eight women and seven men, among them construction workers, students, government employees and fishermen, their ages ranged between 23 and 58 years old. For the first task, participants were asked to locate emblematic places and place them in the image to familiarize with it. Then, they were asked to locate areas that get flooded, the buildings that they use as refuges, and to list actions that they usually take to reduce vulnerability, as well as to collectively come up with others that might reduce disasters. The spatial information generated at the workshops was digitized and integrated into a GIS environment. A printed version of the map was reviewed by local risk management experts, who validated feasibility of proposed actions. For the second workshop, we retrieved the information back to the community for feedback. Additionally a survey was applied in one household per block in the community to obtain socioeconomic, prevention and adaptation data. The information generated from the workshops was contrasted, through T and Chi Squared tests, with the survey data in order to probe the hypothesis that poorer or less educated people, are less prepared to face floods (more vulnerable) and live near or among higher presence of floods. Results showed that a great majority of people in the community are aware of the hazard and are prepared to face it. However, there was not a consistent relationship between regularly flooded areas with people’s average years of education, house services, or house modifications against heavy rains to be prepared to hazards. We could say that the participatory cartography intervention made participants aware of their vulnerability and made them collectively reflect about actions that can reduce disasters produced by floods. They also considered that the final map could be used as a communication and negotiation instrument with NGO and government authorities. It was not found that poorer and less educated people are located in areas with higher presence of floods.Keywords: climate change, floods, Mexico, participatory mapping, social vulnerability
Procedia PDF Downloads 11322685 Gray Level Image Encryption
Authors: Roza Afarin, Saeed Mozaffari
Abstract:
The aim of this paper is image encryption using Genetic Algorithm (GA). The proposed encryption method consists of two phases. In modification phase, pixels locations are altered to reduce correlation among adjacent pixels. Then, pixels values are changed in the diffusion phase to encrypt the input image. Both phases are performed by GA with binary chromosomes. For modification phase, these binary patterns are generated by Local Binary Pattern (LBP) operator while for diffusion phase binary chromosomes are obtained by Bit Plane Slicing (BPS). Initial population in GA includes rows and columns of the input image. Instead of subjective selection of parents from this initial population, a random generator with predefined key is utilized. It is necessary to decrypt the coded image and reconstruct the initial input image. Fitness function is defined as average of transition from 0 to 1 in LBP image and histogram uniformity in modification and diffusion phases, respectively. Randomness of the encrypted image is measured by entropy, correlation coefficients and histogram analysis. Experimental results show that the proposed method is fast enough and can be used effectively for image encryption.Keywords: correlation coefficients, genetic algorithm, image encryption, image entropy
Procedia PDF Downloads 33022684 Data Hiding in Gray Image Using ASCII Value and Scanning Technique
Authors: R. K. Pateriya, Jyoti Bharti
Abstract:
This paper presents an approach for data hiding methods which provides a secret communication between sender and receiver. The data is hidden in gray-scale images and the boundary of gray-scale image is used to store the mapping information. In this an approach data is in ASCII format and the mapping is in between ASCII value of hidden message and pixel value of cover image, since pixel value of an image as well as ASCII value is in range of 0 to 255 and this mapping information is occupying only 1 bit per character of hidden message as compared to 8 bit per character thus maintaining good quality of stego image.Keywords: ASCII value, cover image, PSNR, pixel value, stego image, secret message
Procedia PDF Downloads 41322683 In-Flight Radiometric Performances Analysis of an Airborne Optical Payload
Authors: Caixia Gao, Chuanrong Li, Lingli Tang, Lingling Ma, Yaokai Liu, Xinhong Wang, Yongsheng Zhou
Abstract:
Performances analysis of remote sensing sensor is required to pursue a range of scientific research and application objectives. Laboratory analysis of any remote sensing instrument is essential, but not sufficient to establish a valid inflight one. In this study, with the aid of the in situ measurements and corresponding image of three-gray scale permanent artificial target, the in-flight radiometric performances analyses (in-flight radiometric calibration, dynamic range and response linearity, signal-noise-ratio (SNR), radiometric resolution) of self-developed short-wave infrared (SWIR) camera are performed. To acquire the inflight calibration coefficients of the SWIR camera, the at-sensor radiances (Li) for the artificial targets are firstly simulated with in situ measurements (atmosphere parameter and spectral reflectance of the target) and viewing geometries using MODTRAN model. With these radiances and the corresponding digital numbers (DN) in the image, a straight line with a formulation of L = G × DN + B is fitted by a minimization regression method, and the fitted coefficients, G and B, are inflight calibration coefficients. And then the high point (LH) and the low point (LL) of dynamic range can be described as LH= (G × DNH + B) and LL= B, respectively, where DNH is equal to 2n − 1 (n is the quantization number of the payload). Meanwhile, the sensor’s response linearity (δ) is described as the correlation coefficient of the regressed line. The results show that the calibration coefficients (G and B) are 0.0083 W·sr−1m−2µm−1 and −3.5 W·sr−1m−2µm−1; the low point of dynamic range is −3.5 W·sr−1m−2µm−1 and the high point is 30.5 W·sr−1m−2µm−1; the response linearity is approximately 99%. Furthermore, a SNR normalization method is used to assess the sensor’s SNR, and the normalized SNR is about 59.6 when the mean value of radiance is equal to 11.0 W·sr−1m−2µm−1; subsequently, the radiometric resolution is calculated about 0.1845 W•sr-1m-2μm-1. Moreover, in order to validate the result, a comparison of the measured radiance with a radiative-transfer-code-predicted over four portable artificial targets with reflectance of 20%, 30%, 40%, 50% respectively, is performed. It is noted that relative error for the calibration is within 6.6%.Keywords: calibration and validation site, SWIR camera, in-flight radiometric calibration, dynamic range, response linearity
Procedia PDF Downloads 27022682 Image Segmentation Techniques: Review
Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo
Abstract:
Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.Keywords: clustering-based, convolution-network, edge-based, region-growing
Procedia PDF Downloads 9522681 Understanding the Classification of Rain Microstructure and Estimation of Z-R Relationship using a Micro Rain Radar in Tropical Region
Authors: Tomiwa, Akinyemi Clement
Abstract:
Tropical regions experience diverse and complex precipitation patterns, posing significant challenges for accurate rainfall estimation and forecasting. This study addresses the problem of effectively classifying tropical rain types and refining the Z-R (Reflectivity-Rain Rate) relationship to enhance rainfall estimation accuracy. Through a combination of remote sensing, meteorological analysis, and machine learning, the research aims to develop an advanced classification framework capable of distinguishing between different types of tropical rain based on their unique characteristics. This involves utilizing high-resolution satellite imagery, radar data, and atmospheric parameters to categorize precipitation events into distinct classes, providing a comprehensive understanding of tropical rain systems. Additionally, the study seeks to improve the Z-R relationship, a crucial aspect of rainfall estimation. One year of rainfall data was analyzed using a Micro Rain Radar (MRR) located at The Federal University of Technology Akure, Nigeria, measuring rainfall parameters from ground level to a height of 4.8 km with a vertical resolution of 0.16 km. Rain rates were classified into low (stratiform) and high (convective) based on various microstructural attributes such as rain rates, liquid water content, Drop Size Distribution (DSD), average fall speed of the drops, and radar reflectivity. By integrating diverse datasets and employing advanced statistical techniques, the study aims to enhance the precision of Z-R models, offering a more reliable means of estimating rainfall rates from radar reflectivity data. This refined Z-R relationship holds significant potential for improving our understanding of tropical rain systems and enhancing forecasting accuracy in regions prone to heavy precipitation.Keywords: remote sensing, precipitation, drop size distribution, micro rain radar
Procedia PDF Downloads 3322680 Tourism Satellite Account: Approach and Information System Development
Authors: Pappas Theodoros, Mihail Diakomihalis
Abstract:
Measuring the economic impact of tourism in a benchmark economy is a global concern, with previous measurements being partial and not fully integrated. Tourism is a phenomenon that requires individual consumption of visitors and which should be observed and measured to reveal, thus, the overall contribution of tourism to an economy. The Tourism Satellite Account (TSA) is a critical tool for assessing the annual growth of tourism, providing reliable measurements. This article introduces a system of TSA information that encompasses all the works of the TSA, including input, storage, management, and analysis of data, as well as additional future functions and enhances the efficiency of tourism data management and TSA collection utility. The methodology and results presented offer insights into the development and implementation of TSA.Keywords: tourism satellite account, information system, data-based tourist account, relation database
Procedia PDF Downloads 8422679 Empirical Prediction of the Effect of Rain Drops on Dbs System Operating in Ku-Band (Case Study of Abuja)
Authors: Tonga Agadi Danladi, Ajao Wasiu Bamidele, Terdue Dyeko
Abstract:
Recent advancement in microwave communications technologies especially in telecommunications and broadcasting have resulted in congestion on the frequencies below 10GHz. This has forced microwave designers to look for high frequencies. Unfortunately for frequencies greater than 10GHz rain becomes one of the main factors of attenuation in signal strength. At frequencies from 10GHz upwards, rain drop sizes leads to outages that compromises the availability and quality of service this making it a critical factor in satellite link budget design. Rain rate and rain attenuation predictions are vital steps to be considered when designing microwave satellite communication link operating at Ku-band frequencies (112-18GHz). Unreliable rain rates data in the tropical regions of the world like Nigeria from radio communication group of the international Telecommunication Union (ITU-R) makes it difficult for microwave engineers to determine a realistic rain margin that needs to be accommodated in satellite link budget design in such region. This work presents an empirical tool for predicting the amount of signal due to rain on DBS signal operating at the Ku-band.Keywords: attenuation, Ku-Band, microwave communication, rain rates
Procedia PDF Downloads 48422678 Estimating Leaf Area and Biomass of Wheat Using UAS Multispectral Remote Sensing
Authors: Jackson Parker Galvan, Wenxuan Guo
Abstract:
Unmanned aerial vehicle (UAV) technology is being increasingly adopted in high-throughput plant phenotyping for applications in plant breeding and precision agriculture. Winter wheat is an important cover crop for reducing soil erosion and protecting the environment in the Southern High Plains. Efficiently quantifying plant leaf area and biomass provides critical information for producers to practice site-specific management of crop inputs, such as water and fertilizers. The objective of this study was to estimate wheat biomass and leaf area index using UAV images. This study was conducted in an irrigated field in Garza County, Texas. High-resolution images were acquired on three dates (February 18, March 25, and May 15th ) using a multispectral sensor onboard a Matrice 600 UAV. On each data of image acquisition, 10 random plant samples were collected and measured for biomass and leaf area. Images were stitched using Pix4D, and ArcGIS was applied to overlay sampling locations and derive data for sampling locations.Keywords: precision agriculture, UAV plant phenotyping, biomass, leaf area index, winter wheat, southern high plains
Procedia PDF Downloads 9522677 A General Framework for Knowledge Discovery from Echocardiographic and Natural Images
Authors: S. Nandagopalan, N. Pradeep
Abstract:
The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.Keywords: active contour, Bayesian, echocardiographic image, feature vector
Procedia PDF Downloads 44522676 Impact of Data and Model Choices to Urban Flood Risk Assessments
Authors: Abhishek Saha, Serene Tay, Gerard Pijcke
Abstract:
The availability of high-resolution topography and rainfall information in urban areas has made it necessary to revise modeling approaches used for simulating flood risk assessments. Lidar derived elevation models that have 1m or lower resolutions are becoming widely accessible. The classical approaches of 1D-2D flow models where channel flow is simulated and coupled with a coarse resolution 2D overland flow models may not fully utilize the information provided by high-resolution data. In this context, a study was undertaken to compare three different modeling approaches to simulate flooding in an urban area. The first model used is the base model used is Sobek, which uses 1D model formulation together with hydrologic boundary conditions and couples with an overland flow model in 2D. The second model uses a full 2D model for the entire area with shallow water equations at the resolution of the digital elevation model (DEM). These models are compared against another shallow water equation solver in 2D, which uses a subgrid method for grid refinement. These models are simulated for different horizontal resolutions of DEM varying between 1m to 5m. The results show a significant difference in inundation extents and water levels for different DEMs. They are also sensitive to the different numerical models with the same physical parameters, such as friction. The study shows the importance of having reliable field observations of inundation extents and levels before a choice of model and data can be made for spatial flood risk assessments.Keywords: flooding, DEM, shallow water equations, subgrid
Procedia PDF Downloads 14122675 Development of the Analysis and Pretreatment of Brown HT in Foods
Authors: Hee-Jae Suh, Mi-Na Hong, Min-Ji Kim, Yeon-Seong Jeong, Ok-Hwan Lee, Jae-Wook Shin, Hyang-Sook Chun, Chan Lee
Abstract:
Brown HT is a bis-azo dye which is permitted in EU as a food colorant. So far, many studies have focused on HPLC using diode array detection (DAD) analysis for detection of this food colorant with different columns and mobile phases. Even though these methods make it possible to detect Brown HT, low recovery, reproducibility, and linearity are still the major limitations for the application in foods. The purpose of this study was to compare various methods for the analysis of Brown HT and to develop an improved analytical methods including pretreatment. Among tested analysis methods, best resolution of Brown HT was observed when the following solvent was applied as a eluent; solvent A of mobile phase was 0.575g NH4H2PO4, and 0.7g Na2HPO4 in 500mL water added with 500mL methanol. The pH was adjusted using phosphoric acid to pH 6.9 and solvent B was methanol. Major peak for Brown HT appeared at the end of separation, 13.4min after injection. This method exhibited relatively high recovery and reproducibility compared with other methods. LOD (0.284 ppm), LOQ (0.861 ppm), resolution (6.143), and selectivity (1.3) of this method were better than those of ammonium acetate solution method which was most frequently used. Precision and accuracy were verified through inter-day test and intra-day test. Various methods for sample pretreatments were developed for different foods and relatively high recovery over 80% was observed in all case. This method exhibited high resolution and reproducibility of Brown HT compared with other previously reported official methods from FSA and, EU regulation.Keywords: analytic method, Brown HT, food colorants, pretreatment method
Procedia PDF Downloads 47822674 A Single Feature Probability-Object Based Image Analysis for Assessing Urban Landcover Change: A Case Study of Muscat Governorate in Oman
Authors: Salim H. Al Salmani, Kevin Tansey, Mohammed S. Ozigis
Abstract:
The study of the growth of built-up areas and settlement expansion is a major exercise that city managers seek to undertake to establish previous and current developmental trends. This is to ensure that there is an equal match of settlement expansion needs to the appropriate levels of services and infrastructure required. This research aims at demonstrating the potential of satellite image processing technique, harnessing the utility of single feature probability-object based image analysis technique in assessing the urban growth dynamics of the Muscat Governorate in Oman for the period 1990, 2002 and 2013. This need is fueled by the continuous expansion of the Muscat Governorate beyond predicted levels of infrastructural provision. Landsat Images of the years 1990, 2002 and 2013 were downloaded and preprocessed to forestall appropriate radiometric and geometric standards. A novel approach of probability filtering of the target feature segment was implemented to derive the spatial extent of the final Built-Up Area of the Muscat governorate for the three years period. This however proved to be a useful technique as high accuracy assessment results of 55%, 70%, and 71% were recorded for the Urban Landcover of 1990, 2002 and 2013 respectively. Furthermore, the Normalized Differential Built – Up Index for the various images were derived and used to consolidate the results of the SFP-OBIA through a linear regression model and visual comparison. The result obtained showed various hotspots where urbanization have sporadically taken place. Specifically, settlement in the districts (Wilayat) of AL-Amarat, Muscat, and Qurayyat experienced tremendous change between 1990 and 2002, while the districts (Wilayat) of AL-Seeb, Bawshar, and Muttrah experienced more sporadic changes between 2002 and 2013.Keywords: urban growth, single feature probability, object based image analysis, landcover change
Procedia PDF Downloads 27422673 Seismic Inversion to Improve the Reservoir Characterization: Case Study in Central Blue Nile Basin, Sudan
Authors: Safwat E. Musa, Nuha E. Mohamed, Nuha A. Bagi
Abstract:
In this study, several crossplots of the P-impedance with the lithology logs (gamma ray, neutron porosity, deep resistivity, water saturation and Vp/Vs curves) were made in three available wells, which were drilled in central part of the Blue Nile basin in depths varies from 1460 m to 1600 m. These crossplots were successful to discriminate between sand and shale when using P-Impedance values, and between the wet sand and the pay sand when using both P-impedance and Vp/Vs together. Also, some impedance sections were converted to porosity sections using linear formula to characterize the reservoir in terms of porosity. The used crossplots were created on log resolution, while the seismic resolution can identify only the reservoir, unless a 3D seismic angle stacks were available; then it would be easier to identify the pay sand with great confidence; through high resolution seismic inversion and geostatistical approach when using P-impedance and Vp/Vs volumes.Keywords: basin, Blue Nile, inversion, seismic
Procedia PDF Downloads 430