Search results for: imaging analysis (NMR
27694 Using Photogrammetric Techniques to Map the Mars Surface
Authors: Ahmed Elaksher, Islam Omar
Abstract:
For many years, Mars surface has been a mystery for scientists. Lately with the help of geospatial data and photogrammetric procedures researchers were able to capture some insights about this planet. Two of the most imperative data sources to explore Mars are the The High Resolution Imaging Science Experiment (HiRISE) and the Mars Orbiter Laser Altimeter (MOLA). HiRISE is one of six science instruments carried by the Mars Reconnaissance Orbiter, launched August 12, 2005, and managed by NASA. The MOLA sensor is a laser altimeter carried by the Mars Global Surveyor (MGS) and launched on November 7, 1996. In this project, we used MOLA-based DEMs to orthorectify HiRISE optical images for generating a more accurate and trustful surface of Mars. The MOLA data was interpolated using the kriging interpolation technique. Corresponding tie points were digitized from both datasets. These points were employed in co-registering both datasets using GIS analysis tools. In this project, we employed three different 3D to 2D transformation models. These are the parallel projection (3D affine) transformation model; the extended parallel projection transformation model; the Direct Linear Transformation (DLT) model. A set of tie-points was digitized from both datasets. These points were split into two sets: Ground Control Points (GCPs), used to evaluate the transformation parameters using least squares adjustment techniques, and check points (ChkPs) to evaluate the computed transformation parameters. Results were evaluated using the RMSEs between the precise horizontal coordinates of the digitized check points and those estimated through the transformation models using the computed transformation parameters. For each set of GCPs, three different configurations of GCPs and check points were tested, and average RMSEs are reported. It was found that for the 2D transformation models, average RMSEs were in the range of five meters. Increasing the number of GCPs from six to ten points improve the accuracy of the results with about two and half meters. Further increasing the number of GCPs didn’t improve the results significantly. Using the 3D to 2D transformation parameters provided three to two meters accuracy. Best results were reported using the DLT transformation model. However, increasing the number of GCPS didn’t have substantial effect. The results support the use of the DLT model as it provides the required accuracy for ASPRS large scale mapping standards. However, well distributed sets of GCPs is a key to provide such accuracy. The model is simple to apply and doesn’t need substantial computations.Keywords: mars, photogrammetry, MOLA, HiRISE
Procedia PDF Downloads 5727693 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis
Authors: N. R. N. Idris, S. Baharom
Abstract:
A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates. On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.Keywords: aggregate data, combined-level data, individual patient data, meta-analysis
Procedia PDF Downloads 37527692 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation
Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong
Abstract:
Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation
Procedia PDF Downloads 19027691 Use of the SWEAT Analysis Approach to Determine the Effectiveness of a School's Implementation of Its Curriculum
Authors: Prakash Singh
Abstract:
The focus of this study is on the use of the SWEAT analysis approach to determine how effectively a school, as an organization, has implemented its curriculum. To gauge the feelings of the teaching staff, unstructured interviews were employed in this study, asking the participants for their ideas and opinions on each of the three identified aspects of the school: instructional materials, media and technology; teachers’ professional competencies; and the curriculum. This investigation was based on the five key components of the SWEAT model: strengths, weaknesses, expectations, abilities, and tensions. The findings of this exploratory study evoke the significance of the SWEAT achievement model as a tool for strategic analysis to be undertaken in any organization. The findings further affirm the usefulness of this analytical tool for human resource development. Employees have expectations, but competency gaps in their professional abilities may hinder them from fulfilling their tasks in terms of their job description. Also, tensions in the working environment can contribute to their experiences of tobephobia (fear of failure). The SWEAT analysis approach detects such shortcomings in any organization and can therefore culminate in the development of programmes to address such concerns. The strategic SWEAT analysis process can provide a clear distinction between success and failure, and between mediocrity and excellence in organizations. However, more research needs to be done on the effectiveness of the SWEAT analysis approach as a strategic analytical tool.Keywords: SWEAT analysis, strategic analysis, tobephobia, competency gaps
Procedia PDF Downloads 50727690 Effect of Infill’s in Influencing the Dynamic Responses of Multistoried Structures
Authors: Rahmathulla Noufal E.
Abstract:
Investigating the dynamic responses of high rise structures under the effect of siesmic ground motion is extremely important for the proper analysis and design of multitoried structures. Since the presence of infilled walls strongly influences the behaviour of frame systems in multistoried buildings, there is an increased need for developing guidelines for the analysis and design of infilled frames under the effect of dynamic loads for safe and proper design of buildings. In this manuscript, we evaluate the natural frequencies and natural periods of single bay single storey frames considering the effect of infill walls by using the Eigen value analysis and validating with SAP 2000 (free vibration analysis). Various parameters obtained from the diagonal strut model followed for the free vibration analysis is then compared with the Finite Element model, where infill is modeled as shell elements (four noded). We also evaluated the effect of various parameters on the natural periods of vibration obtained by free vibration analysis in SAP 2000 comparing them with those obtained by the empirical expressions presented in I.S. 1893(Part I)-2002.Keywords: infilled frame, eigen value analysis, free vibration analysis, diagonal strut model, finite element model, SAP 2000, natural period
Procedia PDF Downloads 33027689 Modeling Taxane-Induced Peripheral Neuropathy Ex Vivo Using Patient-Derived Neurons
Authors: G. Cunningham, E. Cantor, X. Wu, F. Shen, G. Jiang, S. Philips, C. Bales, Y. Xiao, T. R. Cummins, J. C. Fehrenbacher, B. P. Schneider
Abstract:
Background: Taxane-induced peripheral neuropathy (TIPN) is the most devastating survivorship issue for patients receiving therapy. Dose reductions due to TIPN in the curative setting lead to inferior outcomes for African American patients, as prior research has shown that this group is more susceptible to developing severe neuropathy. The mechanistic underpinnings of TIPN, however, have not been entirely elucidated. While it would be appealing to use primary tissue to study the development of TIPN, procuring nerves from patients is not realistically feasible, as nerve biopsies are painful and may result in permanent damage. Therefore, our laboratory has investigated paclitaxel-induced neuronal morphological and molecular changes using an ex vivo model of human-induced pluripotent stem cell (iPSC)-derived neurons. Methods: iPSCs are undifferentiated and endlessly dividing cells that can be generated from a patient’s somatic cells, such as peripheral blood mononuclear cells (PBMCs). We successfully reprogrammed PBMCs into iPSCs using the Erythroid Progenitor Reprograming Kit (STEMCell Technologiesᵀᴹ); pluripotency was verified by flow cytometry analysis. iPSCs were then induced into neurons using a differentiation protocol that bypasses the neural progenitor stage and uses selected small-molecule modulators of key signaling pathways (SMAD, Notch, FGFR1 inhibition, and Wnt activation). Results: Flow cytometry analysis revealed expression of core pluripotency transcription factors Nanog, Oct3/4 and Sox2 in iPSCs overlaps with commercially purchased pluripotent cell line UCSD064i-20-2. Trilineage differentiation of iPSCs was confirmed with immunofluorescent imaging with germ-layer-specific markers; Sox17 and ExoA2 for ectoderm, Nestin, and Pax6 for mesoderm, and Ncam and Brachyury for endoderm. Sensory neuron markers, β-III tubulin, and Peripherin were applied to stain the cells for the maturity of iPSC-derived neurons. Patch-clamp electrophysiology and calcitonin gene-related peptide (CGRP) release data supported the functionality of the induced neurons and provided insight into the timing for which downstream assays could be performed (week 4 post-induction). We have also performed a cell viability assay and fluorescence-activated cell sorting (FACS) using four cell-surface markers (CD184, CD44, CD15, and CD24) to select a neuronal population. At least 70% of the cells were viable in the isolated neuron population. Conclusion: We have found that these iPSC-derived neurons recapitulate mature neuronal phenotypes and demonstrate functionality. Thus, this represents a patient-derived ex vivo neuronal model to investigate the molecular mechanisms of clinical TIPN.Keywords: chemotherapy, iPSC-derived neurons, peripheral neuropathy, taxane, paclitaxel
Procedia PDF Downloads 12227688 Spatio-Temporal Dynamics of Snow Cover and Melt/Freeze Conditions in Indian Himalayas
Authors: Rajashree Bothale, Venkateswara Rao
Abstract:
Indian Himalayas also known as third pole with 0.9 Million SQ km area, contain the largest reserve of ice and snow outside poles and affect global climate and water availability in the perennial rivers. The variations in the extent of snow are indicative of climate change. The snow melt is sensitive to climate change (warming) and also an influencing factor to the climate change. A study of the spatio-temporal dynamics of snow cover and melt/freeze conditions is carried out using space based observations in visible and microwave bands. An analysis period of 2003 to 2015 is selected to identify and map the changes and trend in snow cover using Indian Remote Sensing (IRS) Advanced Wide Field Sensor (AWiFS) and Moderate Resolution Imaging Spectroradiometer(MODIS) data. For mapping of wet snow, microwave data is used, which is sensitive to the presence of liquid water in the snow. The present study uses Ku-band scatterometer data from QuikSCAT and Oceansat satellites. The enhanced resolution images at 2.25 km from the 13.6GHz sensor are used to analyze the backscatter response to dry and wet snow for the period of 2000-2013 using threshold method. The study area is divided into three major river basins namely Brahmaputra, Ganges and Indus which also represent the diversification in Himalayas as the Eastern Himalayas, Central Himalayas and Western Himalayas. Topographic variations across different zones show that a majority of the study area lies in 4000–5500 m elevation range and the maximum percent of high elevated areas (>5500 m) lies in Western Himalayas. The effect of climate change could be seen in the extent of snow cover and also on the melt/freeze status in different parts of Himalayas. Melt onset day increases from east (March11+11) to west (May12+15) with large variation in number of melt days. Western Himalayas has shorter melt duration (120+15) in comparison to Eastern Himalayas (150+16) providing lesser time for melt. Eastern Himalaya glaciers are prone for enhanced melt due to large melt duration. The extent of snow cover coupled with the status of melt/freeze indicating solar radiation can be used as precursor for monsoon prediction.Keywords: Indian Himalaya, Scatterometer, Snow Melt/Freeze, AWiFS, Cryosphere
Procedia PDF Downloads 26027687 Analysis of Expression Data Using Unsupervised Techniques
Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe
Abstract:
his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation
Procedia PDF Downloads 14927686 Geographical Data Visualization Using Video Games Technologies
Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava
Abstract:
In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material
Procedia PDF Downloads 24627685 Unlocking Health Insights: Studying Data for Better Care
Authors: Valentina Marutyan
Abstract:
Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.Keywords: data mining, healthcare, big data, large amounts of data
Procedia PDF Downloads 7627684 Seawater Desalination for Production of Highly Pure Water Using a Hydrophobic PTFE Membrane and Direct Contact Membrane Distillation (DCMD)
Authors: Ahmad Kayvani Fard, Yehia Manawi
Abstract:
Qatar’s primary source of fresh water is through seawater desalination. Amongst the major processes that are commercially available on the market, the most common large scale techniques are Multi-Stage Flash distillation (MSF), Multi Effect distillation (MED), and Reverse Osmosis (RO). Although commonly used, these three processes are highly expensive down to high energy input requirements and high operating costs allied with maintenance and stress induced on the systems in harsh alkaline media. Beside that cost, environmental footprint of these desalination techniques are significant; from damaging marine eco-system, to huge land use, to discharge of tons of GHG and huge carbon footprint. Other less energy consuming techniques based on membrane separation are being sought to reduce both the carbon footprint and operating costs is membrane distillation (MD). Emerged in 1960s, MD is an alternative technology for water desalination attracting more attention since 1980s. MD process involves the evaporation of a hot feed, typically below boiling point of brine at standard conditions, by creating a water vapor pressure difference across the porous, hydrophobic membrane. Main advantages of MD compared to other commercially available technologies (MSF and MED) and specially RO are reduction of membrane and module stress due to absence of trans-membrane pressure, less impact of contaminant fouling on distillate due to transfer of only water vapor, utilization of low grade or waste heat from oil and gas industries to heat up the feed up to required temperature difference across the membrane, superior water quality, and relatively lower capital and operating cost. To achieve the objective of this study, state of the art flat-sheet cross-flow DCMD bench scale unit was designed, commissioned, and tested. The objective of this study is to analyze the characteristics and morphology of the membrane suitable for DCMD through SEM imaging and contact angle measurement and to study the water quality of distillate produced by DCMD bench scale unit. Comparison with available literature data is undertaken where appropriate and laboratory data is used to compare a DCMD distillate quality with that of other desalination techniques and standards. Membrane SEM analysis showed that the PTFE membrane used for the study has contact angle of 127º with highly porous surface supported with less porous and bigger pore size PP membrane. Study on the effect of feed solution (salinity) and temperature on water quality of distillate produced from ICP and IC analysis showed that with any salinity and different feed temperature (up to 70ºC) the electric conductivity of distillate is less than 5 μS/cm with 99.99% salt rejection and proved to be feasible and effective process capable of consistently producing high quality distillate from very high feed salinity solution (i.e. 100000 mg/L TDS) even with substantial quality difference compared to other desalination methods such as RO and MSF.Keywords: membrane distillation, waste heat, seawater desalination, membrane, freshwater, direct contact membrane distillation
Procedia PDF Downloads 22727683 Spatial Organization of Organelles in Living Cells: Insights from Mathematical Modelling
Authors: Congping Lin
Abstract:
Intracellular transport in fungi has a number of important roles in, e.g., filamentous fungal growth and cellular metabolism. Two basic mechanisms for intracellular transport are motor-driven trafficking along microtubules (MTs) and diffusion. Mathematical modelling has been actively developed to understand such intracellular transport and provide unique insight into cellular complexity. Based on live-cell imaging data in Ustilago hyphal cells, probabilistic models have been developed to study mechanism underlying spatial organization of molecular motors and organelles. In particular, anther mechanism - stochastic motility of dynein motors along MTs has been found to contribute to half of its accumulation at hyphal tip in order to support early endosome (EE) recycling. The EE trafficking not only facilitates the directed motion of peroxisomes but also enhances their diffusive motion. Considering the importance of spatial organization of early endosomes in supporting peroxisome movement, computational and experimental approaches have been combined to a whole-cell level. Results from this interdisciplinary study promise insights into requirements for other membrane trafficking systems (e.g., in neurons), but also may inform future 'synthetic biology' studies.Keywords: intracellular transport, stochastic process, molecular motors, spatial organization
Procedia PDF Downloads 13327682 Study and Calibration of Autonomous UAV Systems with Thermal Sensing Allowing Screening of Environmental Concerns
Authors: Raahil Sheikh, Abhishek Maurya, Priya Gujjar, Himanshu Dwivedi, Prathamesh Minde
Abstract:
UAVs have been an initial member of our environment since it's the first used by Austrian warfare in Venice. At that stage, they were just pilotless balloons equipped with bombs to be dropped on enemy territory. Over time, technological advancements allowed UAVs to be controlled remotely or autonomously. This study shall mainly focus on the intensification of pre-existing manual drones equipping them with a variety of sensors and making them autonomous, and capable, and purposing them for a variety of roles, including thermal sensing, data collection, tracking creatures, forest fires, volcano detection, hydrothermal studies, urban heat, Island measurement, and other environmental research. The system can also be used for reconnaissance, research, 3D mapping, and search and rescue missions. This study mainly focuses on automating tedious tasks and reducing human errors as much as possible, reducing deployment time, and increasing the overall efficiency, efficacy, and reliability of the UAVs. Creation of a comprehensive Ground Control System UI (GCS) enabling less trained professionals to be able to use the UAV with maximum potency. With the inclusion of such an autonomous system, artificially intelligent paths and environmental gusts and concerns can be avoided.Keywords: UAV, drone, autonomous system, thermal imaging
Procedia PDF Downloads 7527681 Stress Analysis of Buried Pipes from Soil and Traffic Loads
Authors: A. Mohamed, A. El-Hamalawi, M. Frost, A. Connell
Abstract:
Often design standards do not provide guidance or formulae for the calculation of stresses on buried pipelines caused by external loads. Frequently engineers rely on other methods and published sources of information to calculate such imposed stresses and a variety of methods can be used. This paper reviews three current approaches to soil pipeline interaction modelling to predict stresses on buried pipelines subjected to soil overburden and traffic loading. The traditional approach to use empirical stress formulas to calculate circumferential bending stresses on pipelines. The alternative approaches considered are the use of a finite element package to compute an estimate of circumferential bending stress and a proprietary stress analysis system (SURFLOAD) to estimate the circumferential bending stress. The results from analysis using the methods are presented and compared to experimental results in terms of predicted and measured circumferential stresses. This study shows that the approach used to assess externally generated stress is important and can lead to an over-conservative analysis. Using FE analysis either through SURFLOAD or a general FE package to predict circumferential stress is the most accurate way to undertake stress analysis due to traffic and soil loads. Although conservative, classical empirical methods will continue to be applied to the analysis of buried pipelines, an opportunity exists, therefore, in many circumstances, to use applied numerical techniques, made possible by advances in finite element analysis.Keywords: buried pipelines, circumferential bending stress, finite element analysis, soil overburden, soil pipeline interaction analysis (SPIA), traffic loadings
Procedia PDF Downloads 44127680 Interaction with Earth’s Surface in Remote Sensing
Authors: Spoorthi Sripad
Abstract:
Remote sensing is a powerful tool for acquiring information about the Earth's surface without direct contact, relying on the interaction of electromagnetic radiation with various materials and features. This paper explores the fundamental principle of "Interaction with Earth's Surface" in remote sensing, shedding light on the intricate processes that occur when electromagnetic waves encounter different surfaces. The absorption, reflection, and transmission of radiation generate distinct spectral signatures, allowing for the identification and classification of surface materials. The paper delves into the significance of the visible, infrared, and thermal infrared regions of the electromagnetic spectrum, highlighting how their unique interactions contribute to a wealth of applications, from land cover classification to environmental monitoring. The discussion encompasses the types of sensors and platforms used to capture these interactions, including multispectral and hyperspectral imaging systems. By examining real-world applications, such as land cover classification and environmental monitoring, the paper underscores the critical role of understanding the interaction with the Earth's surface for accurate and meaningful interpretation of remote sensing data.Keywords: remote sensing, earth's surface interaction, electromagnetic radiation, spectral signatures, land cover classification, archeology and cultural heritage preservation
Procedia PDF Downloads 5927679 The Effectiveness of Communication Skills Using Transactional Analysis on the Dimensions of Marital Intimacy: An Experimental Study
Authors: Mehravar Javid, James Sexton, S. Taridashti, Joseph Dorer
Abstract:
Objective: Intimacy is among the most important factors in marital relationships and includes different aspects. Communication skills can enable couples to promote their intimacy. This experimental study was conducted to measure the effectiveness of communication skills using Transactional Analysis (TA) on various dimensions of marital intimacy. Method: The participants in this study were female teachers. Analysis of covariance was recruited in the experimental group (n =15) and control group (n =15) with pre-test and post-test. Random assignment was applied. The experimental group received the Transactional Analysis training program for 9 sessions of 2 hours each week. The instrument was the Marital Intimacy Questionnaire, with 87 items and 9 subscales. Result: The findings suggest that training in Transactional Analysis significantly increased the total score of intimacy except spiritual intimacy on the post-test. Discussion: According to the obtained data, it is concluded that communication skills using Transactional Analysis (TA) training could increase intimacy and improve marital relationships. The study highlights the differential effects on emotional, rational, sexual, and psychological intimacy compared to physical, social/recreational, and relational intimacy over a 9-week period.Keywords: communication skills, intimacy, marital relationships, transactional analysis
Procedia PDF Downloads 9527678 A Critical Discourse Analysis of President Muhammad Buhari's Speeches
Authors: Joy Aworo-Okoroh
Abstract:
Politics is about trust and trust is challenged by the speaker’s ability to manipulate language before the electorate. Critical discourse analysis investigates the role of language in constructing social relationships between a political speaker and his audience. This paper explores the linguistic choices made by President Muhammad Buhari that enshrines his ideologies as well as the socio-political relations of power between him and Nigerians in his speeches. Two speeches of President Buhari –inaugural and Independence Day speeches are analyzed using Norman Fairclough’s perspective on Halliday’s Systemic functional grammar. The analysis is at two levels. The first level of analysis is the identification of transitivity and modality choices in the speeches and how they reveal the covert ideologies. The second analysis is premised on Normal Fairclough’s model, the clauses are analyzed to identify elements of power, hesistation, persuasion, threat and religious statement. It was discovered that Buhari is a dominant character who manipulates the material processes a lot.Keywords: politics, critical discourse analysis, Norman Fairclough, systemic functional grammar
Procedia PDF Downloads 55127677 A Novel Paradigm in the Management of Pancreatic Trauma
Authors: E. Tan, O. McKay, T. Clarnette T., D. Croagh
Abstract:
Background: Historically with pancreatic trauma, complete disruption of the main pancreatic duct (MPD), classified as Grade IV-V by the American Association for the Surgery of Trauma (AAST), necessitated a damage-control laparotomy. This was to avoid mortality, shorten diet upgrade timeframe, and hence shorter length of stay. However, acute pancreatic resection entailed complications of pancreatic fistulas and leaks. With the advance of imaging-guided interventions, non-operative management such as percutaneous and transpapillary drainage of traumatic peripancreatic collections have been trialled favourably. The aim of this case series is to evaluate the efficacy of endoscopic ultrasound-guided (EUS) transmural drainage in managing traumatic peripancreatic collections as a less invasive alternative to traditional approaches. This study also highlights the importance of anatomical knowledge regarding peripancreatic collection’s common location in the lesser sac, the pancreas relationship to adjacent organs, and the formation of the main pancreatic duct in regards to the feasibility of therapeutic internal drainage. Methodology: A retrospective case series was conducted at a single tertiary endoscopy unit, analysing patient data over a 5-year period. Inclusion criteria outlined patients age 5 to 80-years-old, traumatic pancreatic injury of at least Grade IV and haemodynamic stability. Exclusion criteria involved previous episodes of pancreatitis or abdominal trauma. Patient demographics and clinicopathological characteristics were retrospectively collected. Results: The study identified 7 patients with traumatic pancreatic injuries that were managed from 2018-2022; age ranging from 5 to 34 years old, with majority being female (n=5). Majority of the mechanisms of trauma were a handlebar injury (n=4). Diagnosis was confirmed with an elevated lipase and computerized tomotography (CT) confirmation of proximal pancreatic transection with MPD disruption. All patients sustained an isolated single organ grade IV pancreatic injury, except case 4 and 5 with other intra-abdominal visceral Grade 1 injuries. 6 patients underwent early ERCP-guided transpapillary drainage with 1 being unsuccessful for pancreatic duct stent insertion (case 1) and 1 complication of stent migration (case 2). Surveillance imaging post ERCP showed the stents were unable to bridge the disrupted duct and development of symptomatic collections with an average size of 9.9cm. Hence, all patients proceeded to EUS-guided transmural drainage, with 2/7 patients requiring repeat drainages (case 6 and 7). Majority (n=6) had a cystogastrostomy, whilst 1 (case 6) had a cystoenterostomy due to feasibility of the peripancreatic collection being adjacent to duodenum rather than stomach. However, case 6 subsequently required repeat EUS-guided drainage with cystogastrostomy for ongoing collections. Hence all patients avoided initial laparotomy with an average index length of stay of 11.7 days. Successful transmural drainage was demonstrated, with no long-term complications of pancreatic insufficiency; except for 1 patient requiring a distal pancreatectomy at 2 year follow-up due to chronic pain. Conclusion: The early results of this series support EUS-guided transmural drainage as a viable management option for traumatic peripancreatic collections, showcasing successful outcomes, minimal complications, and long-term efficacy in avoiding surgical interventions. More studies are required before the adoption of this procedure as a less invasive and complication-prone management approach for traumatic peripancreatic collections.Keywords: endoscopic ultrasound, cystogastrostomy, pancreatic trauma, traumatic peripancreatic collection, transmural drainage
Procedia PDF Downloads 4727676 Study and Calibration of Autonomous UAV Systems With Thermal Sensing With Multi-purpose Roles
Authors: Raahil Sheikh, Prathamesh Minde, Priya Gujjar, Himanshu Dwivedi, Abhishek Maurya
Abstract:
UAVs have been an initial member of our environment since it's the first used by Austrian warfare in Venice. At that stage, they were just pilotless balloons equipped with bombs to be dropped on enemy territory. Over time, technological advancements allowed UAVs to be controlled remotely or autonomously. This study shall mainly focus on the intensification of pre-existing manual drones equipping them with a variety of sensors and making them autonomous, and capable, and purposing them for a variety of roles, including thermal sensing, data collection, tracking creatures, forest fires, volcano detection, hydrothermal studies, urban heat, Island measurement, and other environmental research. The system can also be used for reconnaissance, research, 3D mapping, and search and rescue missions. This study mainly focuses on automating tedious tasks and reducing human errors as much as possible, reducing deployment time, and increasing the overall efficiency, efficacy, and reliability of the UAVs. Creation of a comprehensive Ground Control System UI (GCS) enabling less trained professionals to be able to use the UAV with maximum potency. With the inclusion of such an autonomous system, artificially intelligent paths and environmental gusts and concerns can be avoidedKeywords: UAV, autonomous systems, drones, geo thermal imaging
Procedia PDF Downloads 8527675 Increasing the Frequency of Laser Impulses with Optical Choppers with Rotational Shafts
Authors: Virgil-Florin Duma, Dorin Demian
Abstract:
Optical choppers are among the most common optomechatronic devices, utilized in numerous applications, from radiometry to telescopes and biomedical imaging. The classical configuration has a rotational disk with windows with linear margins. This research points out the laser signals that can be obtained with these classical choppers, as well as with another, novel, patented configuration, of eclipse choppers (i.e., with rotational disks with windows with non-linear margins, oriented outwards or inwards). Approximately triangular laser signals can be obtained with eclipse choppers, in contrast to the approximately sinusoidal – with classical devices. The main topic of this work refers to another, novel device, of choppers with shafts of different shapes and with slits of various profiles (patent pending). A significant improvement which can be obtained (with regard to disk choppers) refers to the chop frequencies of the laser signals. Thus, while 1 kHz is their typical limit for disk choppers, with choppers with shafts, a more than 20 times increase in the chop frequency can be obtained with choppers with shafts. Their transmission functions are also discussed, for different types of laser beams. Acknowledgments: This research is supported by the Romanian National Authority for Scientific Research, through the project PN-III-P2-2.1-BG-2016-0297.Keywords: laser signals, laser systems, optical choppers, optomechatronics, transfer functions, eclipse choppers, choppers with shafts
Procedia PDF Downloads 19127674 Study the Effect of Tolerances for Press Tool Assembly: Computer Aided Tolerance Analysis
Authors: Subodh Kumar, Ramkisan Pawar, Gopal D. Belurkar
Abstract:
This paper describes a study for simple blanking tool. In blanking or piercing operation, punch and die should be concentric for proper cutting. In this study, tolerance analysis method is used to analyze the variation in the press tool assembly. Variation results into the eccentricity in between die and punch due to cumulative tolerance of parts used in assembly. 1D variation analysis were performed by CREO parametric computer aided design (CAD) Software Powered by CETOL 6σ computer aided tolerance analysis software. Use of CAD analysis software given the opportunity to find out the cause of variation in tool assembly. Accordingly, the new specification of tolerance and process setting for die set manufacturing has determined. Tolerance allocation and tolerance analysis method were performed iteratively to conclude that position tolerance as well as size tolerance of hole in top plate for bush and size tolerance of guide pillar were more responsible for eccentricity in punch and die. This work proposes optimum tolerance for press tool assembly parts to achieve 100 % yield for specified .015mm minimum tolerance zone.Keywords: blanking, GD&T (Geometric Dimension and Tolerancing), DPMU (defects per million unit), press tool, stackup analysis, tolerance allocation, yield percentage
Procedia PDF Downloads 36127673 Study of Aerosol Deposition and Shielding Effects on Fluorescent Imaging Quantitative Evaluation in Protective Equipment Validation
Authors: Shinhao Yang, Hsiao-Chien Huang, Chin-Hsiang Luo
Abstract:
The leakage of protective clothing is an important issue in the occupational health field. There is no quantitative method for measuring the leakage of personal protective equipment. This work aims to measure the quantitative leakage of the personal protective equipment by using the fluorochrome aerosol tracer. The fluorescent aerosols were employed as airborne particulates in a controlled chamber with ultraviolet (UV) light-detectable stickers. After an exposure-and-leakage test, the protective equipment was removed and photographed with UV-scanning to evaluate areas, color depth ratio, and aerosol deposition and shielding effects of the areas where fluorescent aerosols had adhered to the body through the protective equipment. Thus, this work built a calculation software for quantitative leakage ratio of protective clothing based on fluorescent illumination depth/aerosol concentration ratio, illumination/Fa ratio, aerosol deposition and shielding effects, and the leakage area ratio on the segmentation. The results indicated that the two-repetition total leakage rate of the X, Y, and Z type protective clothing for subject T were about 3.05, 4.21, and 3.52 (mg/m2). For five-repetition, the leakage rate of T were about 4.12, 4.52, and 5.11 (mg/m2).Keywords: fluorochrome, deposition, shielding effects, digital image processing, leakage ratio, personal protective equipment
Procedia PDF Downloads 32227672 Investors' Ratio Analysis and the Profitability of Listed Firms: Evidence from Nigeria
Authors: Abisola Akinola, Akinsulere Femi
Abstract:
The stock market has continually been a source of economic development in most developing countries. This study examined the relationship between investors’ ratio analysis and profitability of quoted companies in Nigeria using secondary data obtained from the annual reports of forty-two (42) companies. The study employed the multiple regression technique to analyze the relationship between investors’ ratio analysis (measured by dividend per share and earning per share) and profitability (measured by the return on equity). The results from the analysis show that investors’ ratio analysis, when measured by earnings per share, have a positive and significant impact on profitability. However, the study noted that investors’ ratio analysis, when measured by dividend per share, tend to have a positive impact on profitability but it is statistically insignificant. By implication, investors and other stakeholders that are interested in investing in stocks can predict the earning capacity of listed firms in the stock market.Keywords: dividend per share, earnings per share, profitability, return on equity
Procedia PDF Downloads 13727671 Statistical Analysis of Natural Images after Applying ICA and ISA
Authors: Peyman Sheikholharam Mashhadi
Abstract:
Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images
Procedia PDF Downloads 33927670 Surge Analysis of Water Transmission Mains in Una, Himachal Pradesh, India
Authors: Baldev Setia, Raj Rajeshwari, Maneesh Kumar
Abstract:
Present paper is an analysis of water transmission mains failed due to surge analysis by using basic software known as Surge Analysis Program (SAP). It is a real time failure case study of a pipe laid in Una, Himachal Pradesh. The transmission main is a 13 kilometer long pipe with 7.9 kilometers as pumping main and 5.1 kilometers as gravitational main. The analysis deals with mainly pumping mains. The results are available in two text files. Besides, several files are prepared with specific view to obtain results in a graphical form. These results help to observe the pressure difference and surge occurrence at different locations along the pipe profile, which help to redesign the transmission main with different but suitable safety measures against possible surge. A technically viable and economically feasible design has been provided as per the relevant manual and standard code of practice.Keywords: surge, water hammer, transmission mains, SAP 2000
Procedia PDF Downloads 36627669 Measurement of Nasal Septal Cartilage in Adult Filipinos Using Computed Tomography
Authors: Miguel Limbert Ramos, Joseph Amado Galvez
Abstract:
Background: The nasal septal cartilage is an autologous graft that is widely used in different otolaryngologic procedures of the different subspecialties, such as in septorhinoplasty and ear rehabilitation procedures. The cartilage can be easily accessed and harvested to be utilized for such procedures. However, the dimension of the nasal septal cartilage differs, corresponding to race, gender, and age. Measurements can be done via direct measurement of harvested septal cartilage in cadavers or utilizing radiographic imaging studies giving baseline measurement of the nasal septal cartilage distinct to every race. A preliminary baseline measurement of the dimensions of Filipino nasal septal cartilage was previously established by measuring harvested nasal septal cartilage in Filipino Malay cadavers. This study intends to reinforce this baseline measurement by utilizing computed tomography (CT) scans of adult Filipinos in a tertiary government hospital in the City of Manila, Philippines, which will cover a larger sampling population. Methods: The unit of observation and analysis will be the computed tomography (CT) scans of patients ≥ 18years old who underwent cranial, facial, orbital, paranasal sinus, and temporal bone studies for the year 2019. The measurements will be done in a generated best midsagittal image (155 subjects) which is a view through the midline of the cerebrum that is simultaneously viewed with its coronal and axial views for proper orientation. The view should reveal important structures that will be used to plot the anatomic boundaries, which will be measured by a DICOM image viewing software (RadiAnt). The measured area of nasal septal cartilage will be compared by gender and age. Results: The total area of the nasal septal cartilage is larger in males compared to females, with a mean value of 6.52 cm² and 5.71 cm², respectively. The harvestable nasal septal cartilage area is also larger in males with a mean value of 3.57 cm² compared to females with only a measured mean value of 3.13 cm². The total and harvestable area of the nasal septal cartilage is largest in the 18-30 year-old age group with a mean value of 6.47 cm² and 3.60 cm² respectively and tends to decrease with the advancement of age, which can be attributed to continuous ossification changes. Conclusion: The best time to perform septorhinoplasty and other otolaryngologic procedures which utilize the nasal septal cartilage as graft material is during post-pubertal age, hence surgeries should be avoided or delayed to allow growth and maturation of the cartilage. A computed tomography scan is a cost-effective and non-invasive tool that can provide information on septal cartilage areas prior to these procedures.Keywords: autologous graft, computed tomography, nasal septal cartilage, septorhinoplasty
Procedia PDF Downloads 15827668 Viscoelastic Behaviour of Hyaluronic Acid Copolymers
Authors: Loredana Elena Nita, Maria Bercea, Aurica P. Chiriac, Iordana Neamtu
Abstract:
The paper is devoted to the behavior of gels based on poly(itaconic anhydride-co-3, 9-divinyl-2, 4, 8, 10-tetraoxaspiro (5.5) undecane) copolymers, with different ratio between the comonomers, and hyaluronic acid (HA). The gel formation was investigated by small-amplitude oscillatory shear measurements following the viscoelastic behavior as a function of gel composition, temperature and shear conditions. Hyaluronic acid was investigated in the same conditions and its rheological behavior is typical to viscous fluids. In the case of the copolymers, the ratio between the two comonomers influences the viscoelastic behavior, a higher content of itaconic anhydride favoring the gel formation. Also, the sol-gel transition was evaluated according to Winter-Chambon criterion that identifies the gelation point when the viscoelastic moduli (G’ and G”) behave similarly as a function of oscillation frequency. From rheological measurements, an optimum composition was evidenced for which the system presents a typical gel-like behavior at 37 °C: the elastic modulus is higher than the viscous modulus and they are not dependent on the oscillation frequency. The formation of the 3D macroporous network was also evidenced by FTIR spectra, SEM microscopy and chemical imaging. These hydrogels present a high potential as drug delivery systems.Keywords: copolymer, viscoelasticity, gelation, 3D network
Procedia PDF Downloads 28727667 Satellite Interferometric Investigations of Subsidence Events Associated with Groundwater Extraction in Sao Paulo, Brazil
Authors: B. Mendonça, D. Sandwell
Abstract:
The Metropolitan Region of Sao Paulo (MRSP) has suffered from serious water scarcity. Consequently, the most convenient solution has been building wells to extract groundwater from local aquifers. However, it requires constant vigilance to prevent over extraction and future events that can pose serious threat to the population, such as subsidence. Radar imaging techniques (InSAR) have allowed continuous investigation of such phenomena. The analysis of data in the present study consists of 23 SAR images dated from October 2007 to March 2011, obtained by the ALOS-1 spacecraft. Data processing was made with the software GMTSAR, by using the InSAR technique to create pairs of interferograms with ground displacement during different time spans. First results show a correlation between the location of 102 wells registered in 2009 and signals of ground displacement equal or lower than -90 millimeters (mm) in the region. The longest time span interferogram obtained dates from October 2007 to March 2010. As a result, from that interferogram, it was possible to detect the average velocity of displacement in millimeters per year (mm/y), and which areas strong signals have persisted in the MRSP. Four specific areas with signals of subsidence of 28 mm/y to 40 mm/y were chosen to investigate the phenomenon: Guarulhos (Sao Paulo International Airport), the Greater Sao Paulo, Itaquera and Sao Caetano do Sul. The coverage area of the signals was between 0.6 km and 1.65 km of length. All areas are located above a sedimentary type of aquifer. Itaquera and Sao Caetano do Sul showed signals varying from 28 mm/y to 32 mm/y. On the other hand, the places most likely to be suffering from stronger subsidence are the ones in the Greater Sao Paulo and Guarulhos, right beside the International Airport of Sao Paulo. The rate of displacement observed in both regions goes from 35 mm/y to 40 mm/y. Previous investigations of the water use at the International Airport highlight the risks of excessive water extraction that was being done through 9 deep wells. Therefore, it is affirmed that subsidence events are likely to occur and to cause serious damage in the area. This study could show a situation that has not been explored with proper importance in the city, given its social and economic consequences. Since the data were only available until 2011, the question that remains is if the situation still persists. It could be reaffirmed, however, a scenario of risk at the International Airport of Sao Paulo that needs further investigation.Keywords: ground subsidence, Interferometric Satellite Aperture Radar (InSAR), metropolitan region of Sao Paulo, water extraction
Procedia PDF Downloads 35427666 Surge Analysis of Water Transmission Mains in Una, Himachal Pradesh (India)
Authors: Baldev Setia, Raj Rajeshwari, Maneesh Kumar
Abstract:
Present paper is an analysis of water transmission mains failed due to surge analysis by using basic software known as Surge Analysis Program (SAP). It is a real time failure case study of a pipe laid in Una, Himachal Pradesh. The transmission main is a 13 kilometres long pipe with 7.9 kilometres as pumping main and 5.1 kilometres as gravitational main. The analysis deals with mainly pumping mains. The results are available in two text files. Besides, several files are prepared with specific view to obtain results in a graphical form. These results help to observe the pressure difference and surge occurrence at different locations along the pipe profile, which help to redesign the transmission main with different but suitable safety measures against possible surge. A technically viable and economically feasible design has been provided as per the relevant manual and standard code of practice.Keywords: surge, water hammer, transmission mains, SAP 2000
Procedia PDF Downloads 40327665 Using Structured Analysis and Design Technique Method for Unmanned Aerial Vehicle Components
Authors: Najeh Lakhoua
Abstract:
Introduction: Scientific developments and techniques for the systemic approach generate several names to the systemic approach: systems analysis, systems analysis, structural analysis. The main purpose of these reflections is to find a multi-disciplinary approach which organizes knowledge, creates universal language design and controls complex sets. In fact, system analysis is structured sequentially by steps: the observation of the system by various observers in various aspects, the analysis of interactions and regulatory chains, the modeling that takes into account the evolution of the system, the simulation and the real tests in order to obtain the consensus. Thus the system approach allows two types of analysis according to the structure and the function of the system. The purpose of this paper is to present an application of system analysis of Unmanned Aerial Vehicle (UAV) components in order to represent the architecture of this system. Method: There are various analysis methods which are proposed, in the literature, in to carry out actions of global analysis and different points of view as SADT method (Structured Analysis and Design Technique), Petri Network. The methodology adopted in order to contribute to the system analysis of an Unmanned Aerial Vehicle has been proposed in this paper and it is based on the use of SADT. In fact, we present a functional analysis based on the SADT method of UAV components Body, power supply and platform, computing, sensors, actuators, software, loop principles, flight controls and communications). Results: In this part, we present the application of SADT method for the functional analysis of the UAV components. This SADT model will be composed exclusively of actigrams. It starts with the main function ‘To analysis of the UAV components’. Then, this function is broken into sub-functions and this process is developed until the last decomposition level has been reached (levels A1, A2, A3 and A4). Recall that SADT techniques are semi-formal; however, for the same subject, different correct models can be built without having to know with certitude which model is the good or, at least, the best. In fact, this kind of model allows users a sufficient freedom in its construction and so the subjective factor introduces a supplementary dimension for its validation. That is why the validation step on the whole necessitates the confrontation of different points of views. Conclusion: In this paper, we presented an application of system analysis of Unmanned Aerial Vehicle components. In fact, this application of system analysis is based on SADT method (Structured Analysis Design Technique). This functional analysis proved the useful use of SADT method and its ability of describing complex dynamic systems.Keywords: system analysis, unmanned aerial vehicle, functional analysis, architecture
Procedia PDF Downloads 204