Search results for: synthetic aperture imaging
1656 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees
Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel
Abstract:
Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.Keywords: cloud storage, decision trees, diagnostic image, search, telemedicine
Procedia PDF Downloads 2041655 Tractography Analysis and the Evolutionary Origin of Schizophrenia
Authors: Mouktafi Amine, Tahiri Asmaa
Abstract:
A substantial number of traditional medical research has been put forward to managing and treating mental disorders. At the present time, to our best knowledge, it is believed that a fundamental understanding of the underlying causes of the majority of psychological disorders needs to be explored further to inform early diagnosis, managing symptoms and treatment. The emerging field of evolutionary psychology is a promising prospect to address the origin of mental disorders, potentially leading to more effective treatments. Schizophrenia as a topical mental disorder has been linked to the evolutionary adaptation of the human brain represented in the brain connectivity and asymmetry directly linked to humans' higher brain cognition in contrast to other primates being our direct living representation of the structure and connectivity of our earliest common African ancestors. As proposed in the evolutionary psychology scientific literature, the pathophysiology of schizophrenia is expressed and directly linked to altered connectivity between the Hippocampal Formation (HF) and Dorsolateral Prefrontal Cortex (DLPFC). This research paper presents the results of the use of tractography analysis using multiple open access Diffusion Weighted Imaging (DWI) datasets of healthy subjects, schizophrenia-affected subjects and primates to illustrate the relevance of the aforementioned brain regions' connectivity and the underlying evolutionary changes in the human brain. Deterministic fiber tracking and streamline analysis were used to generate connectivity matrices from the DWI datasets overlaid to compute distances and highlight disconnectivity patterns in conjunction with other fiber tracking metrics: Fractional Anisotropy (FA), Mean Diffusivity (MD) and Radial Diffusivity (RD).Keywords: tractography, diffusion weighted imaging, schizophrenia, evolutionary psychology
Procedia PDF Downloads 521654 Application of a Universal Distortion Correction Method in Stereo-Based Digital Image Correlation Measurement
Authors: Hu Zhenxing, Gao Jianxin
Abstract:
Stereo-based digital image correlation (also referred to as three-dimensional (3D) digital image correlation (DIC)) is a technique for both 3D shape and surface deformation measurement of a component, which has found increasing applications in academia and industries. The accuracy of the reconstructed coordinate depends on many factors such as configuration of the setup, stereo-matching, distortion, etc. Most of these factors have been investigated in literature. For instance, the configuration of a binocular vision system determines the systematic errors. The stereo-matching errors depend on the speckle quality and the matching algorithm, which can only be controlled in a limited range. And the distortion is non-linear particularly in a complex imaging acquisition system. Thus, the distortion correction should be carefully considered. Moreover, the distortion function is difficult to formulate in a complex imaging acquisition system using conventional models in such cases where microscopes and other complex lenses are involved. The errors of the distortion correction will propagate to the reconstructed 3D coordinates. To address the problem, an accurate mapping method based on 2D B-spline functions is proposed in this study. The mapping functions are used to convert the distorted coordinates into an ideal plane without distortions. This approach is suitable for any image acquisition distortion models. It is used as a prior process to convert the distorted coordinate to an ideal position, which enables the camera to conform to the pin-hole model. A procedure of this approach is presented for stereo-based DIC. Using 3D speckle image generation, numerical simulations were carried out to compare the accuracy of both the conventional method and the proposed approach.Keywords: distortion, stereo-based digital image correlation, b-spline, 3D, 2D
Procedia PDF Downloads 5001653 Compact LWIR Borescope Sensor for Thermal Imaging of 2D Surface Temperature in Gas-Turbine Engines
Authors: Andy Zhang, Awnik Roy, Trevor B. Chen, Bibik Oleksandar, Subodh Adhikari, Paul S. Hsu
Abstract:
The durability of a combustor in gas-turbine engines is a strong function of its component temperatures and requires good control of these temperatures. Since the temperature of combustion gases frequently exceeds the melting point of the combustion liner walls, an efficient air-cooling system with optimized flow rates of cooling air is significantly important to elongate the lifetime of liner walls. To determine the effectiveness of the air-cooling system, accurate two-dimensional (2D) surface temperature measurement of combustor liner walls is crucial for advanced engine development. Traditional diagnostic techniques for temperature measurement in this application include the rmocouples, thermal wall paints, pyrometry, and phosphors. They have shown some disadvantages, including being intrusive and affecting local flame/flow dynamics, potential flame quenching, and physical damages to instrumentation due to harsh environments inside the combustor and strong optical interference from strong combustion emission in UV-Mid IR wavelength. To overcome these drawbacks, a compact and small borescope long-wave-infrared (LWIR) sensor is developed to achieve 2D high-spatial resolution, high-fidelity thermal imaging of 2D surface temperature in gas-turbine engines, providing the desired engine component temperature distribution. The compactLWIRborescope sensor makes it feasible to promote the durability of a combustor in gas-turbine engines and, furthermore, to develop more advanced gas-turbine engines.Keywords: borescope, engine, low-wave-infrared, sensor
Procedia PDF Downloads 1371652 Quantum Sieving for Hydrogen Isotope Separation
Authors: Hyunchul Oh
Abstract:
One of the challenges in modern separation science and technology is the separation of hydrogen isotopes mixtures since D2 and H2 consist of almost identical size, shape and thermodynamic properties. Recently, quantum sieving of isotopes by confinement in narrow space has been proposed as an alternative technique. Despite many theoretical suggestions, however, it has been difficult to discover a feasible microporous material up to now. Among various porous materials, the novel class of microporous framework materials (COFs, ZIFs and MOFs) is considered as a promising material class for isotope sieving due to ultra-high porosity and uniform pore size which can be tailored. Hence, we investigate experimentally the fundamental correlation between D2/H2 molar ratio and pore size at optimized operating conditions by using different ultramicroporous frameworks. The D2/H2 molar ratio is strongly depending on pore size, pressure and temperature. An experimentally determined optimum pore diameter for quantum sieving lies between 3.0 and 3.4 Å which can be an important guideline for designing and developing feasible microporous frameworks for isotope separation. Afterwards, we report a novel strategy for efficient hydrogen isotope separation at technologically relevant operating pressure through the development of quantum sieving exploited by the pore aperture engineering. The strategy involves installation of flexible components in the pores of the framework to tune the pore surface.Keywords: gas adsorption, hydrogen isotope, metal organic frameworks(MOFs), quantum sieving
Procedia PDF Downloads 2651651 Diffusion MRI: Clinical Application in Radiotherapy Planning of Intracranial Pathology
Authors: Pomozova Kseniia, Gorlachev Gennadiy, Chernyaev Aleksandr, Golanov Andrey
Abstract:
In clinical practice, and especially in stereotactic radiosurgery planning, the significance of diffusion-weighted imaging (DWI) is growing. This makes the existence of software capable of quickly processing and reliably visualizing diffusion data, as well as equipped with tools for their analysis in terms of different tasks. We are developing the «MRDiffusionImaging» software on the standard C++ language. The subject part has been moved to separate class libraries and can be used on various platforms. The user interface is Windows WPF (Windows Presentation Foundation), which is a technology for managing Windows applications with access to all components of the .NET 5 or .NET Framework platform ecosystem. One of the important features is the use of a declarative markup language, XAML (eXtensible Application Markup Language), with which you can conveniently create, initialize and set properties of objects with hierarchical relationships. Graphics are generated using the DirectX environment. The MRDiffusionImaging software package has been implemented for processing diffusion magnetic resonance imaging (dMRI), which allows loading and viewing images sorted by series. An algorithm for "masking" dMRI series based on T2-weighted images was developed using a deformable surface model to exclude tissues that are not related to the area of interest from the analysis. An algorithm of distortion correction using deformable image registration based on autocorrelation of local structure has been developed. Maximum voxel dimension was 1,03 ± 0,12 mm. In an elementary brain's volume, the diffusion tensor is geometrically interpreted using an ellipsoid, which is an isosurface of the probability density of a molecule's diffusion. For the first time, non-parametric intensity distributions, neighborhood correlations, and inhomogeneities are combined in one segmentation of white matter (WM), grey matter (GM), and cerebrospinal fluid (CSF) algorithm. A tool for calculating the coefficient of average diffusion and fractional anisotropy has been created, on the basis of which it is possible to build quantitative maps for solving various clinical problems. Functionality has been created that allows clustering and segmenting images to individualize the clinical volume of radiation treatment and further assess the response (Median Dice Score = 0.963 ± 0,137). White matter tracts of the brain were visualized using two algorithms: deterministic (fiber assignment by continuous tracking) and probabilistic using the Hough transform. The proposed algorithms test candidate curves in the voxel, assigning to each one a score computed from the diffusion data, and then selects the curves with the highest scores as the potential anatomical connections. White matter fibers were visualized using a Hough transform tractography algorithm. In the context of functional radiosurgery, it is possible to reduce the irradiation volume of the internal capsule receiving 12 Gy from 0,402 cc to 0,254 cc. The «MRDiffusionImaging» will improve the efficiency and accuracy of diagnostics and stereotactic radiotherapy of intracranial pathology. We develop software with integrated, intuitive support for processing, analysis, and inclusion in the process of radiotherapy planning and evaluating its results.Keywords: diffusion-weighted imaging, medical imaging, stereotactic radiosurgery, tractography
Procedia PDF Downloads 851650 A Visualization Classification Method for Identifying the Decayed Citrus Fruit Infected by Fungi Based on Hyperspectral Imaging
Authors: Jiangbo Li, Wenqian Huang
Abstract:
Early detection of fungal infection in citrus fruit is one of the major problems in the postharvest commercialization process. The automatic and nondestructive detection of infected fruits is still a challenge for the citrus industry. At present, the visual inspection of rotten citrus fruits is commonly performed by workers through the ultraviolet induction fluorescence technology or manual sorting in citrus packinghouses to remove fruit subject with fungal infection. However, the former entails a number of problems because exposing people to this kind of lighting is potentially hazardous to human health, and the latter is very inefficient. Orange is used as a research object. This study would focus on this problem and proposed an effective method based on Vis-NIR hyperspectral imaging in the wavelength range of 400-1000 nm with a spectroscopic resolution of 2.8 nm. In this work, three normalization approaches are applied prior to analysis to reduce the effect of sample curvature on spectral profiles, and it is found that mean normalization was the most effective pretreatment for decreasing spectral variability due to curvature. Then, principal component analysis (PCA) was applied to a dataset composing of average spectra from decayed and normal tissue to reduce the dimensionality of data and observe the ability of Vis-NIR hyper-spectra to discriminate data from two classes. In this case, it was observed that normal and decayed spectra were separable along the resultant first principal component (PC1) axis. Subsequently, five wavelengths (band) centered at 577, 702, 751, 808, and 923 nm were selected as the characteristic wavelengths by analyzing the loadings of PC1. A multispectral combination image was generated based on five selected characteristic wavelength images. Based on the obtained multispectral combination image, the intensity slicing pseudocolor image processing method is used to generate a 2-D visual classification image that would enhance the contrast between normal and decayed tissue. Finally, an image segmentation algorithm for detection of decayed fruit was developed based on the pseudocolor image coupled with a simple thresholding method. For the investigated 238 independent set samples including infected fruits infected by Penicillium digitatum and normal fruits, the total success rate is 100% and 97.5%, respectively, and, the proposed algorithm also used to identify the orange infected by penicillium italicum with a 100% identification accuracy, indicating that the proposed multispectral algorithm here is an effective method and it is potential to be applied in citrus industry.Keywords: citrus fruit, early rotten, fungal infection, hyperspectral imaging
Procedia PDF Downloads 3041649 Characterization of Potato Starch/Guar Gum Composite Film Modified by Ecofriendly Cross-Linkers
Authors: Sujosh Nandi, Proshanta Guha
Abstract:
Synthetic plastics are preferred for food packaging due to high strength, stretch-ability, good water vapor and gas barrier properties, transparency and low cost. However, environmental pollution generated by these synthetic plastics is a major concern of modern human civilization. Therefore, use of biodegradable polymers as a substitute for synthetic non-biodegradable polymers are encouraged to be used even after considering drawbacks related to mechanical and barrier properties of the films. Starch is considered one of the potential raw material for the biodegradable polymer, encounters poor water barrier property and mechanical properties due to its hydrophilic nature. That apart, recrystallization of starch molecules occurs during aging which decreases flexibility and increases elastic modulus of the film. The recrystallization process can be minimized by blending of other hydrocolloids having similar structural compatibility, into the starch matrix. Therefore, incorporation of guar gum having a similar structural backbone, into the starch matrix can introduce a potential film into the realm of biodegradable polymer. However, hydrophilic nature of both starch and guar gum, water barrier property of the film is low. One of the prospective solution to enhance this could be modification of the potato starch/guar gum (PSGG) composite film using cross-linker. Over the years, several cross-linking agents such as phosphorus oxychloride, sodium trimetaphosphate, etc. have been used to improve water vapor permeability (WVP) of the films. However, these chemical cross-linking agents are toxic, expensive and take longer time to degrade. Therefore, naturally available carboxylic acid (tartaric acid, malonic acid, succinic acid, etc.) had been used as a cross-linker and found that water barrier property enhanced substantially. As per our knowledge, no works have been reported with tartaric acid and succinic acid as a cross-linking agent blended with the PSGG films. Therefore, the objective of the present study was to examine the changes in water vapor barrier property and mechanical properties of the PSGG films after cross-linked with tartaric acid (TA) and succinic acid (SA). The cross-linkers were blended with PSGG film-forming solution at four different concentrations (4, 8, 12 & 16%) and cast on teflon plate at 37°C for 20 h. From the fourier-transform infrared spectroscopy (FTIR) study of the developed films, a band at 1720cm-1 was observed which is attributed to the formation of ester group in the developed films. On the other hand, it was observed that tensile strength (TS) of the cross-linked film decreased compared to non-cross linked films, whereas strain at break increased by several folds. Moreover, the results depicted that tensile strength diminished with increasing the concentration of TA or SA and lowest TS (1.62 MPa) was observed for 16% SA. That apart, maximum strain at break was also observed for TA at 16% and the reason behind this could be a lesser degree of crystallinity of the TA cross-linked films compared to SA. However, water vapor permeability of succinic acid cross-linked film was reduced significantly, but it was enhanced significantly by addition of tartaric acid.Keywords: cross linking agent, guar gum, organic acids, potato starch
Procedia PDF Downloads 1151648 Monitoring Soil Moisture Dynamic in Root Zone System of Argania spinosa Using Electrical Resistivity Imaging
Authors: F. Ainlhout, S. Boutaleb, M. C. Diaz-Barradas, M. Zunzunegui
Abstract:
Argania spinosa is an endemic tree of the southwest of Morocco, occupying 828,000 Ha, distributed mainly between Mediterranean vegetation and the desert. This tree can grow in extremely arid regions in Morocco, where annual rainfall ranges between 100-300 mm where no other tree species can live. It has been designated as a UNESCO Biosphere reserve since 1998. Argania tree is of great importance in human and animal feeding of rural population as well as for oil production, it is considered as a multi-usage tree. Admine forest located in the suburbs of Agadir city, 5 km inland, was selected to conduct this work. The aim of the study was to investigate the temporal variation in root-zone moisture dynamic in response to variation in climatic conditions and vegetation water uptake, using a geophysical technique called Electrical resistivity imaging (ERI). This technique discriminates resistive woody roots, dry and moisture soil. Time-dependent measurements (from April till July) of resistivity sections were performed along the surface transect (94 m Length) at 2 m fixed electrode spacing. Transect included eight Argan trees. The interactions between the tree and soil moisture were estimated by following the tree water status variations accompanying the soil moisture deficit. For that purpose we measured midday leaf water potential and relative water content during each sampling day, and for the eight trees. The first results showed that ERI can be used to accurately quantify the spatiotemporal distribution of root-zone moisture content and woody root. The section obtained shows three different layers: middle conductive one (moistured); a moderately resistive layer corresponding to relatively dry soil (calcareous formation with intercalation of marly strata) on top, this layer is interspersed by very resistant layer corresponding to woody roots. Below the conductive layer, we find the moderately resistive layer. We note that throughout the experiment, there was a continuous decrease in soil moisture at the different layers. With the ERI, we can clearly estimate the depth of the woody roots, which does not exceed 4 meters. In previous work on the same species, analyzing the δ18O in water of xylem and in the range of possible water sources, we argued that rain is the main water source in winter and spring, but not in summer, trees are not exploiting deep water from the aquifer as the popular assessment, instead of this they are using soil water at few meter depth. The results of the present work confirm the idea that the roots of Argania spinosa are not growing very deep.Keywords: Argania spinosa, electrical resistivity imaging, root system, soil moisture
Procedia PDF Downloads 3291647 Imaging 255nm Tungsten Thin Film Adhesion with Picosecond Ultrasonics
Authors: A. Abbas, X. Tridon, J. Michelon
Abstract:
In the electronic or in the photovoltaic industries, components are made from wafers which are stacks of thin film layers of a few nanometers to serval micrometers thickness. Early evaluation of the bounding quality between different layers of a wafer is one of the challenges of these industries to avoid dysfunction of their final products. Traditional pump-probe experiments, which have been developed in the 70’s, give a partial solution to this problematic but with a non-negligible drawback. In fact, on one hand, these setups can generate and detect ultra-high ultrasounds frequencies which can be used to evaluate the adhesion quality of wafer layers. But, on the other hand, because of the quiet long acquisition time they need to perform one measurement, these setups remain shut in punctual measurement to evaluate global sample quality. This last point can lead to bad interpretation of the sample quality parameters, especially in the case of inhomogeneous samples. Asynchronous Optical Sampling (ASOPS) systems can perform sample characterization with picosecond acoustics up to 106 times faster than traditional pump-probe setups. This last point allows picosecond ultrasonic to unlock the acoustic imaging field at the nanometric scale to detect inhomogeneities regarding sample mechanical properties. This fact will be illustrated by presenting an image of the measured acoustical reflection coefficients obtained by mapping, with an ASOPS setup, a 255nm thin-film tungsten layer deposited on a silicone substrate. Interpretation of the coefficient reflection in terms of bounding quality adhesion will also be exposed. Origin of zones which exhibit good and bad quality bounding will be discussed.Keywords: adhesion, picosecond ultrasonics, pump-probe, thin film
Procedia PDF Downloads 1591646 The Current Ways of Thinking Mild Traumatic Brain Injury and Clinical Practice in a Trauma Hospital: A Pilot Study
Authors: P. Donnelly, G. Mitchell
Abstract:
Traumatic Brain Injury (TBI) is a major contributor to the global burden of disease; despite its ubiquity, there is significant variation in diagnosis, prognosis, and treatment between clinicians. This study aims to examine the spectrum of approaches that currently exist at a Level 1 Trauma Centre in Australasia by surveying Emergency Physicians and Neurosurgeons on those aspects of mTBI. A pilot survey of 17 clinicians (Neurosurgeons, Emergency Physicians, and others who manage patients with mTBI) at a Level 1 Trauma Centre in Brisbane, Australia, was conducted. The objective of this study was to examine the importance these clinicians place on various elements in their approach to the diagnosis, prognostication, and treatment of mTBI. The data were summarised, and the descriptive statistics reported. Loss of consciousness and post-traumatic amnesia were rated as the most important signs or symptoms in diagnosing mTBI (median importance of 8). MRI was the most important imaging modality in diagnosing mTBI (median importance of 7). ‘Number of the Previous TBIs’ and Intracranial Injury on Imaging’ were rated as the most important elements for prognostication (median importance of 9). Education and reassurance were rated as the most important modality for treating mTBI (median importance of 7). There was a statistically insignificant variation between the specialties as to the importance they place on each of these components. In this Australian tertiary trauma center, there appears to be variation in how clinicians approach mTBI. This study is underpowered to state whether this is between clinicians within a specialty or a trend between specialties. This variation is worthwhile in investigating as a step toward a unified approach to diagnosing, prognosticating, and treating this common pathology.Keywords: mild traumatic brain injury, adult, clinician, survey
Procedia PDF Downloads 1311645 Comparison of Radiation Dosage and Image Quality: Digital Breast Tomosynthesis vs. Full-Field Digital Mammography
Authors: Okhee Woo
Abstract:
Purpose: With increasing concern of individual radiation exposure doses, studies analyzing radiation dosage in breast imaging modalities are required. Aim of this study is to compare radiation dosage and image quality between digital breast tomosynthesis (DBT) and full-field digital mammography (FFDM). Methods and Materials: 303 patients (mean age 52.1 years) who studied DBT and FFDM were retrospectively reviewed. Radiation dosage data were obtained by radiation dosage scoring and monitoring program: Radimetrics (Bayer HealthCare, Whippany, NJ). Entrance dose and mean glandular doses in each breast were obtained in both imaging modalities. To compare the image quality of DBT with two-dimensional synthesized mammogram (2DSM) and FFDM, 5-point scoring of lesion clarity was assessed and the better modality between the two was selected. Interobserver performance was compared with kappa values and diagnostic accuracy was compared using McNemar test. The parameters of radiation dosages (entrance dose, mean glandular dose) and image quality were compared between two modalities by using paired t-test and Wilcoxon rank sum test. Results: For entrance dose and mean glandular doses for each breasts, DBT had lower values compared with FFDM (p-value < 0.0001). Diagnostic accuracy did not have statistical difference, but lesion clarity score was higher in DBT with 2DSM and DBT was chosen as a better modality compared with FFDM. Conclusion: DBT showed lower radiation entrance dose and also lower mean glandular doses to both breasts compared with FFDM. Also, DBT with 2DSM had better image quality than FFDM with similar diagnostic accuracy, suggesting that DBT may have a potential to be performed as an alternative to FFDM.Keywords: radiation dose, DBT, digital mammography, image quality
Procedia PDF Downloads 3501644 Understanding Chromosome Movement in Starfish Oocytes
Authors: Bryony Davies
Abstract:
Many cell and tissue culture practices ignore the effects of gravity on cell biology, and little is known about how cell components may move in response to gravitational forces. Starfish oocytes provide an excellent model for interrogating the movement of cell components due to their unusually large size, ease of handling, and high transparency. Chromosomes from starfish oocytes can be visualised by microinjection of the histone-H2B-mCherry plasmid into the oocytes. The movement of the chromosomes can then be tracked by live-cell fluorescence microscopy. The results from experiments using these methods suggest that there is a replicable downward movement of centrally located chromosomes at a median velocity of 0.39 μm/min. Chromosomes nearer the nuclear boundary showed more restricted movement. Chromosome density and shape could also be altered by microinjection of restriction enzymes, primarily Alu1, before imaging. This was found to alter the speed of chromosome movement, with chromosomes from Alu1-injected nuclei showing a median downward velocity of 0.60 μm/min. Overall, these results suggest that there is a non-negligible movement of chromosomes in response to gravitational forces and that this movement can be altered by enzyme activity. Future directions based on these results could interrogate if this observed downward movement extends to other cell components and to other cell types. Additionally, it may be important to understand whether gravitational orientation and vertical positioning of cell components alter cell behaviour. The findings here may have implications for current cell culture practices, which do not replicate cell orientations or external forces experienced in vivo. It is possible that a failure to account for gravitational forces in 2D cell culture alters experimental results and the accuracy of conclusions drawn from them. Understanding possible behavioural changes in cells due to the effects of gravity would therefore be beneficial.Keywords: starfish, oocytes, live-cell imaging, microinjection, chromosome dynamics
Procedia PDF Downloads 1041643 Natural Dyes: A Global Perspective on Commercial Solutions and Industry Players
Authors: Laura Seppälä, Ana Nuutinen
Abstract:
Environmental concerns are increasing the interest in the potential uses of natural dyes. Natural dyes are more safe and environmentally friendly option than synthetic dyes. However, one must be also cautious with natural dyes, because, for example, some dyestuff such as plants or mushrooms, as well as some mordants are poisonous. By natural dyes we mean dyes that are derived from plants, fungi, bark, lichens, algae, insects, and minerals. Different plant parts, such as stems, leaves, flowers, roots, bark, berries, fruits, and cones, can be utilized for textile dyeing and printing, pigment manufacture, and other processes depending on the season. They may be utilized to produce distinctive colour tones that are challenging to do with synthetic dyes. This adds value to textiles and makes them stand out. Synthetic dyes quickly replaced natural dyes, after being developed in the middle of the 19th century, but natural dyes have remained the dyeing method of crafters until recently. This research examines the commercial solutions for natural dyes in many parts of the world, such as Europe, the United States, South America, Africa, Asia, New Zealand, and Australia. This study aims to determine the commercial status of natural dyes. Each continent has its own traditions and specific dyestuffs. The availability of natural dyes can vary depending on several aspects, including plant species, temperature, and harvesting techniques, which poses a challenge to the work of designers and crafters. While certain plants may only provide dyes during specific seasons, others may do so continuously. To find the ideal time to collect natural dyes, it is critical to research various plant species and their harvesting techniques. Furthermore, to guarantee the quality and colour of the dye, plant material must be handled and processed properly. This research was conducted via an internet search, and results were searched systematically for commercial stakeholders in the field. The research question looked at commercial players in the field of natural dyes. This qualitative case study interpreted the data using thematic analysis. Each webpage was screenshotted and analyzed in reflection on to research question. Online content analysis means systematically coding and analyzing qualitative data. The most evident result was that the natural dyes interest in different parts of the World. There are clothing collections dyed with natural dyes, dyestuff stores, and courses for natural dyeing. This article presents the designers who work with natural dyes and actors who are involved with the natural dye industry. Several websites emphasized the safety and environmental benefits of natural dyes. Many of them included eye-catching images of textiles dyed naturally, and the colours of such dyes are thought to be attractive since they are beautiful and natural hues. The search did not find big-scale industrial solutions for natural dyes, but there were several instances of dyeing with natural dyes. Understanding the players, designers, and stakeholders in the natural dye business is the purpose of this article. The comprehension of the current state of the art illustrates the direction that the natural dye business is currently taking.Keywords: commercial solutions, environmental issues, key stakeholders, natural dyes, sustainability, textile dyeing
Procedia PDF Downloads 681642 Alumina Nanoparticles in One-Pot Synthesis of Pyrazolopyranopyrimidinones
Authors: Saeed Khodabakhshi, Alimorad Rashidi, Ziba Tavakoli, Sajad Kiani, Sadegh Dastkhoon
Abstract:
Alumina nanoparticles (γ-Al2O3 NPs) were prepared via a new and simple synthetic route and characterized by field emission scanning electron microscope, X-ray diffraction, and Fourier transform infrared spectroscopy. The catalytic activity of prepared γ-Al2O3 NPs was investigated for the one-pot, four-component synthesis of fused tri-heterocyclic compounds containing pyrazole, pyran, and pyrimidine. This procedure has some advantages such as high efficiency, simplicity, high rate and environmental safety.Keywords: alumina nanoparticles, one-pot, fused tri-heterocyclic compounds, pyran
Procedia PDF Downloads 3321641 Crystallization Based Resolution of Enantiomeric and Diastereomeric Derivatives of myo-Inositol
Authors: Nivedita T. Patil, M. T. Patil, M. S. Shashidhar, R. G. Gonnade
Abstract:
Cyclitols are cycloalkane polyols which have raise attention since they have numerous biological and pharmaceutical properties. Among these, inositols are important cyclitols, which constitute a group of naturally occurring polyhydric alcohols. Myo, scyllo, allo, neo, D-chiro- are naturally occurring structural isomer of inositol while other four isomers (L-chiro, allo, epi-, and cis-inositol) are derived from myo-inositol by chemical synthesis. Myo-inositol, most abundant isomer, plays an important role in signal transduction process and for the treatment of type 2 diabetes, bacterial infections, stimulation of menstruation, ovulation in polycystic ovary syndrome, improvement of osteogenesis, and in treatment of neurological disorders. Considering the vast application of the derivatives, it becomes important to supply these compounds for further studies in quantitative amounts, but the synthesis of suitably protected chiral inositol derivatives is the key intermediates in most of the synthesis which is difficult. Chiral inositol derivatives could also be of interest to synthetic organic chemists as they could serve as potential starting materials for the synthesis of several natural products and their analogs. Thus, obtaining chiral myo-inositol derivatives in a more eco-friendly way is need for current inositol chemistry. Thus, the resolution of nonracemates by preferential crystallization of enantiomers has not been reported as a method for inositol derivatives. We are optimistic that this work might lead to the development of the two tosylate enantiomers as synthetic chiral pool molecules for organic synthesis. Resolution of racemic 4-O-benzyl 6-O-tosyl myo-inositol 1, 3, 5 orthoformate was successfully achieved on multigram scale by preferential crystallization, which is more scalable, eco-friendly method of separation than other reported methods. The separation of the conglomeric mixture of tosylate was achieved by suspending the mixture in ethyl acetate till the level of saturation is obtained. To this saturated clear solution was added seed crystal of the desired enantiomers. The filtration of the precipitated seed was carried out at its filtration window to get enantiomerically enriched tosylate, and the process was repeated alternatively. These enantiomerically enriched samples were recrystallized to get tosylate as pure enantiomers. The configuration of the resolved enantiomers was determined by converting it to previously reported dibenzyl ether myo-inositol, which is an important precursor for mono- and tetraphosphates. We have also developed a convenient and practical method for the preparation of enantiomeric 4-O and 6-O-allyl myo-inositol orthoesters by resolution of diastereomeric allyl dicamphante orthoesters on multigram scale. These allyl ethers can be converted to other chiral protected myo-inositol derivatives using routine synthetic transformations. The chiral allyl ethers can be obtained in gram quantities, and the methods are amenable to further scale-up due to the simple procedures involved. We believe that the work described enhances the pace of research to understand the intricacies of the myo-inositol cycle as the methods described provide efficient access to enantiomeric phosphoinositols, cyclitols, and their derivatives from the abundantly available myo-inositol as a starting material.Keywords: cyclitols, diastereomers, enantiomers, myo-inositol, preferential crystallization, signal transduction
Procedia PDF Downloads 1421640 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence
Authors: Gus Calderon, Richard McCreight, Tammy Schwartz
Abstract:
Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.
Procedia PDF Downloads 1081639 Advanced Biosensor Characterization of Phage-Mediated Lysis in Real-Time and under Native Conditions
Authors: Radka Obořilová, Hana Šimečková, Matěj Pastucha, Jan Přibyl, Petr Skládal, Ivana Mašlaňová, Zdeněk Farka
Abstract:
Due to the spreading of antimicrobial resistance, alternative approaches to combat superinfections are being sought, both in the field of lysing agents and methods for studying bacterial lysis. A suitable alternative to antibiotics is phage therapy and enzybiotics, for which it is also necessary to study the mechanism of their action. Biosensor-based techniques allow rapid detection of pathogens in real time, verification of sensitivity to commonly used antimicrobial agents, and selection of suitable lysis agents. The detection of lysis takes place on the surface of the biosensor with immobilized bacteria, which has the potential to be used to study biofilms. An example of such a biosensor is surface plasmon resonance (SPR), which records the kinetics of bacterial lysis based on a change in the resonance angle. The bacteria are immobilized on the surface of the SPR chip, and the action of phage as the mass loss is monitored after a typical lytic cycle delay. Atomic force microscopy (AFM) is a technique for imaging of samples on the surface. In contrast to electron microscopy, it has the advantage of real-time imaging in the native conditions of the nutrient medium. In our case, Staphylococcus aureus was lysed using the enzyme lysostaphin and phage P68 from the familyPodoviridae at 37 ° C. In addition to visualization, AFM was used to study changes in mechanical properties during lysis, which resulted in a reduction of Young’s modulus (E) after disruption of the bacterial wall. Changes in E reflect the stiffness of the bacterium. These advanced methods provide deeper insight into bacterial lysis and can help to fight against bacterial diseases.Keywords: biosensors, atomic force microscopy, surface plasmon resonance, bacterial lysis, staphylococcus aureus, phage P68
Procedia PDF Downloads 1341638 Assessment of Hepatosteatosis Among Diabetic and Nondiabetic Patients Using Biochemical Parameters and Noninvasive Imaging Techniques
Authors: Tugba Sevinc Gamsiz, Emine Koroglu, Ozcan Keskin
Abstract:
Aim: Nonalcoholic fatty liver disease (NAFLD) is considered the most common chronic liver disease in the general population. The higher mortality and morbidity among NAFLD patients and lack of symptoms makes early detection and management important. In our study, we aimed to evaluate the relationship between noninvasive imaging and biochemical markers in diabetic and nondiabetic patients diagnosed with NAFLD. Materials and Methods: The study was conducted from (September 2017) to (December 2017) on adults admitted to Internal Medicine and Gastroenterology outpatient clinics with hepatic steatosis reported on ultrasound or transient elastography within the last six months that exclude patients with other liver diseases or alcohol abuse. The data were collected and analyzed retrospectively. Number cruncher statistical system (NCSS) 2007 program was used for statistical analysis. Results: 116 patients were included in this study. Diabetic patients compared to nondiabetics had significantly higher Controlled Attenuation Parameter (CAP), Liver Stiffness Measurement (LSM) and fibrosis values. Also, hypertension, hepatomegaly, high BMI, hypertriglyceridemia, hyperglycemia, high A1c, and hyperuricemia were found to be risk factors for NAFLD progression to fibrosis. Advanced fibrosis (F3, F4) was present in 18,6 % of all our patients; 35,8 % of diabetic and 5,7 % of nondiabetic patients diagnosed with hepatic steatosis. Conclusion: Transient elastography is now used in daily clinical practice as an accurate noninvasive tool during follow-up of patients with fatty liver. Early diagnosis of the stage of liver fibrosis improves the monitoring and management of patients, especially in those with metabolic syndrome criteria.Keywords: diabetes, elastography, fatty liver, fibrosis, metabolic syndrome
Procedia PDF Downloads 1531637 A Radiomics Approach to Predict the Evolution of Prostate Imaging Reporting and Data System Score 3/5 Prostate Areas in Multiparametric Magnetic Resonance
Authors: Natascha C. D'Amico, Enzo Grossi, Giovanni Valbusa, Ala Malasevschi, Gianpiero Cardone, Sergio Papa
Abstract:
Purpose: To characterize, through a radiomic approach, the nature of areas classified PI-RADS (Prostate Imaging Reporting and Data System) 3/5, recognized in multiparametric prostate magnetic resonance with T2-weighted (T2w), diffusion and perfusion sequences with paramagnetic contrast. Methods and Materials: 24 cases undergoing multiparametric prostate MR and biopsy were admitted to this pilot study. Clinical outcome of the PI-RADS 3/5 was found through biopsy, finding 8 malignant tumours. The analysed images were acquired with a Philips achieva 1.5T machine with a CE- T2-weighted sequence in the axial plane. Semi-automatic tumour segmentation was carried out on MR images using 3DSlicer image analysis software. 45 shape-based, intensity-based and texture-based features were extracted and represented the input for preprocessing. An evolutionary algorithm (a TWIST system based on KNN algorithm) was used to subdivide the dataset into training and testing set and select features yielding the maximal amount of information. After this pre-processing 20 input variables were selected and different machine learning systems were used to develop a predictive model based on a training testing crossover procedure. Results: The best machine learning system (three-layers feed-forward neural network) obtained a global accuracy of 90% ( 80 % sensitivity and 100% specificity ) with a ROC of 0.82. Conclusion: Machine learning systems coupled with radiomics show a promising potential in distinguishing benign from malign tumours in PI-RADS 3/5 areas.Keywords: machine learning, MR prostate, PI-Rads 3, radiomics
Procedia PDF Downloads 1881636 Using Biopolymer Materials to Enhance Sandy Soil Behavior
Authors: Mohamed Ayeldeen, Abdelazim Negm
Abstract:
Nowadays, strength characteristics of soils have more importance due to increasing building loads. In some projects, geotechnical properties of the soils are be improved using man-made materials varying from cement-based to chemical-based. These materials have proven successful in improving the engineering properties of the soil such as shear strength, compressibility, permeability, bearing capacity etc.. However, the use of these artificial injection formulas often modifies the pH level of soil, contaminates soil and groundwater. This is attributed to their toxic and hazardous characteristics. Recently, an environmentally friendly soil treatment method or Biological Treatment Method (BTM) was to bond particles of loose sandy soils. This research paper presents the preliminary results of using biopolymers for strengthening cohesionless soil. Xanthan gum was identified for further study over a range of concentrations varying from 0.25% to 2.00%. Xanthan gum is a polysaccharide secreted by the bacterium Xanthomonas campestris, used as a food additive and it is a nontoxic material. A series of direct shear, unconfined compressive strength, and permeability tests were carried out to investigate the behavior of sandy soil treated with Xanthan gum with different concentration ratios and at different curing times. Laser microscopy imaging was also conducted to study the microstructure of the treated sand. Experimental results demonstrated the compatibility of Xanthan gum to improve the geotechnical properties of sandy soil. Depending on the biopolymer concentration, it was observed that the biopolymers effectively increased the cohesion intercept and stiffness of the treated sand and reduced the permeability of sand. The microscopy imaging indicates that the cross-links of the biopolymers through and over the soil particles increase with the increase of the biopolymer concentration.Keywords: biopolymer, direct shear, permeability, sand, shear strength, Xanthan gum
Procedia PDF Downloads 2771635 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients
Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho
Abstract:
Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper
Procedia PDF Downloads 1471634 Subsurface Exploration for Soil Geotechnical Properties and its Implications for Infrastructure Design and Construction in Victoria Island, Lagos, Nigeria
Authors: Sunday Oladele, Joseph Oluwagbeja Simeon
Abstract:
Subsurface exploration, integrating methods of geotechnics and geophysics, of a planned construction site in the coastal city of Lagos, Nigeria has been carried out with the aim of characterizing the soil properties and their implication for the proposed infrastructural development. Six Standard Penetration Tests (SPT), fourteen Dutch Cone Penetrometer Tests (DCPT) and 2D Electrical Resistivity Imaging employing Dipole-dipole and Pole-dipole arrays were implemented on the site. The topsoil (0 - 4m) consists of highly compacted sandy lateritic clay(10 to 5595Ωm) to 1.25m in some parts and dense sand in other parts to 5.50m depth. This topsoil was characterized as a material of very high shear strength (≤ 150kg/m2) and allowable bearing pressure value of 54kN/m2 to 85kN/m2 and a safety factor of 2.5. Soft amorphous peat/peaty clay (0.1 to 11.4Ωm), 3-6m thick, underlays the lateritic clay to about 18m depth. Grey, medium dense to very dense sand (0.37 to 2387Ωm) with occasional gravels underlies the peaty clay down to 30m depth. Within this layer, the freshwater bearing zones are characterized by high resistivity response (83 to 2387Ωm), while the clayey sand/saline water intruded sand produced subdued resistivity output (0.37 to 40Ωm). The overall ground-bearing pressure for the proposed structure would be 225kN/m2. Bored/cast-in-place pile at 18.00m depth with any of these diameters and respective safe working loads 600mm/1,140KN, 800mm/2,010KN and 1000mm/3,150KN is recommended for the proposed multi-story structure.Keywords: subsurface exploration, Geotechnical properties, resistivity imaging, pile
Procedia PDF Downloads 941633 Right Cerebellar Stroke with a Right Vertebral Artery Occlusion Following an Embolization of the Right Glomus Tympanicum Tumor
Authors: Naim Izet Kajtazi
Abstract:
Context: Although rare, glomus tumor (i.e., nonchromaffin chemodectomas and paragan¬gliomas) is the most common middle ear tumor, with female predominance. Pre-operative embolization is often required to devascularize the hypervascular tumor for better surgical outcomes. Process: A 35-year-old female presented with episodes of frequent dizziness, ear fullness, and right ear tinnitus for 12 months. Head imaging revealed a right glomus tympanicum tumor. She underwent pre-operative endovascular embolization of the glomus tympanicum tumor with surgical, cyanoacrylate-based glue. Immediately after the procedure, she developed drowsiness and severe pain in the right temporal region. Further investigations revealed a right cerebellar stroke in the posterior inferior cerebellar artery territory. She was treated with intravenous heparin, followed by one year of oral anticoagulation. With rehabilitation, she significantly recovered from her post embolization stroke. However, the tumor was resected at another institution. Ten years later, follow-up imaging indicated a gradual increase in the size of the glomus jugulare tumor, compressing the nearby critical vascular structures. She subsequently received radiation therapy to treat the residual tumor. Outcome: Currently, she has no neurological deficit, but her mild dizziness, right ear tinnitus, and hearing impairment persist. Relevance: This case highlights the complex nature of these tumors, which often bring challenges to the patients as well as treatment teams. The multi-disciplinary team approach is necessary to tailor the management plan for individual tumors. Although embolization is a safe procedure, careful attention and thoughtful anatomic knowledge regarding dangerous anastomosis are essential to avoid devastating complications. Complications occur due to encountered vessel anomalies and new anastomoses formed during the gluing and changes in hemodynamics.Keywords: stroke, embolization, MRI brain, cerebral angiogram
Procedia PDF Downloads 711632 Recycled Cellulosic Fibers and Lignocellulosic Aggregates for Sustainable Building Materials
Authors: N. Stevulova, I. Schwarzova, V. Hospodarova, J. Junak, J. Briancin
Abstract:
Sustainability is becoming a priority for developers and the use of environmentally friendly materials is increasing. Nowadays, the application of raw materials from renewable sources to building materials has gained a significant interest in this research area. Lignocellulosic aggregates and cellulosic fibers are coming from many different sources such as wood, plants and waste. They are promising alternative materials to replace synthetic, glass and asbestos fibers as reinforcement in inorganic matrix of composites. Natural fibers are renewable resources so their cost is relatively low in comparison to synthetic fibers. With the consideration of environmental consciousness, natural fibers are biodegradable so their using can reduce CO2 emissions in the building materials production. The use of cellulosic fibers in cementitious matrices have gained importance because they make the composites lighter at high fiber content, they have comparable cost - performance ratios to similar building materials and they could be processed from waste paper, thus expanding the opportunities for waste utilization in cementitious materials. The main objective of this work is to find out the possibility of using different wastes: hemp hurds as waste of hemp stem processing and recycled fibers obtained from waste paper for making cement composite products such as mortars based on cellulose fibers. This material was made of cement mortar containing organic filler based on hemp hurds and recycled waste paper. In addition, the effects of fibers and their contents on some selected physical and mechanical properties of the fiber-cement plaster composites have been investigated. In this research organic material have used to mortars as 2.0, 5.0 and 10.0 % replacement of cement weight. Reference sample is made for comparison of physical and mechanical properties of cement composites based on recycled cellulosic fibers and lignocellulosic aggregates. The prepared specimens were tested after 28 days of curing in order to investigate density, compressive strength and water absorbability. Scanning Electron Microscopy examination was also carried out.Keywords: Hemp hurds, organic filler, recycled paper, sustainable building materials
Procedia PDF Downloads 2231631 Comparison of Support Vector Machines and Artificial Neural Network Classifiers in Characterizing Threatened Tree Species Using Eight Bands of WorldView-2 Imagery in Dukuduku Landscape, South Africa
Authors: Galal Omer, Onisimo Mutanga, Elfatih M. Abdel-Rahman, Elhadi Adam
Abstract:
Threatened tree species (TTS) play a significant role in ecosystem functioning and services, land use dynamics, and other socio-economic aspects. Such aspects include ecological, economic, livelihood, security-based, and well-being benefits. The development of techniques for mapping and monitoring TTS is thus critical for understanding the functioning of ecosystems. The advent of advanced imaging systems and supervised learning algorithms has provided an opportunity to classify TTS over fragmenting landscape. Recently, vegetation maps have been produced using advanced imaging systems such as WorldView-2 (WV-2) and robust classification algorithms such as support vectors machines (SVM) and artificial neural network (ANN). However, delineation of TTS in a fragmenting landscape using high resolution imagery has widely remained elusive due to the complexity of the species structure and their distribution. Therefore, the objective of the current study was to examine the utility of the advanced WV-2 data for mapping TTS in the fragmenting Dukuduku indigenous forest of South Africa using SVM and ANN classification algorithms. The results showed the robustness of the two machine learning algorithms with an overall accuracy (OA) of 77.00% (total disagreement = 23.00%) for SVM and 75.00% (total disagreement = 25.00%) for ANN using all eight bands of WV-2 (8B). This study concludes that SVM and ANN classification algorithms with WV-2 8B have the potential to classify TTS in the Dukuduku indigenous forest. This study offers relatively accurate information that is important for forest managers to make informed decisions regarding management and conservation protocols of TTS.Keywords: artificial neural network, threatened tree species, indigenous forest, support vector machines
Procedia PDF Downloads 5151630 Biodegradable Poly-ε-Caprolactone-Based Siloxane Polymer
Authors: Maria E. Fortună, Elena Ungureanu, Răzvan Rotaru, Valeria Harabagiu
Abstract:
Polymers are used in a variety of areas due to their unique mechanical and chemical properties. Natural polymers are biodegradable, whereas synthetic polymers are rarely biodegradable but can be modified. As a result, by combining the benefits of natural and synthetic polymers, composite materials that are biodegradable can be obtained with potential for biomedical and environmental applications. However, because of their strong resistance to degradation, it may be difficult to eliminate waste. As a result, interest in developing biodegradable polymers has risen significantly. This research involves obtaining and characterizing two biodegradable poly-ε-caprolactone-polydimethylsiloxane copolymers. A comparison study was conducted using an aminopropyl-terminated polydimethylsiloxane macroinitiator with two distinct molecular weights. The copolymers were obtained by ring-opening polymerization of poly (ɛ-caprolactone) in the presence of aminopropyl-terminated polydimethylsiloxane as initiator and comonomers and stannous 2-ethylhexanoate as a catalyst. The materials were characterized using a number of techniques, including NMR, FTIR, EDX, SEM, AFM, and DSC. Additionally, the water contact angle and water vapor sorption capacity were assessed. Furthermore, the copolymers were examined for environmental susceptibility by conducting biological tests on tomato plants (Lypercosium esculentum), with an accent on biological stability and metabolism. Subsequent to the copolymer's degradation, the dynamics of nitrogen experience evolutionary alterations, validating the progression of the process accompanied by the liberation of organic nitrogen. The biological tests performed (germination index, average seedling height, green and dry biomass) on Lypercosium esculentum, San Marzano variety tomato plants in direct contact with the copolymer indicated normal growth and development, suggesting a minimal toxic effect and, by extension, compatibility of the copolymer with the environment. The total chlorophyll concentration of plant leaves in contact with copolymers was determined, considering the pigment's critical role in photosynthesis and, implicitly, plant metabolism and physiological state.Keywords: biodegradable, biological stability, copolymers, polydimethylsiloxane
Procedia PDF Downloads 241629 Effect of Synthetic L-Lysine and DL-Methionine Amino Acids on Performance of Broiler Chickens
Authors: S. M. Ali, S. I. Mohamed
Abstract:
Reduction of feed cost for broiler production is at most importance in decreasing the cost of production. The objectives of this study were to evaluate the use of synthetic amino acids (L-lysine – DL-methionine) instead of super concentrate and groundnut cake versus meat powder as protein sources. A total of 180 male broiler chicks (Cobb – strain) at 15 day of age (DOA) were selected according to their average body weight (380 g) from a broiler chicks flock at Elbashair Farm. The chicks were randomly divided into six groups of 30 chicks. Each group was further sub divided into three replicates with 10 birds. Six experimental diets were formulated. The first diet contained groundnut cake and super concentrate as the control (GNC + C); in the second diet, meat powder and super concentrate (MP + C) were used. The third diet contained groundnut cake and amino acids (GNC + AA); the forth diet contained meat powder and amino acids (MP + AA). The fifth diet contained groundnut cake, meat powder and super concentrate (GNC + MP + C) and the sixth diet contained groundnut cake, meat powder and amino acids (GNC + MP + AA). The formulated rations were randomly assigned for the different sub groups in a completely randomized design of six treatments and three replicates. Weekly feed intake, body weight and mortality were recorded and body weight gain and feed conversion ratio were calculated. At the end of the experiment (49 DOA), nine birds from each treatment were slaughtered. Live body weight, carcass weight, head, shank, and some internal organs (gizzard, heart, liver, small intestine, and abdominal fat pad) weights were taken. For the overall experimental period the (GNC + C +MP) consumed significantly (P≤0.01) the highest cumulative feed while the (MP + AA) group consumed the lowest amount of feed. The (GNC + C) and the (GNC + AA) groups had the heaviest live body weight while (MP + AA) had the lowest live body weight. The overall FCR was significantly (P≤0.01) the best for (GNC + AA) group while the (MP + AA) reported the worst FCR. However, the (GNC + AA) had significantly (P≤0.01) the lowest AFP. The (GNC + MP + Con) group had the highest dressing % while the (MP + AA) group had the lowest dressing %. It is concluded that amino acids can be used instead of super concentrate in broiler feeding with perfect performance and less cost and that meat powder is not advisable to be used with amino acids.Keywords: broiler chickens, DL-lysine, methionine, performance
Procedia PDF Downloads 2681628 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models
Authors: Ainouna Bouziane
Abstract:
The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.Keywords: electron tomography, supported catalysts, nanometrology, error assessment
Procedia PDF Downloads 881627 Source-Detector Trajectory Optimization for Target-Based C-Arm Cone Beam Computed Tomography
Authors: S. Hatamikia, A. Biguri, H. Furtado, G. Kronreif, J. Kettenbach, W. Birkfellner
Abstract:
Nowadays, three dimensional Cone Beam CT (CBCT) has turned into a widespread clinical routine imaging modality for interventional radiology. In conventional CBCT, a circular sourcedetector trajectory is used to acquire a high number of 2D projections in order to reconstruct a 3D volume. However, the accumulated radiation dose due to the repetitive use of CBCT needed for the intraoperative procedure as well as daily pretreatment patient alignment for radiotherapy has become a concern. It is of great importance for both health care providers and patients to decrease the amount of radiation dose required for these interventional images. Thus, it is desirable to find some optimized source-detector trajectories with the reduced number of projections which could therefore lead to dose reduction. In this study we investigate some source-detector trajectories with the optimal arbitrary orientation in the way to maximize performance of the reconstructed image at particular regions of interest. To achieve this approach, we developed a box phantom consisting several small target polytetrafluoroethylene spheres at regular distances through the entire phantom. Each of these spheres serves as a target inside a particular region of interest. We use the 3D Point Spread Function (PSF) as a measure to evaluate the performance of the reconstructed image. We measured the spatial variance in terms of Full-Width-Half-Maximum (FWHM) of the local PSFs each related to a particular target. The lower value of FWHM shows the better spatial resolution of reconstruction results at the target area. One important feature of interventional radiology is that we have very well-known imaging targets as a prior knowledge of patient anatomy (e.g. preoperative CT) is usually available for interventional imaging. Therefore, we use a CT scan from the box phantom as the prior knowledge and consider that as the digital phantom in our simulations to find the optimal trajectory for a specific target. Based on the simulation phase we have the optimal trajectory which can be then applied on the device in real situation. We consider a Philips Allura FD20 Xper C-arm geometry to perform the simulations and real data acquisition. Our experimental results based on both simulation and real data show our proposed optimization scheme has the capacity to find optimized trajectories with minimal number of projections in order to localize the targets. Our results show the proposed optimized trajectories are able to localize the targets as good as a standard circular trajectory while using just 1/3 number of projections. Conclusion: We demonstrate that applying a minimal dedicated set of projections with optimized orientations is sufficient to localize targets, may minimize radiation.Keywords: CBCT, C-arm, reconstruction, trajectory optimization
Procedia PDF Downloads 132