Search results for: rocking curve imaging
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2270

Search results for: rocking curve imaging

1580 The Reliability Analysis of Concrete Chimneys Due to Random Vortex Shedding

Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta

Abstract:

Chimneys are generally tall and slender structures with circular cross-sections, due to which they are highly prone to wind forces. Wind exerts pressure on the wall of the chimneys, which produces unwanted forces. Vortex-induced oscillation is one of such excitations which can lead to the failure of the chimneys. Therefore, vortex-induced oscillation of chimneys is of great concern to researchers and practitioners since many failures of chimneys due to vortex shedding have occurred in the past. As a consequence, extensive research has taken place on the subject over decades. Many laboratory experiments have been performed to verify the theoretical models proposed to predict vortex-induced forces, including aero-elastic effects. Comparatively, very few proto-type measurement data have been recorded to verify the proposed theoretical models. Because of this reason, the theoretical models developed with the help of experimental laboratory data are utilized for analyzing the chimneys for vortex-induced forces. This calls for reliability analysis of the predictions of the responses of the chimneys produced due to vortex shedding phenomena. Although several works of literature exist on the vortex-induced oscillation of chimneys, including code provisions, the reliability analysis of chimneys against failure caused due to vortex shedding is scanty. In the present study, the reliability analysis of chimneys against vortex shedding failure is presented, assuming the uncertainty in vortex shedding phenomena to be significantly more than other uncertainties, and hence, the latter is ignored. The vortex shedding is modeled as a stationary random process and is represented by a power spectral density function (PSDF). It is assumed that the vortex shedding forces are perfectly correlated and act over the top one-third height of the chimney. The PSDF of the tip displacement of the chimney is obtained by performing a frequency domain spectral analysis using a matrix approach. For this purpose, both chimney and random wind forces are discretized over a number of points along with the height of the chimney. The method of analysis duly accounts for the aero-elastic effects. The double barrier threshold crossing level, as proposed by Vanmarcke, is used for determining the probability of crossing different threshold levels of the tip displacement of the chimney. Assuming the annual distribution of the mean wind velocity to be a Gumbel type-I distribution, the fragility curve denoting the variation of the annual probability of threshold crossing against different threshold levels of the tip displacement of the chimney is determined. The reliability estimate is derived from the fragility curve. A 210m tall concrete chimney with a base diameter of 35m, top diameter as 21m, and thickness as 0.3m has been taken as an illustrative example. The terrain condition is assumed to be that corresponding to the city center. The expression for the PSDF of the vortex shedding force is taken to be used by Vickery and Basu. The results of the study show that the threshold crossing reliability of the tip displacement of the chimney is significantly influenced by the assumed structural damping and the Gumbel distribution parameters. Further, the aero-elastic effect influences the reliability estimate to a great extent for small structural damping.

Keywords: chimney, fragility curve, reliability analysis, vortex-induced vibration

Procedia PDF Downloads 144
1579 Understanding Chromosome Movement in Starfish Oocytes

Authors: Bryony Davies

Abstract:

Many cell and tissue culture practices ignore the effects of gravity on cell biology, and little is known about how cell components may move in response to gravitational forces. Starfish oocytes provide an excellent model for interrogating the movement of cell components due to their unusually large size, ease of handling, and high transparency. Chromosomes from starfish oocytes can be visualised by microinjection of the histone-H2B-mCherry plasmid into the oocytes. The movement of the chromosomes can then be tracked by live-cell fluorescence microscopy. The results from experiments using these methods suggest that there is a replicable downward movement of centrally located chromosomes at a median velocity of 0.39 μm/min. Chromosomes nearer the nuclear boundary showed more restricted movement. Chromosome density and shape could also be altered by microinjection of restriction enzymes, primarily Alu1, before imaging. This was found to alter the speed of chromosome movement, with chromosomes from Alu1-injected nuclei showing a median downward velocity of 0.60 μm/min. Overall, these results suggest that there is a non-negligible movement of chromosomes in response to gravitational forces and that this movement can be altered by enzyme activity. Future directions based on these results could interrogate if this observed downward movement extends to other cell components and to other cell types. Additionally, it may be important to understand whether gravitational orientation and vertical positioning of cell components alter cell behaviour. The findings here may have implications for current cell culture practices, which do not replicate cell orientations or external forces experienced in vivo. It is possible that a failure to account for gravitational forces in 2D cell culture alters experimental results and the accuracy of conclusions drawn from them. Understanding possible behavioural changes in cells due to the effects of gravity would therefore be beneficial.

Keywords: starfish, oocytes, live-cell imaging, microinjection, chromosome dynamics

Procedia PDF Downloads 89
1578 Key Transfer Protocol Based on Non-invertible Numbers

Authors: Luis A. Lizama-Perez, Manuel J. Linares, Mauricio Lopez

Abstract:

We introduce a method to perform remote user authentication on what we call non-invertible cryptography. It exploits the fact that the multiplication of an invertible integer and a non-invertible integer in a ring Zn produces a non-invertible integer making infeasible to compute factorization. The protocol requires the smallest key size when is compared with the main public key algorithms as Diffie-Hellman, Rivest-Shamir-Adleman or Elliptic Curve Cryptography. Since we found that the unique opportunity for the eavesdropper is to mount an exhaustive search on the keys, the protocol seems to be post-quantum.

Keywords: invertible, non-invertible, ring, key transfer

Procedia PDF Downloads 164
1577 Polymer Mediated Interaction between Grafted Nanosheets

Authors: Supriya Gupta, Paresh Chokshi

Abstract:

Polymer-particle interactions can be effectively utilized to produce composites that possess physicochemical properties superior to that of neat polymer. The incorporation of fillers with dimensions comparable to polymer chain size produces composites with extra-ordinary properties owing to very high surface to volume ratio. The dispersion of nanoparticles is achieved by inducing steric repulsion realized by grafting particles with polymeric chains. A comprehensive understanding of the interparticle interaction between these functionalized nanoparticles plays an important role in the synthesis of a stable polymer nanocomposite. With the focus on incorporation of clay sheets in a polymer matrix, we theoretically construct the polymer mediated interparticle potential for two nanosheets grafted with polymeric chains. The self-consistent field theory (SCFT) is employed to obtain the inhomogeneous composition field under equilibrium. Unlike the continuum models, SCFT is built from the microscopic description taking in to account the molecular interactions contributed by both intra- and inter-chain potentials. We present the results of SCFT calculations of the interaction potential curve for two grafted nanosheets immersed in the matrix of polymeric chains of dissimilar chemistry to that of the grafted chains. The interaction potential is repulsive at short separation and shows depletion attraction for moderate separations induced by high grafting density. It is found that the strength of attraction well can be tuned by altering the compatibility between the grafted and the mobile chains. Further, we construct the interaction potential between two nanosheets grafted with diblock copolymers with one of the blocks being chemically identical to the free polymeric chains. The interplay between the enthalpic interaction between the dissimilar species and the entropy of the free chains gives rise to a rich behavior in interaction potential curve obtained for two separate cases of free chains being chemically similar to either the grafted block or the free block of the grafted diblock chains.

Keywords: clay nanosheets, polymer brush, polymer nanocomposites, self-consistent field theory

Procedia PDF Downloads 241
1576 Proposals of Exposure Limits for Infrasound From Wind Turbines

Authors: M. Pawlaczyk-Łuszczyńska, T. Wszołek, A. Dudarewicz, P. Małecki, M. Kłaczyński, A. Bortkiewicz

Abstract:

Human tolerance to infrasound is defined by the hearing threshold. Infrasound that cannot be heard (or felt) is not annoying and is not thought to have any other adverse or health effects. Recent research has largely confirmed earlier findings. ISO 7196:1995 recommends the use of G-weighted characteristics for the assessment of infrasound. There is a strong correlation between G-weighted SPL and annoyance perception. The aim of this study was to propose exposure limits for infrasound from wind turbines. However, only a few countries have set limits for infrasound. These limits are usually no higher than 85-92 dBG, and none of them are specific to wind turbines. Over the years, a number of studies have been carried out to determine hearing thresholds below 20 Hz. It has been recognized that 10% of young people would be able to perceive 10 Hz at around 90 dB, and it has also been found that the difference in median hearing thresholds between young adults aged around 20 years and older adults aged over 60 years is around 10 dB, irrespective of frequency. This shows that older people (up to about 60 years of age) retain good hearing in the low frequency range, while their sensitivity to higher frequencies is often significantly reduced. In terms of exposure limits for infrasound, the average hearing threshold corresponds to a tone with a G-weighted SPL of about 96 dBG. In contrast, infrasound at Lp,G levels below 85-90 dBG is usually inaudible. The individual hearing threshold can, therefore be 10-15 dB lower than the average threshold, so the recommended limits for environmental infrasound could be 75 dBG or 80 dBG. It is worth noting that the G86 curve has been taken as the threshold of auditory perception of infrasound reached by 90-95% of the population, so the G75 and G80 curves can be taken as the criterion curve for wind turbine infrasound. Finally, two assessment methods and corresponding exposure limit values have been proposed for wind turbine infrasound, i.e. method I - based on G-weighted sound pressure level measurements and method II - based on frequency analysis in 1/3-octave bands in the frequency range 4-20 Hz. Separate limit values have been set for outdoor living areas in the open countryside (Area A) and for noise sensitive areas (Area B). In the case of Method I, infrasound limit values of 80 dBG (for areas A) and 75 dBG (for areas B) have been proposed, while in the case of Method II - criterion curves G80 and G75 have been chosen (for areas A and B, respectively).

Keywords: infrasound, exposure limit, hearing thresholds, wind turbines

Procedia PDF Downloads 60
1575 Hydraulic Characteristics of Mine Tailings by Metaheuristics Approach

Authors: Akhila Vasudev, Himanshu Kaushik, Tadikonda Venkata Bharat

Abstract:

A large number of mine tailings are produced every year as part of the extraction process of phosphates, gold, copper, and other materials. Mine tailings are high in water content and have very slow dewatering behavior. The efficient design of tailings dam and economical disposal of these slurries requires the knowledge of tailings consolidation behavior. The large-strain consolidation theory closely predicts the self-weight consolidation of these slurries as the theory considers the conservation of mass and momentum conservation and considers the hydraulic conductivity as a function of void ratio. Classical laboratory techniques, such as settling column test, seepage consolidation test, etc., are expensive and time-consuming for the estimation of hydraulic conductivity variation with void ratio. Inverse estimation of the constitutive relationships from the measured settlement versus time curves is explored. In this work, inverse analysis based on metaheuristics techniques will be explored for predicting the hydraulic conductivity parameters for mine tailings from the base excess pore water pressure dissipation curve and the initial conditions of the mine tailings. The proposed inverse model uses particle swarm optimization (PSO) algorithm, which is based on the social behavior of animals searching for food sources. The finite-difference numerical solution of the forward analytical model is integrated with the PSO algorithm to solve the inverse problem. The method is tested on synthetic data of base excess pore pressure dissipation curves generated using the finite difference method. The effectiveness of the method is verified using base excess pore pressure dissipation curve obtained from a settling column experiment and further ensured through comparison with available predicted hydraulic conductivity parameters.

Keywords: base excess pore pressure, hydraulic conductivity, large strain consolidation, mine tailings

Procedia PDF Downloads 122
1574 Curve Designing Using an Approximating 4-Point C^2 Ternary Non-Stationary Subdivision Scheme

Authors: Muhammad Younis

Abstract:

A ternary 4-point approximating non-stationary subdivision scheme has been introduced that generates the family of $C^2$ limiting curves. The theory of asymptotic equivalence is being used to analyze the convergence and smoothness of the scheme. The comparison of the proposed scheme has been demonstrated using different examples with the existing 4-point ternary approximating schemes, which shows that the limit curves of the proposed scheme behave more pleasantly and can generate conic sections as well.

Keywords: ternary, non-stationary, approximation subdivision scheme, convergence and smoothness

Procedia PDF Downloads 459
1573 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence

Authors: Gus Calderon, Richard McCreight, Tammy Schwartz

Abstract:

Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.

Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.

Procedia PDF Downloads 93
1572 Spectroscopic Constant Calculation of the BeF Molecule

Authors: Nayla El-Kork, Farah Korjieh, Ahmed Bentiba, Mahmoud Korek

Abstract:

Ab-initio calculations have been performed to investigate the spectroscopic constants for the diatomic compound BeF. Values of the internuclear distance Re, the harmonic frequency ωe, the rotational constants Be, the electronic transition energy with respect to the ground state Te, the eignvalues Ev, the abscissas of the turning points Rmin, Rmax, the rotational constants Bv and the centrifugal distortion constants Dv have been calculated for the molecule’s ground and excited electronic states. Results are in agreement with experimental data.

Keywords: spectroscopic constant, potential energy curve, diatomic molecule, spectral analysis

Procedia PDF Downloads 556
1571 Advanced Biosensor Characterization of Phage-Mediated Lysis in Real-Time and under Native Conditions

Authors: Radka Obořilová, Hana Šimečková, Matěj Pastucha, Jan Přibyl, Petr Skládal, Ivana Mašlaňová, Zdeněk Farka

Abstract:

Due to the spreading of antimicrobial resistance, alternative approaches to combat superinfections are being sought, both in the field of lysing agents and methods for studying bacterial lysis. A suitable alternative to antibiotics is phage therapy and enzybiotics, for which it is also necessary to study the mechanism of their action. Biosensor-based techniques allow rapid detection of pathogens in real time, verification of sensitivity to commonly used antimicrobial agents, and selection of suitable lysis agents. The detection of lysis takes place on the surface of the biosensor with immobilized bacteria, which has the potential to be used to study biofilms. An example of such a biosensor is surface plasmon resonance (SPR), which records the kinetics of bacterial lysis based on a change in the resonance angle. The bacteria are immobilized on the surface of the SPR chip, and the action of phage as the mass loss is monitored after a typical lytic cycle delay. Atomic force microscopy (AFM) is a technique for imaging of samples on the surface. In contrast to electron microscopy, it has the advantage of real-time imaging in the native conditions of the nutrient medium. In our case, Staphylococcus aureus was lysed using the enzyme lysostaphin and phage P68 from the familyPodoviridae at 37 ° C. In addition to visualization, AFM was used to study changes in mechanical properties during lysis, which resulted in a reduction of Young’s modulus (E) after disruption of the bacterial wall. Changes in E reflect the stiffness of the bacterium. These advanced methods provide deeper insight into bacterial lysis and can help to fight against bacterial diseases.

Keywords: biosensors, atomic force microscopy, surface plasmon resonance, bacterial lysis, staphylococcus aureus, phage P68

Procedia PDF Downloads 122
1570 Application of Groundwater Level Data Mining in Aquifer Identification

Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen

Abstract:

Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.

Keywords: aquifer identification, decision tree, groundwater, Fourier transform

Procedia PDF Downloads 143
1569 Assessment of Hepatosteatosis Among Diabetic and Nondiabetic Patients Using Biochemical Parameters and Noninvasive Imaging Techniques

Authors: Tugba Sevinc Gamsiz, Emine Koroglu, Ozcan Keskin

Abstract:

Aim: Nonalcoholic fatty liver disease (NAFLD) is considered the most common chronic liver disease in the general population. The higher mortality and morbidity among NAFLD patients and lack of symptoms makes early detection and management important. In our study, we aimed to evaluate the relationship between noninvasive imaging and biochemical markers in diabetic and nondiabetic patients diagnosed with NAFLD. Materials and Methods: The study was conducted from (September 2017) to (December 2017) on adults admitted to Internal Medicine and Gastroenterology outpatient clinics with hepatic steatosis reported on ultrasound or transient elastography within the last six months that exclude patients with other liver diseases or alcohol abuse. The data were collected and analyzed retrospectively. Number cruncher statistical system (NCSS) 2007 program was used for statistical analysis. Results: 116 patients were included in this study. Diabetic patients compared to nondiabetics had significantly higher Controlled Attenuation Parameter (CAP), Liver Stiffness Measurement (LSM) and fibrosis values. Also, hypertension, hepatomegaly, high BMI, hypertriglyceridemia, hyperglycemia, high A1c, and hyperuricemia were found to be risk factors for NAFLD progression to fibrosis. Advanced fibrosis (F3, F4) was present in 18,6 % of all our patients; 35,8 % of diabetic and 5,7 % of nondiabetic patients diagnosed with hepatic steatosis. Conclusion: Transient elastography is now used in daily clinical practice as an accurate noninvasive tool during follow-up of patients with fatty liver. Early diagnosis of the stage of liver fibrosis improves the monitoring and management of patients, especially in those with metabolic syndrome criteria.

Keywords: diabetes, elastography, fatty liver, fibrosis, metabolic syndrome

Procedia PDF Downloads 133
1568 A Radiomics Approach to Predict the Evolution of Prostate Imaging Reporting and Data System Score 3/5 Prostate Areas in Multiparametric Magnetic Resonance

Authors: Natascha C. D'Amico, Enzo Grossi, Giovanni Valbusa, Ala Malasevschi, Gianpiero Cardone, Sergio Papa

Abstract:

Purpose: To characterize, through a radiomic approach, the nature of areas classified PI-RADS (Prostate Imaging Reporting and Data System) 3/5, recognized in multiparametric prostate magnetic resonance with T2-weighted (T2w), diffusion and perfusion sequences with paramagnetic contrast. Methods and Materials: 24 cases undergoing multiparametric prostate MR and biopsy were admitted to this pilot study. Clinical outcome of the PI-RADS 3/5 was found through biopsy, finding 8 malignant tumours. The analysed images were acquired with a Philips achieva 1.5T machine with a CE- T2-weighted sequence in the axial plane. Semi-automatic tumour segmentation was carried out on MR images using 3DSlicer image analysis software. 45 shape-based, intensity-based and texture-based features were extracted and represented the input for preprocessing. An evolutionary algorithm (a TWIST system based on KNN algorithm) was used to subdivide the dataset into training and testing set and select features yielding the maximal amount of information. After this pre-processing 20 input variables were selected and different machine learning systems were used to develop a predictive model based on a training testing crossover procedure. Results: The best machine learning system (three-layers feed-forward neural network) obtained a global accuracy of 90% ( 80 % sensitivity and 100% specificity ) with a ROC of 0.82. Conclusion: Machine learning systems coupled with radiomics show a promising potential in distinguishing benign from malign tumours in PI-RADS 3/5 areas.

Keywords: machine learning, MR prostate, PI-Rads 3, radiomics

Procedia PDF Downloads 174
1567 Using Biopolymer Materials to Enhance Sandy Soil Behavior

Authors: Mohamed Ayeldeen, Abdelazim Negm

Abstract:

Nowadays, strength characteristics of soils have more importance due to increasing building loads. In some projects, geotechnical properties of the soils are be improved using man-made materials varying from cement-based to chemical-based. These materials have proven successful in improving the engineering properties of the soil such as shear strength, compressibility, permeability, bearing capacity etc.. However, the use of these artificial injection formulas often modifies the pH level of soil, contaminates soil and groundwater. This is attributed to their toxic and hazardous characteristics. Recently, an environmentally friendly soil treatment method or Biological Treatment Method (BTM) was to bond particles of loose sandy soils. This research paper presents the preliminary results of using biopolymers for strengthening cohesionless soil. Xanthan gum was identified for further study over a range of concentrations varying from 0.25% to 2.00%. Xanthan gum is a polysaccharide secreted by the bacterium Xanthomonas campestris, used as a food additive and it is a nontoxic material. A series of direct shear, unconfined compressive strength, and permeability tests were carried out to investigate the behavior of sandy soil treated with Xanthan gum with different concentration ratios and at different curing times. Laser microscopy imaging was also conducted to study the microstructure of the treated sand. Experimental results demonstrated the compatibility of Xanthan gum to improve the geotechnical properties of sandy soil. Depending on the biopolymer concentration, it was observed that the biopolymers effectively increased the cohesion intercept and stiffness of the treated sand and reduced the permeability of sand. The microscopy imaging indicates that the cross-links of the biopolymers through and over the soil particles increase with the increase of the biopolymer concentration.

Keywords: biopolymer, direct shear, permeability, sand, shear strength, Xanthan gum

Procedia PDF Downloads 250
1566 Carrying Capacity Estimation for Small Hydro Plant Located in Torrential Rivers

Authors: Elena Carcano, James Ball, Betty Tiko

Abstract:

Carrying capacity refers to the maximum population that a given level of resources can sustain over a specific period. In undisturbed environments, the maximum population is determined by the availability and distribution of resources, as well as the competition for their utilization. This information is typically obtained through long-term data collection. In regulated environments, where resources are artificially modified, populations must adapt to changing conditions, which can lead to additional challenges due to fluctuations in resource availability over time and throughout development. An example of this is observed in hydropower plants, which alter water flow and impact fish migration patterns and behaviors. To assess how fish species can adapt to these changes, specialized surveys are conducted, which provide valuable information on fish populations, sample sizes, and density before and after flow modifications. In such situations, it is highly recommended to conduct hydrological and biological monitoring to gain insight into how flow reductions affect species adaptability and to prevent unfavorable exploitation conditions. This analysis involves several planned steps that help design appropriate hydropower production while simultaneously addressing environmental needs. Consequently, the study aims to strike a balance between technical assessment, biological requirements, and societal expectations. Beginning with a small hydro project that requires restoration, this analysis focuses on the lower tail of the Flow Duration Curve (FDC), where both hydrological and environmental goals can be met. The proposed approach involves determining the threshold condition that is tolerable for the most vulnerable species sampled (Telestes Muticellus) by identifying a low flow value from the long-term FDC. The results establish a practical connection between hydrological and environmental information and simplify the process by establishing a single reference flow value that represents the minimum environmental flow that should be maintained.

Keywords: carrying capacity, fish bypass ladder, long-term streamflow duration curve, eta-beta method, environmental flow

Procedia PDF Downloads 14
1565 Laser Beam Bending via Lenses

Authors: Remzi Yildirim, Fatih. V. Çelebi, H. Haldun Göktaş, A. Behzat Şahin

Abstract:

This study is about a single component cylindrical structured lens with gradient curve which we used for bending laser beams. It operates under atmospheric conditions and bends the laser beam independent of temperature, pressure, polarity, polarization, magnetic field, electric field, radioactivity, and gravity. A single piece cylindrical lens that can bend laser beams is invented. Lenses are made of transparent, tinted or colored glasses and used for undermining or absorbing the energy of the laser beams.

Keywords: laser, bending, lens, light, nonlinear optics

Procedia PDF Downloads 469
1564 Laser Light Bending via Lenses

Authors: Remzi Yildirim, Fatih V. Çelebi, H. Haldun Göktaş, A. Behzat Şahin

Abstract:

This study is about a single component cylindrical structured lens with gradient curve which we used for bending laser beams. It operates under atmospheric conditions and bends the laser beam independent of temperature, pressure, polarity, polarization, magnetic field, electric field, radioactivity, and gravity. A single piece cylindrical lens that can bend laser beams is invented. Lenses are made of transparent, tinted or colored glasses and used for undermining or absorbing the energy of the laser beams.

Keywords: laser, bending, lens, light, nonlinear optics

Procedia PDF Downloads 680
1563 Comparative Efficacy of Angiotensin Converting Enzymes Inhibitors and Angiotensin Receptor Blockers in Patients with Heart Failure in Tanzania: A Prospective Cohort Study

Authors: Mark P. Mayala, Henry Mayala, Khuzeima Khanbhai

Abstract:

Background: Heart failure has been a rising concern in Tanzania. New drugs have been introduced, including the group of drugs called Angiotensin receptor Neprilysin Inhibitor (ARNI), but due to their high cost, angiotensin-converting enzymes inhibitors (ACEIs) and Angiotensin receptor blockers (ARBs) have been mostly used in Tanzania. However, according to our knowledge, the efficacy comparison of the two groups is yet to be studied in Tanzania. The aim of this study was to compare the efficacy of ACEIs and ARBs among patients with heart failure. Methodology: This was a hospital-based prospective cohort study done at Jakaya Kikwete Cardiac Institution (JKCI), Tanzania, from June to December 2020. Consecutive enrollment was done until fulfilling the inclusion criteria. Clinical details were measured at baseline. We assessed the relationship between ARBs and ACEIs users with N-terminal pro-brain natriuretic peptide (NT pro-BNP) levels at admission and at 1-month follow-up using a chi-square test. A Kaplan-Meier curve was used to estimate the survival time of the two groups. Results: 155 HF patients were enrolled, with a mean age of 48 years, whereby 52.3% were male, and their mean left ventricular ejection fraction (LVEF) was 37.3%. 52 (33.5%) heart failure patients were on ACEIs, 57 (36.8%) on ARBs, and 46 (29.7%) were neither using ACEIs nor ARBs. At least half of the patients did not receive a guideline-directed medical therapy (GDMT), with only 82 (52.9%) receiving a GDMT. A drop in NT pro-BNP levels was observed during admission and at 1-month follow-up on both groups, from 6389.2 pg/ml to 4000.1 pg/ml for ARB users and 5877.7 pg/ml to 1328.2 pg/ml for the ACEIs users. There was no statistical difference between the two groups when estimated by the Kaplan-Meier curve, though more deaths were observed in those who were neither on ACEIs nor ARBs, with a calculated P value of 0.01. Conclusion: This study demonstrates that ACEIs have more efficacy and overall better clinical outcome than ARBs, but this should be taken under the patient-based case, considering the side effects of ACEIs and patients’ adherence.

Keywords: angiotensin converting enzymes inhibitors, angiotensin receptor blockers, guideline direct medical therapy, N-terminal pro-brain natriuretic peptide

Procedia PDF Downloads 72
1562 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients

Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho

Abstract:

Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).

Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper

Procedia PDF Downloads 130
1561 Subsurface Exploration for Soil Geotechnical Properties and its Implications for Infrastructure Design and Construction in Victoria Island, Lagos, Nigeria

Authors: Sunday Oladele, Joseph Oluwagbeja Simeon

Abstract:

Subsurface exploration, integrating methods of geotechnics and geophysics, of a planned construction site in the coastal city of Lagos, Nigeria has been carried out with the aim of characterizing the soil properties and their implication for the proposed infrastructural development. Six Standard Penetration Tests (SPT), fourteen Dutch Cone Penetrometer Tests (DCPT) and 2D Electrical Resistivity Imaging employing Dipole-dipole and Pole-dipole arrays were implemented on the site. The topsoil (0 - 4m) consists of highly compacted sandy lateritic clay(10 to 5595Ωm) to 1.25m in some parts and dense sand in other parts to 5.50m depth. This topsoil was characterized as a material of very high shear strength (≤ 150kg/m2) and allowable bearing pressure value of 54kN/m2 to 85kN/m2 and a safety factor of 2.5. Soft amorphous peat/peaty clay (0.1 to 11.4Ωm), 3-6m thick, underlays the lateritic clay to about 18m depth. Grey, medium dense to very dense sand (0.37 to 2387Ωm) with occasional gravels underlies the peaty clay down to 30m depth. Within this layer, the freshwater bearing zones are characterized by high resistivity response (83 to 2387Ωm), while the clayey sand/saline water intruded sand produced subdued resistivity output (0.37 to 40Ωm). The overall ground-bearing pressure for the proposed structure would be 225kN/m2. Bored/cast-in-place pile at 18.00m depth with any of these diameters and respective safe working loads 600mm/1,140KN, 800mm/2,010KN and 1000mm/3,150KN is recommended for the proposed multi-story structure.

Keywords: subsurface exploration, Geotechnical properties, resistivity imaging, pile

Procedia PDF Downloads 72
1560 Right Cerebellar Stroke with a Right Vertebral Artery Occlusion Following an Embolization of the Right Glomus Tympanicum Tumor

Authors: Naim Izet Kajtazi

Abstract:

Context: Although rare, glomus tumor (i.e., nonchromaffin chemodectomas and paragan¬gliomas) is the most common middle ear tumor, with female predominance. Pre-operative embolization is often required to devascularize the hypervascular tumor for better surgical outcomes. Process: A 35-year-old female presented with episodes of frequent dizziness, ear fullness, and right ear tinnitus for 12 months. Head imaging revealed a right glomus tympanicum tumor. She underwent pre-operative endovascular embolization of the glomus tympanicum tumor with surgical, cyanoacrylate-based glue. Immediately after the procedure, she developed drowsiness and severe pain in the right temporal region. Further investigations revealed a right cerebellar stroke in the posterior inferior cerebellar artery territory. She was treated with intravenous heparin, followed by one year of oral anticoagulation. With rehabilitation, she significantly recovered from her post embolization stroke. However, the tumor was resected at another institution. Ten years later, follow-up imaging indicated a gradual increase in the size of the glomus jugulare tumor, compressing the nearby critical vascular structures. She subsequently received radiation therapy to treat the residual tumor. Outcome: Currently, she has no neurological deficit, but her mild dizziness, right ear tinnitus, and hearing impairment persist. Relevance: This case highlights the complex nature of these tumors, which often bring challenges to the patients as well as treatment teams. The multi-disciplinary team approach is necessary to tailor the management plan for individual tumors. Although embolization is a safe procedure, careful attention and thoughtful anatomic knowledge regarding dangerous anastomosis are essential to avoid devastating complications. Complications occur due to encountered vessel anomalies and new anastomoses formed during the gluing and changes in hemodynamics.

Keywords: stroke, embolization, MRI brain, cerebral angiogram

Procedia PDF Downloads 59
1559 Comparison of Support Vector Machines and Artificial Neural Network Classifiers in Characterizing Threatened Tree Species Using Eight Bands of WorldView-2 Imagery in Dukuduku Landscape, South Africa

Authors: Galal Omer, Onisimo Mutanga, Elfatih M. Abdel-Rahman, Elhadi Adam

Abstract:

Threatened tree species (TTS) play a significant role in ecosystem functioning and services, land use dynamics, and other socio-economic aspects. Such aspects include ecological, economic, livelihood, security-based, and well-being benefits. The development of techniques for mapping and monitoring TTS is thus critical for understanding the functioning of ecosystems. The advent of advanced imaging systems and supervised learning algorithms has provided an opportunity to classify TTS over fragmenting landscape. Recently, vegetation maps have been produced using advanced imaging systems such as WorldView-2 (WV-2) and robust classification algorithms such as support vectors machines (SVM) and artificial neural network (ANN). However, delineation of TTS in a fragmenting landscape using high resolution imagery has widely remained elusive due to the complexity of the species structure and their distribution. Therefore, the objective of the current study was to examine the utility of the advanced WV-2 data for mapping TTS in the fragmenting Dukuduku indigenous forest of South Africa using SVM and ANN classification algorithms. The results showed the robustness of the two machine learning algorithms with an overall accuracy (OA) of 77.00% (total disagreement = 23.00%) for SVM and 75.00% (total disagreement = 25.00%) for ANN using all eight bands of WV-2 (8B). This study concludes that SVM and ANN classification algorithms with WV-2 8B have the potential to classify TTS in the Dukuduku indigenous forest. This study offers relatively accurate information that is important for forest managers to make informed decisions regarding management and conservation protocols of TTS.

Keywords: artificial neural network, threatened tree species, indigenous forest, support vector machines

Procedia PDF Downloads 496
1558 Mapping Iron Content in the Brain with Magnetic Resonance Imaging and Machine Learning

Authors: Gabrielle Robertson, Matthew Downs, Joseph Dagher

Abstract:

Iron deposition in the brain has been linked with a host of neurological disorders such as Alzheimer’s, Parkinson’s, and Multiple Sclerosis. While some treatment options exist, there are no objective measurement tools that allow for the monitoring of iron levels in the brain in vivo. An emerging Magnetic Resonance Imaging (MRI) method has been recently proposed to deduce iron concentration through quantitative measurement of magnetic susceptibility. This is a multi-step process that involves repeated modeling of physical processes via approximate numerical solutions. For example, the last two steps of this Quantitative Susceptibility Mapping (QSM) method involve I) mapping magnetic field into magnetic susceptibility and II) mapping magnetic susceptibility into iron concentration. Process I involves solving an ill-posed inverse problem by using regularization via injection of prior belief. The end result from Process II highly depends on the model used to describe the molecular content of each voxel (type of iron, water fraction, etc.) Due to these factors, the accuracy and repeatability of QSM have been an active area of research in the MRI and medical imaging community. This work aims to estimate iron concentration in the brain via a single step. A synthetic numerical model of the human head was created by automatically and manually segmenting the human head on a high-resolution grid (640x640x640, 0.4mm³) yielding detailed structures such as microvasculature and subcortical regions as well as bone, soft tissue, Cerebral Spinal Fluid, sinuses, arteries, and eyes. Each segmented region was then assigned tissue properties such as relaxation rates, proton density, electromagnetic tissue properties and iron concentration. These tissue property values were randomly selected from a Probability Distribution Function derived from a thorough literature review. In addition to having unique tissue property values, different synthetic head realizations also possess unique structural geometry created by morphing the boundary regions of different areas within normal physical constraints. This model of the human brain is then used to create synthetic MRI measurements. This is repeated thousands of times, for different head shapes, volume, tissue properties and noise realizations. Collectively, this constitutes a training-set that is similar to in vivo data, but larger than datasets available from clinical measurements. This 3D convolutional U-Net neural network architecture was used to train data-driven Deep Learning models to solve for iron concentrations from raw MRI measurements. The performance was then tested on both synthetic data not used in training as well as real in vivo data. Results showed that the model trained on synthetic MRI measurements is able to directly learn iron concentrations in areas of interest more effectively than other existing QSM reconstruction methods. For comparison, models trained on random geometric shapes (as proposed in the Deep QSM method) are less effective than models trained on realistic synthetic head models. Such an accurate method for the quantitative measurement of iron deposits in the brain would be of important value in clinical studies aiming to understand the role of iron in neurological disease.

Keywords: magnetic resonance imaging, MRI, iron deposition, machine learning, quantitative susceptibility mapping

Procedia PDF Downloads 118
1557 Development and Validation of a Coronary Heart Disease Risk Score in Indian Type 2 Diabetes Mellitus Patients

Authors: Faiz N. K. Yusufi, Aquil Ahmed, Jamal Ahmad

Abstract:

Diabetes in India is growing at an alarming rate and the complications caused by it need to be controlled. Coronary heart disease (CHD) is one of the complications that will be discussed for prediction in this study. India has the second most number of diabetes patients in the world. To the best of our knowledge, there is no CHD risk score for Indian type 2 diabetes patients. Any form of CHD has been taken as the event of interest. A sample of 750 was determined and randomly collected from the Rajiv Gandhi Centre for Diabetes and Endocrinology, J.N.M.C., A.M.U., Aligarh, India. Collected variables include patients data such as sex, age, height, weight, body mass index (BMI), blood sugar fasting (BSF), post prandial sugar (PP), glycosylated haemoglobin (HbA1c), diastolic blood pressure (DBP), systolic blood pressure (SBP), smoking, alcohol habits, total cholesterol (TC), triglycerides (TG), high density lipoprotein (HDL), low density lipoprotein (LDL), very low density lipoprotein (VLDL), physical activity, duration of diabetes, diet control, history of antihypertensive drug treatment, family history of diabetes, waist circumference, hip circumference, medications, central obesity and history of CHD. Predictive risk scores of CHD events are designed by cox proportional hazard regression. Model calibration and discrimination is assessed from Hosmer Lemeshow and area under receiver operating characteristic (ROC) curve. Overfitting and underfitting of the model is checked by applying regularization techniques and best method is selected between ridge, lasso and elastic net regression. Youden’s index is used to choose the optimal cut off point from the scores. Five year probability of CHD is predicted by both survival function and Markov chain two state model and the better technique is concluded. The risk scores for CHD developed can be calculated by doctors and patients for self-control of diabetes. Furthermore, the five-year probabilities can be implemented as well to forecast and maintain the condition of patients.

Keywords: coronary heart disease, cox proportional hazard regression, ROC curve, type 2 diabetes Mellitus

Procedia PDF Downloads 207
1556 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 65
1555 Source-Detector Trajectory Optimization for Target-Based C-Arm Cone Beam Computed Tomography

Authors: S. Hatamikia, A. Biguri, H. Furtado, G. Kronreif, J. Kettenbach, W. Birkfellner

Abstract:

Nowadays, three dimensional Cone Beam CT (CBCT) has turned into a widespread clinical routine imaging modality for interventional radiology. In conventional CBCT, a circular sourcedetector trajectory is used to acquire a high number of 2D projections in order to reconstruct a 3D volume. However, the accumulated radiation dose due to the repetitive use of CBCT needed for the intraoperative procedure as well as daily pretreatment patient alignment for radiotherapy has become a concern. It is of great importance for both health care providers and patients to decrease the amount of radiation dose required for these interventional images. Thus, it is desirable to find some optimized source-detector trajectories with the reduced number of projections which could therefore lead to dose reduction. In this study we investigate some source-detector trajectories with the optimal arbitrary orientation in the way to maximize performance of the reconstructed image at particular regions of interest. To achieve this approach, we developed a box phantom consisting several small target polytetrafluoroethylene spheres at regular distances through the entire phantom. Each of these spheres serves as a target inside a particular region of interest. We use the 3D Point Spread Function (PSF) as a measure to evaluate the performance of the reconstructed image. We measured the spatial variance in terms of Full-Width-Half-Maximum (FWHM) of the local PSFs each related to a particular target. The lower value of FWHM shows the better spatial resolution of reconstruction results at the target area. One important feature of interventional radiology is that we have very well-known imaging targets as a prior knowledge of patient anatomy (e.g. preoperative CT) is usually available for interventional imaging. Therefore, we use a CT scan from the box phantom as the prior knowledge and consider that as the digital phantom in our simulations to find the optimal trajectory for a specific target. Based on the simulation phase we have the optimal trajectory which can be then applied on the device in real situation. We consider a Philips Allura FD20 Xper C-arm geometry to perform the simulations and real data acquisition. Our experimental results based on both simulation and real data show our proposed optimization scheme has the capacity to find optimized trajectories with minimal number of projections in order to localize the targets. Our results show the proposed optimized trajectories are able to localize the targets as good as a standard circular trajectory while using just 1/3 number of projections. Conclusion: We demonstrate that applying a minimal dedicated set of projections with optimized orientations is sufficient to localize targets, may minimize radiation.

Keywords: CBCT, C-arm, reconstruction, trajectory optimization

Procedia PDF Downloads 124
1554 X-Ray Detector Technology Optimization in Computed Tomography

Authors: Aziz Ikhlef

Abstract:

Most of multi-slices Computed Tomography (CT) scanners are built with detectors composed of scintillator - photodiodes arrays. The photodiodes arrays are mainly based on front-illuminated technology for detectors under 64 slices and on back-illuminated photodiode for systems of 64 slices or more. The designs based on back-illuminated photodiodes were being investigated for CT machines to overcome the challenge of the higher number of runs and connection required in front-illuminated diodes. In backlit diodes, the electronic noise has already been improved because of the reduction of the load capacitance due to the routing reduction. This is translated by a better image quality in low signal application, improving low dose imaging in large patient population. With the fast development of multi-detector-rows CT (MDCT) scanners and the increasing number of examinations, the clinical community has raised significant concerns on radiation dose received by the patient in both medical and regulatory community. In order to reduce individual exposure and in response to the recommendations of the International Commission on Radiological Protection (ICRP) which suggests that all exposures should be kept as low as reasonably achievable (ALARA), every manufacturer is trying to implement strategies and solutions to optimize dose efficiency and image quality based on x-ray emission and scanning parameters. The added demands on the CT detector performance also comes from the increased utilization of spectral CT or dual-energy CT in which projection data of two different tube potentials are collected. One of the approaches utilizes a technology called fast-kVp switching in which the tube voltage is switched between 80 kVp and 140 kVp in fraction of a millisecond. To reduce the cross-contamination of signals, the scintillator based detector temporal response has to be extremely fast to minimize the residual signal from previous samples. In addition, this paper will present an overview of detector technologies and image chain improvement which have been investigated in the last few years to improve the signal-noise ratio and the dose efficiency CT scanners in regular examinations and in energy discrimination techniques. Several parameters of the image chain in general and in the detector technology contribute in the optimization of the final image quality. We will go through the properties of the post-patient collimation to improve the scatter-to-primary ratio, the scintillator material properties such as light output, afterglow, primary speed, crosstalk to improve the spectral imaging, the photodiode design characteristics and the data acquisition system (DAS) to optimize for crosstalk, noise and temporal/spatial resolution.

Keywords: computed tomography, X-ray detector, medical imaging, image quality, artifacts

Procedia PDF Downloads 183
1553 A Novel Study Contrasting Traditional Autopsy with Post-Mortem Computed Tomography in Falls Leading to Death

Authors: Balaji Devanathan, Gokul G., Abilash S., Abhishek Yadav, Sudhir K. Gupta

Abstract:

Background: As an alternative to the traditional autopsy, a virtual autopsy is carried out using scanning and imaging technologies, mainly post-mortem computed tomography (PMCT). This facility aims to supplement traditional autopsy results and reduce or eliminate internal dissection in subsequent autopsies. For emotional and religious reasons, the deceased's relatives have historically disapproved such interior dissection. The non-invasive, objective, and preservative PMCT is what friends and family would rather have than a traditional autopsy. Additionally, it aids in the examination of the technologies and the benefits and drawbacks of each, demonstrating the significance of contemporary imaging in the field of forensic medicine. Results: One hundred falls resulting in fatalities was analysed by the writers. Before the autopsy, each case underwent a PMCT examination using a 16-slice Multi-Slice CT spiral scanner. By using specialised software, MPR and VR reconstructions were carried out following the capture of the raw images. The accurate detection of fractures in the skull, face bones, clavicle, scapula, and vertebra was better observed in comparison to a routine autopsy. The interpretation of pneumothorax, Pneumoperitoneum, pneumocephalus, and hemosiuns are much enhanced by PMCT than traditional autopsy. Conclusion. It is useful to visualise the skeletal damage in fall from height cases using a virtual autopsy based on PMCT. So, the ideal tool in traumatising patients is a virtual autopsy based on PMCT scans. When assessing trauma victims, PMCT should be viewed as an additional helpful tool to traditional autopsy. This is because it can identify additional bone fractures in body parts that are challenging to examine during autopsy, such as posterior regions, which helps the pathologist reconstruct the victim's life and determine the cause of death.

Keywords: PMCT, fall from height, autopsy, fracture

Procedia PDF Downloads 21
1552 Template-Assisted Synthesis of IrO2 Nanopores Membrane Electrode Assembly

Authors: Zhuo-Xin Lu, Yan Shi, Chang-Feng Yan, Ying Huang, Yuan Gan, Zhi-Da Wang

Abstract:

With TiO2 nanotube arrays (TNTA) as template, a IrO2 nanopores membrane electrode assembly (MEA) was synthesized by a novel depositi-assemble-etch strategy. By analysing the morphology of IrO2/TNTA and cyclic voltammetry (CV) curve at different deposition cycles, we proposed a reasonable scheme for the process of IrO2 electrodeposition on TNTA. The current density of IrO2/TNTA at 1.5V vs RHE reaches 5.12mA/cm2 after 55 cycles deposition, which shows promising performance for its high OER activity after template removal.

Keywords: electrodeposition, IrO2 nanopores, MEA, OER

Procedia PDF Downloads 433
1551 Clinical Manifestations, Pathogenesis and Medical Treatment of Stroke Caused by Basic Mitochondrial Abnormalities (Mitochondrial Encephalopathy, Lactic Acidosis, and Stroke-like Episodes, MELAS)

Authors: Wu Liching

Abstract:

Aim This case aims to discuss the pathogenesis, clinical manifestations and medical treatment of strokes caused by mitochondrial gene mutations. Methods Diagnosis of ischemic stroke caused by mitochondrial gene defect by means of "next-generation sequencing mitochondrial DNA gene variation detection", imaging examination, neurological examination, and medical history; this study took samples from the neurology ward of a medical center in northern Taiwan cases diagnosed with acute cerebral infarction as the research objects. Result This case is a 49-year-old married woman with a rare disease, mitochondrial gene mutation inducing ischemic stroke. She has severe hearing impairment and needs to use hearing aids, and has a history of diabetes. During the patient’s hospitalization, the blood test showed that serum Lactate: 7.72 mmol/L, Lactate (CSF) 5.9 mmol/L. Through the collection of relevant medical history, neurological evaluation showed changes in consciousness and cognition, slow response in language expression, and brain magnetic resonance imaging examination showed subacute bilateral temporal lobe infarction, which was an atypical type of stroke. The lineage DNA gene has m.3243A>G known pathogenic mutation point, and its heteroplasmic level is 24.6%. This pathogenic point is located in MITOMAP and recorded as Mitochondrial Encephalopathy, Lactic Acidosis, and Stroke-like episodes (MELAS) , Leigh Syndrome and other disease-related pathogenic loci, this mutation is located in ClinVar and recorded as Pathogenic (dbSNP: rs199474657), so it is diagnosed as a case of stroke caused by a rare disease mitochondrial gene mutation. After medical treatment, there was no more seizure during hospitalization. After interventional rehabilitation, the patient's limb weakness, poor language function, and cognitive impairment have all improved significantly. Conclusion Mitochondrial disorders can also be associated with abnormalities in psychological, neurological, cerebral cortical function, and autonomic functions, as well as problems with internal medical diseases. Therefore, the differential diagnoses cover a wide range and are not easy to be diagnosed. After neurological evaluation, medical history collection, imaging and rare disease serological examination, atypical ischemic stroke caused by rare mitochondrial gene mutation was diagnosed. We hope that through this case, the diagnosis of rare disease mitochondrial gene variation leading to cerebral infarction will be more familiar to clinical medical staff, and this case report may help to improve the clinical diagnosis and treatment for patients with similar clinical symptoms in the future.

Keywords: acute stroke, MELAS, lactic acidosis, mitochondrial disorders

Procedia PDF Downloads 55