Search results for: computational imaging
2006 Optimization of Surface Coating on Magnetic Nanoparticles for Biomedical Applications
Authors: Xiao-Li Liu, Ling-Yun Zhao, Xing-Jie Liang, Hai-Ming Fan
Abstract:
Owing to their unique properties, magnetic nanoparticles have been used as diagnostic and therapeutic agents for biomedical applications. Highly monodispersed magnetic nanoparticles with controlled particle size and surface coating have been successfully synthesized as a model system to investigate the effect of surface coating on the T2 relaxivity and specific absorption rate (SAR) under an alternating magnetic field, respectively. Amongst, by using mPEG-g-PEI to solubilize oleic-acid capped 6 nm magnetic nanoparticles, the T2 relaxivity could be significantly increased by up to 4-fold as compared to PEG coated nanoparticles. Moreover, it largely enhances the cell uptake with a T2 relaxivity of 92.6 mM-1s-1 for in vitro cell MRI. As for hyperthermia agent, SAR value increase with the decreased thickness of PEG surface coating. By elaborate optimization of surface coating and particle size, a significant increase of SAR (up to 74%) could be achieved with a minimal variation on the saturation magnetization (<5%). The 19 nm magnetic nanoparticles with 2000 Da PEG exhibited the highest SAR of 930 W•g-1 among the samples, which can be maintained in various simulated physiological conditions. This systematic work provides a general strategy for the optimization of surface coating of magnetic core for high performance MRI contrast agent and hyperthermia agent.Keywords: magnetic nanoparticles, magnetic hyperthermia, magnetic resonance imaging, surface modification
Procedia PDF Downloads 5102005 Computational Fluid Dynamicsfd Simulations of Air Pollutant Dispersion: Validation of Fire Dynamic Simulator Against the Cute Experiments of the Cost ES1006 Action
Authors: Virginie Hergault, Siham Chebbah, Bertrand Frere
Abstract:
Following in-house objectives, Central laboratory of Paris police Prefecture conducted a general review on models and Computational Fluid Dynamics (CFD) codes used to simulate pollutant dispersion in the atmosphere. Starting from that review and considering main features of Large Eddy Simulation, Central Laboratory Of Paris Police Prefecture (LCPP) postulates that the Fire Dynamics Simulator (FDS) model, from National Institute of Standards and Technology (NIST), should be well suited for air pollutant dispersion modeling. This paper focuses on the implementation and the evaluation of FDS in the frame of the European COST ES1006 Action. This action aimed at quantifying the performance of modeling approaches. In this paper, the CUTE dataset carried out in the city of Hamburg, and its mock-up has been used. We have performed a comparison of FDS results with wind tunnel measurements from CUTE trials on the one hand, and, on the other, with the models results involved in the COST Action. The most time-consuming part of creating input data for simulations is the transfer of obstacle geometry information to the format required by SDS. Thus, we have developed Python codes to convert automatically building and topographic data to the FDS input file. In order to evaluate the predictions of FDS with observations, statistical performance measures have been used. These metrics include the fractional bias (FB), the normalized mean square error (NMSE) and the fraction of predictions within a factor of two of observations (FAC2). As well as the CFD models tested in the COST Action, FDS results demonstrate a good agreement with measured concentrations. Furthermore, the metrics assessment indicate that FB and NMSE meet the tolerance acceptable.Keywords: numerical simulations, atmospheric dispersion, cost ES1006 action, CFD model, cute experiments, wind tunnel data, numerical results
Procedia PDF Downloads 1332004 Assessing Image Quality in Mobile Radiography: A Phantom-Based Evaluation of a New Lightweight Mobile X-Ray Equipment
Authors: May Bazzi, Shafik Tokmaj, Younes Saberi, Mats Geijer, Tony Jurkiewicz, Patrik Sund, Anna Bjällmark
Abstract:
Mobile radiography, employing portable X-ray equipment, has become a routine procedure within hospital settings, with chest X-rays in intensive care units standing out as the most prevalent mobile X-ray examinations. This approach is not limited to hospitals alone, as it extends its benefits to imaging patients in various settings, particularly those too frail to be transported, such as elderly care residents in nursing homes. Moreover, the utility of mobile X-ray isn't confined solely to traditional healthcare recipients; it has proven to be a valuable resource for vulnerable populations, including the homeless, drug users, asylum seekers, and patients with multiple co-morbidities. Mobile X-rays reduce patient stress, minimize costly hospitalizations, and offer cost-effective imaging. While studies confirm its reliability, further research is needed, especially regarding image quality. Recent advancements in lightweight equipment with enhanced battery and detector technology provide the potential for nearly handheld radiography. The main aim of this study was to evaluate a new lightweight mobile X-ray system with two different detectors and compare the image quality with a modern stationary system. Methods: A total of 74 images of the chest (chest anterior-posterior (AP) views and chest lateral views) and pelvic/hip region (AP pelvis views, hip AP views, and hip cross-table lateral views) were acquired on a whole-body phantom (Kyotokagaku, Japan), utilizing varying image parameters. These images were obtained using a stationary system - 18 images (Mediel, Sweden), a mobile X-ray system with a second-generation detector - 28 images (FDR D-EVO II; Fujifilm, Japan) and a mobile X-ray system with a third-generation detector - 28 images (FDR D-EVO III; Fujifilm, Japan). Image quality was assessed by visual grading analysis (VGA), which is a method to measure image quality by assessing the visibility and accurate reproduction of anatomical structures within the images. A total of 33 image criteria were used in the analysis. A panel of two experienced radiologists, two experienced radiographers, and two final-term radiographer students evaluated the image quality on a 5-grade ordinal scale using the software Viewdex 3.0 (Viewer for Digital Evaluation of X-ray images, Sweden). Data were analyzed using visual grading characteristics analysis. The dose was measured by the dose-area product (DAP) reported by the respective systems. Results: The mobile X-ray equipment (both detectors) showed significantly better image quality than the stationary equipment for the pelvis, hip AP and hip cross-table lateral images with AUCVGA-values ranging from 0.64-0.92, while chest images showed mixed results. The number of images rated as having sufficient quality for diagnostic use was significantly higher for mobile X-ray generation 2 and 3 compared with the stationary X-ray system. The DAP values were higher for the stationary compared to the mobile system. Conclusions: The new lightweight radiographic equipment had an image quality at least as good as a fixed system at a lower radiation dose. Future studies should focus on clinical images and consider radiographers' viewpoints for a comprehensive assessment.Keywords: mobile x-ray, visual grading analysis, radiographer, radiation dose
Procedia PDF Downloads 652003 Numerical Analysis of NOₓ Emission in Staged Combustion for the Optimization of Once-Through-Steam-Generators
Authors: Adrien Chatel, Ehsan Askari Mahvelati, Laurent Fitschy
Abstract:
Once-Through-Steam-Generators are commonly used in the oil-sand industry in the heavy fuel oil extraction process. They are composed of three main parts: the burner, the radiant and convective sections. Natural gas is burned through staged diffusive flames stabilized by the burner. The heat generated by the combustion is transferred to the water flowing through the piping system in the radiant and convective sections. The steam produced within the pipes is then directed to the ground to reduce the oil viscosity and allow its pumping. With the rapid development of the oil-sand industry, the number of OTSG in operation has increased as well as the associated emissions of environmental pollutants, especially the Nitrous Oxides (NOₓ). To limit the environmental degradation, various international environmental agencies have established regulations on the pollutant discharge and pushed to reduce the NOₓ release. To meet these constraints, OTSG constructors have to rely on more and more advanced tools to study and predict the NOₓ emission. With the increase of the computational resources, Computational Fluid Dynamics (CFD) has emerged as a flexible tool to analyze the combustion and pollutant formation process. Moreover, to optimize the burner operating condition regarding the NOx emission, field characterization and measurements are usually accomplished. However, these kinds of experimental campaigns are particularly time-consuming and sometimes even impossible for industrial plants with strict operation schedule constraints. Therefore, the application of CFD seems to be more adequate in order to provide guidelines on the NOₓ emission and reduction problem. In the present work, two different software are employed to simulate the combustion process in an OTSG, namely the commercial software ANSYS Fluent and the open source software OpenFOAM. RANS (Reynolds-Averaged Navier–Stokes) equations combined with the Eddy Dissipation Concept to model the combustion and closed by the k-epsilon model are solved. A mesh sensitivity analysis is performed to assess the independence of the solution on the mesh. In the first part, the results given by the two software are compared and confronted with experimental data as a mean to assess the numerical modelling. Flame temperatures and chemical composition are used as reference fields to perform this validation. Results show a fair agreement between experimental and numerical data. In the last part, OpenFOAM is employed to simulate several operating conditions, and an Emission Characteristic Map of the combustion system is generated. The sources of high NOₓ production inside the OTSG are pointed and correlated to the physics of the flow. CFD is, therefore, a useful tool for providing an insight into the NOₓ emission phenomena in OTSG. Sources of high NOₓ production can be identified, and operating conditions can be adjusted accordingly. With the help of RANS simulations, an Emission Characteristics Map can be produced and then be used as a guide for a field tune-up.Keywords: combustion, computational fluid dynamics, nitrous oxides emission, once-through-steam-generators
Procedia PDF Downloads 1132002 Analysis of Human Mental and Behavioral Models for Development of an Electroencephalography-Based Human Performance Management System
Authors: John Gaber, Youssef Ahmed, Hossam A. Gabbar, Jing Ren
Abstract:
Accidents at Nuclear Power Plants (NPPs) occur due to various factors, notable among them being poor safety management and poor safety culture. During abnormal situations, the likelihood of human error is many-fold higher due to the higher cognitive workload. The most common cause of human error and high cognitive workload is mental fatigue. Electroencephalography (EEG) is a method of gathering the electromagnetic waves emitted by a human brain. We propose a safety system by monitoring brainwaves for signs of mental fatigue using an EEG system. This requires an analysis of the mental model of the NPP operator, changes in brain wave power in response to certain stimuli, and the risk factors on mental fatigue and attention that NPP operators face when performing their tasks. We analyzed these factors and developed an EEG-based monitoring system, which aims to alert NPP operators when levels of mental fatigue and attention hinders their ability to maintain safety.Keywords: brain imaging, EEG, power plant operator, psychology
Procedia PDF Downloads 1012001 An Ultrasonic Signal Processing System for Tomographic Imaging of Reinforced Concrete Structures
Authors: Edwin Forero-Garcia, Jaime Vitola, Brayan Cardenas, Johan Casagua
Abstract:
This research article presents the integration of electronic and computer systems, which developed an ultrasonic signal processing system that performs the capture, adaptation, and analog-digital conversion to later carry out its processing and visualization. The capture and adaptation of the signal were carried out from the design and implementation of an analog electronic system distributed in stages: 1. Coupling of impedances; 2. Analog filter; 3. Signal amplifier. After the signal conditioning was carried out, the ultrasonic information was digitized using a digital microcontroller to carry out its respective processing. The digital processing of the signals was carried out in MATLAB software for the elaboration of A-Scan, B and D-Scan types of ultrasonic images. Then, advanced processing was performed using the SAFT technique to improve the resolution of the Scan-B-type images. Thus, the information from the ultrasonic images was displayed in a user interface developed in .Net with Visual Studio. For the validation of the system, ultrasonic signals were acquired, and in this way, the non-invasive inspection of the structures was carried out and thus able to identify the existing pathologies in them.Keywords: acquisition, signal processing, ultrasound, SAFT, HMI
Procedia PDF Downloads 1072000 Comparative Study of Titanium and Polyetheretherketone Cranial Implant Using Finite Element Model
Authors: Khaja Moiduddin, Sherif Mohammed Elseufy, Hisham Alkhalefah
Abstract:
Recent advances in three-dimensional (3D) printing, medical imaging, and implant design may alter how craniomaxillofacial surgeons construct individualized treatments using patient data. By utilizing medical image data, medical professionals can obtain detailed information about a patient's injuries, enabling them to conduct a thorough preoperative assessment while ensuring the implant's accuracy. However, selecting the right implant material requires careful consideration of various mechanical properties. This study aims to compare the two commonly used implant material for cranial reconstruction which includes titanium (Ti6Al4V) and Polyetheretherketone (PEEK). Biomechanical analysis was performed to study the implant behavior, by keeping the implant design and fixation constant in both cases. A finite element model was created and analyzed under loading conditions. The finite element analysis proves that although Ti6Al4V is stronger than PEEK but, its mechanical strength is adequate to bear the loads of the adjacent bone tissue.Keywords: cranial reconstruction, titanium implants, PEEK, finite element model
Procedia PDF Downloads 681999 Portfolio Risk Management Using Quantum Annealing
Authors: Thomas Doutre, Emmanuel De Meric De Bellefon
Abstract:
This paper describes the application of local-search metaheuristic quantum annealing to portfolio opti- mization. Heuristic technics are particularly handy when Markowitz’ classical Mean-Variance problem is enriched with additional realistic constraints. Once tailored to the problem, computational experiments on real collected data have shown the superiority of quantum annealing over simulated annealing for this constrained optimization problem, taking advantages of quantum effects such as tunnelling.Keywords: optimization, portfolio risk management, quantum annealing, metaheuristic
Procedia PDF Downloads 3831998 A Novel Approach of Secret Communication Using Douglas-Peucker Algorithm
Authors: R. Kiruthika, A. Kannan
Abstract:
Steganography is the problem of hiding secret messages in 'innocent – looking' public communication so that the presence of the secret message cannot be detected. This paper introduces a steganographic security in terms of computational in-distinguishability from a channel of probability distributions on cover messages. This method first splits the cover image into two separate blocks using Douglas – Peucker algorithm. The text message and the image will be hided in the Least Significant Bit (LSB) of the cover image.Keywords: steganography, lsb, embedding, Douglas-Peucker algorithm
Procedia PDF Downloads 3631997 Nanoscale Metal-Organic Framework Coated Carbon Nitride Nanosheet for Combination Cancer Therapy
Authors: Rui Chen, Jinfeng Zhang, Chun-Sing Lee
Abstract:
In the past couple of decades, nanoscale metal-organic frameworks (NMOFs) have been highlighted as promising delivery platforms for biomedical applications, which combine many potent features such as high loading capacity, progressive biodegradability and low cytotoxicity. While NMOF has been extensively used as carriers for drugs of different modalities, so far there is no report on exploiting the advantages of NMOF for combination therapy. Herein, we prepared core-shell nanoparticles, where each nanoparticle contains a single graphitic-phase carbon nitride (g-C3N4) nanosheet encapsulated by a zeolitic-imidazolate frameworks-8 (ZIF-8) shell. The g-C3N4 nanosheets are effective visible-light photosensitizer for photodynamic therapy (PDT). When hosting DOX (doxorubicin), the as-synthesized core-shell nanoparticles could realize combinational photo-chemo therapy and provide dual-color fluorescence imaging. Therefore, we expect NMOFs-based core-shell nanoparticles could provide a new way to achieve much-enhanced cancer therapy.Keywords: carbon nitride, combination therapy, drug delivery, nanoscale metal-organic frameworks
Procedia PDF Downloads 4251996 Making the Right Call for Falls: Evaluating the Efficacy of a Multi-Faceted Trust Wide Approach to Improving Patient Safety Post Falls
Authors: Jawaad Saleem, Hannah Wright, Peter Sommerville, Adrian Hopper
Abstract:
Introduction: Inpatient falls are the most commonly reported patient safety incidents, and carry a significant burden on resources, morbidity, and mortality. Ensuring adequate post falls management of patients by staff is therefore paramount to maintaining patient safety especially in out of hours and resource stretched settings. Aims: This quality improvement project aims to improve the current practice of falls management at Guys St Thomas Hospital, London as compared to our 2016 Quality Improvement Project findings. Furthermore, it looks to increase current junior doctors confidence in managing falls and their use of new guidance protocols. Methods: Multifaceted Interventions implemented included: the development of new trust wide guidelines detailing management pathways for patients post falls, available for intranet access. Furthermore, the production of 2000 lanyard cards distributed amongst junior doctors and staff which summarised these guidelines. Additionally, a ‘safety signal’ email was sent from the Trust chief medical officer to all staff raising awareness of falls and the guidelines. Formal falls teaching was also implemented for new doctors at induction. Using an established incident database, 189 consecutive falls in 2017were retrospectively analysed electronically to assess and compared to the variables measured in 2016 post interventions. A separate serious incident database was used to analyse 50 falls from May 2015 to March 2018 to ascertain the statistical significance of the impact of our interventions on serious incidents. A similar questionnaire for the 2017 cohort of foundation year one (FY1) doctors was performed and compared to 2016 results. Results: Questionnaire data demonstrated improved awareness and utility of guidelines and increased confidence as well as an increase in training. 97% of FY1 trainees felt that the interventions had increased their awareness of the impact of falls on patients in the trust. Data from the incident database demonstrated the time to review patients post fall had decreased from an average of 130 to 86 minutes. Improvement was also demonstrated in the reduced time to order and schedule X-ray and CT imaging, 3 and 5 hours respectively. Data from the serious incident database show that ‘the time from fall until harm was detected’ was statistically significantly lower (P = 0.044) post intervention. We also showed the incidence of significant delays in detecting harm ( > 10 hours) reduced post intervention. Conclusions: Our interventions have helped to significantly reduce the average time to assess, order and schedule appropriate imaging post falls. Delays of over ten hours to detect serious injuries after falls were commonplace; since the intervention, their frequency has markedly reduced. We suggest this will lead to identifying patient harm sooner, reduced clinical incidents relating to falls and thus improve overall patient safety. Our interventions have also helped increase clinical staff confidence, management, and awareness of falls in the trust. Next steps include expanding teaching sessions, improving multidisciplinary team involvement to aid this improvement.Keywords: patient safety, quality improvement, serious incidents, falls, clinical care
Procedia PDF Downloads 1241995 Functionalized DOX Nanocapsules by Iron Oxide Nanoparticles for Targeted Drug Delivery
Authors: Afsaneh Ghorbanzadeh, Afshin Farahbakhsh, Zakieh Bayat
Abstract:
The drug capsulation was used for release and targeted delivery in determined time, place and temperature or pH. The DOX nanocapsules were used to reduce and to minimize the unwanted side effects of drug. In this paper, the encapsulation methods of doxorubicin (DOX) and the labeling it by the magnetic core of iron (Fe3O4) has been studied. The Fe3O4 was conjugated with DOX via hydrazine bond. The solution was capsuled by the sensitive polymer of heat or pH such as chitosan-g-poly (N-isopropylacrylamide-co-N,N-dimethylacrylamide), dextran-g-poly(N-isopropylacrylamide-co-N,N-dimethylacrylamide) and mPEG-G2.5 PAMAM by hydrazine bond. The drug release was very slow at temperatures lower than 380°C. There was a rapid and controlled drug release at temperatures higher than 380°C. According to experiments, the use mPEG-G2.5PAMAM is the best method of DOX nanocapsules synthesis, because in this method, the drug delivery time to certain place is lower than other methods and the percentage of released drug is higher. The synthesized magnetic carrier system has potential applications in magnetic drug-targeting delivery and magnetic resonance imaging.Keywords: drug carrier, drug release, doxorubicin, iron oxide NPs
Procedia PDF Downloads 4181994 A Review of Optomechatronic Ecosystem
Authors: Sam Zhang
Abstract:
The landscape of Opto mechatronics is viewed along the line of light vs. matter, photonics vs. semiconductors, and optics vs. mechatronics. Optomechatronics is redefined as the integration of light and matter from the atom, device, and system to the application. The markets and megatrends in Opto mechatronics are further listed. The author then focuses on Opto mechatronic technology in the semiconductor industry as an example and reviews the practical systems, characteristics, and trends. Opto mechatronics, together with photonics and semiconductor, will continue producing the computational and smart infrastructure required for the 4th industrial revolution.Keywords: photonics, semiconductor, optomechatronics, 4th industrial revolution
Procedia PDF Downloads 1291993 Designing Stochastic Non-Invasively Applied DC Pulses to Suppress Tremors in Multiple Sclerosis by Computational Modeling
Authors: Aamna Lawrence, Ashutosh Mishra
Abstract:
Tremors occur in 60% of the patients who have Multiple Sclerosis (MS), the most common demyelinating disease that affects the central and peripheral nervous system, and are the primary cause of disability in young adults. While pharmacological agents provide minimal benefits, surgical interventions like Deep Brain Stimulation and Thalamotomy are riddled with dangerous complications which make non-invasive electrical stimulation an appealing treatment of choice for dealing with tremors. Hence, we hypothesized that if the non-invasive electrical stimulation parameters (mainly frequency) can be computed by mathematically modeling the nerve fibre to take into consideration the minutest details of the axon morphologies, tremors due to demyelination can be optimally alleviated. In this computational study, we have modeled the random demyelination pattern in a nerve fibre that typically manifests in MS using the High-Density Hodgkin-Huxley model with suitable modifications to account for the myelin. The internode of the nerve fibre in our model could have up to ten demyelinated regions each having random length and myelin thickness. The arrival time of action potentials traveling the demyelinated and the normally myelinated nerve fibre between two fixed points in space was noted, and its relationship with the nerve fibre radius ranging from 5µm to 12µm was analyzed. It was interesting to note that there were no overlaps between the arrival time for action potentials traversing the demyelinated and normally myelinated nerve fibres even when a single internode of the nerve fibre was demyelinated. The study gave us an opportunity to design DC pulses whose frequency of application would be a function of the random demyelination pattern to block only the delayed tremor-causing action potentials. The DC pulses could be delivered to the peripheral nervous system non-invasively by an electrode bracelet that would suppress any shakiness beyond it thus paving the way for wearable neuro-rehabilitative technologies.Keywords: demyelination, Hodgkin-Huxley model, non-invasive electrical stimulation, tremor
Procedia PDF Downloads 1281992 Enhancing Residential Architecture through Generative Design: Balancing Aesthetics, Legal Constraints, and Environmental Considerations
Authors: Milena Nanova, Radul Shishkov, Martin Georgiev, Damyan Damov
Abstract:
This research paper presents an in-depth exploration of the use of generative design in urban residential architecture, with a dual focus on aligning aesthetic values with legal and environmental constraints. The study aims to demonstrate how generative design methodologies can innovate residential building designs that are not only legally compliant and environmentally conscious but also aesthetically compelling. At the core of our research is a specially developed generative design framework tailored for urban residential settings. This framework employs computational algorithms to produce diverse design solutions, meticulously balancing aesthetic appeal with practical considerations. By integrating site-specific features, urban legal restrictions, and environmental factors, our approach generates designs that resonate with the unique character of urban landscapes while adhering to regulatory frameworks. The paper explores how modern digital tools, particularly computational design, and algorithmic modelling, can optimize the early stages of residential building design. By creating a basic parametric model of a residential district, the paper investigates how automated design tools can explore multiple design variants based on predefined parameters (e.g., building cost, dimensions, orientation) and constraints. The paper aims to demonstrate how these tools can rapidly generate and refine architectural solutions that meet the required criteria for quality of life, cost efficiency, and functionality. The study utilizes computational design for database processing and algorithmic modelling within the fields of applied geodesy and architecture. It focuses on optimizing the forms of residential development by adjusting specific parameters and constraints. The results of multiple iterations are analysed, refined, and selected based on their alignment with predefined quality and cost criteria. The findings of this research will contribute to a modern, complex approach to residential area design. The paper demonstrates the potential for integrating BIM models into the design process and their application in virtual 3D Geographic Information Systems (GIS) environments. The study also examines the transformation of BIM models into suitable 3D GIS file formats, such as CityGML, to facilitate the visualization and evaluation of urban planning solutions. In conclusion, our research demonstrates that a generative parametric approach based on real geodesic data and collaborative decision-making could be introduced in the early phases of the design process. This gives the designers powerful tools to explore diverse design possibilities, significantly improving the qualities of the investment during its entire lifecycle.Keywords: architectural design, residential buildings, urban development, geodesic data, generative design, parametric models, workflow optimization
Procedia PDF Downloads 71991 Central Nervous System Lesion Differentiation in the Emergency Radiology Department
Authors: Angelis P. Barlampas
Abstract:
An 89 years old woman came to the emergency department complaining of long-lasting headaches and nausea. A CT examination was performed, and a homogeneous midline anterior cranial fossa lesion was revealed, which was situated near the base and measured 2,4 cm in diameter. The patient was allergic, and an i.v.c injection could not be done on the spot, and neither could an MRI exam because of metallic implants. How could someone narrow down the differential diagnosis? The interhemispheric meningioma is usually a silent midline lesion with no edema, and most often presents as a homogeneous, solid type, isodense, or slightly hyperdense mass ( usually the smallest lesions as this one ). Of them, 20-30% have some calcifications. Hyperostosis is typical for meningiomas that abut the base of the skull but is absent in the current case, presumably of a more cephalad location that is borderline away from the bone. Because further investigation could not be done, as the patient was allergic to the contrast media, some other differential options should be considered. Regarding the site of the lesion, the most common other entities to keep in mind are the following: Metastasis, tumor of skull base, abscess, primary brain tumors, meningioma, giant aneurysm of the anterior cerebral artery, olfactory neuroblastoma, interhemispheric meningioma, giant aneurysm of the anterior cerebral artery, midline lesion. Appearance will depend on whether the aneurysm is non-thrombosed, or partially, or completely thrombosed. Non-contrast: slightly hyperdense, well-defined round extra-axial mass, may demonstrate a peripheral calcified rim, olfactory neuroblastoma, midline lesion. The mass is of soft tissue attenuation and is relatively homogeneous. Focal calcifications are occasionally present. When an intracranial extension is present, peritumoral cysts between it and the overlying brain are often present. Final diagnosis interhemispheric meningioma (Known from the previous patient’s history). Meningiomas come from the meningocytes or the arachnoid cells of the meninges. They are usually found incidentally, have an indolent course, and their most common location is extra-axial, parasagittal, and supratentorial. Other locations include the sphenoid ridge, olfactory groove, juxtasellar, infratentorial, intraventricular, pineal gland area, and optic nerve meningioma. They are clinically silent entities, except for large ones, which can present with headaches, changes in personality status, paresis, or symptomatology according to their specific site and may cause edema of the surrounding brain tissue. Imaging findings include the presence of calcifications, the CSF cleft sign, hyperostosis of adjacent bone, dural tail, and white matter buckling sign. After i.v.c. injection, they enhance brightly and homogenously, except for large ones, which may exhibit necrotic areas or may be heavily calcified. Malignant or cystic variants demonstrate more heterogeneity and less intense enhancement. Sometimes, it is inevitable that the needed CT protocol cannot be performed, especially in the emergency department. In these cases, the radiologist must focus on the characteristic imaging features of the unenhanced lesion, as well as in previous examinations or a known lesion history, in order to come to the right report conclusion.Keywords: computed tomography, emergency radiology, metastasis, tumor of skull base, abscess, primary brain tumors, meningioma, giant aneurysm of the anterior cerebral artery, olfactory neuroblastoma, interhemispheric meningioma
Procedia PDF Downloads 691990 A Fuzzy Approach to Liver Tumor Segmentation with Zernike Moments
Authors: Abder-Rahman Ali, Antoine Vacavant, Manuel Grand-Brochier, Adélaïde Albouy-Kissi, Jean-Yves Boire
Abstract:
In this paper, we present a new segmentation approach for liver lesions in regions of interest within MRI (Magnetic Resonance Imaging). This approach, based on a two-cluster Fuzzy C-Means methodology, considers the parameter variable compactness to handle uncertainty. Fine boundaries are detected by a local recursive merging of ambiguous pixels with a sequential forward floating selection with Zernike moments. The method has been tested on both synthetic and real images. When applied on synthetic images, the proposed approach provides good performance, segmentations obtained are accurate, their shape is consistent with the ground truth, and the extracted information is reliable. The results obtained on MR images confirm such observations. Our approach allows, even for difficult cases of MR images, to extract a segmentation with good performance in terms of accuracy and shape, which implies that the geometry of the tumor is preserved for further clinical activities (such as automatic extraction of pharmaco-kinetics properties, lesion characterization, etc).Keywords: defuzzification, floating search, fuzzy clustering, Zernike moments
Procedia PDF Downloads 4521989 Fluorescent Imaging with Hoechst 34580 and Propidium Iodide in Determination of Toxic Changes of Cyanobacterial Oligopeptides in Rotifers
Authors: Adam Bownik, Małgorzata Adamczuk, Barbara Pawlik-Skowrońska
Abstract:
Certain strains of cyanobacteria, microorganisms forming water blooms, produce toxic secondary metabolites. Although various effects of cyanotoxins in aquatic animals are known, little data can be found on the influence of some cyanobacterial oligopeptides beyond microcystins. The aim of the present study was to determine the toxicity of novel pure cyanobacterial oligopeptides: microginin FR-1 (MGFR1) and anabaenopeptin-A (ANA-A) on a transparent model rotifer Brachionus calyciflorus with the use of fluorescent double staining with Hoechst 34580 and propidium iodide. The obtained results showed that both studied oligopeptides decreased the fluorescence intensity of animals stained with Hoechst 34580 in a concentration-dependent manner. On the other hand, a concentration-dependent increase of propidium iodide fluorescence was noted in the exposed rotifers. The results suggest that MGFR-1 and ANA-A should be considered as a potent toxic agent to freshwater rotifers, and fluorescent staining with Hoechst and propidium iodide may be a valuable tool for determination of toxicity of cyanobacterial oligopeptides in rotifers.Keywords: cyanobacteria, brachionus, oligopeptides, fluorescent staining, hoechst, propidium iodide
Procedia PDF Downloads 1291988 KCBA, A Method for Feature Extraction of Colonoscopy Images
Authors: Vahid Bayrami Rad
Abstract:
In recent years, the use of artificial intelligence techniques, tools, and methods in processing medical images and health-related applications has been highlighted and a lot of research has been done in this regard. For example, colonoscopy and diagnosis of colon lesions are some cases in which the process of diagnosis of lesions can be improved by using image processing and artificial intelligence algorithms, which help doctors a lot. Due to the lack of accurate measurements and the variety of injuries in colonoscopy images, the process of diagnosing the type of lesions is a little difficult even for expert doctors. Therefore, by using different software and image processing, doctors can be helped to increase the accuracy of their observations and ultimately improve their diagnosis. Also, by using automatic methods, the process of diagnosing the type of disease can be improved. Therefore, in this paper, a deep learning framework called KCBA is proposed to classify colonoscopy lesions which are composed of several methods such as K-means clustering, a bag of features and deep auto-encoder. Finally, according to the experimental results, the proposed method's performance in classifying colonoscopy images is depicted considering the accuracy criterion.Keywords: colorectal cancer, colonoscopy, region of interest, narrow band imaging, texture analysis, bag of feature
Procedia PDF Downloads 571987 Computational Modelling of Epoxy-Graphene Composite Adhesive towards the Development of Cryosorption Pump
Authors: Ravi Verma
Abstract:
Cryosorption pump is the best solution to achieve clean, vibration free ultra-high vacuum. Furthermore, the operation of cryosorption pump is free from the influence of electric and magnetic fields. Due to these attributes, this pump is used in the space simulation chamber to create the ultra-high vacuum. The cryosorption pump comprises of three parts (a) panel which is cooled with the help of cryogen or cryocooler, (b) an adsorbent which is used to adsorb the gas molecules, (c) an epoxy which holds the adsorbent and the panel together thereby aiding in heat transfer from adsorbent to the panel. The performance of cryosorption pump depends on the temperature of the adsorbent and hence, on the thermal conductivity of the epoxy. Therefore we have made an attempt to increase the thermal conductivity of epoxy adhesive by mixing nano-sized graphene filler particles. The thermal conductivity of epoxy-graphene composite adhesive is measured with the help of indigenously developed experimental setup in the temperature range from 4.5 K to 7 K, which is generally the operating temperature range of cryosorption pump for efficiently pumping of hydrogen and helium gas. In this article, we have presented the experimental results of epoxy-graphene composite adhesive in the temperature range from 4.5 K to 7 K. We have also proposed an analytical heat conduction model to find the thermal conductivity of the composite. In this case, the filler particles, such as graphene, are randomly distributed in a base matrix of epoxy. The developed model considers the complete spatial random distribution of filler particles and this distribution is explained by Binomial distribution. The results obtained by the model have been compared with the experimental results as well as with the other established models. The developed model is able to predict the thermal conductivity in both isotropic regions as well as in anisotropic region over the required temperature range from 4.5 K to 7 K. Due to the non-empirical nature of the proposed model, it will be useful for the prediction of other properties of composite materials involving the filler in a base matrix. The present studies will aid in the understanding of low temperature heat transfer which in turn will be useful towards the development of high performance cryosorption pump.Keywords: composite adhesive, computational modelling, cryosorption pump, thermal conductivity
Procedia PDF Downloads 891986 A Hybrid of BioWin and Computational Fluid Dynamics Based Modeling of Biological Wastewater Treatment Plants for Model-Based Control
Authors: Komal Rathore, Kiesha Pierre, Kyle Cogswell, Aaron Driscoll, Andres Tejada Martinez, Gita Iranipour, Luke Mulford, Aydin Sunol
Abstract:
Modeling of Biological Wastewater Treatment Plants requires several parameters for kinetic rate expressions, thermo-physical properties, and hydrodynamic behavior. The kinetics and associated mechanisms become complex due to several biological processes taking place in wastewater treatment plants at varying times and spatial scales. A dynamic process model that incorporated the complex model for activated sludge kinetics was developed using the BioWin software platform for an Advanced Wastewater Treatment Plant in Valrico, Florida. Due to the extensive number of tunable parameters, an experimental design was employed for judicious selection of the most influential parameter sets and their bounds. The model was tuned using both the influent and effluent plant data to reconcile and rectify the forecasted results from the BioWin Model. Amount of mixed liquor suspended solids in the oxidation ditch, aeration rates and recycle rates were adjusted accordingly. The experimental analysis and plant SCADA data were used to predict influent wastewater rates and composition profiles as a function of time for extended periods. The lumped dynamic model development process was coupled with Computational Fluid Dynamics (CFD) modeling of the key units such as oxidation ditches in the plant. Several CFD models that incorporate the nitrification-denitrification kinetics, as well as, hydrodynamics was developed and being tested using ANSYS Fluent software platform. These realistic and verified models developed using BioWin and ANSYS were used to plan beforehand the operating policies and control strategies for the biological wastewater plant accordingly that further allows regulatory compliance at minimum operational cost. These models, with a little bit of tuning, can be used for other biological wastewater treatment plants as well. The BioWin model mimics the existing performance of the Valrico Plant which allowed the operators and engineers to predict effluent behavior and take control actions to meet the discharge limits of the plant. Also, with the help of this model, we were able to find out the key kinetic and stoichiometric parameters which are significantly more important for modeling of biological wastewater treatment plants. One of the other important findings from this model were the effects of mixed liquor suspended solids and recycle ratios on the effluent concentration of various parameters such as total nitrogen, ammonia, nitrate, nitrite, etc. The ANSYS model allowed the abstraction of information such as the formation of dead zones increases through the length of the oxidation ditches as compared to near the aerators. These profiles were also very useful in studying the behavior of mixing patterns, effect of aerator speed, and use of baffles which in turn helps in optimizing the plant performance.Keywords: computational fluid dynamics, flow-sheet simulation, kinetic modeling, process dynamics
Procedia PDF Downloads 2091985 Enhancing Sensitivity in Multifrequency Atomic Force Microscopy
Authors: Babak Eslami
Abstract:
Bimodal and trimodal AFM have provided additional capabilities to scanning probe microscopy characterization techniques. These capabilities have specifically enhanced material characterization of surfaces and provided subsurface imaging in addition to conventional topography images. Bimodal and trimodal AFM, being different techniques of multifrequency AFM, are based on exciting the cantilever’s fundamental eigenmode with second and third eigenmodes simultaneously. Although higher eigenmodes provide a higher number of observables that can provide additional information about the sample, they cause experimental challenges. In this work, different experimental approaches for enhancing AFM images in multifrequency for different characterization goals are provided. The trade-offs between eigenmodes including the advantages and disadvantages of using each mode for different samples (ranging from stiff to soft matter) in both air and liquid environments are provided. Additionally, the advantage of performing conventional single tapping mode AFM with higher eigenmodes of the cantilever in order to reduce sample indentation is discussed. These analyses are performed on widely used polymers such as polystyrene, polymethyl methacrylate and air nanobubbles on different surfaces in both air and liquid.Keywords: multifrequency, sensitivity, soft matter, polymer
Procedia PDF Downloads 1341984 Testing and Validation Stochastic Models in Epidemiology
Authors: Snigdha Sahai, Devaki Chikkavenkatappa Yellappa
Abstract:
This study outlines approaches for testing and validating stochastic models used in epidemiology, focusing on the integration and functional testing of simulation code. It details methods for combining simple functions into comprehensive simulations, distinguishing between deterministic and stochastic components, and applying tests to ensure robustness. Techniques include isolating stochastic elements, utilizing large sample sizes for validation, and handling special cases. Practical examples are provided using R code to demonstrate integration testing, handling of incorrect inputs, and special cases. The study emphasizes the importance of both functional and defensive programming to enhance code reliability and user-friendliness.Keywords: computational epidemiology, epidemiology, public health, infectious disease modeling, statistical analysis, health data analysis, disease transmission dynamics, predictive modeling in health, population health modeling, quantitative public health, random sampling simulations, randomized numerical analysis, simulation-based analysis, variance-based simulations, algorithmic disease simulation, computational public health strategies, epidemiological surveillance, disease pattern analysis, epidemic risk assessment, population-based health strategies, preventive healthcare models, infection dynamics in populations, contagion spread prediction models, survival analysis techniques, epidemiological data mining, host-pathogen interaction models, risk assessment algorithms for disease spread, decision-support systems in epidemiology, macro-level health impact simulations, socioeconomic determinants in disease spread, data-driven decision making in public health, quantitative impact assessment of health policies, biostatistical methods in population health, probability-driven health outcome predictions
Procedia PDF Downloads 61983 Peg@GDF3:TB3+ – Rb Nanocomposites for Deep-Seated X-Ray Induced Photodynamic Therapy in Oncology
Authors: E.A. Kuchma
Abstract:
Photodynamic therapy (PDT) is considered an alternative and minimally invasive cancer treatment modality compared to chemotherapy and radiation therapy. PDT includes three main components: a photosensitizer (PS), oxygen, and a light source. PS is injected into the patient's body and then selectively accumulates in the tumor. However, the light used in PDT (spectral range 400–700 nm) is limited to superficial lesions, and the light penetration depth does not exceed a few cm. The problem of PDT (poor visible light transmission) can be solved by using X-rays. The penetration depth of X-rays is ten times greater than that of visible light. Therefore, X-ray radiation easily penetrates through the tissues of the body. The aim of this work is to develop universal nanocomposites for X-ray photodynamic therapy of deep and superficial tumors using scintillation nanoparticles of gadolinium fluoride (GdF3), doped with Tb3+, coated with a biocompatible coating (PEG) and photosensitizer RB (Rose Bengal). PEG@GdF3:Tb3+(15%) – RB could be used as an effective X-ray, UV, and photoluminescent mediator to excite a photosensitizer for generating reactive oxygen species (ROS) to kill tumor cells via photodynamic therapy. GdF3 nanoparticles can also be used as contrast agents for computed tomography (CT) and magnetic resonance imaging (MRI).Keywords: X-ray induced photodynamic therapy, scintillating nanoparticle, radiosensitizer, photosensitizer
Procedia PDF Downloads 801982 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches
Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez
Abstract:
Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation
Procedia PDF Downloads 3461981 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach
Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak
Abstract:
Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity
Procedia PDF Downloads 1611980 A Rare Case of Dissection of Cervical Portion of Internal Carotid Artery, Diagnosed Postpartum
Authors: Bidisha Chatterjee, Sonal Grover, Rekha Gurung
Abstract:
Postpartum dissection of the internal carotid artery is a relatively rare condition and is considered as an underlying aetiology in 5% to 25% of strokes under the age of 30 to 45 years. However, 86% of these cases recover completely and 14% have mild focal neurological symptoms. Prognosis is generally good with early intervention. The risk quoted for a repeat carotid artery dissection in subsequent pregnancies is less than 2%. 36-year Caucasian primipara presented on postnatal day one of forceps delivery with tachycardia. In the intrapartum period she had a history of prolonged rupture of membranes and developed intrapartum sepsis and was treated with antibiotics. Postpartum ECG showed septal inferior T wave inversion and a troponin level of 19. Subsequently Echocardiogram ruled out post-partum cardiomyopathy. Repeat ECG showed improvement of the previous changes and in the absence of symptoms no intervention was warranted. On day 4 post-delivery, she had developed symptoms of droopy right eyelid, pain around the right eye and itching in the right ear. On examination, she had developed right sided ptosis, unequal pupils (Rt miotic pupil). Cranial nerve examination, reflexes, sensory examination and muscle power was normal. Apart from migraine, there was no medical or family history of note. In view of Horner’s on the right, she had a CT Angiogram and subsequently MR/MRA and was diagnosed with dissection of the cervical portion of the right internal carotid artery. She was discharged on a course of Aspirin 75mg. By 6 week post-natal follow up patient had recovered significantly with occasional episodes of unequal pupils and tingling of right toes which resolved spontaneously. Cervical artery dissection, including VAD and carotid artery dissection, are rare complications of pregnancy with an estimated annual incidence of 2.6–3 per 100,000 pregnancy hospitalizations. Aetiology remains unclear though trauma during straining at labour, underlying arterial disease and preeclampsia have been implicated. Hypercoagulable state during pregnancy and puerperium could also be an important factor. 60-90% cases present with severe headache and neck pain and generally precede neurological symptoms like ipsilateral Horner’s syndrome, retroorbital pain, tinnitus and cranial nerve palsy. Although rare, the consequences of delayed diagnosis and management can lead to severe and permanent neurological deficits. Patients with a strong index of suspicion should undergo an MRI or MRA of head and neck. Antithrombotic and antiplatelet therapy forms the mainstay of therapy with selected cases needing endovascular stenting. Long term prognosis is favourable with either complete resolution or minimal deficit if treatment is prompt. Patients should be counselled about the recurrence risk and possibility of stroke in future pregnancy. Coronary artery dissection is rare and treatable but needs early diagnosis and treatment. Post-partum headache and neck pain with neurological symptoms should prompt urgent imaging followed by antithrombotic and /or antiplatelet therapy. Most cases resolve completely or with minimal sequelae.Keywords: postpartum, dissection of internal carotid artery, magnetic resonance angiogram, magnetic resonance imaging, antiplatelet, antithrombotic
Procedia PDF Downloads 971979 Implementation of Edge Detection Based on Autofluorescence Endoscopic Image of Field Programmable Gate Array
Authors: Hao Cheng, Zhiwu Wang, Guozheng Yan, Pingping Jiang, Shijia Qin, Shuai Kuang
Abstract:
Autofluorescence Imaging (AFI) is a technology for detecting early carcinogenesis of the gastrointestinal tract in recent years. Compared with traditional white light endoscopy (WLE), this technology greatly improves the detection accuracy of early carcinogenesis, because the colors of normal tissues are different from cancerous tissues. Thus, edge detection can distinguish them in grayscale images. In this paper, based on the traditional Sobel edge detection method, optimization has been performed on this method which considers the environment of the gastrointestinal, including adaptive threshold and morphological processing. All of the processes are implemented on our self-designed system based on the image sensor OV6930 and Field Programmable Gate Array (FPGA), The system can capture the gastrointestinal image taken by the lens in real time and detect edges. The final experiments verified the feasibility of our system and the effectiveness and accuracy of the edge detection algorithm.Keywords: AFI, edge detection, adaptive threshold, morphological processing, OV6930, FPGA
Procedia PDF Downloads 2301978 CT Images Based Dense Facial Soft Tissue Thickness Measurement by Open-source Tools in Chinese Population
Authors: Ye Xue, Zhenhua Deng
Abstract:
Objectives: Facial soft tissue thickness (FSTT) data could be obtained from CT scans by measuring the face-to-skull distances at sparsely distributed anatomical landmarks by manually located on face and skull. However, automated measurement using 3D facial and skull models by dense points using open-source software has become a viable option due to the development of computed assisted imaging technologies. By utilizing dense FSTT information, it becomes feasible to generate plausible automated facial approximations. Therefore, establishing a comprehensive and detailed, densely calculated FSTT database is crucial in enhancing the accuracy of facial approximation. Materials and methods: This study utilized head CT scans from 250 Chinese adults of Han ethnicity, with 170 participants originally born and residing in northern China and 80 participants in southern China. The age of the participants ranged from 14 to 82 years, and all samples were divided into five non-overlapping age groups. Additionally, samples were also divided into three categories based on BMI information. The 3D Slicer software was utilized to segment bone and soft tissue based on different Hounsfield Unit (HU) thresholds, and surface models of the face and skull were reconstructed for all samples from CT data. Following procedures were performed unsing MeshLab, including converting the face models into hollowed cropped surface models amd automatically measuring the Hausdorff Distance (referred to as FSTT) between the skull and face models. Hausdorff point clouds were colorized based on depth value and exported as PLY files. A histogram of the depth distributions could be view and subdivided into smaller increments. All PLY files were visualized of Hausdorff distance value of each vertex. Basic descriptive statistics (i.e., mean, maximum, minimum and standard deviation etc.) and distribution of FSTT were analysis considering the sex, age, BMI and birthplace. Statistical methods employed included Multiple Regression Analysis, ANOVA, principal component analysis (PCA). Results: The distribution of FSTT is mainly influenced by BMI and sex, as further supported by the results of the PCA analysis. Additionally, FSTT values exceeding 30mm were found to be more sensitive to sex. Birthplace-related differences were observed in regions such as the forehead, orbital, mandibular, and zygoma. Specifically, there are distribution variances in the depth range of 20-30mm, particularly in the mandibular region. Northern males exhibit thinner FSTT in the frontal region of the forehead compared to southern males, while females shows fewer distribution differences between the northern and southern, except for the zygoma region. The observed distribution variance in the orbital region could be attributed to differences in orbital size and shape. Discussion: This study provides a database of Chinese individuals distribution of FSTT and suggested opening source tool shows fine function for FSTT measurement. By incorporating birthplace as an influential factor in the distribution of FSTT, a greater level of detail can be achieved in facial approximation.Keywords: forensic anthropology, forensic imaging, cranial facial reconstruction, facial soft tissue thickness, CT, open-source tool
Procedia PDF Downloads 581977 Theory of Mind and Its Brain Distribution in Patients with Temporal Lobe Epilepsy
Authors: Wei-Han Wang, Hsiang-Yu Yu, Mau-Sun Hua
Abstract:
Theory of Mind (ToM) refers to the ability to infer another’s mental state. With appropriate ToM, one can behave well in social interactions. A growing body of evidence has demonstrated that patients with temporal lobe epilepsy (TLE) may have damaged ToM due to impact on regions of the underlying neural network of ToM. However, the question of whether there is cerebral laterality for ToM functions remains open. This study aimed to examine whether there is cerebral lateralization for ToM abilities in TLE patients. Sixty-seven adult TLE patients and 30 matched healthy controls (HC) were recruited. Patients were classified into right (RTLE), left (LTLE), and bilateral (BTLE) TLE groups on the basis of a consensus panel review of their seizure semiology, EEG findings, and brain imaging results. All participants completed an intellectual test and four tasks measuring basic and advanced ToM. The results showed that, on all ToM tasks; (1)each patient group performed worse than HC; (2)there were no significant differences between LTLE and RTLE groups; (3)the BTLE group performed the worst. It appears that the neural network responsible for ToM is distributed evenly between the cerebral hemispheres.Keywords: cerebral lateralization, social cognition, temporal lobe epilepsy, theory of mind
Procedia PDF Downloads 420