Search results for: algebraic geometric imaging approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15309

Search results for: algebraic geometric imaging approach

14559 Iron Oxide Magnetic Nanoparticles as MRI Contrast Agents

Authors: Suhas Pednekar, Prashant Chavan, Ramesh Chaughule, Deepak Patkar

Abstract:

Iron oxide (Fe3O4) magnetic nanoparticles (MNPs) are one of the most attractive nanomaterials for various biomedical applications. An important potential medical application of polymer-coated iron oxide nanoparticles (NPs) is as imaging agents. Composition, size, morphology and surface chemistry of these nanoparticles can now be tailored by various processes to not only improve magnetic properties but also affect the behavior of nanoparticles in vivo. MNPs are being actively investigated as the next generation of magnetic resonance imaging (MRI) contrast agents. Also, there is considerable interest in developing magnetic nanoparticles and their surface modifications with therapeutic agents. Our study involves the synthesis of biocompatible cancer drug coated with iron oxide nanoparticles and to evaluate their efficacy as MRI contrast agents. A simple and rapid microwave method to prepare Fe3O4 nanoparticles has been developed. The drug was successfully conjugated to the Fe3O4 nanoparticles which can be used for various applications. The relaxivity R2 (reciprocal of the spin-spin relaxation time T2) is an important factor to determine the efficacy of Fe nanoparticles as contrast agents for MRI experiments. R2 values of the coated magnetic nanoparticles were also measured using MRI technique and the results showed that R2 of the Fe complex consisting of Fe3O4, polymer and drug was higher than that of bare Fe nanoparticles and polymer coated nanoparticles. This is due to the increase in hydrodynamic sizes of Fe NPs. The results with various amounts of iron molar concentrations are also discussed. Using MRI, it is seen that the R2 relaxivity increases linearly with increase in concentration of Fe NPs in water.

Keywords: cancer drug, hydrodynamic size, magnetic nanoparticles, MRI

Procedia PDF Downloads 484
14558 Task Based Language Learning: A Paradigm Shift in ESL/EFL Teaching and Learning: A Case Study Based Approach

Authors: Zehra Sultan

Abstract:

The study is based on the task-based language teaching approach which is found to be very effective in the EFL/ESL classroom. This approach engages learners to acquire the usage of authentic language skills by interacting with the real world through sequence of pedagogical tasks. The use of technology enhances the effectiveness of this approach. This study throws light on the historical background of TBLT and its efficacy in the EFL/ESL classroom. In addition, this study precisely talks about the implementation of this approach in the General Foundation Programme of Muscat College, Oman. It furnishes the list of the pedagogical tasks embedded in the language curriculum of General Foundation Programme (GFP) which are skillfully allied to the College Graduate Attributes. Moreover, the study also discusses the challenges pertaining to this approach from the point of view of teachers, students, and its classroom application. Additionally, the operational success of this methodology is gauged through formative assessments of the GFP, which is apparent in the students’ progress.

Keywords: task-based language teaching, authentic language, communicative approach, real world activities, ESL/EFL activities

Procedia PDF Downloads 120
14557 Behind Fuzzy Regression Approach: An Exploration Study

Authors: Lavinia B. Dulla

Abstract:

The exploration study of the fuzzy regression approach attempts to present that fuzzy regression can be used as a possible alternative to classical regression. It likewise seeks to assess the differences and characteristics of simple linear regression and fuzzy regression using the width of prediction interval, mean absolute deviation, and variance of residuals. Based on the simple linear regression model, the fuzzy regression approach is worth considering as an alternative to simple linear regression when the sample size is between 10 and 20. As the sample size increases, the fuzzy regression approach is not applicable to use since the assumption regarding large sample size is already operating within the framework of simple linear regression. Nonetheless, it can be suggested for a practical alternative when decisions often have to be made on the basis of small data.

Keywords: fuzzy regression approach, minimum fuzziness criterion, interval regression, prediction interval

Procedia PDF Downloads 293
14556 Characterization of the Worn Surfaces of Brake Discs and Friction Materials after Dynobench Tests

Authors: Ana Paula Gomes Nogueira, Pietro Tonolini, Andrea Bonfanti

Abstract:

Automotive braking systems must convert kinetic into thermal energy by friction. Nowadays, the disc brake system is the most widespread configuration on the automotive market, which its specific configuration provides a very efficient heat dissipation. At the same time, both discs and pads wear out. Different wear mechanisms can act during the braking, which makes the understanding of the phenomenon essential for the strategies to be applied when an increased lifetime of the components is required. In this study, a specific characterization approach was conducted to analyze the worn surfaces of commercial pad friction materials and its conterface cast iron disc after dynobench tests. Scanning electronic microscope (SEM), confocal microscope, and focus ion beam microscope (FIB) were used as the main tools of the analysis, and they allowed imaging of the footprint of the different wear mechanisms presenting on the worn surfaces. Aspects such as the temperature and specific ingredients of the pad friction materials are discussed since they play an important role in the wear mechanisms.

Keywords: wear mechanism, surface characterization, brake tests, friction materials, disc brake

Procedia PDF Downloads 50
14555 Evaluating Structural Crack Propagation Induced by Soundless Chemical Demolition Agent Using an Energy Release Rate Approach

Authors: Shyaka Eugene

Abstract:

The efficient and safe demolition of structures is a critical challenge in civil engineering and construction. This study focuses on the development of optimal demolition strategies by investigating the crack propagation behavior in beams induced by soundless cracking agents. It is commonly used in controlled demolition and has gained prominence due to its non-explosive and environmentally friendly nature. This research employs a comprehensive experimental and computational approach to analyze the crack initiation, propagation, and eventual failure in beams subjected to soundless cracking agents. Experimental testing involves the application of various cracking agents under controlled conditions to understand their effects on the structural integrity of beams. High-resolution imaging and strain measurements are used to capture the crack propagation process. In parallel, numerical simulations are conducted using advanced finite element analysis (FEA) techniques to model crack propagation in beams, considering various parameters such as cracking agent composition, loading conditions, and beam properties. The FEA models are validated against experimental results, ensuring their accuracy in predicting crack propagation patterns. The findings of this study provide valuable insights into optimizing demolition strategies, allowing engineers and demolition experts to make informed decisions regarding the selection of cracking agents, their application techniques, and structural reinforcement methods. Ultimately, this research contributes to enhancing the safety, efficiency, and sustainability of demolition practices in the construction industry, reducing environmental impact and ensuring the protection of adjacent structures and the surrounding environment.

Keywords: expansion pressure, energy release rate, soundless chemical demolition agent, crack propagation

Procedia PDF Downloads 56
14554 Numerical Analysis of Heat Transfer in Water Channels of the Opposed-Piston Diesel Engine

Authors: Michal Bialy, Marcin Szlachetka, Mateusz Paszko

Abstract:

This paper discusses the CFD results of heat transfer in water channels in the engine body. The research engine was a newly designed Diesel combustion engine. The engine has three cylinders with three pairs of opposed pistons inside. The engine will be able to generate 100 kW mechanical power at a crankshaft speed of 3,800-4,000 rpm. The water channels are in the engine body along the axis of the three cylinders. These channels are around the three combustion chambers. The water channels transfer combustion heat that occurs the cylinders to the external radiator. This CFD research was based on the ANSYS Fluent software and aimed to optimize the geometry of the water channels. These channels should have a maximum flow of heat from the combustion chamber or the external radiator. Based on the parallel simulation research, the boundary and initial conditions enabled us to specify average values of key parameters for our numerical analysis. Our simulation used the average momentum equations and turbulence model k-epsilon double equation. There was also used a real k-epsilon model with a function of a standard wall. The turbulence intensity factor was 10%. The working fluid mass flow rate was calculated for a single typical value, specified in line with the research into the flow rate of automotive engine cooling pumps used in engines of similar power. The research uses a series of geometric models which differ, for instance, in the shape of the cross-section of the channel along the axis of the cylinder. The results are presented as colourful distribution maps of temperature, speed fields and heat flow through the cylinder walls. Due to limitations of space, our paper presents the results on the most representative geometric model only. Acknowledgement: This work has been realized in the cooperation with The Construction Office of WSK ‘PZL-KALISZ’ S.A. and is part of Grant Agreement No. POIR.01.02.00-00-0002/15 financed by the Polish National Centre for Research and Development.

Keywords: Ansys fluent, combustion engine, computational fluid dynamics CFD, cooling system

Procedia PDF Downloads 213
14553 A Novel Approach of Power Transformer Diagnostic Using 3D FEM Parametrical Model

Authors: M. Brandt, A. Peniak, J. Makarovič, P. Rafajdus

Abstract:

This paper deals with a novel approach of power transformers diagnostics. This approach identifies the exact location and the range of a fault in the transformer and helps to reduce operation costs related to handling of the faulty transformer, its disassembly and repair. The advantage of the approach is a possibility to simulate healthy transformer and also all faults, which can occur in transformer during its operation without its disassembling, which is very expensive in practice. The approach is based on creating frequency dependent impedance of the transformer by sweep frequency response analysis measurements and by 3D FE parametrical modeling of the fault in the transformer. The parameters of the 3D FE model are the position and the range of the axial short circuit. Then, by comparing the frequency dependent impedances of the parametrical models with the measured ones, the location and the range of the fault is identified. The approach was tested on a real transformer and showed high coincidence between the real fault and the simulated one.

Keywords: transformer, parametrical model of transformer, fault, sweep frequency response analysis, finite element method

Procedia PDF Downloads 477
14552 The Role of Eclectic Approach to Teach Communicative Function at Secondary Level

Authors: Fariha Asif

Abstract:

The main purpose of this study was to investigate the effectiveness of eclectic approach in teaching of communicative functions. The objectives of the study were to get the information about the use of communicative functions through eclectic approach and to point out the most effective way of teaching functional communication and social interaction with the help of communicative activities through eclectic approach. The next step was to select sample from the selected population. As the research was descriptive so a questionnaire was developed on the basis of hypothesis and distributed to different selected schools of Lahore, Pakistan. Then data was tabulated, analyzed and interpreted through computer by finding percentages of different responses given by teachers to see the results. It was concluded that eclectic approach is effective in teaching communicative functions and communicative functions are better when taught through eclectic approach and communicative activities are more appropriate way of teaching communicative functions. It was found those teachers who were qualified in ELT gave better opinions as compare to those who did not have this degree. Techniques like presentations, dialogues and roleplay proved to be effective for teaching functional communication through communicative activities and also motivate the students not only in learning rules but also in using them to communicate with others.

Keywords: methodology, functions, teaching, ESP

Procedia PDF Downloads 566
14551 Flame Volume Prediction and Validation for Lean Blowout of Gas Turbine Combustor

Authors: Ejaz Ahmed, Huang Yong

Abstract:

The operation of aero engines has a critical importance in the vicinity of lean blowout (LBO) limits. Lefebvre’s model of LBO based on empirical correlation has been extended to flame volume concept by the authors. The flame volume takes into account the effects of geometric configuration, the complex spatial interaction of mixing, turbulence, heat transfer and combustion processes inside the gas turbine combustion chamber. For these reasons, flame volume based LBO predictions are more accurate. Although LBO prediction accuracy has improved, it poses a challenge associated with Vf estimation in real gas turbine combustors. This work extends the approach of flame volume prediction previously based on fuel iterative approximation with cold flow simulations to reactive flow simulations. Flame volume for 11 combustor configurations has been simulated and validated against experimental data. To make prediction methodology robust as required in the preliminary design stage, reactive flow simulations were carried out with the combination of probability density function (PDF) and discrete phase model (DPM) in FLUENT 15.0. The criterion for flame identification was defined. Two important parameters i.e. critical injection diameter (Dp,crit) and critical temperature (Tcrit) were identified, and their influence on reactive flow simulation was studied for Vf estimation. Obtained results exhibit ±15% error in Vf estimation with experimental data.

Keywords: CFD, combustion, gas turbine combustor, lean blowout

Procedia PDF Downloads 265
14550 An Approach from Fichte as a Response to the Kantian Dualism of Subject and Object: The Unity of the Subject and Object in Both Theoretical and Ethical Possibility

Authors: Mengjie Liu

Abstract:

This essay aims at responding to the Kant arguments on how to fit the self-caused subject into the deterministic object which follows the natural laws. This essay mainly adopts the approach abstracted from Fichte’s “Wissenshaftslehre” (Doctrine of Science) to picture a possible solution to the conciliation of Kantian dualism. The Fichte approach is based on the unity of the theoretical and practical reason, which can be understood as a philosophical abstraction from ordinary experience combining both subject and object. This essay will discuss the general Kantian dualism problem and Fichte’s unity approach in the first part. Then the essay will elaborate on the achievement of this unity of the subject and object through Fichte’s “the I posits itself” process in the second section. The following third section is related to the ethical unity of subject and object based on the Fichte approach. The essay will also discuss the limitation of Fichte’s approach from two perspectives: (1) the theoretical possibility of the existence of the pure I and (2) Schelling’s statement that the Absolute I is a result rather than the originating act. This essay demonstrates a possible approach to unifying the subject and object supported by Fichte’s “Absolute I” and ethical theories and also points out the limitations of Fichte’s theories.

Keywords: Fichte, identity, Kantian dualism, Wissenshaftslehre

Procedia PDF Downloads 86
14549 Infrared Thermography as an Informative Tool in Energy Audit and Software Modelling of Historic Buildings: A Case Study of the Sheffield Cathedral

Authors: Ademuyiwa Agbonyin, Stamatis Zoras, Mohammad Zandi

Abstract:

This paper investigates the extent to which building energy modelling can be informed based on preliminary information provided by infrared thermography using a thermal imaging camera in a walkthrough audit. The case-study building is the Sheffield Cathedral, built in the early 1400s. Based on an informative qualitative report generated from the thermal images taken at the site, the regions showing significant heat loss are input into a computer model of the cathedral within the integrated environmental solution (IES) virtual environment software which performs an energy simulation to determine quantitative heat losses through the building envelope. Building data such as material thermal properties and building plans are provided by the architects, Thomas Ford and Partners Ltd. The results of the modelling revealed the portions of the building with the highest heat loss and these aligned with those suggested by the thermal camera. Retrofit options for the building are also considered, however, may not see implementation due to a desire to conserve the architectural heritage of the building. Results show that thermal imaging in a walk-through audit serves as a useful guide for the energy modelling process. Hand calculations were also performed to serve as a 'control' to estimate losses, providing a second set of data points of comparison.

Keywords: historic buildings, energy retrofit, thermal comfort, software modelling, energy modelling

Procedia PDF Downloads 167
14548 Inversely Designed Chipless Radio Frequency Identification (RFID) Tags Using Deep Learning

Authors: Madhawa Basnayaka, Jouni Paltakari

Abstract:

Fully passive backscattering chipless RFID tags are an emerging wireless technology with low cost, higher reading distance, and fast automatic identification without human interference, unlike already available technologies like optical barcodes. The design optimization of chipless RFID tags is crucial as it requires replacing integrated chips found in conventional RFID tags with printed geometric designs. These designs enable data encoding and decoding through backscattered electromagnetic (EM) signatures. The applications of chipless RFID tags have been limited due to the constraints of data encoding capacity and the ability to design accurate yet efficient configurations. The traditional approach to accomplishing design parameters for a desired EM response involves iterative adjustment of design parameters and simulating until the desired EM spectrum is achieved. However, traditional numerical simulation methods encounter limitations in optimizing design parameters efficiently due to the speed and resource consumption. In this work, a deep learning neural network (DNN) is utilized to establish a correlation between the EM spectrum and the dimensional parameters of nested centric rings, specifically square and octagonal. The proposed bi-directional DNN has two simultaneously running neural networks, namely spectrum prediction and design parameters prediction. First, spectrum prediction DNN was trained to minimize mean square error (MSE). After the training process was completed, the spectrum prediction DNN was able to accurately predict the EM spectrum according to the input design parameters within a few seconds. Then, the trained spectrum prediction DNN was connected to the design parameters prediction DNN and trained two networks simultaneously. For the first time in chipless tag design, design parameters were predicted accurately after training bi-directional DNN for a desired EM spectrum. The model was evaluated using a randomly generated spectrum and the tag was manufactured using the predicted geometrical parameters. The manufactured tags were successfully tested in the laboratory. The amount of iterative computer simulations has been significantly decreased by this approach. Therefore, highly efficient but ultrafast bi-directional DNN models allow rapid and complicated chipless RFID tag designs.

Keywords: artificial intelligence, chipless RFID, deep learning, machine learning

Procedia PDF Downloads 43
14547 Generalization of Tsallis Entropy from a Q-Deformed Arithmetic

Authors: J. Juan Peña, J. Morales, J. García-Ravelo, J. García-Martínez

Abstract:

It is known that by introducing alternative forms of exponential and logarithmic functions, the Tsallis entropy Sᵩ is itself a generalization of Shannon entropy S. In this work, from a deformation through a scaling function applied to the differential operator, it is possible to generate a q-deformed calculus as well as a q-deformed arithmetic, which not only allows generalizing the exponential and logarithmic functions but also any other standard function. The updated q-deformed differential operator leads to an updated integral operator under which the functions are integrated together with a weight function. For each differentiable function, it is possible to identify its q-deformed partner, which is useful to generalize other algebraic relations proper of the original functions. As an application of this proposal, in this work, a generalization of exponential and logarithmic functions is studied in such a way that their relationship with the thermodynamic functions, particularly the entropy, allows us to have a q-deformed expression of these. As a result, from a particular scaling function applied to the differential operator, a q-deformed arithmetic is obtained, leading to the generalization of the Tsallis entropy.

Keywords: q-calculus, q-deformed arithmetic, entropy, exponential functions, thermodynamic functions

Procedia PDF Downloads 73
14546 A Unified Model for Predicting Particle Settling Velocity in Pipe, Annulus and Fracture

Authors: Zhaopeng Zhu, Xianzhi Song, Gensheng Li

Abstract:

Transports of solid particles through the drill pipe, drill string-hole annulus and hydraulically generated fractures are important dynamic processes encountered in oil and gas well drilling and completion operations. Different from particle transport in infinite space, the transports of cuttings, proppants and formation sand are hindered by a finite boundary. Therefore, an accurate description of the particle transport behavior under the bounded wall conditions encountered in drilling and hydraulic fracturing operations is needed to improve drilling safety and efficiency. In this study, the particle settling experiments were carried out to investigate the particle settling behavior in the pipe, annulus and between the parallel plates filled with power-law fluids. Experimental conditions simulated the particle Reynolds number ranges of 0.01-123.87, the dimensionless diameter ranges of 0.20-0.80 and the fluid flow behavior index ranges of 0.48-0.69. Firstly, the wall effect of the annulus is revealed by analyzing the settling process of the particles in the annular geometry with variable inner pipe diameter. Then, the geometric continuity among the pipe, annulus and parallel plates was determined by introducing the ratio of inner diameter to an outer diameter of the annulus. Further, a unified dimensionless diameter was defined to confirm the relationship between the three different geometry in terms of the wall effect. In addition, a dimensionless term independent from the settling velocity was introduced to establish a unified explicit settling velocity model applicable to pipes, annulus and fractures with a mean relative error of 8.71%. An example case study was provided to demonstrate the application of the unified model for predicting particle settling velocity. This paper is the first study of annulus wall effects based on the geometric continuity concept and the unified model presented here will provide theoretical guidance for improved hydraulic design of cuttings transport, proppant placement and sand management operations.

Keywords: wall effect, particle settling velocity, cuttings transport, proppant transport in fracture

Procedia PDF Downloads 158
14545 Feature Selection Approach for the Classification of Hydraulic Leakages in Hydraulic Final Inspection using Machine Learning

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Manufacturing companies are facing global competition and enormous cost pressure. The use of machine learning applications can help reduce production costs and create added value. Predictive quality enables the securing of product quality through data-supported predictions using machine learning models as a basis for decisions on test results. Furthermore, machine learning methods are able to process large amounts of data, deal with unfavourable row-column ratios and detect dependencies between the covariates and the given target as well as assess the multidimensional influence of all input variables on the target. Real production data are often subject to highly fluctuating boundary conditions and unbalanced data sets. Changes in production data manifest themselves in trends, systematic shifts, and seasonal effects. Thus, Machine learning applications require intensive pre-processing and feature selection. Data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets. Within the used real data set of Bosch hydraulic valves, the comparability of the same production conditions in the production of hydraulic valves within certain time periods can be identified by applying the concept drift method. Furthermore, a classification model is developed to evaluate the feature importance in different subsets within the identified time periods. By selecting comparable and stable features, the number of features used can be significantly reduced without a strong decrease in predictive power. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. In this research, the ada boosting classifier is used to predict the leakage of hydraulic valves based on geometric gauge blocks from machining, mating data from the assembly, and hydraulic measurement data from end-of-line testing. In addition, the most suitable methods are selected and accurate quality predictions are achieved.

Keywords: classification, achine learning, predictive quality, feature selection

Procedia PDF Downloads 158
14544 Hybridization of Manually Extracted and Convolutional Features for Classification of Chest X-Ray of COVID-19

Authors: M. Bilal Ishfaq, Adnan N. Qureshi

Abstract:

COVID-19 is the most infectious disease these days, it was first reported in Wuhan, the capital city of Hubei in China then it spread rapidly throughout the whole world. Later on 11 March 2020, the World Health Organisation (WHO) declared it a pandemic. Since COVID-19 is highly contagious, it has affected approximately 219M people worldwide and caused 4.55M deaths. It has brought the importance of accurate diagnosis of respiratory diseases such as pneumonia and COVID-19 to the forefront. In this paper, we propose a hybrid approach for the automated detection of COVID-19 using medical imaging. We have presented the hybridization of manually extracted and convolutional features. Our approach combines Haralick texture features and convolutional features extracted from chest X-rays and CT scans. We also employ a minimum redundancy maximum relevance (MRMR) feature selection algorithm to reduce computational complexity and enhance classification performance. The proposed model is evaluated on four publicly available datasets, including Chest X-ray Pneumonia, COVID-19 Pneumonia, COVID-19 CTMaster, and VinBig data. The results demonstrate high accuracy and effectiveness, with 0.9925 on the Chest X-ray pneumonia dataset, 0.9895 on the COVID-19, Pneumonia and Normal Chest X-ray dataset, 0.9806 on the Covid CTMaster dataset, and 0.9398 on the VinBig dataset. We further evaluate the effectiveness of the proposed model using ROC curves, where the AUC for the best-performing model reaches 0.96. Our proposed model provides a promising tool for the early detection and accurate diagnosis of COVID-19, which can assist healthcare professionals in making informed treatment decisions and improving patient outcomes. The results of the proposed model are quite plausible and the system can be deployed in a clinical or research setting to assist in the diagnosis of COVID-19.

Keywords: COVID-19, feature engineering, artificial neural networks, radiology images

Procedia PDF Downloads 74
14543 On the Solidness of the Polar of Recession Cones

Authors: Sima Hassankhali, Ildar Sadeqi

Abstract:

In the theory of Pareto efficient points, the existence of a bounded base for a cone K of a normed space X is so important. In this article, we study the geometric structure of a nonzero closed convex cone K with a bounded base. For this aim, we study the structure of the polar cone K# of K. Furthermore, we obtain a necessary and sufficient condition for a nonempty closed convex set C so that its recession cone C∞ has a bounded base.

Keywords: solid cones, recession cones, polar cones, bounded base

Procedia PDF Downloads 263
14542 Influence of Pretreatment Magnetic Resonance Imaging on Local Therapy Decisions in Intermediate-Risk Prostate Cancer Patients

Authors: Christian Skowronski, Andrew Shanholtzer, Brent Yelton, Muayad Almahariq, Daniel J. Krauss

Abstract:

Prostate cancer has the third highest incidence rate and is the second leading cause of cancer death for men in the United States. Of the diagnostic tools available for intermediate-risk prostate cancer, magnetic resonance imaging (MRI) provides superior soft tissue delineation serving as a valuable tool for both diagnosis and treatment planning. Currently, there is minimal data regarding the practical utility of MRI for evaluation of intermediate-risk prostate cancer. As such, the National Comprehensive Cancer Network’s guidelines indicate MRI as optional in intermediate-risk prostate cancer evaluation. This project aims to elucidate whether MRI affects radiation treatment decisions for intermediate-risk prostate cancer. This was a retrospective study evaluating 210 patients with intermediate-risk prostate cancer, treated with definitive radiotherapy at our institution between 2019-2020. NCCN risk stratification criteria were used to define intermediate-risk prostate cancer. Patients were divided into two groups: those with pretreatment prostate MRI, and those without pretreatment prostate MRI. We compared the use of external beam radiotherapy, brachytherapy alone, brachytherapy boost, and androgen depravation therapy between the two groups. Inverse probability of treatment weighting was used to match the two groups for age, comorbidity index, American Urologic Association symptoms index, pretreatment PSA, grade group, and percent core involvement on prostate biopsy. Wilcoxon Rank Sum and Chi-squared tests were used to compare continuous and categorical variables. Of the patients who met the study’s eligibility criteria, 133 had a prostate MRI and 77 did not. Following propensity matching, there were no differences between baseline characteristics between the two groups. There were no statistically significant differences in treatments pursued between the two groups: 42% vs 47% were treated with brachytherapy alone, 40% vs 42% were treated with external beam radiotherapy alone, 18% vs 12% were treated with external beam radiotherapy with a brachytherapy boost, and 24% vs 17% received androgen deprivation therapy in the non-MRI and MRI groups, respectively. This analysis suggests that pretreatment MRI does not significantly impact radiation therapy or androgen deprivation therapy decisions in patients with intermediate-risk prostate cancer. Obtaining a pretreatment prostate MRI should be used judiciously and pursued only to answer a specific question, for which the answer is likely to impact treatment decision. Further follow up is needed to correlate MRI findings with their impacts on specific oncologic outcomes.

Keywords: magnetic resonance imaging, prostate cancer, definitive radiotherapy, gleason score 7

Procedia PDF Downloads 85
14541 Monitoring Memories by Using Brain Imaging

Authors: Deniz Erçelen, Özlem Selcuk Bozkurt

Abstract:

The course of daily human life calls for the need for memories and remembering the time and place for certain events. Recalling memories takes up a substantial amount of time for an individual. Unfortunately, scientists lack the proper technology to fully understand and observe different brain regions that interact to form or retrieve memories. The hippocampus, a complex brain structure located in the temporal lobe, plays a crucial role in memory. The hippocampus forms memories as well as allows the brain to retrieve them by ensuring that neurons fire together. This process is called “neural synchronization.” Sadly, the hippocampus is known to deteriorate often with age. Proteins and hormones, which repair and protect cells in the brain, typically decline as the age of an individual increase. With the deterioration of the hippocampus, an individual becomes more prone to memory loss. Many memory loss starts off as mild but may evolve into serious medical conditions such as dementia and Alzheimer’s disease. In their quest to fully comprehend how memories work, scientists have created many different kinds of technology that are used to examine the brain and neural pathways. For instance, Magnetic Resonance Imaging - or MRI- is used to collect detailed images of an individual's brain anatomy. In order to monitor and analyze brain functions, a different version of this machine called Functional Magnetic Resonance Imaging - or fMRI- is used. The fMRI is a neuroimaging procedure that is conducted when the target brain regions are active. It measures brain activity by detecting changes in blood flow associated with neural activity. Neurons need more oxygen when they are active. The fMRI measures the change in magnetization between blood which is oxygen-rich and oxygen-poor. This way, there is a detectable difference across brain regions, and scientists can monitor them. Electroencephalography - or EEG - is also a significant way to monitor the human brain. The EEG is more versatile and cost-efficient than an fMRI. An EEG measures electrical activity which has been generated by the numerous cortical layers of the brain. EEG allows scientists to be able to record brain processes that occur after external stimuli. EEGs have a very high temporal resolution. This quality makes it possible to measure synchronized neural activity and almost precisely track the contents of short-term memory. Science has come a long way in monitoring memories using these kinds of devices, which have resulted in the inspections of neurons and neural pathways becoming more intense and detailed.

Keywords: brain, EEG, fMRI, hippocampus, memories, neural pathways, neurons

Procedia PDF Downloads 77
14540 High-Dimensional Single-Cell Imaging Maps Inflammatory Cell Types in Pulmonary Arterial Hypertension

Authors: Selena Ferrian, Erin Mccaffrey, Toshie Saito, Aiqin Cao, Noah Greenwald, Mark Robert Nicolls, Trevor Bruce, Roham T. Zamanian, Patricia Del Rosario, Marlene Rabinovitch, Michael Angelo

Abstract:

Recent experimental and clinical observations are advancing immunotherapies to clinical trials in pulmonary arterial hypertension (PAH). However, comprehensive mapping of the immune landscape in pulmonary arteries (PAs) is necessary to understand how immune cell subsets interact to induce pulmonary vascular pathology. We used multiplexed ion beam imaging by time-of-flight (MIBI-TOF) to interrogate the immune landscape in PAs from idiopathic (IPAH) and hereditary (HPAH) PAH patients. Massive immune infiltration in I/HPAH was observed with intramural infiltration linked to PA occlusive changes. The spatial context of CD11c+DCs expressing SAMHD1, TIM-3 and IDO-1 within immune-enriched microenvironments and neutrophils were associated with greater immune activation in HPAH. Furthermore, CD11c-DC3s (mo-DC-like cells) within a smooth muscle cell (SMC) enriched microenvironment were linked to vessel score, proliferating SMCs, and inflamed endothelial cells. Experimental data in cultured cells reinforced a causal relationship between neutrophils and mo-DCs in mediating pulmonary arterial SMC proliferation. These findings merit consideration in developing effective immunotherapies for PAH.

Keywords: pulmonary arterial hypertension, vascular remodeling, indoleamine 2-3-dioxygenase 1 (IDO-1), neutrophils, monocyte-derived dendritic cells, BMPR2 mutation, interferon gamma (IFN-γ)

Procedia PDF Downloads 170
14539 Analysis of a Discrete-time Geo/G/1 Queue Integrated with (s, Q) Inventory Policy at a Service Facility

Authors: Akash Verma, Sujit Kumar Samanta

Abstract:

This study examines a discrete-time Geo/G/1 queueing-inventory system attached with (s, Q) inventory policy. Assume that the customers follow the Bernoulli process on arrival. Each customer demands a single item with arbitrarily distributed service time. The inventory is replenished by an outside supplier, and the lead time for the replenishment is determined by a geometric distribution. There is a single server and infinite waiting space in this facility. Demands must wait in the specified waiting area during a stock-out period. The customers are served on a first-come-first-served basis. With the help of the embedded Markov chain technique, we determine the joint probability distributions of the number of customers in the system and the number of items in stock at the post-departure epoch using the Matrix Analytic approach. We relate the system length distribution at post-departure and outside observer's epochs to determine the joint probability distribution at the outside observer's epoch. We use probability distributions at random epochs to determine the waiting time distribution. We obtain the performance measures to construct the cost function. The optimum values of the order quantity and reordering point are found numerically for the variety of model parameters.

Keywords: discrete-time queueing inventory model, matrix analytic method, waiting-time analysis, cost optimization

Procedia PDF Downloads 36
14538 Effect of Digital Technology on Students Interest, Achievement and Retention in Algebra in Abia State College of Education (Technical) Arochukwu

Authors: Stephen O. Amaraihu

Abstract:

This research investigated the effect of Computer Based Instruction on Students’ interest, achievement, and retention in Algebra in Abia State College of Education (Technical), Arochukwu. Three research questions and two hypotheses guided the study. Two instruments, Maths Achievement Test (MAT) and Maths Interest Inventory were employed, to test a population of three hundred and sixteen (316) NCE 1 students in algebra. It is expected that this research will lead to the improvement of students’ performance and enhance their interest and retention of basic algebraic concept. It was found that the majority of students in the college are not proficient in the use of ICT as a result of a lack of trained personnel. It was concluded that the state government was not ready to implement the usage of mathematics in Abia State College of Education. The paper recommends, amongst others, the employment of mathematics Lectures with competent skills in ICT and the training of lecturers of mathematics.

Keywords: achievement, computer based instruction, interest, retention

Procedia PDF Downloads 203
14537 The Capabilities Approach as a Future Alternative to Neoliberal Higher Education in the MENA Region

Authors: Ranya Elkhayat

Abstract:

This paper aims at offering a futures study for higher education in the Middle East. Paying special attention to the negative impacts of neoliberalism, the paper will demonstrate how higher education is now commodified, corporatized and how arts and humanities are eschewed in favor of science and technology. This conceptual paper argues against the neoliberal agenda and aims at providing an alternative exemplified in the Capabilities Approach with special reference to Martha Nussbaum’s theory. The paper is divided into four main parts: the current state of higher education under neoliberal values, a prediction of the conditions of higher education in the near future, the future of higher education using the theoretical framework of the Capabilities Approach, and finally, some areas of concern regarding the approach. The implications of the study demonstrate that Nussbaum’s Capabilities Approach will ensure that the values of education are preserved while avoiding the pitfalls of neoliberalism.

Keywords: capabilities approach, education future, higher education, MENA

Procedia PDF Downloads 190
14536 Design of a 4-DOF Robot Manipulator with Optimized Algorithm for Inverse Kinematics

Authors: S. Gómez, G. Sánchez, J. Zarama, M. Castañeda Ramos, J. Escoto Alcántar, J. Torres, A. Núñez, S. Santana, F. Nájera, J. A. Lopez

Abstract:

This paper shows in detail the mathematical model of direct and inverse kinematics for a robot manipulator (welding type) with four degrees of freedom. Using the D-H parameters, screw theory, numerical, geometric and interpolation methods, the theoretical and practical values of the position of robot were determined using an optimized algorithm for inverse kinematics obtaining the values of the particular joints in order to determine the virtual paths in a relatively short time.

Keywords: kinematics, degree of freedom, optimization, robot manipulator

Procedia PDF Downloads 460
14535 A Framework for Internet Education: Personalised Approach

Authors: Zoe Wong

Abstract:

The purpose of this paper is to develop a framework for internet education. This framework uses the personalized learning approach for everyone who can freely develop their qualifications & careers. The key components of the framework includes students, teachers, assessments and infrastructure. It allows remove the challenges and limitations of the current educational system and allows learners' to cope with progressing learning materials.

Keywords: internet education, personalized approach, information technology, framework

Procedia PDF Downloads 353
14534 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach

Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak

Abstract:

Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.

Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity

Procedia PDF Downloads 156
14533 Measurement of Echocardiographic Ejection Fraction Reference Values and Evaluation between Body Weight and Ejection Fraction in Domestic Rabbits (Oryctolagus cuniculus)

Authors: Reza Behmanesh, Mohammad Nasrolahzadeh-Masouleh, Ehsan Khaksar, Saeed Bokaie

Abstract:

Domestic rabbits (Oryctolagus cuniculus) are an excellent model for cardiovascular research because the size of these animals is more suitable for study and experimentation than smaller animals. One of the most important diagnostic imaging methods is echocardiography, which is used today to evaluate the anatomical and functional cardiovascular system and is one of the most accurate and sensitive non-invasive methods for examining heart disease. Ventricular function indices can be assessed with cardiac imaging techniques. One of these important cardiac parameters is the ejection fraction (EF), which has a valuable place along with other involved parameters. EF is a measure of the percentage of blood that comes out of the heart with each contraction. For this study, 100 adult and young standard domestic rabbits, six months to one year old and of both sexes (50 female and 50 male rabbits) without anesthesia and sedation were used. In this study, the mean EF in domestic rabbits studied in males was 58.753 ± 6.889 and in females, 61.397 ± 6.530, which are comparable to the items mentioned in the valid books and the average size of EF measured in this study; there is no significant difference between this research and other research. There was no significant difference in the percentage of EF between most weight groups, but there was a significant difference (p < 0.05) in weight groups (2161–2320 g and 2481–2640 g). Echocardiographic EF reference values for domestic rabbits (Oryctolagus cuniculus) non-anesthetized are presented, providing reference values for future studies.

Keywords: echocardiography, ejection fraction, rabbit, heart

Procedia PDF Downloads 88
14532 Finite Element Analysis of Human Tarsals, Meta Tarsals and Phalanges for Predicting probable location of Fractures

Authors: Irfan Anjum Manarvi, Fawzi Aljassir

Abstract:

Human bones have been a keen area of research over a long time in the field of biomechanical engineering. Medical professionals, as well as engineering academics and researchers, have investigated various bones by using medical, mechanical, and materials approaches to discover the available body of knowledge. Their major focus has been to establish properties of these and ultimately develop processes and tools either to prevent fracture or recover its damage. Literature shows that mechanical professionals conducted a variety of tests for hardness, deformation, and strain field measurement to arrive at their findings. However, they considered these results accuracy to be insufficient due to various limitations of tools, test equipment, difficulties in the availability of human bones. They proposed the need for further studies to first overcome inaccuracies in measurement methods, testing machines, and experimental errors and then carry out experimental or theoretical studies. Finite Element analysis is a technique which was developed for the aerospace industry due to the complexity of design and materials. But over a period of time, it has found its applications in many other industries due to accuracy and flexibility in selection of materials and types of loading that could be theoretically applied to an object under study. In the past few decades, the field of biomechanical engineering has also started to see its applicability. However, the work done in the area of Tarsals, metatarsals and phalanges using this technique is very limited. Therefore, present research has been focused on using this technique for analysis of these critical bones of the human body. This technique requires a 3-dimensional geometric computer model of the object to be analyzed. In the present research, a 3d laser scanner was used for accurate geometric scans of individual tarsals, metatarsals, and phalanges from a typical human foot to make these computer geometric models. These were then imported into a Finite Element Analysis software and a length refining process was carried out prior to analysis to ensure the computer models were true representatives of actual bone. This was followed by analysis of each bone individually. A number of constraints and load conditions were applied to observe the stress and strain distributions in these bones under the conditions of compression and tensile loads or their combination. Results were collected for deformations in various axis, and stress and strain distributions were observed to identify critical locations where fracture could occur. A comparative analysis of failure properties of all the three types of bones was carried out to establish which of these could fail earlier which is presented in this research. Results of this investigation could be used for further experimental studies by the academics and researchers, as well as industrial engineers, for development of various foot protection devices or tools for surgical operations and recovery treatment of these bones. Researchers could build up on these models to carryout analysis of a complete human foot through Finite Element analysis under various loading conditions such as walking, marching, running, and landing after a jump etc.

Keywords: tarsals, metatarsals, phalanges, 3D scanning, finite element analysis

Procedia PDF Downloads 326
14531 Visualising Charles Bonnet Syndrome: Digital Co-Creation of Pseudohallucinations

Authors: Victoria H. Hamilton

Abstract:

Charles Bonnet Syndrome (CBS) is when a person experiences pseudohallucinations that fill in visual information from any type of sight loss. CBS arises from an epiphenomenal process, with the physical actions of sight resulting in the mental formations of images. These pseudohallucinations—referred to as visions by the CBS community—manifest in a wide range of forms, from complex scenes to simple geometric shapes. To share these unique visual experiences, a remote co-creation website was created where CBS participants communicated their lived experiences. This created a reflexive process, and we worked to produce true representations of these interesting and little-known phenomena. Digital reconstruction of the visions is utilised as it echoes the vivid, experiential movie-like nature of what is being perceived. This paper critically analyses co-creation as a method for making digital assets. The implications of the participants' vision impairments and the application of ethical safeguards are examined in this context. Important to note, this research is of a medical syndrome for a non-medical, practice-based design. CBS research to date is primarily conducted by the ophthalmic, neurological, and psychiatric fields and approached with the primary concerns of these specialties. This research contributes a distinct approach incorporating practice-based digital design, autoethnography, and phenomenology. Autoethnography and phenomenology combine as a foundation, with the first bringing understanding and insights, balanced by the second philosophical, bigger picture, and established approach. With further refining, it is anticipated that the research may be applied to other conditions. Conditions where articulating internal experiences proves challenging and the use of digital methods could aid communication. Both the research and CBS communities will benefit from the insights regarding the relationship between cognitive perceptions and the vision process. This research combines the digital visualising of visions with interest in the link between metaphor, embodied cognition, and image. The argument for a link between CBS visions and metaphor may appear evident due to the cross-category mapping of images that is necessary for comprehension. They both are— CBS visions and metaphors—the experience of picturing images, often with lateral connections and imaginative associations.

Keywords: Charles Bonnet Syndrome, digital design, visual hallucinations, visual perception

Procedia PDF Downloads 38
14530 Urban Growth Analysis Using Multi-Temporal Satellite Images, Non-stationary Decomposition Methods and Stochastic Modeling

Authors: Ali Ben Abbes, ImedRiadh Farah, Vincent Barra

Abstract:

Remotely sensed data are a significant source for monitoring and updating databases for land use/cover. Nowadays, changes detection of urban area has been a subject of intensive researches. Timely and accurate data on spatio-temporal changes of urban areas are therefore required. The data extracted from multi-temporal satellite images are usually non-stationary. In fact, the changes evolve in time and space. This paper is an attempt to propose a methodology for changes detection in urban area by combining a non-stationary decomposition method and stochastic modeling. We consider as input of our methodology a sequence of satellite images I1, I2, … In at different periods (t = 1, 2, ..., n). Firstly, a preprocessing of multi-temporal satellite images is applied. (e.g. radiometric, atmospheric and geometric). The systematic study of global urban expansion in our methodology can be approached in two ways: The first considers the urban area as one same object as opposed to non-urban areas (e.g. vegetation, bare soil and water). The objective is to extract the urban mask. The second one aims to obtain a more knowledge of urban area, distinguishing different types of tissue within the urban area. In order to validate our approach, we used a database of Tres Cantos-Madrid in Spain, which is derived from Landsat for a period (from January 2004 to July 2013) by collecting two frames per year at a spatial resolution of 25 meters. The obtained results show the effectiveness of our method.

Keywords: multi-temporal satellite image, urban growth, non-stationary, stochastic model

Procedia PDF Downloads 424