Search results for: inverse problem in tomography
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7818

Search results for: inverse problem in tomography

7038 Characteristic Function in Estimation of Probability Distribution Moments

Authors: Vladimir S. Timofeev

Abstract:

In this article the problem of distributional moments estimation is considered. The new approach of moments estimation based on usage of the characteristic function is proposed. By statistical simulation technique, author shows that new approach has some robust properties. For calculation of the derivatives of characteristic function there is used numerical differentiation. Obtained results confirmed that author’s idea has a certain working efficiency and it can be recommended for any statistical applications.

Keywords: characteristic function, distributional moments, robustness, outlier, statistical estimation problem, statistical simulation

Procedia PDF Downloads 499
7037 Spherical Harmonic Based Monostatic Anisotropic Point Scatterer Model for RADAR Applications

Authors: Eric Huang, Coleman DeLude, Justin Romberg, Saibal Mukhopadhyay, Madhavan Swaminathan

Abstract:

High performance computing (HPC) based emulators can be used to model the scattering from multiple stationary and moving targets for RADAR applications. These emulators rely on the RADAR Cross Section (RCS) of the targets being available in complex scenarios. Representing the RCS using tables generated from electromagnetic (EM) simulations is often times cumbersome leading to large storage requirement. This paper proposed a spherical harmonic based anisotropic scatterer model to represent the RCS of complex targets. The problem of finding the locations and reflection profiles of all scatterers can be formulated as a linear least square problem with a special sparsity constraint. This paper solves this problem using a modified Orthogonal Matching Pursuit algorithm. The results show that the spherical harmonic based scatterer model can effectively represent the RCS data of complex targets.

Keywords: RADAR, RCS, high performance computing, point scatterer model

Procedia PDF Downloads 185
7036 Network Connectivity Knowledge Graph Using Dwave Quantum Hybrid Solvers

Authors: Nivedha Rajaram

Abstract:

Hybrid Quantum solvers have been given prime focus in recent days by computation problem-solving domain industrial applications. D’Wave Quantum Computers are one such paragon of systems built using quantum annealing mechanism. Discrete Quadratic Models is a hybrid quantum computing model class supplied by D’Wave Ocean SDK - a real-time software platform for hybrid quantum solvers. These hybrid quantum computing modellers can be employed to solve classic problems. One such problem that we consider in this paper is finding a network connectivity knowledge hub in a huge network of systems. Using this quantum solver, we try to find out the prime system hub, which acts as a supreme connection point for the set of connected computers in a large network. This paper establishes an innovative problem approach to generate a connectivity system hub plot for a set of systems using DWave ocean SDK hybrid quantum solvers.

Keywords: quantum computing, hybrid quantum solver, DWave annealing, network knowledge graph

Procedia PDF Downloads 118
7035 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status

Authors: Rosa Figueroa, Christopher Flores

Abstract:

Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).

Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm

Procedia PDF Downloads 290
7034 CT Doses Pre and Post SAFIRE: Sinogram Affirmed Iterative Reconstruction

Authors: N. Noroozian, M. Halim, B. Holloway

Abstract:

Computed Tomography (CT) has become the largest source of radiation exposure in modern countries however, recent technological advances have created new methods to reduce dose without negatively affecting image quality. SAFIRE has emerged as a new software package which utilizes full raw data projections for iterative reconstruction, thereby allowing for lower CT dose to be used. this audit was performed to compare CT doses in certain examinations before and after the introduction of SAFIRE at our Radiology department which showed CT doses were significantly lower using SAFIRE compared with pre-SAFIRE software at SAFIRE 3 setting for the following studies:CSKUH Unenhanced brain scans (-20.9%), CABPEC Abdomen and pelvis with contrast (-21.5%), CCHAPC Chest with contrast (-24.4%), CCHAPC Abdomen and pelvis with contrast (-16.1%), CCHAPC Total chest, abdomen and pelvis (-18.7%).

Keywords: dose reduction, iterative reconstruction, low dose CT techniques, SAFIRE

Procedia PDF Downloads 280
7033 Vehicle Routing Problem with Mixed Fleet of Conventional and Heterogenous Electric Vehicles and Time Dependent Charging Costs

Authors: Ons Sassi, Wahiba Ramdane Cherif-Khettaf, Ammar Oulamara

Abstract:

In this paper, we consider a new real-life Heterogenous Electric Vehicle Routing Problem with Time Dependant Charging Costs and a Mixed Fleet (HEVRP-TDMF), in which a set of geographically scattered customers have to be served by a mixed fleet of vehicles composed of a heterogenous fleet of Electric Vehicles (EVs), having different battery capacities and operating costs, and Conventional Vehicles (CVs). We include the possibility of charging EVs in the available charging stations during the routes in order to serve all customers. Each charging station offers charging service with a known technology of chargers and time-dependent charging costs. Charging stations are also subject to operating time windows constraints. EVs are not necessarily compatible with all available charging technologies and a partial charging is allowed. Intermittent charging at the depot is also allowed provided that constraints related to the electricity grid are satisfied. The objective is to minimize the number of employed vehicles and then minimize the total travel and charging costs. In this study, we present a Mixed Integer Programming Model and develop a Charging Routing Heuristic and a Local Search Heuristic based on the Inject-Eject routine with three different insertion strategies. All heuristics are tested on real data instances.

Keywords: charging problem, electric vehicle, heuristics, local search, optimization, routing problem

Procedia PDF Downloads 460
7032 A Two-Dimensional Problem Micropolar Thermoelastic Medium under the Effect of Laser Irradiation and Distributed Sources

Authors: Devinder Singh, Rajneesh Kumar, Arvind Kumar

Abstract:

The present investigation deals with the deformation of micropolar generalized thermoelastic solid subjected to thermo-mechanical loading due to a thermal laser pulse. Laplace transform and Fourier transform techniques are used to solve the problem. Thermo-mechanical laser interactions are taken as distributed sources to describe the application of the approach. The closed form expressions of normal stress, tangential stress, coupled stress and temperature are obtained in the domain. Numerical inversion technique of Laplace transform and Fourier transform has been implied to obtain the resulting quantities in the physical domain after developing a computer program. The normal stress, tangential stress, coupled stress and temperature are depicted graphically to show the effect of relaxation times. Some particular cases of interest are deduced from the present investigation.

Keywords: pulse laser, integral transform, thermoelastic, boundary value problem

Procedia PDF Downloads 608
7031 Elephant Herding Optimization for Service Selection in QoS-Aware Web Service Composition

Authors: Samia Sadouki Chibani, Abdelkamel Tari

Abstract:

Web service composition combines available services to provide new functionality. Given the number of available services with similar functionalities and different non functional aspects (QoS), the problem of finding a QoS-optimal web service composition is considered as an optimization problem belonging to NP-hard class. Thus, an optimal solution cannot be found by exact algorithms within a reasonable time. In this paper, a meta-heuristic bio-inspired is presented to address the QoS aware web service composition; it is based on Elephant Herding Optimization (EHO) algorithm, which is inspired by the herding behavior of elephant group. EHO is characterized by a process of dividing and combining the population to sub populations (clan); this process allows the exchange of information between local searches to move toward a global optimum. However, with Applying others evolutionary algorithms the problem of early stagnancy in a local optimum cannot be avoided. Compared with PSO, the results of experimental evaluation show that our proposition significantly outperforms the existing algorithm with better performance of the fitness value and a fast convergence.

Keywords: bio-inspired algorithms, elephant herding optimization, QoS optimization, web service composition

Procedia PDF Downloads 324
7030 The Effectiveness of Adaptive Difficulty Adjustment in Touch Tablet App on Young Children's Spatial Problem Solving Development

Authors: Chenchen Liu, Jacques Audran

Abstract:

Using tablet apps with a certain educational purpose to promote young children’s cognitive development, is quite common now. Developing an educational app on an Ipad like tablet, especially for a young child (age 3-5) requires an optimal level of challenge to continuously attract children’s attention and obtain an educational effect. Adaptive difficulty adjustment, which could dynamically set the difficulty in the challenge according to children’s performance, seems to be a good solution. Since space concept plays an important role in young children’s cognitive development, we made an experimental comparison in a French kindergarten between one group of 23 children using an educational app ‘Debout Ludo’ with adaptive difficulty settings and another group of 20 children using the previous version of ‘Debout Ludo’ with a classic incremental difficulty adjustment. The experiment results of spatial problem solving indicated that a significantly higher learning outcome was acquired by the young children who used the adaptive version of the app.

Keywords: adaptive difficulty, spatial problem solving, tactile tablet, young children

Procedia PDF Downloads 433
7029 A New Approach for Preparation of Super Absorbent Polymers: In-Situ Surface Cross-Linking

Authors: Reyhan Özdoğan, Mithat Çelebi, Özgür Ceylan, Mehmet Arif Kaya

Abstract:

Super absorbent polymers (SAPs) are defined as materials that can absorb huge amount of water or aqueous solution in comparison to their own mass and retain in their lightly cross-linked structure. SAPs were produced from water soluble monomers via polymerization subsequently controlled crosslinking. SAPs are generally used for water absorbing applications such as baby diapers, patient or elder pads and other hygienic product industries. Crosslinking density (CD) of SAP structure is an essential factor for water absortion capacity (WAC). Low internal CD leads to high WAC values and vice versa. However, SAPs have low CD and high swelling capacities and tend to disintegrate when pressure is applied upon them, so SAPs under load cannot absorb liquids effectively. In order to prevent this undesired situation and to obtain suitable SAP structures having high swelling capacity and ability to work under load, surface crosslinking can be the answer. In industry, these superabsorbent gels are mostly produced via solution polymerization and then they need to be dried, grinded, sized, post polymerized and finally surface croslinked (involves spraying of a crosslinking solution onto dried and grinded SAP particles, and then curing by heat). It can easily be seen that these steps are time consuming and should be handled carefully for the desired final product. If we could synthesize desired final SAPs using less processes it will help reducing time and production costs which are very important for any industries. In this study, synthesis of SAPs were achieved successfully by inverse suspension (Pickering type) polymerization and subsequently in-situ surface cross-linking via using proper surfactants in high boiling point solvents. Our one-pot synthesis of surface cross-linked SAPs invovles only one-step for preparation, thus it can be said that this technique exhibits more preferable characteristic for the industry in comparison to conventional methods due to its one-step easy process. Effects of different surface crosslinking agents onto properties of poly(acrylic acid-co-sodium acrylate) based SAPs are investigated. Surface crosslink degrees are evaluated by swelling under load (SUL) test. It was determined water absorption capacities of obtained SAPs decrease with the increasing surface crosslink density while their mechanic properties are improved.

Keywords: inverse suspension polymerization, polyacrylic acid, super absorbent polymers (SAPs), surface crosslinking, sodium polyacrylate

Procedia PDF Downloads 318
7028 Flexural Properties of Carbon/Polypropylene Composites: Influence of Matrix Forming Polypropylene in Fiber, Powder, and Film States

Authors: Vijay Goud, Ramasamy Alagirusamy, Apurba Das, Dinesh Kalyanasundaram

Abstract:

Thermoplastic composites render new opportunities as effective processing technology while crafting newer complications into processing. One of the notable challenges is in achieving thorough wettability that is significantly deterred by the high viscosity of the long molecular chains of the thermoplastics. As a result of high viscosity, it is very difficult to impregnate the resin into a tightly interlaced textile structure to fill the voids present in the structure. One potential solution to the above problem, is to pre-deposit resin on the fiber, prior to consolidation. The current study compares DREF spinning, powder coating and film stacking methods of predeposition of resin onto fibers. An investigation into the flexural properties of unidirectional composites (UDC) produced from blending of carbon fiber and polypropylene (PP) matrix in varying forms of fiber, powder and film are reported. Dr. Ernst Fehrer (DREF) yarns or friction spun hybrid yarns were manufactured from PP fibers and carbon tows. The DREF yarns were consolidated to yield unidirectional composites (UDCs) referred to as UDC-D. PP in the form of powder was coated on carbon tows by electrostatic spray coating. The powder-coated towpregs were consolidated to form UDC-P. For the sake of comparison, a third UDC referred as UDC-F was manufactured by the consolidation of PP films stacked between carbon tows. The experiments were designed to yield a matching fiber volume fraction of about 50 % in all the three UDCs. A comparison of mechanical properties of the three composites was studied to understand the efficiency of matrix wetting and impregnation. Approximately 19% and 68% higher flexural strength were obtained for UDC-P than UDC-D and UDC-F respectively. Similarly, 25% and 81% higher modulus were observed in UDC-P than UDC-D and UDC-F respectively. Results from micro-computed tomography, scanning electron microscopy, and short beam tests indicate better impregnation of PP matrix in UDC-P obtained through electrostatic spray coating process and thereby higher flexural strength and modulus.

Keywords: DREF spinning, film stacking, flexural strength, powder coating, thermoplastic composite

Procedia PDF Downloads 219
7027 Wavelet Method for Numerical Solution of Fourth Order Wave Equation

Authors: A. H. Choudhury

Abstract:

In this paper, a highly accurate numerical method for the solution of one-dimensional fourth-order wave equation is derived. This hyperbolic problem is solved by using semidiscrete approximations. The space direction is discretized by wavelet-Galerkin method, and the time variable is discretized by using Newmark schemes.

Keywords: hyperbolic problem, semidiscrete approximations, stability, Wavelet-Galerkin Method

Procedia PDF Downloads 313
7026 A Mathematical Model for a Two-Stage Assembly Flow-Shop Scheduling Problem with Batch Delivery System

Authors: Saeedeh Ahmadi Basir, Mohammad Mahdavi Mazdeh, Mohammad Namakshenas

Abstract:

Manufacturers often dispatch jobs in batches to reduce delivery costs. However, sending several jobs in batches can have a negative effect on other scheduling-related objective functions such as minimizing the number of tardy jobs which is often used to rate managers’ performance in many manufacturing environments. This paper aims to minimize the number of weighted tardy jobs and the sum of delivery costs of a two-stage assembly flow-shop problem in a batch delivery system. We present a mixed-integer linear programming (MILP) model to solve the problem. As this is an MILP model, the commercial solver (the CPLEX solver) is not guaranteed to find the optimal solution for large-size problems at a reasonable amount of time. We present several numerical examples to confirm the accuracy of the model.

Keywords: scheduling, two-stage assembly flow-shop, tardy jobs, batched delivery system

Procedia PDF Downloads 451
7025 BeamGA Median: A Hybrid Heuristic Search Approach

Authors: Ghada Badr, Manar Hosny, Nuha Bintayyash, Eman Albilali, Souad Larabi Marie-Sainte

Abstract:

The median problem is significantly applied to derive the most reasonable rearrangement phylogenetic tree for many species. More specifically, the problem is concerned with finding a permutation that minimizes the sum of distances between itself and a set of three signed permutations. Genomes with equal number of genes but different order can be represented as permutations. In this paper, an algorithm, namely BeamGA median, is proposed that combines a heuristic search approach (local beam) as an initialization step to generate a number of solutions, and then a Genetic Algorithm (GA) is applied in order to refine the solutions, aiming to achieve a better median with the smallest possible reversal distance from the three original permutations. In this approach, any genome rearrangement distance can be applied. In this paper, we use the reversal distance. To the best of our knowledge, the proposed approach was not applied before for solving the median problem. Our approach considers true biological evolution scenario by applying the concept of common intervals during the GA optimization process. This allows us to imitate a true biological behavior and enhance genetic approach time convergence. We were able to handle permutations with a large number of genes, within an acceptable time performance and with same or better accuracy as compared to existing algorithms.

Keywords: median problem, phylogenetic tree, permutation, genetic algorithm, beam search, genome rearrangement distance

Procedia PDF Downloads 261
7024 Effects of Fishbone Creative Thinking Strategy on Problem-Solving Skills of Teaching Personnel in Ogun State, Nigeria

Authors: Olusegun Adeleke Adenuga

Abstract:

The study examined effect of fishbone creative thinking strategy on problem-solving skills of public teachers in Ogun state, Nigeria. A 2x2x2 factorial design was employed for the study which consisted of 80 participants made up of 40 male and 40 female public teachers randomly selected among public teaching personnel from the two local government area headquarters (Ijebu-ode and Ijebu-Igbo) within Ogun East Senatorial District. Each treatment group received 45minutes instructions and training per week for 8weeks. Data was collected from participants with the use of standardized instrument tagged ‘Problem Solving Inventory’ (PSI) developed by the researchers prior to the training to form a pre-test and immediately after eight weeks of training to form a post-test. One hypothesis was tested; the data obtained was analyzed using Analysis of Covariance (ANCOVA) tested at significance level of 0.05. The result of the data analysis shows that there was a significant effect of the fishbone creative thinking technique on the participants (F (2,99) = 12.410; p <.05). Based on the findings, it is therefore recommended that the report of this study be used to effect organizational change and development of teaching service in Nigeria through teachers’ retraining and capacity building.

Keywords: fishbone, creative thinking strategy, and problem-solving skills, public teachers

Procedia PDF Downloads 345
7023 An Unusual Fracture Pattern: Fracture of the Distal Radius (Colles') along with Fracture of the Ipsilateral Scaphoid & Capitate Bones

Authors: Srikanta Tagore Sarkar, Prasanta Kumar Mandal, Dibakar Roy

Abstract:

The association of a capitate fracture with a scaphoid fracture has been termed as the naviculocapitate syndrome. The existence of some nondisplaced fractures of scaphoid and capitate with or without the fracture of lunate or radius suggests that there is a spectrum of these injuries, and this confuses the terminology. With our case; we report an unusual variety of this naviculocapitate syndrome with distal radial Colles fracture in addition to the nondisplaced fractures of the scaphoid, capitate and the dorsal lip of radial fracture. When we looked at the literature there is no another Colles fracture reported together with undisplaced scapho-capitate syndrome. The coronal and sagittal images that obtained from the MDCT (Multidetector computed tomography) is useful and effective imaging modality to diagnose complex wrist fractures with more details that are not detected in X-rays.

Keywords: scaphoid, capitate, Colles’ fracture, syndrome, MDCT, unusual

Procedia PDF Downloads 387
7022 M-Number of Aortic Cannulas Applied During Hypothermic Cardiopulmonary Bypass

Authors: Won-Gon Kim

Abstract:

A standardized system to describe the pressure-flow characteristics of a given cannula has recently been proposed and has been termed ‘the M-number’. Using three different sizes of aortic cannulas in 50 pediatric cardiac patients on hypothermic cardiopulmonary bypass, we analyzed the correlation between experimentally and clinically derived M-numbers, and found this was positive. Clinical M-numbers were typically 0.35 to 0.55 greater than experimental M-numbers, and correlated inversely with a patient's temperature change; this was most probably due to increased blood viscosity, arising from hypothermia. This inverse relationship was more marked in higher M-number cannulas. The clinical data obtained in this study suggest that experimentally derived M-numbers correlate strongly with clinical performance of the cannula, and that the influence of temperature is significant.

Keywords: cardiopulmonary bypass, M-number, aortic cannula, pressure-flow characteristics

Procedia PDF Downloads 238
7021 Study the Influence of the Type of Cast Iron Chips on the Quality of Briquettes Obtained with Controlled Impact

Authors: Dimitar N. Karastoianov, Stanislav D. Gyoshev, Todor N. Penchev

Abstract:

Preparation of briquettes of metal chips with good density and quality is of great importance for the efficiency of this process. In this paper are presented the results of impact briquetting of grey cast iron chips with rectangular shape and dimensions 15x25x1 mm. Density and quality of briquettes of these chips are compared with those obtained in another work of the authors using cast iron chips with smaller sizes. It has been found that by using a rectangular chips with a large size are produced briquettes with a very low density and poor quality. From the photographs taken by X-ray tomography, it is clear that the reason for this is the orientation of the chip in the peripheral wall of the briquettes, which does not allow of the air to escape from it. It was concluded that in order to obtain briquettes of cast iron chips with a large size, these chips must first be ground, for example in a small ball mill.

Keywords: briquetting, chips, impact, rocket engine

Procedia PDF Downloads 520
7020 Phantom and Clinical Evaluation of Block Sequential Regularized Expectation Maximization Reconstruction Algorithm in Ga-PSMA PET/CT Studies Using Various Relative Difference Penalties and Acquisition Durations

Authors: Fatemeh Sadeghi, Peyman Sheikhzadeh

Abstract:

Introduction: Block Sequential Regularized Expectation Maximization (BSREM) reconstruction algorithm was recently developed to suppress excessive noise by applying a relative difference penalty. The aim of this study was to investigate the effect of various strengths of noise penalization factor in the BSREM algorithm under different acquisition duration and lesion sizes in order to determine an optimum penalty factor by considering both quantitative and qualitative image evaluation parameters in clinical uses. Materials and Methods: The NEMA IQ phantom and 15 clinical whole-body patients with prostate cancer were evaluated. Phantom and patients were injected withGallium-68 Prostate-Specific Membrane Antigen(68 Ga-PSMA)and scanned on a non-time-of-flight Discovery IQ Positron Emission Tomography/Computed Tomography(PET/CT) scanner with BGO crystals. The data were reconstructed using BSREM with a β-value of 100-500 at an interval of 100. These reconstructions were compared to OSEM as a widely used reconstruction algorithm. Following the standard NEMA measurement procedure, background variability (BV), recovery coefficient (RC), contrast recovery (CR) and residual lung error (LE) from phantom data and signal-to-noise ratio (SNR), signal-to-background ratio (SBR) and tumor SUV from clinical data were measured. Qualitative features of clinical images visually were ranked by one nuclear medicine expert. Results: The β-value acts as a noise suppression factor, so BSREM showed a decreasing image noise with an increasing β-value. BSREM, with a β-value of 400 at a decreased acquisition duration (2 min/ bp), made an approximately equal noise level with OSEM at an increased acquisition duration (5 min/ bp). For the β-value of 400 at 2 min/bp duration, SNR increased by 43.7%, and LE decreased by 62%, compared with OSEM at a 5 min/bp duration. In both phantom and clinical data, an increase in the β-value is translated into a decrease in SUV. The lowest level of SUV and noise were reached with the highest β-value (β=500), resulting in the highest SNR and lowest SBR due to the greater noise reduction than SUV reduction at the highest β-value. In compression of BSREM with different β-values, the relative difference in the quantitative parameters was generally larger for smaller lesions. As the β-value decreased from 500 to 100, the increase in CR was 160.2% for the smallest sphere (10mm) and 12.6% for the largest sphere (37mm), and the trend was similar for SNR (-58.4% and -20.5%, respectively). BSREM visually was ranked more than OSEM in all Qualitative features. Conclusions: The BSREM algorithm using more iteration numbers leads to more quantitative accuracy without excessive noise, which translates into higher overall image quality and lesion detectability. This improvement can be used to shorter acquisition time.

Keywords: BSREM reconstruction, PET/CT imaging, noise penalization, quantification accuracy

Procedia PDF Downloads 90
7019 Assessment of Breast, Lung and Liver Effective Doses in Heart Imaging by CT-Scan 128 Dual Sources with Use of TLD-100 in RANDO Phantom

Authors: Seyedeh Sepideh Amini, Navideh Aghaei Amirkhizi, Seyedeh Paniz Amini, Seyed Soheil Sayyahi, Mohammad Reza Davar Panah

Abstract:

CT-Scan is one of the lateral and sectional imaging methods that produce 3D-images with use of rotational x-ray tube around central axis. This study is about evaluation and calculation of effective doses around heart organs such as breast, lung and liver with CT-Scan 128 dual sources with TLD_100 and RANDO Phantom by spiral, flash and conventional protocols. In results, it is showed that in spiral protocol organs have maximum effective dose and minimum in flash protocol. Thus flash protocol advised for children and risk persons.

Keywords: X-ray computed tomography, dosimetry, TLD-100, RANDO, phantom

Procedia PDF Downloads 465
7018 A Hybrid Pareto-Based Swarm Optimization Algorithm for the Multi-Objective Flexible Job Shop Scheduling Problems

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a new hybrid particle swarm optimization algorithm is proposed for the multi-objective flexible job shop scheduling problem that is very important and hard combinatorial problem. The Pareto approach is used for solving the multi-objective problem. Several new local search heuristics are integrated into an algorithm based on the critical block concept to enhance the performance of the algorithm. The algorithm is compared with the recently published multi-objective algorithms based on benchmarks selected from the literature. Several metrics are used for quantifying performance and comparison of the achieved solutions. The algorithms are also compared based on the Weighting summation of objectives approach. The proposed algorithm can find the Pareto solutions more efficiently than the compared algorithms in less computational time.

Keywords: swarm-based optimization, local search, Pareto optimality, flexible job shop scheduling, multi-objective optimization

Procedia PDF Downloads 363
7017 Multimodal Ophthalmologic Evaluation Can Detect Retinal Injuries in Asymptomatic Patients With Primary Antiphospholipid Syndrome

Authors: Taurino S. R. Neto, Epitácio D. S. Neto, Flávio Signorelli, Gustavo G. M. Balbi, Alex H. Higashi, Mário Luiz R. Monteiro, Eloisa Bonfá, Danieli C. O. Andrade, Leandro C. Zacharias

Abstract:

Purpose: To perform a multimodal evaluation, including the use of Optical Coherence Angiotomography (OCTA), in patients with primary antiphospholipid syndrome (PAPS) without ocular complaints and to compare them with healthy individuals. Methods: A complete structural and functional ophthalmological evaluation using OCTA and microperimetry (MP) exam in patients with PAPS, followed at a tertiary rheumatology outpatient clinic, was performed. All ophthalmologic manifestations were recorded and then statistical analysis was performed for comparative purposes; p <0.05 was considered statistically significant. Results: 104 eyes of 52 subjects (26 patients with PAPS without ocular complaints and 26 healthy individuals) were included. Among PAPS patients, 21 were female (80.8%) and 21 (80.8%) were Caucasians. Thrombotic PAPS was the main clinical criteria manifestation (100%); 65.4% had venous and 34.6% had arterial thrombosis. Obstetrical criteria were present in 34.6% of all thrombotic PAPS patients. Lupus anticoagulant was present in all patients. 19.2% of PAPS patients presented ophthalmologic findings against none of the healthy individuals. The most common retinal change was paracentral acute middle maculopathy (PAMM) (3 patients, 5 eyes), followed by drusen-like deposits (1 patient, 2 eyes) and pachychoroid pigment epitheliopathy (1 patient, 1 eye). Systemic hypertension and hyperlipidaemia were present in 100% of the PAPS patients with PAMM, while only six patients (26.1%) with PAPS without PAMM presented these two risk factors together. In the quantitative OCTA evaluation, we found significant differences between PAPS patients and controls in both the superficial vascular complex (SVC) and deep vascular complex (DVC) in the high-speed protocol, as well as in the SVC in the high-resolution protocol. In the analysis of the foveal avascular zone (FAZ) parameters, the PAPS group had a larger area of FAZ in the DVC using the high-speed method compared to the control group (p=0.047). In the quantitative analysis of the MP, the PAPS group had lower central (p=0.041) and global (p<0.001) retinal sensitivity compared to the control group, as well as in the sector analysis, with the exception of the inferior sector. In the quantitative evaluation of fixation stability, there was a trend towards worse stability in the PAPS subgroup with PAMM in both studied methods. Conclusions: PAMM was observed in 11.5% of PAPS patients with no previous ocular complaints. Systemic hypertension concomitant with hyperlipidemia was the most commonly associated risk factor for PAMM in patients with PAPS. PAPS patients present lower vascular density and retinal sensitivity compared to the control group, even in patients without PAMM.

Keywords: antiphospholipid syndrome, optical coherence angio tomography, optical coherence tomography, retina

Procedia PDF Downloads 76
7016 The Integration of Geographical Information Systems and Capacitated Vehicle Routing Problem with Simulated Demand for Humanitarian Logistics in Tsunami-Prone Area: A Case Study of Phuket, Thailand

Authors: Kiatkulchai Jitt-Aer, Graham Wall, Dylan Jones

Abstract:

As a result of the Indian Ocean tsunami in 2004, logistics applied to disaster relief operations has received great attention in the humanitarian sector. As learned from such disaster, preparing and responding to the aspect of delivering essential items from distribution centres to affected locations are of the importance for relief operations as the nature of disasters is uncertain especially in suffering figures, which are normally proportional to quantity of supplies. Thus, this study proposes a spatial decision support system (SDSS) for humanitarian logistics by integrating Geographical Information Systems (GIS) and the capacitated vehicle routing problem (CVRP). The GIS is utilised for acquiring demands simulated from the tsunami flooding model of the affected area in the first stage, and visualising the simulation solutions in the last stage. While CVRP in this study encompasses designing the relief routes of a set of homogeneous vehicles from a relief centre to a set of geographically distributed evacuation points in which their demands are estimated by using both simulation and randomisation techniques. The CVRP is modeled as a multi-objective optimization problem where both total travelling distance and total transport resources used are minimized, while demand-cost efficiency of each route is maximized in order to determine route priority. As the model is a NP-hard combinatorial optimization problem, the Clarke and Wright Saving heuristics is proposed to solve the problem for the near-optimal solutions. The real-case instances in the coastal area of Phuket, Thailand are studied to perform the SDSS that allows a decision maker to visually analyse the simulation scenarios through different decision factors.

Keywords: demand simulation, humanitarian logistics, geographical information systems, relief operations, capacitated vehicle routing problem

Procedia PDF Downloads 243
7015 Enhanced Imperialist Competitive Algorithm for the Cell Formation Problem Using Sequence Data

Authors: S. H. Borghei, E. Teymourian, M. Mobin, G. M. Komaki, S. Sheikh

Abstract:

Imperialist competitive algorithm (ICA) is a recent meta-heuristic method that is inspired by the social evolutions for solving NP-Hard problems. The ICA is a population based algorithm which has achieved a great performance in comparison to other meta-heuristics. This study is about developing enhanced ICA approach to solve the cell formation problem (CFP) using sequence data. In addition to the conventional ICA, an enhanced version of ICA, namely EICA, applies local search techniques to add more intensification aptitude and embed the features of exploration and intensification more successfully. Suitable performance measures are used to compare the proposed algorithms with some other powerful solution approaches in the literature. In the same way, for checking the proficiency of algorithms, forty test problems are presented. Five benchmark problems have sequence data, and other ones are based on 0-1 matrices modified to sequence based problems. Computational results elucidate the efficiency of the EICA in solving CFP problems.

Keywords: cell formation problem, group technology, imperialist competitive algorithm, sequence data

Procedia PDF Downloads 449
7014 Applications of Artificial Intelligence (AI) in Cardiac imaging

Authors: Angelis P. Barlampas

Abstract:

The purpose of this study is to inform the reader, about the various applications of artificial intelligence (AI), in cardiac imaging. AI grows fast and its role is crucial in medical specialties, which use large amounts of digital data, that are very difficult or even impossible to be managed by human beings and especially doctors.Artificial intelligence (AI) refers to the ability of computers to mimic human cognitive function, performing tasks such as learning, problem-solving, and autonomous decision making based on digital data. Whereas AI describes the concept of using computers to mimic human cognitive tasks, machine learning (ML) describes the category of algorithms that enable most current applications described as AI. Some of the current applications of AI in cardiac imaging are the follows: Ultrasound: Automated segmentation of cardiac chambers across five common views and consequently quantify chamber volumes/mass, ascertain ejection fraction and determine longitudinal strain through speckle tracking. Determine the severity of mitral regurgitation (accuracy > 99% for every degree of severity). Identify myocardial infarction. Distinguish between Athlete’s heart and hypertrophic cardiomyopathy, as well as restrictive cardiomyopathy and constrictive pericarditis. Predict all-cause mortality. CT Reduce radiation doses. Calculate the calcium score. Diagnose coronary artery disease (CAD). Predict all-cause 5-year mortality. Predict major cardiovascular events in patients with suspected CAD. MRI Segment of cardiac structures and infarct tissue. Calculate cardiac mass and function parameters. Distinguish between patients with myocardial infarction and control subjects. It could potentially reduce costs since it would preclude the need for gadolinium-enhanced CMR. Predict 4-year survival in patients with pulmonary hypertension. Nuclear Imaging Classify normal and abnormal myocardium in CAD. Detect locations with abnormal myocardium. Predict cardiac death. ML was comparable to or better than two experienced readers in predicting the need for revascularization. AI emerge as a helpful tool in cardiac imaging and for the doctors who can not manage the overall increasing demand, in examinations such as ultrasound, computed tomography, MRI, or nuclear imaging studies.

Keywords: artificial intelligence, cardiac imaging, ultrasound, MRI, CT, nuclear medicine

Procedia PDF Downloads 72
7013 An Investigation the Effectiveness of Emotion Regulation Training on the Reduction of Cognitive-Emotion Regulation Problem in Patients with Multiple Sclerosis

Authors: Mahboobeh Sadeghi, Zahra Izadi Khah, Mansour Hakim Javadi, Masoud Gholamali Lavasani

Abstract:

Background: Since there is a relation between psychological and physiological factors, the aim of this study was to examine the effect of Emotion Regulation training on cognitive emotion regulation problem in patients with Multiple Sclerosis(MS) Method: In a randomized clinical trial thirty patients diagnosed with Multiple Sclerosis referred to state welfare organization were selected. The sample group was randomized into either an experimental group or a nonintervention control group. The subjects participated in 75-minute treatment sessions held three times a week for 4weeks (12 sessions). All 30 individuals were administered with Cognitive Emotion Regulation questionnaire (CERQ). Participants completed the questionnaire in pretest and post-test. Data obtained from the questionnaire was analyzed using Mancova. Results: Emotion Regulation significantly decreased the Cognitive Emotion Regulation problems patients with Multiple sclerosis (p < 0.001). Conclusions: Emotion Regulation can be used for the treatment of cognitive-emotion regulation problem in Multiple sclerosis.

Keywords: Multiple Sclerosis, cognitive-emotion regulation, emotion regulation, MS

Procedia PDF Downloads 452
7012 A Correlational Study between Parentification and Memory Retention among Parentified Female Adolescents: A Neurocognitive Perspective on Parentification

Authors: Mary Dorothy Roxas, Jeian Mae Dungca, Reginald Agor, Beatriz Figueroa, Lennon Andre Patricio, Honey Joy Cabahug

Abstract:

Parentification occurs when children are expected to provide instrumental or emotional caregiving within the family. It was found that parentification has the latter effect on adolescents’ cognitive and emotional vulnerability. Attachment theory helps clarify the process of parentification as it involves the relationship between the child and the parent. Carandang theory of “taga-salo” helps explain parentification in the Philippines setting. The present study examined the potential risk of parentification on adolescent’s memory retention by hypothesizing that there is a correlation between the two. The research was conducted with 249 female adolescents ages 12-24, residing in Valenzuela City. Results indicated that there is a significant inverse correlation between parentification and memory retention.

Keywords: memory retention, neurocognitive, parentification, stress

Procedia PDF Downloads 637
7011 From Problem Space to Executional Architecture: The Development of a Simulator to Examine the Effect of Autonomy on Mainline Rail Capacity

Authors: Emily J. Morey, Kevin Galvin, Thomas Riley, R. Eddie Wilson

Abstract:

The key challenges faced by integrating autonomous rail operations into the existing mainline railway environment have been identified through the understanding and framing of the problem space and stakeholder analysis. This was achieved through the completion of the first four steps of Soft Systems Methodology, where the problem space has been expressed via conceptual models. Having identified these challenges, we investigated one of them, namely capacity, via the use of models and simulation. This paper examines the approach used to move from the conceptual models to a simulation which can determine whether the integration of autonomous trains can plausibly increase capacity. Within this approach, we developed an architecture and converted logical models into physical resource models and associated design features which were used to build a simulator. From this simulator, we are able to analyse mixtures of legacy-autonomous operations and produce fundamental diagrams and trajectory plots to describe the dynamic behaviour of mixed mainline railway operations.

Keywords: autonomy, executable architecture, modelling and simulation, railway capacity

Procedia PDF Downloads 73
7010 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language

Authors: Wenjun Hou, Marek Perkowski

Abstract:

The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.

Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language

Procedia PDF Downloads 180
7009 A Study on the Korean Connected Industrial Parks Smart Logistics It Financial Enterprise Architecture

Authors: Ilgoun Kim, Jongpil Jeong

Abstract:

Recently, a connected industrial parks (CIPs) architecture using new technologies such as RFID, cloud computing, CPS, Big Data, 5G 5G, IIOT, VR-AR, and ventral AI algorithms based on IoT has been proposed. This researcher noted the vehicle junction problem (VJP) as a more specific detail of the CIPs architectural models. The VJP noted by this researcher includes 'efficient AI physical connection challenges for vehicles' through ventilation, 'financial and financial issues with complex vehicle physical connections,' and 'welfare and working conditions of the performing personnel involved in complex vehicle physical connections.' In this paper, we propose a public solution architecture for the 'electronic financial problem of complex vehicle physical connections' as a detailed task during the vehicle junction problem (VJP). The researcher sought solutions to businesses, consumers, and Korean social problems through technological advancement. We studied how the beneficiaries of technological development can benefit from technological development with many consumers in Korean society and many small and small Korean company managers, not some specific companies. In order to more specifically implement the connected industrial parks (CIPs) architecture using the new technology, we noted the vehicle junction problem (VJP) within the smart factory industrial complex and noted the process of achieving the vehicle junction problem performance among several electronic processes. This researcher proposes a more detailed, integrated public finance enterprise architecture among the overall CIPs architectures. The main details of the public integrated financial enterprise architecture were largely organized into four main categories: 'business', 'data', 'technique', and 'finance'.

Keywords: enterprise architecture, IT Finance, smart logistics, CIPs

Procedia PDF Downloads 158