Search results for: Computation of treatment plan
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10037

Search results for: Computation of treatment plan

10007 The Intersection/Union Region Computation for Drosophila Brain Images Using Encoding Schemes Based on Multi-Core CPUs

Authors: Ming-Yang Guo, Cheng-Xian Wu, Wei-Xiang Chen, Chun-Yuan Lin, Yen-Jen Lin, Ann-Shyn Chiang

Abstract:

With more and more Drosophila Driver and Neuron images, it is an important work to find the similarity relationships among them as the functional inference. There is a general problem that how to find a Drosophila Driver image, which can cover a set of Drosophila Driver/Neuron images. In order to solve this problem, the intersection/union region for a set of images should be computed at first, then a comparison work is used to calculate the similarities between the region and other images. In this paper, three encoding schemes, namely Integer, Boolean, Decimal, are proposed to encode each image as a one-dimensional structure. Then, the intersection/union region from these images can be computed by using the compare operations, Boolean operators and lookup table method. Finally, the comparison work is done as the union region computation, and the similarity score can be calculated by the definition of Tanimoto coefficient. The above methods for the region computation are also implemented in the multi-core CPUs environment with the OpenMP. From the experimental results, in the encoding phase, the performance by the Boolean scheme is the best than that by others; in the region computation phase, the performance by Decimal is the best when the number of images is large. The speedup ratio can achieve 12 based on 16 CPUs. This work was supported by the Ministry of Science and Technology under the grant MOST 106-2221-E-182-070.

Keywords: Drosophila driver image, Drosophila neuron images, intersection/union computation, parallel processing, OpenMP

Procedia PDF Downloads 202
10006 Deep Reinforcement Learning-Based Computation Offloading for 5G Vehicle-Aware Multi-Access Edge Computing Network

Authors: Ziying Wu, Danfeng Yan

Abstract:

Multi-Access Edge Computing (MEC) is one of the key technologies of the future 5G network. By deploying edge computing centers at the edge of wireless access network, the computation tasks can be offloaded to edge servers rather than the remote cloud server to meet the requirements of 5G low-latency and high-reliability application scenarios. Meanwhile, with the development of IOV (Internet of Vehicles) technology, various delay-sensitive and compute-intensive in-vehicle applications continue to appear. Compared with traditional internet business, these computation tasks have higher processing priority and lower delay requirements. In this paper, we design a 5G-based Vehicle-Aware Multi-Access Edge Computing Network (VAMECN) and propose a joint optimization problem of minimizing total system cost. In view of the problem, a deep reinforcement learning-based joint computation offloading and task migration optimization (JCOTM) algorithm is proposed, considering the influences of multiple factors such as concurrent multiple computation tasks, system computing resources distribution, and network communication bandwidth. And, the mixed integer nonlinear programming problem is described as a Markov Decision Process. Experiments show that our proposed algorithm can effectively reduce task processing delay and equipment energy consumption, optimize computing offloading and resource allocation schemes, and improve system resource utilization, compared with other computing offloading policies.

Keywords: multi-access edge computing, computation offloading, 5th generation, vehicle-aware, deep reinforcement learning, deep q-network

Procedia PDF Downloads 78
10005 A Cephalometric Superimposition of a Skeletal Class III Orthognathic Patient on Nasion-Sella Line

Authors: Albert Suryaprawira

Abstract:

The Nasion-Sella Line (NSL) has been used for several years as a reference line in longitudinal growth study. Therefore this line is considered to be stable not only to evaluate treatment outcome and to predict relapse possibility but also to manage prognosis. This is a radiographic superimposition of an adult male aged 19 years who complained of difficulty in aesthetic, talking and chewing. Patient has a midface hypoplasia profile (concave). He was diagnosed to have a severe Skeletal Class III with Class III malocclusion, increased lower vertical height, and an anterior open bite. A pre-treatment cephalometric radiograph was taken to analyse the skeletal problem and to measure the amount of bone movement and the prediction soft tissue response. A panoramic radiograph was also taken to analyse bone quality, bone abnormality, third molar impaction, etc. Before the surgery, a pre-surgical cephalometric radiograph was taken to re-evaluate the plan and to settle the final amount of bone cut. After the surgery, a post-surgical cephalometric radiograph was taken to confirm the result with the plan. The superimposition using NSL as a reference line between those radiographs was performed to analyse the outcome. It is important to describe the amount of hard and soft tissue movement and to predict the possibility of relapse after the surgery. The patient also needs to understand all the surgical plan, outcome and relapse prevention. The surgical management included maxillary impaction and advancement of Le Fort I osteotomy. The evaluation using NSL as a reference was a very useful method in determining the outcome and prognosis.

Keywords: Nasion-Sella Line, midface hypoplasia, Le Fort 1, maxillary advancement

Procedia PDF Downloads 120
10004 Reliability Analysis of Construction Schedule Plan Based on Building Information Modelling

Authors: Lu Ren, You-Liang Fang, Yan-Gang Zhao

Abstract:

In recent years, the application of BIM (Building Information Modelling) to construction schedule plan has been the focus of more and more researchers. In order to assess the reasonable level of the BIM-based construction schedule plan, that is whether the schedule can be completed on time, some researchers have introduced reliability theory to evaluate. In the process of evaluation, the uncertain factors affecting the construction schedule plan are regarded as random variables, and probability distributions of the random variables are assumed to be normal distribution, which is determined using two parameters evaluated from the mean and standard deviation of statistical data. However, in practical engineering, most of the uncertain influence factors are not normal random variables. So the evaluation results of the construction schedule plan will be unreasonable under the assumption that probability distributions of random variables submitted to the normal distribution. Therefore, in order to get a more reasonable evaluation result, it is necessary to describe the distribution of random variables more comprehensively. For this purpose, cubic normal distribution is introduced in this paper to describe the distribution of arbitrary random variables, which is determined by the first four moments (mean, standard deviation, skewness and kurtosis). In this paper, building the BIM model firstly according to the design messages of the structure and making the construction schedule plan based on BIM, then the cubic normal distribution is used to describe the distribution of the random variables due to the collecting statistical data of the random factors influencing construction schedule plan. Next the reliability analysis of the construction schedule plan based on BIM can be carried out more reasonably. Finally, the more accurate evaluation results can be given providing reference for the implementation of the actual construction schedule plan. In the last part of this paper, the more efficiency and accuracy of the proposed methodology for the reliability analysis of the construction schedule plan based on BIM are conducted through practical engineering case.

Keywords: BIM, construction schedule plan, cubic normal distribution, reliability analysis

Procedia PDF Downloads 107
10003 Algorithmic Approach to Management of Complications of Permanent Facial Filler: A Saudi Experience

Authors: Luay Alsalmi

Abstract:

Background: Facial filler is the most common type of cosmetic surgery next to botox. Permanent filler is preferred nowadays due to the low cost brought about by non-recurring injection appointments. However, such fillers pose a higher risk for complications, with even greater adverse effects when the procedure is done using unknown dermal filler injections. AIM: This study aimed to establish an algorithm to categorize and manage patients that receive permanent fillers. Materials and Methods: Twelve participants were presented to the service through emergency or as outpatient from November 2015 to May 2021. Demographics such as age, sex, date of injection, time of onset, and types of complications were collected. After examination, all cases were managed based on an algorithm established. FACE-Q was used to measure overall satisfaction and psychological well-being. Results: The algorithm to diagnose and manage these patients effectively with a high satisfaction rate was established in this study. All participants were non-smoker females with no known medical comorbidities. The algorithm presented determined the treatment plan when faced with complications. Results revealed high appearance-related psychosocial distress was observed prior to surgery, while it significantly dropped after surgery. FACE-Q was able to establish evidence of satisfactory ratings among patients prior to and after surgery. Conclusion: This treatment algorithm can guide the surgeon in formulating a suitable plan with fewer complications and a high satisfaction rate.

Keywords: facial filler, FACE-Q, psycho-social stress, botox, treatment algorithm

Procedia PDF Downloads 57
10002 A Quantitative Plan for Drawing Down Emissions to Attenuate Climate Change

Authors: Terry Lucas

Abstract:

Calculations are performed to quantify the potential contribution of each greenhouse gas emission reduction strategy. This approach facilitates the visualisation of the relative benefits of each, and it provides a potential baseline for the development of a plan of action that is rooted in quantitative evaluation. Emissions reductions are converted to potential de-escalation of global average temperature. A comprehensive plan is then presented which shows the potential benefits all the way out to year 2100. A target temperature de-escalation of 2oC was selected, but the plan shows a benefit of only 1.225oC. This latter disappointing result is in spite of new and powerful technologies introduced into the equation. These include nuclear fusion and alternative nuclear fission processes. Current technologies such as wind, solar and electric vehicles show surprisingly small constributions to the whole.

Keywords: climate change, emissions, drawdown, energy

Procedia PDF Downloads 94
10001 The Impact of Using Flattening Filter-Free Energies on Treatment Efficiency for Prostate SBRT

Authors: T. Al-Alawi, N. Shorbaji, E. Rashaidi, M.Alidrisi

Abstract:

Purpose/Objective(s): The main purpose of this study is to analyze the planning of SBRT treatments for localized prostate cancer with 6FFF and 10FFF energies to see if there is a dosimetric difference between the two energies and how we can increase the plan efficiency and reduce its complexity. Also, to introduce a planning method in our department to treat prostate cancer by utilizing high energy photons without increasing patient toxicity and fulfilled all dosimetric constraints for OAR (an organ at risk). Then toevaluate the target 95% coverage PTV95, V5%, V2%, V1%, low dose volume for OAR (V1Gy, V2Gy, V5Gy), monitor unit (beam-on time), and estimate the values of homogeneity index HI, conformity index CI a Gradient index GI for each treatment plan.Materials/Methods: Two treatment plans were generated for15 patients with localized prostate cancer retrospectively using the CT planning image acquired for radiotherapy purposes. Each plan contains two/three complete arcs with two/three different collimator angle sets. The maximum dose rate available is 1400MU/min for the energy 6FFF and 2400MU/min for 10FFF. So in case, we need to avoid changing the gantry speed during the rotation, we tend to use the third arc in the plan with 6FFF to accommodate the high dose per fraction. The clinical target volume (CTV) consists of the entire prostate for organ-confined disease. The planning target volume (PTV) involves a margin of 5 mm. A 3-mm margin is favored posteriorly. Organs at risk identified and contoured include the rectum, bladder, penile bulb, femoral heads, and small bowel. The prescription dose is to deliver 35Gyin five fractions to the PTV and apply constraints for organ at risk (OAR) derived from those reported in references. Results: In terms of CI=0.99, HI=0.7, and GI= 4.1, it was observed that they are all thesame for both energies 6FFF and 10FFF with no differences, but the total delivered MUs are much less for the 10FFF plans (2907 for 6FFF vs.2468 for 10FFF) and the total delivery time is 124Sc for 6FFF vs. 61Sc for 10FFF beams. There were no dosimetric differences between 6FFF and 10FFF in terms of PTV coverage and mean doses; the mean doses for the bladder, rectum, femoral heads, penile bulb, and small bowel were collected, and they were in favor of the 10FFF. Also, we got lower V1Gy, V2Gy, and V5Gy doses for all OAR with 10FFF plans. Integral dosesID in (Gy. L) were recorded for all OAR, and they were lower with the 10FFF plans. Conclusion: High energy 10FFF has lower treatment time and lower delivered MUs; also, 10FFF showed lower integral and meant doses to organs at risk. In this study, we suggest usinga 10FFF beam for SBRTprostate treatment, which has the advantage of lowering the treatment time and that lead to lessplan complexity with respect to 6FFF beams.

Keywords: FFF beam, SBRT prostate, VMAT, prostate cancer

Procedia PDF Downloads 54
10000 Malpractice, Even in Conditions of Compliance With the Rules of Dental Ethics

Authors: Saimir Heta, Kers Kapaj, Rialda Xhizdari, Ilma Robo

Abstract:

Despite the existence of different dental specialties, the dentist-patient relationship is unique, in the very fact that the treatment is performed by one doctor and the patient identifies the malpractice presented as part of that doctor's practice; this is in complete contrast to cases of medical treatments where the patient can be presented to a team of doctors, to treat a specific pathology. The rules of dental ethics are almost the same as the rules of medical ethics. The appearance of dental malpractice affects exactly this two-party relationship, created on the basis of professionalism, without deviations in this direction, between the dentist and the patient, but with very narrow individual boundaries, compared to cases of medical malpractice. Main text: Malpractice can have different reasons for its appearance, starting from professional negligence, but also from the lack of professional knowledge of the dentist who undertakes the dental treatment. It should always be seen in perspective that we are not talking about the individual - the dentist who goes to work with the intention of harming their patients. Malpractice can also be a consequence of the impossibility, for anatomical or physiological reasons of the tooth under dental treatment, to realize the predetermined dental treatment plan. On the other hand, the dentist himself is an individual who can be affected by health conditions, or have vices that affect the systemic health of the dentist as an individual, which in these conditions can cause malpractice. So, depending on the reason that led to the appearance of malpractice, the method of treatment from a legal point of view also varies, for the dentist who committed the malpractice, evaluating the latter if the malpractice came under the conditions of applying the rules of dental ethics. Conclusions: The deviation from the predetermined dental plan is the minimum sign of malpractice and the latter should not be definitively related only to cases of difficult dental treatments. The identification of the reason for the appearance of malpractice is the initial element, which makes the difference in the way of its treatment, from a legal point of view, and the involvement of the dentist in the assessment of the malpractice committed, must be based on the legislation in force, which must be said to have their specific changes in different states. Malpractice should be referred to, or included in the lectures or in the continuing education of professionals, because it serves as a method of obtaining professional experience in order not to repeat the same thing several times, by different professionals.

Keywords: dental ethics, malpractice, negligence, legal basis, continuing education, dental treatments

Procedia PDF Downloads 36
9999 Model Driven Architecture Methodologies: A Review

Authors: Arslan Murtaza

Abstract:

Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.

Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies

Procedia PDF Downloads 414
9998 Investigating the Effect of Study Plan and Homework on Student's Performance by Using Web Based Learning MyMathLab

Authors: Mohamed Chabi, Mahmoud I. Syam, Sarah Aw

Abstract:

In Summer 2012, the Foundation Program Unit of Qatar University has started implementing new ways of teaching Math by introducing MML (MyMathLab) as an innovative interactive tool to support standard teaching. In this paper, we focused on the effect of proper use of the Study Plan component of MML on student’s performance. Authors investigated the results of students of pre-calculus course during Fall 2013 in Foundation Program at Qatar University. The results showed that there is a strong correlation between study plan results and final exam results, also a strong relation between homework results and final exam results. In addition, the attendance average affected on the student’s results in general. Multiple regression is determined between passing rate dependent variable and study plan, homework as independent variable.

Keywords: MyMathLab, study plan, assessment, homework, attendance, correlation, regression

Procedia PDF Downloads 393
9997 An Improved Method to Compute Sparse Graphs for Traveling Salesman Problem

Authors: Y. Wang

Abstract:

The Traveling salesman problem (TSP) is NP-hard in combinatorial optimization. The research shows the algorithms for TSP on the sparse graphs have the shorter computation time than those for TSP according to the complete graphs. We present an improved iterative algorithm to compute the sparse graphs for TSP by frequency graphs computed with frequency quadrilaterals. The iterative algorithm is enhanced by adjusting two parameters of the algorithm. The computation time of the algorithm is O(CNmaxn2) where C is the iterations, Nmax is the maximum number of frequency quadrilaterals containing each edge and n is the scale of TSP. The experimental results showed the computed sparse graphs generally have less than 5n edges for most of these Euclidean instances. Moreover, the maximum degree and minimum degree of the vertices in the sparse graphs do not have much difference. Thus, the computation time of the methods to resolve the TSP on these sparse graphs will be greatly reduced.

Keywords: frequency quadrilateral, iterative algorithm, sparse graph, traveling salesman problem

Procedia PDF Downloads 194
9996 Verification of Dosimetric Commissioning Accuracy of Flattening Filter Free Intensity Modulated Radiation Therapy and Volumetric Modulated Therapy Delivery Using Task Group 119 Guidelines

Authors: Arunai Nambi Raj N., Kaviarasu Karunakaran, Krishnamurthy K.

Abstract:

The purpose of this study was to create American Association of Physicist in Medicine (AAPM) Task Group 119 (TG 119) benchmark plans for flattening filter free beam (FFF) deliveries of intensity modulated radiation therapy (IMRT) and volumetric arc therapy (VMAT) in the Eclipse treatment planning system. The planning data were compared with the flattening filter (FF) IMRT & VMAT plan data to verify the dosimetric commissioning accuracy of FFF deliveries. AAPM TG 119 proposed a set of test cases called multi-target, mock prostate, mock head and neck, and C-shape to ascertain the overall accuracy of IMRT planning, measurement, and analysis. We used these test cases to investigate the performance of the Eclipse Treatment planning system for the flattening filter free beam deliveries. For these test cases, we generated two sets of treatment plans, the first plan using 7–9 IMRT fields and a second plan utilizing two arc VMAT technique for both the beam deliveries (6 MV FF, 6MV FFF, 10 MV FF and 10 MV FFF). The planning objectives and dose were set as described in TG 119. The dose prescriptions for multi-target, mock prostate, mock head and neck, and C-shape were taken as 50, 75.6, 50 and 50 Gy, respectively. The point dose (mean dose to the contoured chamber volume) at the specified positions/locations was measured using compact (CC‑13) ion chamber. The composite planar dose and per-field gamma analysis were measured with IMatriXX Evaluation 2D array with OmniPro IMRT Software (version 1.7b). FFF beam deliveries of IMRT and VMAT plans were comparable to flattening filter beam deliveries. Our planning and quality assurance results matched with TG 119 data. AAPM TG 119 test cases are useful to generate FFF benchmark plans. From the obtained data in this study, we conclude that the commissioning of FFF IMRT and FFF VMAT delivery were found within the limits of TG-119 and the performance of the Eclipse treatment planning system for FFF plans were found satisfactorily.

Keywords: flattening filter free beams, intensity modulated radiation therapy, task group 119, volumetric modulated arc therapy

Procedia PDF Downloads 120
9995 A Two-Pronged Truncated Deferred Sampling Plan for Log-Logistic Distribution

Authors: Braimah Joseph Odunayo, Jiju Gillariose

Abstract:

This paper is aimed at developing a sampling plan that uses information from precedent and successive lots for lot disposition with a pretention that the life-time of a particular product assumes a Log-logistic distribution. A Two-pronged Truncated Deferred Sampling Plan (TTDSP) for Log-logistic distribution is proposed when the testing is truncated at a precise time. The best possible sample sizes are obtained under a given Maximum Allowable Percent Defective (MAPD), Test Suspension Ratios (TSR), and acceptance numbers (c). A formula for calculating the operating characteristics of the proposed plan is also developed. The operating characteristics and mean-ratio values were used to measure the performance of the plan. The findings of the study show that: Log-logistic distribution has a decreasing failure rate; furthermore, as mean-life ratio increase, the failure rate reduces; the sample size increase as the acceptance number, test suspension ratios and maximum allowable percent defective increases. The study concludes that the minimum sample sizes were smaller, which makes the plan a more economical plan to adopt when cost and time of production are costly and the experiment being destructive.

Keywords: consumers risk, mean life, minimum sample size, operating characteristics, producers risk

Procedia PDF Downloads 105
9994 An Efficient Book Keeping Strategy for the Formation of the Design Matrix in Geodetic Network Adjustment

Authors: O. G. Omogunloye, J. B. Olaleye, O. E. Abiodun, J. O. Odumosu, O. G. Ajayi

Abstract:

The focus of the study is to proffer easy formulation and computation of least square observation equation’s design matrix by using an efficient book keeping strategy. Usually, for a large network of many triangles and stations, a rigorous task is involved in the computation and placement of the values of the differentials of each observation with respect to its station coordinates (latitude and longitude), in their respective rows and columns. The efficient book keeping strategy seeks to eliminate or reduce this rigorous task involved, especially in large network, by simple skillful arrangement and development of a short program written in the Matlab environment, the formulation and computation of least square observation equation’s design matrix can be easily achieved.

Keywords: design, differential, geodetic, matrix, network, station

Procedia PDF Downloads 314
9993 Aerodynamic Coefficients Prediction from Minimum Computation Combinations Using OpenVSP Software

Authors: Marine Segui, Ruxandra Mihaela Botez

Abstract:

OpenVSP is an aerodynamic solver developed by National Aeronautics and Space Administration (NASA) that allows building a reliable model of an aircraft. This software performs an aerodynamic simulation according to the angle of attack of the aircraft makes between the incoming airstream, and its speed. A reliable aerodynamic model of the Cessna Citation X was designed but it required a lot of computation time. As a consequence, a prediction method was established that allowed predicting lift and drag coefficients for all Mach numbers and for all angles of attack, exclusively for stall conditions, from a computation of three angles of attack and only one Mach number. Aerodynamic coefficients given by the prediction method for a Cessna Citation X model were finally compared with aerodynamics coefficients obtained using a complete OpenVSP study.

Keywords: aerodynamic, coefficient, cruise, improving, longitudinal, openVSP, solver, time

Procedia PDF Downloads 202
9992 Comparison of an Anthropomorphic PRESAGE® Dosimeter and Radiochromic Film with a Commercial Radiation Treatment Planning System for Breast IMRT: A Feasibility Study

Authors: Khalid Iqbal

Abstract:

This work presents a comparison of an anthropomorphic PRESAGE® dosimeter and radiochromic film measurements with a commercial treatment planning system to determine the feasibility of PRESAGE® for 3D dosimetry in breast IMRT. An anthropomorphic PRESAGE® phantom was created in the shape of a breast phantom. A five-field IMRT plan was generated with a commercially available treatment planning system and delivered to the PRESAGE® phantom. The anthropomorphic PRESAGE® was scanned with the Duke midsized optical CT scanner (DMOS-RPC) and the OD distribution was converted to dose. Comparisons were performed between the dose distribution calculated with the Pinnacle3 treatment planning system, PRESAGE®, and EBT2 film measurements. DVHs, gamma maps, and line profiles were used to evaluate the agreement. Gamma map comparisons showed that Pinnacle3 agreed with PRESAGE® as greater than 95% of comparison points for the PTV passed a ± 3%/± 3 mm criterion when the outer 8 mm of phantom data were discluded. Edge artifacts were observed in the optical CT reconstruction, from the surface to approximately 8 mm depth. These artifacts resulted in dose differences between Pinnacle3 and PRESAGE® of up to 5% between the surface and a depth of 8 mm and decreased with increasing depth in the phantom. Line profile comparisons between all three independent measurements yielded a maximum difference of 2% within the central 80% of the field width. For the breast IMRT plan studied, the Pinnacle3 calculations agreed with PRESAGE® measurements to within the ±3%/± 3 mm gamma criterion. This work demonstrates the feasibility of the PRESAGE® to be fashioned into anthropomorphic shape, and establishes the accuracy of Pinnacle3 for breast IMRT. Furthermore, these data have established the groundwork for future investigations into 3D dosimetry with more complex anthropomorphic phantoms.

Keywords: 3D dosimetry, PRESAGE®, IMRT, QA, EBT2 GAFCHROMIC film

Procedia PDF Downloads 376
9991 Study on Disaster Prevention Plan for an Electronic Industry in Thailand

Authors: S. Pullteap, M. Pathomsuriyaporn

Abstract:

In this article, a study of employee’s opinion to the factors that affect to the flood preventive and the corrective action plan in an electronic industry at the Sharp Manufacturing (Thailand) Co., Ltd. has been investigated. The surveys data of 175 workers and supervisors have, however, been selected for data analysis. The results is shown that the employees emphasize about the needs in a subsidy at the time of disaster at high levels of 77.8%, as the plan focusing on flood prevention of the rehabilitation equipment is valued at the intermediate level, which is 79.8%. Demonstration of the hypothesis has found that the different education levels has thus been affected to the needs factor at the flood disaster time. Moreover, most respondents give priority to flood disaster risk management factor. Consequently, we found that the flood prevention plan is valued at high level, especially on information monitoring, which is 93.4% for the supervisor item. The respondents largely assume that the flood will have impacts on the industry, up to 80%, thus to focus on flood management plans is enormous.

Keywords: flood prevention plan, flood event, electronic industrial plant, disaster, risk management

Procedia PDF Downloads 280
9990 Developing Medium Term Maintenance Plan For Road Networks

Authors: Helen S. Ghali, Haidy S. Ghali, Salma Ibrahim, Ossama Hosny, Hatem S. Elbehairy

Abstract:

Infrastructure systems are essential assets in any community; accordingly, authorities aim to maximize its life span while minimizing the life cycle cost. This requires studying the asset conditions throughout its operation and forming a cost-efficient maintenance strategy plan. The objective of this study is to develop a highway management system that provides medium-term maintenance plans with the minimum life cycle cost subject to budget constraints. The model is applied to data collected for the highway network in India with the aim to output a 5-year maintenance plan strategy from 2019 till 2023. The main element considered is the surface coarse, either rigid or flexible pavement. The model outputs a 5-year maintenance plan for each segment given the budget constraint while maximizing the new pavement condition rating and minimizing its life cycle cost.

Keywords: infrastructure, asset management, optimization, maintenance plan

Procedia PDF Downloads 185
9989 Motion Estimator Architecture with Optimized Number of Processing Elements for High Efficiency Video Coding

Authors: Seongsoo Lee

Abstract:

Motion estimation occupies the heaviest computation in HEVC (high efficiency video coding). Many fast algorithms such as TZS (test zone search) have been proposed to reduce the computation. Still the huge computation of the motion estimation is a critical issue in the implementation of HEVC video codec. In this paper, motion estimator architecture with optimized number of PEs (processing element) is presented by exploiting early termination. It also reduces hardware size by exploiting parallel processing. The presented motion estimator architecture has 8 PEs, and it can efficiently perform TZS with very high utilization of PEs.

Keywords: motion estimation, test zone search, high efficiency video coding, processing element, optimization

Procedia PDF Downloads 333
9988 Parallel Evaluation of Sommerfeld Integrals for Multilayer Dyadic Green's Function

Authors: Duygu Kan, Mehmet Cayoren

Abstract:

Sommerfeld-integrals (SIs) are commonly encountered in electromagnetics problems involving analysis of antennas and scatterers embedded in planar multilayered media. Generally speaking, the analytical solution of SIs is unavailable, and it is well known that numerical evaluation of SIs is very time consuming and computationally expensive due to the highly oscillating and slowly decaying nature of the integrands. Therefore, fast computation of SIs has a paramount importance. In this paper, a parallel code has been developed to speed up the computation of SI in the framework of calculation of dyadic Green’s function in multilayered media. OpenMP shared memory approach is used to parallelize the SI algorithm and resulted in significant time savings. Moreover accelerating the computation of dyadic Green’s function is discussed based on the parallel SI algorithm developed.

Keywords: Sommerfeld-integrals, multilayer dyadic Green’s function, OpenMP, shared memory parallel programming

Procedia PDF Downloads 213
9987 Multiparametric Optimization of Water Treatment Process for Thermal Power Plants

Authors: Balgaisha Mukanova, Natalya Glazyrina, Sergey Glazyrin

Abstract:

The formulated problem of optimization of the technological process of water treatment for thermal power plants is considered in this article. The problem is of multiparametric nature. To optimize the process, namely, reduce the amount of waste water, a new technology was developed to reuse such water. A mathematical model of the technology of wastewater reuse was developed. Optimization parameters were determined. The model consists of a material balance equation, an equation describing the kinetics of ion exchange for the non-equilibrium case and an equation for the ion exchange isotherm. The material balance equation includes a nonlinear term that depends on the kinetics of ion exchange. A direct problem of calculating the impurity concentration at the outlet of the water treatment plant was numerically solved. The direct problem was approximated by an implicit point-to-point computation difference scheme. The inverse problem was formulated as relates to determination of the parameters of the mathematical model of the water treatment plant operating in non-equilibrium conditions. The formulated inverse problem was solved. Following the results of calculation the time of start of the filter regeneration process was determined, as well as the period of regeneration process and the amount of regeneration and wash water. Multi-parameter optimization of water treatment process for thermal power plants allowed decreasing the amount of wastewater by 15%.

Keywords: direct problem, multiparametric optimization, optimization parameters, water treatment

Procedia PDF Downloads 353
9986 Exploiting Non-Uniform Utility of Computing: A Case Study

Authors: Arnab Sarkar, Michael Huang, Chuang Ren, Jun Li

Abstract:

The increasing importance of computing in modern society has brought substantial growth in the demand for more computational power. In some problem domains such as scientific simulations, available computational power still sets a limit on what can be practically explored in computation. For many types of code, there is non-uniformity in the utility of computation. That is not every piece of computation contributes equally to the quality of the result. If this non-uniformity is understood well and exploited effectively, we can much more effectively utilize available computing power. In this paper, we discuss a case study of exploring such non-uniformity in a particle-in-cell simulation platform. We find both the existence of significant non-uniformity and that it is generally straightforward to exploit it. We show the potential of order-of-magnitude effective performance gain while keeping the comparable quality of output. We also discuss some challenges in both the practical application of the idea and evaluation of its impact.

Keywords: approximate computing, landau damping, non uniform utility computing, particle-in-cell

Procedia PDF Downloads 228
9985 Method for Targeting Small Volume in Rat Brainby Gamma Knife and Dosimetric Control: Towards a Standardization

Authors: J. Constanzo, B. Paquette, G. Charest, L. Masson-Côté, M. Guillot

Abstract:

Targeted and whole-brain irradiation in humans can result in significant side effects causing decreased patient quality of life. To adequately investigate structural and functional alterations after stereotactic radiosurgery, preclinical studies are needed. The first step is to establish a robust standardized method of targeted irradiation on small regions of the rat brain. Eleven euthanized male Fischer rats were imaged in a stereotactic bed, by computed tomographic (CT), to estimate positioning variations regarding to the bregma skull reference point. Using a rat brain atlas and the stereotactic bregma coordinates assessed from CT images, various regions of the brain were delimited and a treatment plan was generated. A dose of 37 Gy at 30% isodose which corresponds to 100 Gy in 100% of the target volume (X = 98.1; Y = 109.1; Z = 100.0) was set by Leksell Gamma Plan using sectors number 4, 5, 7, and 8 of the Gamma Knife unit with the 4-mm diameter collimators. Effects of positioning accuracy of the rat brain on the dose deposition were simulated by Gamma Plan and validated with dosimetric measurements. Our results showed that 90% of the target volume received 110 ± 4.7 Gy and the maximum of deposited dose was 124 ± 0.6 Gy, which corresponds to an excellent relative standard deviation of 0.5%. This dose deposition calculated with the Gamma Plan was validated with the dosimetric films resulting in a dose-profile agreement within 2%, both in X- and Z-axis,. Our results demonstrate the feasibility to standardize the irradiation procedure of a small volume in the rat brain using a Gamma Knife.

Keywords: brain irradiation, dosimetry, gamma knife, small-animal irradiation, stereotactic radiosurgery (SRS)

Procedia PDF Downloads 380
9984 A Parallel Computation Based on GPU Programming for a 3D Compressible Fluid Flow Simulation

Authors: Sugeng Rianto, P.W. Arinto Yudi, Soemarno Muhammad Nurhuda

Abstract:

A computation of a 3D compressible fluid flow for virtual environment with haptic interaction can be a non-trivial issue. This is especially how to reach good performances and balancing between visualization, tactile feedback interaction, and computations. In this paper, we describe our approach of computation methods based on parallel programming on a GPU. The 3D fluid flow solvers have been developed for smoke dispersion simulation by using combinations of the cubic interpolated propagation (CIP) based fluid flow solvers and the advantages of the parallelism and programmability of the GPU. The fluid flow solver is generated in the GPU-CPU message passing scheme to get rapid development of haptic feedback modes for fluid dynamic data. A rapid solution in fluid flow solvers is developed by applying cubic interpolated propagation (CIP) fluid flow solvers. From this scheme, multiphase fluid flow equations can be solved simultaneously. To get more acceleration in the computation, the Navier-Stoke Equations (NSEs) is packed into channels of texel, where computation models are performed on pixels that can be considered to be a grid of cells. Therefore, despite of the complexity of the obstacle geometry, processing on multiple vertices and pixels can be done simultaneously in parallel. The data are also shared in global memory for CPU to control the haptic in providing kinaesthetic interaction and felling. The results show that GPU based parallel computation approaches provide effective simulation of compressible fluid flow model for real-time interaction in 3D computer graphic for PC platform. This report has shown the feasibility of a new approach of solving the compressible fluid flow equations on the GPU. The experimental tests proved that the compressible fluid flowing on various obstacles with haptic interactions on the few model obstacles can be effectively and efficiently simulated on the reasonable frame rate with a realistic visualization. These results confirm that good performances and balancing between visualization, tactile feedback interaction, and computations can be applied successfully.

Keywords: CIP, compressible fluid, GPU programming, parallel computation, real-time visualisation

Procedia PDF Downloads 403
9983 The Standardization of Colorado Schools to Offer Opportunity Through Equal Education

Authors: Heather Caldwell

Abstract:

In 1915, state superintendent, Mary C.C. Bradford initiated a state standardization plan in order to improve the quality of schools and the educational experience for all children in Colorado. This plan would change the schools, improving them and offering more opportunities for children, teachers, and the community. In a state where geography limited opportunity to make all schools equal and brought challenges to state school leaders to improve education throughout the state, the leadership prevailed and worked together with local schools and school leaders to make drastic improvements in the curriculum. This paper will discuss this plan and will highlight key contributions to this standardization plan that improved opportunities for all students in the state of Colorado through these educational initiatives.

Keywords: history of education, standardization, curriculum, state superintendent, women in education

Procedia PDF Downloads 15
9982 Future Housing Energy Efficiency Associated with the Auckland Unitary Plan

Authors: Bin Su

Abstract:

The draft Auckland Unitary Plan outlines the future land used for new housing and businesses with Auckland population growth over the next thirty years. According to Auckland Unitary Plan, over the next 30 years, the population of Auckland is projected to increase by one million, and up to 70% of total new dwellings occur within the existing urban area. Intensification will not only increase the number of median or higher density houses such as terrace house, apartment building, etc. within the existing urban area but also change mean housing design data that can impact building thermal performance under the local climate. Based on mean energy consumption and building design data, and their relationships of a number of Auckland sample houses, this study is to estimate the future mean housing energy consumption associated with the change of mean housing design data and evaluate housing energy efficiency with the Auckland Unitary Plan.

Keywords: Auckland Unitary Plan, building thermal design, housing design, housing energy efficiency

Procedia PDF Downloads 352
9981 Endometriosis: The Optimal Treatment of Recurrent Endometrioma in Infertile Patients

Authors: Smita Lakhotia, C. Kew, S. H. M. Siraj, B. Chern

Abstract:

Up to 50% of those with endometriosis may suffer from infertility due to either distorted pelvic anatomy/impaired oocyte release or inhibit ovum pickup and transport, altered peritoneal function, endocrine and anovulatory disorders, including LUF, impaired implantation, progesterone resistance or decreased levels of cellular immunity. The dilemma continues as to whether the surgery or IVF is the optimal management for such recurrent endometriomas. The core question is whether surgery adds anything of value for infertile women with recurrent endometriosis or not. Complete and detailed information on risks and benefits of treatment alternatives must be offered to patients, giving a realistic estimate of chances of success of repetitive surgery and of multiple IVF cycles in order to allow unbiased choices between different possible optionsAn individualized treatment plan should be developed taking into account patient age, duration of infertility, previous pregnancies and specific clinical conditions and wish.

Keywords: recurrent endometriosis, infertility, oocyte release, pregnancy

Procedia PDF Downloads 213
9980 Autonomic Recovery Plan with Server Virtualization

Authors: S. Hameed, S. Anwer, M. Saad, M. Saady

Abstract:

For autonomic recovery with server virtualization, a cogent plan that includes recovery techniques and backups with virtualized servers can be developed instead of assigning an idle server to backup operations. In addition to hardware cost reduction and data center trail, the disaster recovery plan can ensure system uptime and to meet objectives of high availability, recovery time, recovery point, server provisioning, and quality of services. This autonomic solution would also support disaster management, testing, and development of the recovery site. In this research, a workflow plan is proposed for supporting disaster recovery with virtualization providing virtual monitoring, requirements engineering, solution decision making, quality testing, and disaster management. This recovery model would make disaster recovery a lot easier, faster, and less error prone.

Keywords: autonomous intelligence, disaster recovery, cloud computing, server virtualization

Procedia PDF Downloads 135
9979 A CORDIC Based Design Technique for Efficient Computation of DCT

Authors: Deboraj Muchahary, Amlan Deep Borah Abir J. Mondal, Alak Majumder

Abstract:

A discrete cosine transform (DCT) is described and a technique to compute it using fast Fourier transform (FFT) is developed. In this work, DCT of a finite length sequence is obtained by incorporating CORDIC methodology in radix-2 FFT algorithm. The proposed methodology is simple to comprehend and maintains a regular structure, thereby reducing computational complexity. DCTs are used extensively in the area of digital processing for the purpose of pattern recognition. So the efficient computation of DCT maintaining a transparent design flow is highly solicited.

Keywords: DCT, DFT, CORDIC, FFT

Procedia PDF Downloads 444
9978 Numerical Computation of Specific Absorption Rate and Induced Current for Workers Exposed to Static Magnetic Fields of MRI Scanners

Authors: Sherine Farrag

Abstract:

Currently-used MRI scanners in Cairo City possess static magnetic field (SMF) that varies from 0.25 up to 3T. More than half of them possess SMF of 1.5T. The SMF of the magnet determine the diagnostic power of a scanner, but not worker's exposure profile. This research paper presents an approach for numerical computation of induced electric fields and SAR values by estimation of fringe static magnetic fields. Iso-gauss line of MR was mapped and a polynomial function of the 7th degree was generated and tested. Induced current field due to worker motion in the SMF and SAR values for organs and tissues have been calculated. Results illustrate that the computation tool used permits quick accurate MRI iso-gauss mapping and calculation of SAR values which can then be used for assessment of occupational exposure profile of MRI operators.

Keywords: MRI occupational exposure, MRI safety, induced current density, specific absorption rate, static magnetic fields

Procedia PDF Downloads 403