Search results for: real time kernel preemption
17781 Assessing Project Performance through Work Sampling and Earned Value Analysis
Authors: Shobha Ramalingam
Abstract:
The majority of the infrastructure projects are affected by time overrun, resulting in project delays and subsequently cost overruns. Time overrun may vary from a few months to as high as five or more years, placing the project viability at risk. One of the probable reasons noted in the literature for this outcome in projects is due to poor productivity. Researchers contend that productivity in construction has only marginally increased over the years. While studies in the literature have extensively focused on time and cost parameters in projects, there are limited studies that integrate time and cost with productivity to assess project performance. To this end, a study was conducted to understand the project delay factors concerning cost, time and productivity. A case-study approach was adopted to collect rich data from a nuclear power plant project site for two months through observation, interviews and document review. The data were analyzed using three different approaches for a comprehensive understanding. Foremost, a root-cause analysis was performed on the data using Ishikawa’s fish-bone diagram technique to identify the various factors impacting the delay concerning time. Based on it, a questionnaire was designed and circulated to concerned executives, including project engineers and contractors to determine the frequency of occurrence of the delay, which was then compiled and presented to the management for a possible solution to mitigate. Second, a productivity analysis was performed on select activities, including rebar bending and concreting through a time-motion study to analyze product performance. Third, data on cost of construction for three years allowed analyzing the cost performance using earned value management technique. All three techniques allowed to systematically and comprehensively identify the key factors that deter project performance and productivity loss in the construction of the nuclear power plant project. The findings showed that improper planning and coordination between multiple trades, concurrent operations, improper workforce and material management, fatigue due to overtime were some of the key factors that led to delays and poor productivity. The findings are expected to act as a stepping stone for further research and have implications for practitioners.Keywords: earned value analysis, time performance, project costs, project delays, construction productivity
Procedia PDF Downloads 9717780 Importance of Road Infrastructure on the People Live in Afghanistan
Authors: Mursal Ibrahim Zada
Abstract:
Since 2001, the new Government of Afghanistan has put the improvement of transportation in rural area as one of the key issues for the development of the country. Since then, about 17,000 km of rural roads were planned to be constructed in the entire country. This thesis will assess the impact of rural road improvement on the development of rural communities and housing facilities. Specifically, this study aims to show that the improved road has leads to an improvement in the community, which in turn has a positive effect on the lives of rural people. To obtain this goal, a questionnaire survey was conducted in March 2015 to the residents of four different districts of Kabul province, Afghanistan, where the road projects were constructed in recent years. The collected data was analyzed using on a regression analysis considering different factors such as land price, waiting time at the station, travel time to the city, number of employed family members and so on. Three models are developed to demonstrate the relationship between different factors before and after the improvement of rural transportation. The results showed a significant change positively in the value of land price and housing facilities, travel time to the city, waiting time at the station, number of employed family members, fare per trip to the city, and number of trips to the city per month after the pavement of the road. The results indicated that the improvement of transportation has a significant impact on the improvement of the community in different parts, especially on the price of land and housing facility and travel time to the city.Keywords: accessibility, Afghanistan, housing facility, rural area, land price
Procedia PDF Downloads 26317779 Synthesis, Characterization, and Application of Novel Trihexyltetradecyl Phosphonium Chloride for Extractive Desulfurization of Liquid Fuel
Authors: Swapnil A. Dharaskar, Kailas L. Wasewar, Mahesh N. Varma, Diwakar Z. Shende
Abstract:
Owing to the stringent environmental regulations in many countries for production of ultra low sulfur petroleum fractions intending to reduce sulfur emissions results in enormous interest in this area among the scientific community. The requirement of zero sulfur emissions enhances the prominence for more advanced techniques in desulfurization. Desulfurization by extraction is a promising approach having several advantages over conventional hydrodesulphurization. Present work is dealt with various new approaches for desulfurization of ultra clean gasoline, diesel and other liquid fuels by extraction with ionic liquids. In present paper experimental data on extractive desulfurization of liquid fuel using trihexyl tetradecyl phosphonium chloride has been presented. The FTIR, 1H-NMR, and 13C-NMR have been discussed for the molecular confirmation of synthesized ionic liquid. Further, conductivity, solubility, and viscosity analysis of ionic liquids were carried out. The effects of reaction time, reaction temperature, sulfur compounds, ultrasonication, and recycling of ionic liquid without regeneration on removal of dibenzothiphene from liquid fuel were also investigated. In extractive desulfurization process, the removal of dibenzothiophene in n-dodecane was 84.5% for mass ratio of 1:1 in 30 min at 30OC under the mild reaction conditions. Phosphonium ionic liquids could be reused five times without a significant decrease in activity. Also, the desulfurization of real fuels, multistage extraction was examined. The data and results provided in present paper explore the significant insights of phosphonium based ionic liquids as novel extractant for extractive desulfurization of liquid fuels.Keywords: ionic liquid, PPIL, desulfurization, liquid fuel, extraction
Procedia PDF Downloads 60917778 Evaluating Construction Project Outcomes: Synergy Through the Evolution of Digital Innovation and Strategic Management
Authors: Mirindi Derrick, Mirindi Frederic, Oluwakemi Oshineye
Abstract:
Abstract: The ongoing high rate of construction project failures worldwide is often blamed on the difficulties of managing stakeholders. This highlights the crucial role of strategic management (SM) in achieving project success. This study investigates how integrating digital tools into the SM framework can effectively address stakeholder-related challenges. This work specifically focuses on the impact of evolving digital tools, such as Project Management Software (PMS) (e.g., Basecamp and Wrike), Building Information Modeling (BIM) (e.g., Tekla BIMsight and Autodesk Navisworks), Virtual and Augmented Reality (VR/AR) (e.g., Microsoft HoloLens), drones and remote monitoring, and social media and Web-Based platforms, in improving stakeholder engagement and project outcomes. Through existing literature with examples of failed projects, the study highlights how the evolution of digital tools will serve as facilitators within the strategic management process. These tools offer benefits such as real-time data access, enhanced visualization, and more efficient workflows to mitigate stakeholder challenges in construction projects. The findings indicate that integrating digital tools with SM principles effectively addresses stakeholder challenges, resulting in improved project outcomes and stakeholder satisfaction. The research advocates for a combined approach that embraces both strategic management and digital innovation to navigate the complex stakeholder landscape in construction projects.Keywords: strategic management, digital tools, virtual and augmented reality, stakeholder management, building information modeling, project management software
Procedia PDF Downloads 5017777 New Machine Learning Optimization Approach Based on Input Variables Disposition Applied for Time Series Prediction
Authors: Hervice Roméo Fogno Fotsoa, Germaine Djuidje Kenmoe, Claude Vidal Aloyem Kazé
Abstract:
One of the main applications of machine learning is the prediction of time series. But a more accurate prediction requires a more optimal model of machine learning. Several optimization techniques have been developed, but without considering the input variables disposition of the system. Thus, this work aims to present a new machine learning architecture optimization technique based on their optimal input variables disposition. The validations are done on the prediction of wind time series, using data collected in Cameroon. The number of possible dispositions with four input variables is determined, i.e., twenty-four. Each of the dispositions is used to perform the prediction, with the main criteria being the training and prediction performances. The results obtained from a static architecture and a dynamic architecture of neural networks have shown that these performances are a function of the input variable's disposition, and this is in a different way from the architectures. This analysis revealed that it is necessary to take into account the input variable's disposition for the development of a more optimal neural network model. Thus, a new neural network training algorithm is proposed by introducing the search for the optimal input variables disposition in the traditional back-propagation algorithm. The results of the application of this new optimization approach on the two single neural network architectures are compared with the previously obtained results step by step. Moreover, this proposed approach is validated in a collaborative optimization method with a single objective optimization technique, i.e., genetic algorithm back-propagation neural networks. From these comparisons, it is concluded that each proposed model outperforms its traditional model in terms of training and prediction performance of time series. Thus the proposed optimization approach can be useful in improving the accuracy of time series forecasts. This proves that the proposed optimization approach can be useful in improving the accuracy of time series prediction based on machine learning.Keywords: input variable disposition, machine learning, optimization, performance, time series prediction
Procedia PDF Downloads 10917776 Automated Natural Hazard Zonation System with Internet-SMS Warning: Distributed GIS for Sustainable Societies Creating Schema and Interface for Mapping and Communication
Authors: Devanjan Bhattacharya, Jitka Komarkova
Abstract:
The research describes the implementation of a novel and stand-alone system for dynamic hazard warning. The system uses all existing infrastructure already in place like mobile networks, a laptop/PC and the small installation software. The geospatial dataset are the maps of a region which are again frugal. Hence there is no need to invest and it reaches everyone with a mobile. A novel architecture of hazard assessment and warning introduced where major technologies in ICT interfaced to give a unique WebGIS based dynamic real time geohazard warning communication system. A never before architecture introduced for integrating WebGIS with telecommunication technology. Existing technologies interfaced in a novel architectural design to address a neglected domain in a way never done before–through dynamically updatable WebGIS based warning communication. The work publishes new architecture and novelty in addressing hazard warning techniques in sustainable way and user friendly manner. Coupling of hazard zonation and hazard warning procedures into a single system has been shown. Generalized architecture for deciphering a range of geo-hazards has been developed. Hence the developmental work presented here can be summarized as the development of internet-SMS based automated geo-hazard warning communication system; integrating a warning communication system with a hazard evaluation system; interfacing different open-source technologies towards design and development of a warning system; modularization of different technologies towards development of a warning communication system; automated data creation, transformation and dissemination over different interfaces. The architecture of the developed warning system has been functionally automated as well as generalized enough that can be used for any hazard and setup requirement has been kept to a minimum.Keywords: geospatial, web-based GIS, geohazard, warning system
Procedia PDF Downloads 40817775 Predictive Analytics in Oil and Gas Industry
Authors: Suchitra Chnadrashekhar
Abstract:
Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.Keywords: hydrocarbon, information technology, SAS, predictive analytics
Procedia PDF Downloads 36017774 Alternating Expectation-Maximization Algorithm for a Bilinear Model in Isoform Quantification from RNA-Seq Data
Authors: Wenjiang Deng, Tian Mou, Yudi Pawitan, Trung Nghia Vu
Abstract:
Estimation of isoform-level gene expression from RNA-seq data depends on simplifying assumptions, such as uniform reads distribution, that are easily violated in real data. Such violations typically lead to biased estimates. Most existing methods provide a bias correction step(s), which is based on biological considerations, such as GC content–and applied in single samples separately. The main problem is that not all biases are known. For example, new technologies such as single-cell RNA-seq (scRNA-seq) may introduce new sources of bias not seen in bulk-cell data. This study introduces a method called XAEM based on a more flexible and robust statistical model. Existing methods are essentially based on a linear model Xβ, where the design matrix X is known and derived based on the simplifying assumptions. In contrast, XAEM considers Xβ as a bilinear model with both X and β unknown. Joint estimation of X and β is made possible by simultaneous analysis of multi-sample RNA-seq data. Compared to existing methods, XAEM automatically performs empirical correction of potentially unknown biases. XAEM implements an alternating expectation-maximization (AEM) algorithm, alternating between estimation of X and β. For speed XAEM utilizes quasi-mapping for read alignment, thus leading to a fast algorithm. Overall XAEM performs favorably compared to other recent advanced methods. For simulated datasets, XAEM obtains higher accuracy for multiple-isoform genes, particularly for paralogs. In a differential-expression analysis of a real scRNA-seq dataset, XAEM achieves substantially greater rediscovery rates in an independent validation set.Keywords: alternating EM algorithm, bias correction, bilinear model, gene expression, RNA-seq
Procedia PDF Downloads 14217773 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained
Authors: Homa Ghave, Parmis Shahmaleki
Abstract:
This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function
Procedia PDF Downloads 26417772 Comparison of Iodine Density Quantification through Three Material Decomposition between Philips iQon Dual Layer Spectral CT Scanner and Siemens Somatom Force Dual Source Dual Energy CT Scanner: An in vitro Study
Authors: Jitendra Pratap, Jonathan Sivyer
Abstract:
Introduction: Dual energy/Spectral CT scanning permits simultaneous acquisition of two x-ray spectra datasets and can complement radiological diagnosis by allowing tissue characterisation (e.g., uric acid vs. non-uric acid renal stones), enhancing structures (e.g. boost iodine signal to improve contrast resolution), and quantifying substances (e.g. iodine density). However, the latter showed inconsistent results between the 2 main modes of dual energy scanning (i.e. dual source vs. dual layer). Therefore, the present study aimed to determine which technology is more accurate in quantifying iodine density. Methods: Twenty vials with known concentrations of iodine solutions were made using Optiray 350 contrast media diluted in sterile water. The concentration of iodine utilised ranged from 0.1 mg/ml to 1.0mg/ml in 0.1mg/ml increments, 1.5 mg/ml to 4.5 mg/ml in 0.5mg/ml increments followed by further concentrations at 5.0 mg/ml, 7mg/ml, 10 mg/ml and 15mg/ml. The vials were scanned using Dual Energy scan mode on a Siemens Somatom Force at 80kV/Sn150kV and 100kV/Sn150kV kilovoltage pairing. The same vials were scanned using Spectral scan mode on a Philips iQon at 120kVp and 140kVp. The images were reconstructed at 5mm thickness and 5mm increment using Br40 kernel on the Siemens Force and B Filter on Philips iQon. Post-processing of the Dual Energy data was performed on vendor-specific Siemens Syngo VIA (VB40) and Philips Intellispace Portal (Ver. 12) for the Spectral data. For each vial and scan mode, the iodine concentration was measured by placing an ROI in the coronal plane. Intraclass correlation analysis was performed on both datasets. Results: The iodine concentrations were reproduced with a high degree of accuracy for Dual Layer CT scanner. Although the Dual Source images showed a greater degree of deviation in measured iodine density for all vials, the dataset acquired at 80kV/Sn150kV had a higher accuracy. Conclusion: Spectral CT scanning by the dual layer technique has higher accuracy for quantitative measurements of iodine density compared to the dual source technique.Keywords: CT, iodine density, spectral, dual-energy
Procedia PDF Downloads 12017771 Smart Campus Digital Twin: Basic Framework - Current State, Trends and Challenges
Authors: Enido Fabiano de Ramos, Ieda Kanashiro Makiya, Francisco I. Giocondo Cesar
Abstract:
This study presents an analysis of the Digital Twin concept applied to the academic environment, focusing on the development of a Digital Twin Smart Campus Framework. Using bibliometric analysis methodologies and literature review, the research investigates the evolution and applications of the Digital Twin in educational contexts, comparing these findings with the advances of Industry 4.0. It was identified gaps in the existing literature and highlighted the need to adapt Digital Twin principles to meet the specific demands of a smart campus. By integrating Industry 4.0 concepts such as automation, Internet of Things, and real-time data analytics, we propose an innovative framework for the successful implementation of the Digital Twin in academic settings. The results of this study provide valuable insights for university campus managers, allowing for a better understanding of the potential applications of the Digital Twin for operations, security, and user experience optimization. In addition, our framework offers practical guidance for transitioning from a digital campus to a digital twin smart campus, promoting innovation and efficiency in the educational environment. This work contributes to the growing literature on Digital Twins and Industry 4.0, while offering a specific and tailored approach to transforming university campuses into smart and connected spaces, high demanded by Society 5.0 trends. It is hoped that this framework will serve as a basis for future research and practical implementations in the field of higher education and educational technology.Keywords: smart campus, digital twin, industry 4.0, education trends, society 5.0
Procedia PDF Downloads 5917770 The Effect of Extremely Low Frequency Magnetic Field on Rats Brain
Authors: Omar Abdalla, Abdelfatah Ahmed, Ahmed Mustafa, Abdelazem Eldouma
Abstract:
The purpose of this study is evaluating the effect of extremely low frequency magnetic field on Waster rats brain. The number of rats used in this study were 25, which were divided into five groups, each group containing five rats as follows: Group 1: The control group which was not exposed to energized field; Group 2: Rats were exposed to a magnetic field with an intensity of 0.6 mT (2 hours/day); Group 3: Rats were exposed to a magnetic field of 1.2 mT (2 hours/day); Group4: Rats were exposed to a magnetic field of 1.8 mT (2 hours/day); Group 5: Rats were exposed to a magnetic field of 2.4 mT (2 hours/day) and all groups were exposed for seven days, by designing a maze and calculating the time average for arriving to the decoy at special conditions. We found the time average before exposure for the all groups was G2=330 s, G3=172 s, G4=500 s and G5=174 s, respectively. We exposed all groups to ELF-MF and measured the time and we found: G2=465 s, G3=388 s, G4=501 s, and G5=442 s. It was observed that the time average increased directly with field strength. Histological samples of frontal lop of brain for all groups were taken and we found lesion, atrophy, empty vacuoles and disorder choroid plexus at frontal lope of brain. And finally we observed the disorder of choroid plexus in histological results and Alzheimer's symptoms increase when the magnetic field increases.Keywords: nonionizing radiation, biophysics, magnetic field, shrinkage
Procedia PDF Downloads 54517769 Validation of the Linear Trend Estimation Technique for Prediction of Average Water and Sewerage Charge Rate Prices in the Czech Republic
Authors: Aneta Oblouková, Eva Vítková
Abstract:
The article deals with the issue of water and sewerage charge rate prices in the Czech Republic. The research is specifically focused on the analysis of the development of the average prices of water and sewerage charge rate in the Czech Republic in the years 1994-2021 and on the validation of the chosen methodology relevant for the prediction of the development of the average prices of water and sewerage charge rate in the Czech Republic. The research is based on data collection. The data for this research was obtained from the Czech Statistical Office. The aim of the paper is to validate the relevance of the mathematical linear trend estimate technique for the calculation of the predicted average prices of water and sewerage charge rates. The real values of the average prices of water and sewerage charge rates in the Czech Republic in the years 1994-2018 were obtained from the Czech Statistical Office and were converted into a mathematical equation. The same type of real data was obtained from the Czech Statistical Office for the years 2019-2021. Prediction of the average prices of water and sewerage charge rates in the Czech Republic in the years 2019-2021 were also calculated using a chosen method -a linear trend estimation technique. The values obtained from the Czech Statistical Office and the values calculated using the chosen methodology were subsequently compared. The research result is a validation of the chosen mathematical technique to be a suitable technique for this research.Keywords: Czech Republic, linear trend estimation, price prediction, water and sewerage charge rate
Procedia PDF Downloads 12017768 Effect of Preloading on Long-Term Settlement of Closed Landfills: A Numerical Analysis
Authors: Mehrnaz Alibeikloo, Hajar Share Isfahani, Hadi Khabbaz
Abstract:
In recent years, by developing cities and increasing population, reconstructing on closed landfill sites in some regions is unavoidable. Long-term settlement is one of the major concerns associated with reconstruction on landfills after closure. The purpose of this research is evaluating the effect of preloading in various patterns of height and time on long-term settlements of closed landfills. In this regard, five scenarios of surcharge from 1 to 3 m high within 3, 4.5 and 6 months of preloading time have been modeled using PLAXIS 2D software. Moreover, the numerical results have been compared to those obtained from analytical methods, and a good agreement has been achieved. The findings indicate that there is a linear relationship between settlement and surcharge height. Although, long-term settlement decreased by applying a longer and higher preloading, the time of preloading was found to be a more effective factor compared to preloading height.Keywords: preloading, long-term settlement, landfill, PLAXIS 2D
Procedia PDF Downloads 19517767 Kemmer Oscillator in Cosmic String Background
Authors: N. Messai, A. Boumali
Abstract:
In this work, we aim to solve the two dimensional Kemmer equation including Dirac oscillator interaction term, in the background space-time generated by a cosmic string which is submitted to an uniform magnetic field. Eigenfunctions and eigenvalues of our problem have been found and the influence of the cosmic string space-time on the energy spectrum has been analyzed.Keywords: Kemmer oscillator, cosmic string, Dirac oscillator, eigenfunctions
Procedia PDF Downloads 58417766 Effects of Boiling Temperature and Time on Colour, Texture and Sensory Properties of Volutharpa ampullacea perryi Meat
Authors: Xianbao Sun, Jinlong Zhao, Shudong He, Jing Li
Abstract:
Volutharpa ampullacea perryi is a high-protein marine shellfish. However, few data are available on the effects of boiling temperatures and time on quality of the meat. In this study, colour, texture and sensory characteristics of Volutharpa ampullacea perryi meat during the boiling cooking processes (75-100 °C, 5-60 min) were investigated by colors analysis, texture profile analysis (TPA), scanning electron microscope (SEM) and sensory evaluation. The ratio of cooking loss gradually increased with the increase of temperature and time. The colour of meat became lighter and more yellower from 85 °C to 95 °C in a short time (5-20 min), but it became brown after a 30 min treatment. TPA results showed that the Volutharpa ampullacea perryi meat were more firm and less cohesive after a higher temperature (95-100 °C) treatment even in a short period (5-15 min). Based on the SEM analysis, it was easily found that the myofibrils structure was destroyed at a higher temperature (85-100 °C). Sensory data revealed that the meat cooked at 85-90 °C in 10-20 min showed higher scores in overall acceptance, as well as color, hardness and taste. Based on these results, it could be constructed that Volutharpa ampullacea perryi meat should be heated on a suitable condition (such as 85 °C 15 min or 90 °C 10 min) in the boiling cooking to be ensure a better acceptability.Keywords: Volutharpa ampullacea perryi meat, boiling cooking, colour, sensory, texture
Procedia PDF Downloads 28117765 Improving 99mTc-tetrofosmin Myocardial Perfusion Images by Time Subtraction Technique
Authors: Yasuyuki Takahashi, Hayato Ishimura, Masao Miyagawa, Teruhito Mochizuki
Abstract:
Quantitative measurement of myocardium perfusion is possible with single photon emission computed tomography (SPECT) using a semiconductor detector. However, accumulation of 99mTc-tetrofosmin in the liver may make it difficult to assess that accurately in the inferior myocardium. Our idea is to reduce the high accumulation in the liver by using dynamic SPECT imaging and a technique called time subtraction. We evaluated the performance of a new SPECT system with a cadmium-zinc-telluride solid-state semi- conductor detector (Discovery NM 530c; GE Healthcare). Our system acquired list-mode raw data over 10 minutes for a typical patient. From the data, ten SPECT images were reconstructed, one for every minute of acquired data. Reconstruction with the semiconductor detector was based on an implementation of a 3-D iterative Bayesian reconstruction algorithm. We studied 20 patients with coronary artery disease (mean age 75.4 ± 12.1 years; range 42-86; 16 males and 4 females). In each subject, 259 MBq of 99mTc-tetrofosmin was injected intravenously. We performed both a phantom and a clinical study using dynamic SPECT. An approximation to a liver-only image is obtained by reconstructing an image from the early projections during which time the liver accumulation dominates (0.5~2.5 minutes SPECT image-5~10 minutes SPECT image). The extracted liver-only image is then subtracted from a later SPECT image that shows both the liver and the myocardial uptake (5~10 minutes SPECT image-liver-only image). The time subtraction of liver was possible in both a phantom and the clinical study. The visualization of the inferior myocardium was improved. In past reports, higher accumulation in the myocardium due to the overlap of the liver is un-diagnosable. Using our time subtraction method, the image quality of the 99mTc-tetorofosmin myocardial SPECT image is considerably improved.Keywords: 99mTc-tetrofosmin, dynamic SPECT, time subtraction, semiconductor detector
Procedia PDF Downloads 33517764 Two Efficient Heuristic Algorithms for the Integrated Production Planning and Warehouse Layout Problem
Authors: Mohammad Pourmohammadi Fallah, Maziar Salahi
Abstract:
In the literature, a mixed-integer linear programming model for the integrated production planning and warehouse layout problem is proposed. To solve the model, the authors proposed a Lagrangian relax-and-fix heuristic that takes a significant amount of time to stop with gaps above 5$\%$ for large-scale instances. Here, we present two heuristic algorithms to solve the problem. In the first one, we use a greedy approach by allocating warehouse locations with less reservation costs and also less transportation costs from the production area to locations and from locations to the output point to items with higher demands. Then a smaller model is solved. In the second heuristic, first, we sort items in descending order according to the fraction of the sum of the demands for that item in the time horizon plus the maximum demand for that item in the time horizon and the sum of all its demands in the time horizon. Then we categorize the sorted items into groups of 3, 4, or 5 and solve a small-scale optimization problem for each group, hoping to improve the solution of the first heuristic. Our preliminary numerical results show the effectiveness of the proposed heuristics.Keywords: capacitated lot-sizing, warehouse layout, mixed-integer linear programming, heuristics algorithm
Procedia PDF Downloads 19617763 Optimum Performance of the Gas Turbine Power Plant Using Adaptive Neuro-Fuzzy Inference System and Statistical Analysis
Authors: Thamir K. Ibrahim, M. M. Rahman, Marwah Noori Mohammed
Abstract:
This study deals with modeling and performance enhancements of a gas-turbine combined cycle power plant. A clean and safe energy is the greatest challenges to meet the requirements of the green environment. These requirements have given way the long-time governing authority of steam turbine (ST) in the world power generation, and the gas turbine (GT) will replace it. Therefore, it is necessary to predict the characteristics of the GT system and optimize its operating strategy by developing a simulation system. The integrated model and simulation code for exploiting the performance of gas turbine power plant are developed utilizing MATLAB code. The performance code for heavy-duty GT and CCGT power plants are validated with the real power plant of Baiji GT and MARAFIQ CCGT plants the results have been satisfactory. A new technology of correlation was considered for all types of simulation data; whose coefficient of determination (R2) was calculated as 0.9825. Some of the latest launched correlations were checked on the Baiji GT plant and apply error analysis. The GT performance was judged by particular parameters opted from the simulation model and also utilized Adaptive Neuro-Fuzzy System (ANFIS) an advanced new optimization technology. The best thermal efficiency and power output attained were about 56% and 345MW respectively. Thus, the operation conditions and ambient temperature are strongly influenced on the overall performance of the GT. The optimum efficiency and power are found at higher turbine inlet temperatures. It can be comprehended that the developed models are powerful tools for estimating the overall performance of the GT plants.Keywords: gas turbine, optimization, ANFIS, performance, operating conditions
Procedia PDF Downloads 42517762 Quick Covering Machine for Grain Drying Pavement
Authors: Fatima S. Rodriguez, Victorino T. Taylan, Manolito C. Bulaong, Helen F. Gavino, Vitaliana U. Malamug
Abstract:
In sundrying, the quality of the grains are greatly reduced when paddy grains were caught by the rain unsacked and unstored resulting to reduced profit. The objectives of this study were to design and fabricate a quick covering machine for grain drying pavement to test and evaluate the operating characteristics of the machine according to its deployment speed, recovery speed, deployment time, recovery time, power consumption, aesthetics of laminated sack, conducting partial budget, and cost curve analysis. The machine was able to cover the grains in a 12.8 m x 22.5 m grain drying pavement at an average time of 17.13 s. It consumed 0 .53 W-hr for the deployment and recovery of the cover. The machine entailed an investment cost of $1,344.40 and an annual cost charge of $647.32. Moreover, the savings per year using the quick covering machine was $101.83.Keywords: quick, covering machine, grain, drying pavement
Procedia PDF Downloads 37317761 Introduction to Multi-Agent Deep Deterministic Policy Gradient
Authors: Xu Jie
Abstract:
As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decisionmaking problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security). By modeling the multi-job collaborative cryptographic service scheduling problem as a multiobjective optimized job flow scheduling problem, and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing, and effectively solves the problem of complex resource scheduling in cryptographic services.Keywords: multi-agent reinforcement learning, non-stationary dynamics, multi-agent systems, cooperative and competitive agents
Procedia PDF Downloads 2417760 Analysis of Silicon Controlled Rectifier-Based Electrostatic Discharge Protection Circuits with Electrical Characteristics for the 5V Power Clamp
Authors: Jun-Geol Park, Kyoung-Il Do, Min-Ju Kwon, Kyung-Hyun Park, Yong-Seo Koo
Abstract:
This paper analyzed the SCR (Silicon Controlled Rectifier)-based ESD (Electrostatic Discharge) protection circuits with the turn-on time characteristics. The structures are the LVTSCR (Low Voltage Triggered SCR), the ZTSCR (Zener Triggered SCR) and the PTSCR (P-Substrate Triggered SCR). The three structures are for the 5V power clamp. In general, the structures with the low trigger voltage structure can have the fast turn-on characteristics than other structures. All the ESD protection circuits have the low trigger voltage by using the N+ bridge region of LVTSCR, by using the zener diode structure of ZTSCR, by increasing the trigger current of PTSCR. The simulation for the comparison with the turn-on time was conducted by the Synopsys TCAD simulator. As the simulation results, the LVTSCR has the turn-on time of 2.8 ns, ZTSCR of 2.1 ns and the PTSCR of 2.4 ns. The HBM simulation results, however, show that the PTSCR is the more robust structure of 430K in HBM 8kV standard than 450K of LVTSCR and 495K of ZTSCR. Therefore the PTSCR is the most effective ESD protection circuit for the 5V power clamp.Keywords: ESD, SCR, turn-on time, trigger voltage, power clamp
Procedia PDF Downloads 34817759 Investigation of the Function of Chemotaxonomy of White Tea on the Regulatory Function of Genes in Pathway of Colon Cancer
Authors: Fereydoon Bondarian, Samira Shaygan
Abstract:
Today, many nutritionists recommend the consumption of plants, fruits, and vegetables to provide the antioxidants needed by the body because the use of plant antioxidants usually causes fewer side effects and better treatment. Natural antioxidants increase the power of plasma antioxidants and reduce the incidence of some diseases, such as cancer. Bad lifestyles and environmental factors play an important role in increasing the incidence of cancer. In this study, different extracts of white teas taken from two types of tea available in Iran (clone 100 and Chinese hybrid) due to the presence of a hydroxyl functional group in their structure to inhibit free radicals and anticancer properties, using 3 aqueous, methanolic and aqueous-methanolic methods were used. The total polyphenolic content was calculated using the Folin-Ciocalcu method, and the percentage of inhibition and trapping of free radicals in each of the extracts was calculated using the DPPH method. With the help of high-performance liquid chromatography, a small amount of each catechin in the tea samples was obtained. Clone 100 white tea was found to be the best sample of tea in terms of all the examined attributes (total polyphenol content, antioxidant properties, and individual amount of each catechin). The results showed that aqueous and aqueous-methanolic extracts of Clone 100 white tea have the highest total polyphenol content with 27.59±0.08 and 36.67±0.54 (equivalent gallic acid per gram dry weight of leaves), respectively. Due to having the highest level of different groups of catechin compounds, these extracts have the highest property of inhibiting and trapping free radicals with 66.61±0.27 and 71.74±0.27% (mg/l) of the extracted sample against ascorbic acid). Using the MTT test, the inhibitory effect of clone 100 white tea extract in inhibiting the growth of HCT-116 colon cancer cells was investigated and the best time and concentration treatments were 500, 150 and 1000 micrograms in 8, 16 and 24 hours, respectively. To investigate gene expression changes, selected genes, including tumorigenic genes, proto-oncogenes, tumor suppressors, and genes involved in apoptosis, were selected and analyzed using the real-time PCR method and in the presence of concentrations obtained for white tea. White tea extract at a concentration of 1000 μg/ml 3 times 16, 8, and 24 hours showed the highest growth inhibition in cancer cells with 53.27, 55.8, and 86.06%. The concentration of 1000 μg/ml aqueous extract of white tea under 24-hour treatment increased the expression of tumor suppressor genes compared to the normal sample.Keywords: catechin, gene expression, suppressor genes, colon cell line
Procedia PDF Downloads 5817758 Mapping the Pain Trajectory of Breast Cancer Survivors: Results from a Retrospective Chart Review
Authors: Wilfred Elliam
Abstract:
Background: Pain is a prevalent and debilitating symptom among breast cancer patients, impacting their quality of life and overall well-being. The experience of pain in this population is multifaceted, influenced by a combination of disease-related factors, treatment side effects, and individual characteristics. Despite advancements in cancer treatment and pain management, many breast cancer patients continue to suffer from chronic pain, which can persist long after the completion of treatment. Understanding the progression of pain in breast cancer patients over time and identifying its correlates is crucial for effective pain management and supportive care strategies. The purpose of this research is to understand the patterns and progression of pain experienced by breast cancer survivors over time. Methods: Data were collected from breast cancer patients at Hartford Hospital at four time points: baseline, 3, 6 and 12 weeks. Key variables measured include pain, body mass index (BMI), fatigue, musculoskeletal pain, sleep disturbance, and demographic variables (age, employment status, cancer stage, and ethnicity). Binomial generalized linear mixed models were used to examine changes in pain and symptoms over time. Results: A total of 100 breast cancer patients aged 18 years old were included in the analysis. We found that the effect of time on pain (p = 0.024), musculoskeletal pain (p= <0.001), fatigue (p= <0.001), and sleep disturbance (p-value = 0.013) were statistically significant with pain progression in breast cancer patients. Patients using aromatase inhibitors have worse fatigue (<0.05) and musculoskeletal pain (<0.001) compared to patients with Tamoxifen. Patients who are obese (<0.001) and overweight (<0.001) are more likely to report pain compared to patients with normal weight. Conclusion: This study revealed the complex interplay between various factors such as time, pain, sleep disturbance in breast cancer patient. Specifically, pain, musculoskeletal pain, sleep disturbance, fatigue exhibited significant changes across the measured time points, indicating a dynamic pain progression in these patients. The findings provide a foundation for future research and targeted interventions aimed at improving pain in breast cancer patient outcomes.Keywords: breast cancer, chronic pain, pain management, quality of life
Procedia PDF Downloads 3117757 Source-Detector Trajectory Optimization for Target-Based C-Arm Cone Beam Computed Tomography
Authors: S. Hatamikia, A. Biguri, H. Furtado, G. Kronreif, J. Kettenbach, W. Birkfellner
Abstract:
Nowadays, three dimensional Cone Beam CT (CBCT) has turned into a widespread clinical routine imaging modality for interventional radiology. In conventional CBCT, a circular sourcedetector trajectory is used to acquire a high number of 2D projections in order to reconstruct a 3D volume. However, the accumulated radiation dose due to the repetitive use of CBCT needed for the intraoperative procedure as well as daily pretreatment patient alignment for radiotherapy has become a concern. It is of great importance for both health care providers and patients to decrease the amount of radiation dose required for these interventional images. Thus, it is desirable to find some optimized source-detector trajectories with the reduced number of projections which could therefore lead to dose reduction. In this study we investigate some source-detector trajectories with the optimal arbitrary orientation in the way to maximize performance of the reconstructed image at particular regions of interest. To achieve this approach, we developed a box phantom consisting several small target polytetrafluoroethylene spheres at regular distances through the entire phantom. Each of these spheres serves as a target inside a particular region of interest. We use the 3D Point Spread Function (PSF) as a measure to evaluate the performance of the reconstructed image. We measured the spatial variance in terms of Full-Width-Half-Maximum (FWHM) of the local PSFs each related to a particular target. The lower value of FWHM shows the better spatial resolution of reconstruction results at the target area. One important feature of interventional radiology is that we have very well-known imaging targets as a prior knowledge of patient anatomy (e.g. preoperative CT) is usually available for interventional imaging. Therefore, we use a CT scan from the box phantom as the prior knowledge and consider that as the digital phantom in our simulations to find the optimal trajectory for a specific target. Based on the simulation phase we have the optimal trajectory which can be then applied on the device in real situation. We consider a Philips Allura FD20 Xper C-arm geometry to perform the simulations and real data acquisition. Our experimental results based on both simulation and real data show our proposed optimization scheme has the capacity to find optimized trajectories with minimal number of projections in order to localize the targets. Our results show the proposed optimized trajectories are able to localize the targets as good as a standard circular trajectory while using just 1/3 number of projections. Conclusion: We demonstrate that applying a minimal dedicated set of projections with optimized orientations is sufficient to localize targets, may minimize radiation.Keywords: CBCT, C-arm, reconstruction, trajectory optimization
Procedia PDF Downloads 13217756 An Intelligent Transportation System for Safety and Integrated Management of Railway Crossings
Authors: M. Magrini, D. Moroni, G. Palazzese, G. Pieri, D. Azzarelli, A. Spada, L. Fanucci, O. Salvetti
Abstract:
Railway crossings are complex entities whose optimal management cannot be addressed unless with the help of an intelligent transportation system integrating information both on train and vehicular flows. In this paper, we propose an integrated system named SIMPLE (Railway Safety and Infrastructure for Mobility applied at level crossings) that, while providing unparalleled safety in railway level crossings, collects data on rail and road traffic and provides value-added services to citizens and commuters. Such services include for example alerts, via variable message signs to drivers and suggestions for alternative routes, towards a more sustainable, eco-friendly and efficient urban mobility. To achieve these goals, SIMPLE is organized as a System of Systems (SoS), with a modular architecture whose components range from specially-designed radar sensors for obstacle detection to smart ETSI M2M-compliant camera networks for urban traffic monitoring. Computational unit for performing forecast according to adaptive models of train and vehicular traffic are also included. The proposed system has been tested and validated during an extensive trial held in the mid-sized Italian town of Montecatini, a paradigmatic case where the rail network is inextricably linked with the fabric of the city. Results of the tests are reported and discussed.Keywords: Intelligent Transportation Systems (ITS), railway, railroad crossing, smart camera networks, radar obstacle detection, real-time traffic optimization, IoT, ETSI M2M, transport safety
Procedia PDF Downloads 49717755 Development of a Bus Information Web System
Authors: Chiyoung Kim, Jaegeol Yim
Abstract:
Bus service is often either main or the only public transportation available in cities. In metropolitan areas, both subways and buses are available whereas in the medium sized cities buses are usually the only type of public transportation available. Bus Information Systems (BIS) provide current locations of running buses, efficient routes to travel from one place to another, points of interests around a given bus stop, a series of bus stops consisting of a given bus route, and so on to users. Thanks to BIS, people do not have to waste time at a bus stop waiting for a bus because BIS provides exact information on bus arrival times at a given bus stop. Therefore, BIS does a lot to promote the use of buses contributing to pollution reduction and saving natural resources. BIS implementation costs a huge amount of budget as it requires a lot of special equipment such as road side equipment, automatic vehicle identification and location systems, trunked radio systems, and so on. Consequently, medium and small sized cities with a low budget cannot afford to install BIS even though people in these cities need BIS service more desperately than people in metropolitan areas. It is possible to provide BIS service at virtually no cost under the assumption that everybody carries a smartphone and there is at least one person with a smartphone in a running bus who is willing to reveal his/her location details while he/she is sitting in a bus. This assumption is usually true in the real world. The smartphone penetration rate is greater than 100% in the developed countries and there is no reason for a bus driver to refuse to reveal his/her location details while driving. We have developed a mobile app that periodically reads values of sensors including GPS and sends GPS data to the server when the bus stops or when the elapsed time from the last send attempt is greater than a threshold. This app detects the bus stop state by investigating the sensor values. The server that receives GPS data from this app has also been developed. Under the assumption that the current locations of all running buses collected by the mobile app are recorded in a database, we have also developed a web site that provides all kinds of information that most BISs provide to users through the Internet. The development environment is: OS: Windows 7 64bit, IDE: Eclipse Luna 4.4.1, Spring IDE 3.7.0, Database: MySQL 5.1.7, Web Server: Apache Tomcat 7.0, Programming Language: Java 1.7.0_79. Given a start and a destination bus stop, it finds a shortest path from the start to the destination using the Dijkstra algorithm. Then, it finds a convenient route considering number of transits. For the user interface, we use the Google map. Template classes that are used by the Controller, DAO, Service and Utils classes include BUS, BusStop, BusListInfo, BusStopOrder, RouteResult, WalkingDist, Location, and so on. We are now integrating the mobile app system and the web app system.Keywords: bus information system, GPS, mobile app, web site
Procedia PDF Downloads 21617754 Assessment of Physical Activity Patterns in Patients with Cardiopulmonary Diseases
Authors: Ledi Neçaj
Abstract:
Objectives: The target of this paper is (1) to explain objectively physical activity model throughout three chronic cardiopulmonary conditions, and (2) to study the connection among physical activity dimensions with disease severity, self-reported physical and emotional functioning, and exercise performance. Material and Methods: This is a cross-sectional study of patients in their domestic environment. Patients with cardiopulmonary diseases were: chronic obstructive pulmonary disease (COPD), (n-63), coronary heart failure (n=60), and patients with implantable cardioverter defibrillator (n=60). Main results measures: Seven ambulatory physical activity dimensions (total steps, percentage time active, percentage time ambulating at low, medium, and hard intensity, maximum cadence for 30 non-stop minutes, and peak performance) have been measured with an accelerometer. Results: Subjects with COPD had the lowest amount of ambulatory physical activity compared with topics with coronary heart failure and cardiac dysrhythmias (all 7 interest dimensions, P<.05); total step counts have been: 5319 as opposed to 7464 as opposed to 9570, respectively. Six-minute walk distance becomes correlated (r=.44-.65, P<.01) with all physical activity dimensions inside the COPD pattern, the most powerful correlations being with total steps and peak performance. In topics with cardiac impairment, maximal oxygen intake had the most effective small to slight correlations with five of the physical activity dimensions (r=.22-.40, P<.05). In contrast, correlations among 6-minute walk test distance and physical activity have been higher (r=.48-.61, P<.01) albeit in a smaller pattern of most effective patients with coronary heart failure. For all three samples, self-reported physical and mental health functioning, age, frame mass index, airflow obstruction, and ejection fraction had both exceptionally small and no significant correlations with physical activity. Conclusions: Findings from this study present a profitable benchmark of physical activity patterns in individuals with cardiopulmonary diseases for comparison with future studies. All seven dimensions of ambulatory physical activity have disfavor between subjects with COPD, heart failure, and cardiac dysrhythmias. Depending on the research or clinical goal, the use of one dimension, such as total steps, may be sufficient. Although physical activity had high correlations with performance on a six-minute walk test relative to other variables, accelerometers-based physical activity monitoring provides unique, important information about real-world behavior in patients with cardiopulmonary not already captured with existing measures.Keywords: ambulatory physical activity, walking, monitoring, COPD, heart failure, implantable defibrillator, exercise performance
Procedia PDF Downloads 8717753 Performance of the Aptima® HIV-1 Quant Dx Assay on the Panther System
Authors: Siobhan O’Shea, Sangeetha Vijaysri Nair, Hee Cheol Kim, Charles Thomas Nugent, Cheuk Yan William Tong, Sam Douthwaite, Andrew Worlock
Abstract:
The Aptima® HIV-1 Quant Dx Assay is a fully automated assay on the Panther system. It is based on Transcription-Mediated Amplification and real time detection technologies. This assay is intended for monitoring HIV-1 viral load in plasma specimens and for the detection of HIV-1 in plasma and serum specimens. Nine-hundred and seventy nine specimens selected at random from routine testing at St Thomas’ Hospital, London were anonymised and used to compare the performance of the Aptima HIV-1 Quant Dx assay and Roche COBAS® AmpliPrep/COBAS® TaqMan® HIV-1 Test, v2.0. Two-hundred and thirty four specimens gave quantitative HIV-1 viral load results in both assays. The quantitative results reported by the Aptima Assay were comparable those reported by the Roche COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 with a linear regression slope of 1.04 and an intercept on -0.097. The Aptima assay detected HIV-1 in more samples than the Roche assay. This was not due to lack of specificity of the Aptima assay because this assay gave 99.83% specificity on testing plasma specimens from 600 HIV-1 negative individuals. To understand the reason for this higher detection rate a side-by-side comparison of low level panels made from the HIV-1 3rd international standard (NIBSC10/152) and clinical samples of various subtypes were tested in both assays. The Aptima assay was more sensitive than the Roche assay. The good sensitivity, specificity and agreement with other commercial assays make the HIV-1 Quant Dx Assay appropriate for both viral load monitoring and detection of HIV-1 infections.Keywords: HIV viral load, Aptima, Roche, Panther system
Procedia PDF Downloads 37517752 Hourly Solar Radiations Predictions for Anticipatory Control of Electrically Heated Floor: Use of Online Weather Conditions Forecast
Authors: Helene Thieblemont, Fariborz Haghighat
Abstract:
Energy storage systems play a crucial role in decreasing building energy consumption during peak periods and expand the use of renewable energies in buildings. To provide a high building thermal performance, the energy storage system has to be properly controlled to insure a good energy performance while maintaining a satisfactory thermal comfort for building’s occupant. In the case of passive discharge storages, defining in advance the required amount of energy is required to avoid overheating in the building. Consequently, anticipatory supervisory control strategies have been developed forecasting future energy demand and production to coordinate systems. Anticipatory supervisory control strategies are based on some predictions, mainly of the weather forecast. However, if the forecasted hourly outdoor temperature may be found online with a high accuracy, solar radiations predictions are most of the time not available online. To estimate them, this paper proposes an advanced approach based on the forecast of weather conditions. Several methods to correlate hourly weather conditions forecast to real hourly solar radiations are compared. Results show that using weather conditions forecast allows estimating with an acceptable accuracy solar radiations of the next day. Moreover, this technique allows obtaining hourly data that may be used for building models. As a result, this solar radiation prediction model may help to implement model-based controller as Model Predictive Control.Keywords: anticipatory control, model predictive control, solar radiation forecast, thermal storage
Procedia PDF Downloads 271