Search results for: accurate forecast
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2799

Search results for: accurate forecast

2379 Eco-Parcel As a Semi-Qualitative Approach to Support Environmental Impacts Assessments in Nature-Based Tourism Destinations

Authors: Halima Kilungu, Pantaleo, K. T. Munishi

Abstract:

Climate and land-cover change affect nature-based tourism (NBT) due to its attractions' close connection to natural environments and climate. Thus, knowledge of how each attraction reacts to the changing environments and devising simple yet science based approaches to respond to these changes from a tourism perspective in space and time is timely. Nevertheless, no specific approaches exist to address the knowledge gap. The eco-parcel approach is devised to address the gap and operationalized in Serengeti and Kilimanjaro National Parks: the most climate-sensitive NBT destinations in Africa. The approach is partly descriptive and has three simple steps: (1) to identify and define tourist attractions (i.e. biotic and abiotic attractions). This creates an important database of the most poorly kept information on attractions' types in NBT destinations. (2) To create a spatial and temporal link of each attraction and describe its characteristic environments (e.g. vegetation, soil, water and rock outcrops). This is the most limited attractions' information yet important as a proxy of changes in attractions. (3) To assess the importance of individual attractions for tourism based on tourists' preferences. This information enables an accurate assessment of the value of individual attractions for tourism. The importance of the eco-parcel approach is that it describes how each attraction emerges from and is connected to specific environments, which define its attractiveness in space and time. This information allows accurate assessment of the likely losses or gains of individual attractions when climate or environment changes in specific destinations and equips tourism stakeholders with informed responses.

Keywords: climate change, environmental change, nature-based tourism, Serengeti National Park, Kilimanjaro National Park

Procedia PDF Downloads 117
2378 Chemometric-Based Voltammetric Method for Analysis of Vitamins and Heavy Metals in Honey Samples

Authors: Marwa A. A. Ragab, Amira F. El-Yazbi, Amr El-Hawiet

Abstract:

The analysis of heavy metals in honey samples is crucial. When found in honey, they denote environmental pollution. Some of these heavy metals as lead either present at low or high concentrations are considered to be toxic. Other heavy metals, for example, copper and zinc, if present at low concentrations, they considered safe even vital minerals. On the contrary, if they present at high concentrations, they are toxic. Their voltammetric determination in honey represents a challenge due to the presence of other electro-active components as vitamins, which may overlap with the peaks of the metal, hindering their accurate and precise determination. The simultaneous analysis of some vitamins: nicotinic acid (B3) and riboflavin (B2), and heavy metals: lead, cadmium, and zinc, in honey samples, was addressed. The analysis was done in 0.1 M Potassium Chloride (KCl) using a hanging mercury drop electrode (HMDE), followed by chemometric manipulation of the voltammetric data using the derivative method. Then the derivative data were convoluted using discrete Fourier functions. The proposed method allowed the simultaneous analysis of vitamins and metals though their varied responses and sensitivities. Although their peaks were overlapped, the proposed chemometric method allowed their accurate and precise analysis. After the chemometric treatment of the data, metals were successfully quantified at low levels in the presence of vitamins (1: 2000). The heavy metals limit of detection (LOD) values after the chemometric treatment of data decreased by more than 60% than those obtained from the direct voltammetric method. The method applicability was tested by analyzing the selected metals and vitamins in real honey samples obtained from different botanical origins.

Keywords: chemometrics, overlapped voltammetric peaks, derivative and convoluted derivative methods, metals and vitamins

Procedia PDF Downloads 145
2377 A Nonlinear Dynamical System with Application

Authors: Abdullah Eqal Al Mazrooei

Abstract:

In this paper, a nonlinear dynamical system is presented. This system is a bilinear class. The bilinear systems are very important kind of nonlinear systems because they have many applications in real life. They are used in biology, chemistry, manufacturing, engineering, and economics where linear models are ineffective or inadequate. They have also been recently used to analyze and forecast weather conditions. Bilinear systems have three advantages: First, they define many problems which have a great applied importance. Second, they give us approximations to nonlinear systems. Thirdly, they have a rich geometric and algebraic structures, which promises to be a fruitful field of research for scientists and applications. The type of nonlinearity that is treated and analyzed consists of bilinear interaction between the states vectors and the system input. By using some properties of the tensor product, these systems can be transformed to linear systems. But, here we discuss the nonlinearity when the state vector is multiplied by itself. So, this model will be able to handle evolutions according to the Lotka-Volterra models or the Lorenz weather models, thus enabling a wider and more flexible application of such models. Here we apply by using an estimator to estimate temperatures. The results prove the efficiency of the proposed system.

Keywords: Lorenz models, nonlinear systems, nonlinear estimator, state-space model

Procedia PDF Downloads 249
2376 A Dynamic Approach for Evaluating the Climate Change Risks on Building Performance

Authors: X. Lu, T. Lu, S. Javadi

Abstract:

A simple dynamic approach is presented for analyzing thermal and moisture dynamics of buildings, which is of particular relevance to understanding climate change impacts on buildings, including assessment of risks and applications of resilience strategies. With the goal to demonstrate the proposed modeling methodology, to verify the model, and to show that wooden materials provide a mechanism that can facilitate the reduction of moisture risks and be more resilient to global warming, a wooden church equipped with high precision measurement systems was taken as a test building for full-scale time-series measurements. Sensitivity analyses indicate a high degree of accuracy in the model prediction regarding the indoor environment. The model is then applied to a future projection of climate indoors aiming to identify significant environmental factors, the changing temperature and humidity, and effective response to the climate change impacts. The paper suggests that wooden building materials offer an effective and resilient response to anticipated future climate changes.

Keywords: dynamic model, forecast, climate change impact, wooden structure, buildings

Procedia PDF Downloads 148
2375 VISSIM Modeling of Driver Behavior at Connecticut Roundabouts

Authors: F. Clara Fang, Hernan Castaneda

Abstract:

The Connecticut Department of Transportation (ConnDOT) has constructed four roundabouts in the State of Connecticut within the past ten years. VISSIM traffic simulation software was utilized to analyze these roundabouts during their design phase. The queue length and level of service observed in the field appear to be better than predicted by the VISSIM model. The objectives of this project are to: identify VISSIM input variables most critical to accurate modeling; recommend VISSIM calibration factors; and, provide other recommendations for roundabout traffic operations modeling. Traffic data were collected at these roundabouts using Miovision Technologies. Cameras were set up to capture vehicle circulating activity and entry behavior for two weekdays. A large sample size of filed data was analyzed to achieve accurate and statistically significant results. The data extracted from the videos include: vehicle circulating speed; critical gap estimated by Maximum Likelihood Method; peak hour volume; follow-up headway; travel time; and, vehicle queue length. A VISSIM simulation of existing roundabouts was built to compare both queue length and travel time predicted from simulation with measured in the field. The research investigated a variety of simulation parameters as calibration factors for describing driver behaviors at roundabouts. Among them, critical gap is the most effective calibration variable in roundabout simulation. It has a significant impact to queue length, particularly when the volume is higher. The results will improve the design of future roundabouts in Connecticut and provide decision makers with insights on the relationship between various choices and future performance.

Keywords: driver critical gap, roundabout analysis, simulation, VISSIM modeling

Procedia PDF Downloads 284
2374 E-Learning Recommender System Based on Collaborative Filtering and Ontology

Authors: John Tarus, Zhendong Niu, Bakhti Khadidja

Abstract:

In recent years, e-learning recommender systems has attracted great attention as a solution towards addressing the problem of information overload in e-learning environments and providing relevant recommendations to online learners. E-learning recommenders continue to play an increasing educational role in aiding learners to find appropriate learning materials to support the achievement of their learning goals. Although general recommender systems have recorded significant success in solving the problem of information overload in e-commerce domains and providing accurate recommendations, e-learning recommender systems on the other hand still face some issues arising from differences in learner characteristics such as learning style, skill level and study level. Conventional recommendation techniques such as collaborative filtering and content-based deal with only two types of entities namely users and items with their ratings. These conventional recommender systems do not take into account the learner characteristics in their recommendation process. Therefore, conventional recommendation techniques cannot make accurate and personalized recommendations in e-learning environment. In this paper, we propose a recommendation technique combining collaborative filtering and ontology to recommend personalized learning materials to online learners. Ontology is used to incorporate the learner characteristics into the recommendation process alongside the ratings while collaborate filtering predicts ratings and generate recommendations. Furthermore, ontological knowledge is used by the recommender system at the initial stages in the absence of ratings to alleviate the cold-start problem. Evaluation results show that our proposed recommendation technique outperforms collaborative filtering on its own in terms of personalization and recommendation accuracy.

Keywords: collaborative filtering, e-learning, ontology, recommender system

Procedia PDF Downloads 371
2373 Ordinal Regression with Fenton-Wilkinson Order Statistics: A Case Study of an Orienteering Race

Authors: Joonas Pääkkönen

Abstract:

In sports, individuals and teams are typically interested in final rankings. Final results, such as times or distances, dictate these rankings, also known as places. Places can be further associated with ordered random variables, commonly referred to as order statistics. In this work, we introduce a simple, yet accurate order statistical ordinal regression function that predicts relay race places with changeover-times. We call this function the Fenton-Wilkinson Order Statistics model. This model is built on the following educated assumption: individual leg-times follow log-normal distributions. Moreover, our key idea is to utilize Fenton-Wilkinson approximations of changeover-times alongside an estimator for the total number of teams as in the notorious German tank problem. This original place regression function is sigmoidal and thus correctly predicts the existence of a small number of elite teams that significantly outperform the rest of the teams. Our model also describes how place increases linearly with changeover-time at the inflection point of the log-normal distribution function. With real-world data from Jukola 2019, a massive orienteering relay race, the model is shown to be highly accurate even when the size of the training set is only 5% of the whole data set. Numerical results also show that our model exhibits smaller place prediction root-mean-square-errors than linear regression, mord regression and Gaussian process regression.

Keywords: Fenton-Wilkinson approximation, German tank problem, log-normal distribution, order statistics, ordinal regression, orienteering, sports analytics, sports modeling

Procedia PDF Downloads 117
2372 The Cost of Solar-Centric Renewable Portfolio

Authors: Timothy J. Considine, Edward J. M. Manderson

Abstract:

This paper develops an econometric forecasting system of energy demand coupled with engineering-economic models of energy supply. The framework is used to quantify the impact of state-level renewable portfolio standards (RPSs) achieved predominately with solar generation on electricity rates, electricity consumption, and environmental quality. We perform the analysis using Arizona’s RPS as a case study. We forecast energy demand in Arizona out to 2035, and find by this time the state will require an additional 35 million MWh of electricity generation. If Arizona implements its RPS when supplying this electricity demand, we find there will be a substantial increase in electricity rates (relative to a business-as-usual scenario of reliance on gas-fired generation). Extending the current regime of tax credits can greatly reduce this increase, at the taxpayers’ expense. We find that by 2025 Arizona’s RPS will implicitly abate carbon dioxide emissions at a cost between $101 and $135 per metric ton, and by 2035 abatement costs are between $64 and $112 per metric ton (depending on the future evolution of nature gas prices).

Keywords: electricity demand, renewable portfolio standard, solar, carbon dioxide

Procedia PDF Downloads 478
2371 High Resolution Sandstone Connectivity Modelling: Implications for Outcrop Geological and Its Analog Studies

Authors: Numair Ahmed Siddiqui, Abdul Hadi bin Abd Rahman, Chow Weng Sum, Wan Ismail Wan Yousif, Asif Zameer, Joel Ben-Awal

Abstract:

Advances in data capturing from outcrop studies have made possible the acquisition of high-resolution digital data, offering improved and economical reservoir modelling methods. Terrestrial laser scanning utilizing LiDAR (Light detection and ranging) provides a new method to build outcrop based reservoir models, which provide a crucial piece of information to understand heterogeneities in sandstone facies with high-resolution images and data set. This study presents the detailed application of outcrop based sandstone facies connectivity model by acquiring information gathered from traditional fieldwork and processing detailed digital point-cloud data from LiDAR to develop an intermediate small-scale reservoir sandstone facies model of the Miocene Sandakan Formation, Sabah, East Malaysia. The software RiScan pro (v1.8.0) was used in digital data collection and post-processing with an accuracy of 0.01 m and point acquisition rate of up to 10,000 points per second. We provide an accurate and descriptive workflow to triangulate point-clouds of different sets of sandstone facies with well-marked top and bottom boundaries in conjunction with field sedimentology. This will provide highly accurate qualitative sandstone facies connectivity model which is a challenge to obtain from subsurface datasets (i.e., seismic and well data). Finally, by applying this workflow, we can build an outcrop based static connectivity model, which can be an analogue to subsurface reservoir studies.

Keywords: LiDAR, outcrop, high resolution, sandstone faceis, connectivity model

Procedia PDF Downloads 217
2370 Electronic Payment Recording with Payment History Retrieval Module: A System Software

Authors: Adrian Forca, Simeon Cainday III

Abstract:

The Electronic Payment Recording with Payment History Retrieval Module is developed intendedly for the College of Science and Technology. This system software innovates the manual process of recording the payments done in the department through the development of electronic payment recording system software shifting from the slow and time-consuming procedure to quick yet reliable and accurate way of recording payments because it immediately generates receipts for every transaction. As an added feature to its software process, generation of recorded payment report is integrated eliminating the manual reporting to a more easy and consolidated report. As an added feature to the system, all recorded payments of the students can be retrieved immediately making the system transparent and reliable payment recording software. Viewing the whole process, the system software will shift from the manual process to an organized software technology because the information will be stored in a logically correct and normalized database. Further, the software will be developed using the modern programming language and implement strict programming methods to validate all users accessing the system, evaluate all data passed into the system and information retrieved to ensure data accuracy and reliability. In addition, the system will identify the user and limit its access privilege to establish boundaries of the specific access to information allowed for the store, modify, and update making the information secure against unauthorized data manipulation. As a result, the System software will eliminate the manual procedure and replace with an innovative modern information technology resulting to the improvement of the whole process of payment recording fast, secure, accurate and reliable software innovations.

Keywords: collection, information system, manual procedure, payment

Procedia PDF Downloads 160
2369 Assessment of Arterial Stiffness through Measurement of Magnetic Flux Disturbance and Electrocardiogram Signal

Authors: Jing Niu, Jun X. Wang

Abstract:

Arterial stiffness predicts mortality and morbidity, independently of other cardiovascular risk factors. And it is a major risk factor for age-related morbidity and mortality. The non-invasive industry gold standard measurement system of arterial stiffness utilizes pulse wave velocity method. However, the desktop device is expensive and requires trained professional to operate. The main objective of this research is the proof of concept of the proposed non-invasive method which uses measurement of magnetic flux disturbance and electrocardiogram (ECG) signal for measuring arterial stiffness. The method could enable accurate and easy self-assessment of arterial stiffness at home, and to help doctors in research, diagnostic and prescription in hospitals and clinics. A platform for assessing arterial stiffness through acquisition and analysis of radial artery pulse waveform and ECG signal has been developed based on the proposed method. Radial artery pulse waveform is acquired using the magnetic based sensing technology, while ECG signal is acquired using two dry contact single arm ECG electrodes. The measurement only requires the participant to wear a wrist strap and an arm band. Participants were recruited for data collection using both the developed platform and the industry gold standard system. The results from both systems underwent correlation assessment analysis. A strong positive correlation between the results of the two systems is observed. This study presents the possibility of developing an accurate, easy to use and affordable measurement device for arterial stiffness assessment.

Keywords: arterial stiffness, electrocardiogram, pulse wave velocity, Magnetic Flux Disturbance

Procedia PDF Downloads 185
2368 Hybrid Wavelet-Adaptive Neuro-Fuzzy Inference System Model for a Greenhouse Energy Demand Prediction

Authors: Azzedine Hamza, Chouaib Chakour, Messaoud Ramdani

Abstract:

Energy demand prediction plays a crucial role in achieving next-generation power systems for agricultural greenhouses. As a result, high prediction quality is required for efficient smart grid management and therefore low-cost energy consumption. The aim of this paper is to investigate the effectiveness of a hybrid data-driven model in day-ahead energy demand prediction. The proposed model consists of Discrete Wavelet Transform (DWT), and Adaptive Neuro-Fuzzy Inference System (ANFIS). The DWT is employed to decompose the original signal in a set of subseries and then an ANFIS is used to generate the forecast for each subseries. The proposed hybrid method (DWT-ANFIS) was evaluated using a greenhouse energy demand data for a week and compared with ANFIS. The performances of the different models were evaluated by comparing the corresponding values of Mean Absolute Percentage Error (MAPE). It was demonstrated that discret wavelet transform can improve agricultural greenhouse energy demand modeling.

Keywords: wavelet transform, ANFIS, energy consumption prediction, greenhouse

Procedia PDF Downloads 81
2367 Accuracy of Trauma on Scene Triage Screen Tool (Shock Index, Reverse Shock Index Glasgow Coma Scale, and National Early Warning Score) to Predict the Severity of Emergency Department Triage

Authors: Chaiyaporn Yuksen, Tapanawat Chaiwan

Abstract:

Introduction: Emergency medical service (EMS) care for trauma patients must be provided on-scene assessment and essential treatment and have appropriate transporting to the trauma center. The shock index (SI), reverse shock index Glasgow Coma Scale (rSIG), and National Early Warning Score (NEWS) triage tools are easy to use in a prehospital setting. There is no standardized on-scene triage protocol in prehospital care. The primary objective was to determine the accuracy of SI, rSIG, and NEWS to predict the severity of trauma patients in the emergency department (ED). Methods: This was a retrospective cross-sectional and diagnostic research conducted on trauma patients transported by EMS to the ED of Ramathibodi Hospital, a university-affiliated super tertiary care hospital in Bangkok, Thailand, from January 2015 to September 2022. We included the injured patients receiving prehospital care and transport to the ED of Ramathibodi Hospital by the EMS team from January 2015 to September 2022. We compared the on-scene parameter (SI, rSIG, and NEWS) and ED (Emergency Severity Index) with the area under ROC. Results: 218 patients were traumatic patients transported by EMS to the ED. 161 was ESI level 1-2, and 57 was level 3-5. NEWS was a more accurate triage tool to discriminate the severity of trauma patients than rSIG and SI. The area under the ROC was 0.743 (95%CI 0.70-0.79), 0.649 (95%CI 0.59-0.70), and 0.582 (95%CI 0.52-0.65), respectively (P-value <0.001). The cut point of NEWS to discriminate was 6 points. Conclusions: The NEWs was the most accurate triage tool in prehospital seeing in trauma patients.

Keywords: on-scene triage, trauma patient, ED triage, accuracy, NEWS

Procedia PDF Downloads 119
2366 The Impact of Corporate Social Responsibility Information Disclosure on the Accuracy of Analysts' Earnings Forecasts

Authors: Xin-Hua Zhao

Abstract:

In recent years, the growth rate of social responsibility reports disclosed by Chinese corporations has grown rapidly. The economic effects of the growing corporate social responsibility reports have become a hot topic. The article takes the chemical listed engineering corporations that disclose social responsibility reports in China as a sample, and based on the information asymmetry theory, examines the economic effect generated by corporate social responsibility disclosure with the method of ordinary least squares. The research is conducted from the perspective of analysts’ earnings forecasts and studies the impact of corporate social responsibility information disclosure on improving the accuracy of analysts' earnings forecasts. The results show that there is a statistically significant negative correlation between corporate social responsibility disclosure index and analysts’ earnings forecast error. The conclusions confirm that enterprises can reduce the asymmetry of social and environmental information by disclosing social responsibility reports, and thus improve the accuracy of analysts’ earnings forecasts. It can promote the effective allocation of resources in the market.

Keywords: analysts' earnings forecasts, corporate social responsibility disclosure, economic effect, information asymmetry

Procedia PDF Downloads 153
2365 A Deep Learning Model with Greedy Layer-Wise Pretraining Approach for Optimal Syngas Production by Dry Reforming of Methane

Authors: Maryam Zarabian, Hector Guzman, Pedro Pereira-Almao, Abraham Fapojuwo

Abstract:

Dry reforming of methane (DRM) has sparked significant industrial and scientific interest not only as a viable alternative for addressing the environmental concerns of two main contributors of the greenhouse effect, i.e., carbon dioxide (CO₂) and methane (CH₄), but also produces syngas, i.e., a mixture of hydrogen (H₂) and carbon monoxide (CO) utilized by a wide range of downstream processes as a feedstock for other chemical productions. In this study, we develop an AI-enable syngas production model to tackle the problem of achieving an equivalent H₂/CO ratio [1:1] with respect to the most efficient conversion. Firstly, the unsupervised density-based spatial clustering of applications with noise (DBSAN) algorithm removes outlier data points from the original experimental dataset. Then, random forest (RF) and deep neural network (DNN) models employ the error-free dataset to predict the DRM results. DNN models inherently would not be able to obtain accurate predictions without a huge dataset. To cope with this limitation, we employ reusing pre-trained layers’ approaches such as transfer learning and greedy layer-wise pretraining. Compared to the other deep models (i.e., pure deep model and transferred deep model), the greedy layer-wise pre-trained deep model provides the most accurate prediction as well as similar accuracy to the RF model with R² values 1.00, 0.999, 0.999, 0.999, 0.999, and 0.999 for the total outlet flow, H₂/CO ratio, H₂ yield, CO yield, CH₄ conversion, and CO₂ conversion outputs, respectively.

Keywords: artificial intelligence, dry reforming of methane, artificial neural network, deep learning, machine learning, transfer learning, greedy layer-wise pretraining

Procedia PDF Downloads 82
2364 Short Term Distribution Load Forecasting Using Wavelet Transform and Artificial Neural Networks

Authors: S. Neelima, P. S. Subramanyam

Abstract:

The major tool for distribution planning is load forecasting, which is the anticipation of the load in advance. Artificial neural networks have found wide applications in load forecasting to obtain an efficient strategy for planning and management. In this paper, the application of neural networks to study the design of short term load forecasting (STLF) Systems was explored. Our work presents a pragmatic methodology for short term load forecasting (STLF) using proposed two-stage model of wavelet transform (WT) and artificial neural network (ANN). It is a two-stage prediction system which involves wavelet decomposition of input data at the first stage and the decomposed data with another input is trained using a separate neural network to forecast the load. The forecasted load is obtained by reconstruction of the decomposed data. The hybrid model has been trained and validated using load data from Telangana State Electricity Board.

Keywords: electrical distribution systems, wavelet transform (WT), short term load forecasting (STLF), artificial neural network (ANN)

Procedia PDF Downloads 431
2363 Infusion Pump Historical Development, Measurement and Parts of Infusion Pump

Authors: Samuel Asrat

Abstract:

Infusion pumps have become indispensable tools in modern healthcare, allowing for precise and controlled delivery of fluids, medications, and nutrients to patients. This paper provides an overview of the historical development, measurement, and parts of infusion pumps. The historical development of infusion pumps can be traced back to the early 1960s when the first rudimentary models were introduced. These early pumps were large, cumbersome, and often unreliable. However, advancements in technology and engineering over the years have led to the development of smaller, more accurate, and user-friendly infusion pumps. Measurement of infusion pumps involves assessing various parameters such as flow rate, volume delivered, and infusion duration. Flow rate, typically measured in milliliters per hour (mL/hr), is a critical parameter that determines the rate at which fluids or medications are delivered to the patient. Accurate measurement of flow rate is essential to ensure the proper administration of therapy and prevent adverse effects. Infusion pumps consist of several key parts, including the pump mechanism, fluid reservoir, tubing, and control interface. The pump mechanism is responsible for generating the necessary pressure to push fluids through the tubing and into the patient's bloodstream. The fluid reservoir holds the medication or solution to be infused, while the tubing serves as the conduit through which the fluid travels from the reservoir to the patient. The control interface allows healthcare providers to program and adjust the infusion parameters, such as flow rate and volume. In conclusion, infusion pumps have evolved significantly since their inception, offering healthcare providers unprecedented control and precision in delivering fluids and medications to patients. Understanding the historical development, measurement, and parts of infusion pumps is essential for ensuring their safe and effective use in clinical practice.

Keywords: dip, ip, sp, is

Procedia PDF Downloads 55
2362 Audit of TPS photon beam dataset for small field output factors using OSLDs against RPC standard dataset

Authors: Asad Yousuf

Abstract:

Purpose: The aim of the present study was to audit treatment planning system beam dataset for small field output factors against standard dataset produced by radiological physics center (RPC) from a multicenter study. Such data are crucial for validity of special techniques, i.e., IMRT or stereotactic radiosurgery. Materials/Method: In this study, multiple small field size output factor datasets were measured and calculated for 6 to 18 MV x-ray beams using the RPC recommend methods. These beam datasets were measured at 10 cm depth for 10 × 10 cm2 to 2 × 2 cm2 field sizes, defined by collimator jaws at 100 cm. The measurements were made with a Landauer’s nanoDot OSLDs whose volume is small enough to gather a full ionization reading even for the 1×1 cm2 field size. At our institute the beam data including output factors have been commissioned at 5 cm depth with an SAD setup. For comparison with the RPC data, the output factors were converted to an SSD setup using tissue phantom ratios. SSD setup also enables coverage of the ion chamber in 2×2 cm2 field size. The measured output factors were also compared with those calculated by Eclipse™ treatment planning software. Result: The measured and calculated output factors are in agreement with RPC dataset within 1% and 4% respectively. The large discrepancies in TPS reflect the increased challenge in converting measured data into a commissioned beam model for very small fields. Conclusion: OSLDs are simple, durable, and accurate tool to verify doses that delivered using small photon beam fields down to a 1x1 cm2 field sizes. The study emphasizes that the treatment planning system should always be evaluated for small field out factors for the accurate dose delivery in clinical setting.

Keywords: small field dosimetry, optically stimulated luminescence, audit treatment, radiological physics center

Procedia PDF Downloads 320
2361 Clinical Validation of C-PDR Methodology for Accurate Non-Invasive Detection of Helicobacter pylori Infection

Authors: Suman Som, Abhijit Maity, Sunil B. Daschakraborty, Sujit Chaudhuri, Manik Pradhan

Abstract:

Background: Helicobacter pylori is a common and important human pathogen and the primary cause of peptic ulcer disease and gastric cancer. Currently H. pylori infection is detected by both invasive and non-invasive way but the diagnostic accuracy is not up to the mark. Aim: To set up an optimal diagnostic cut-off value of 13C-Urea Breath Test to detect H. pylori infection and evaluate a novel c-PDR methodology to overcome of inconclusive grey zone. Materials and Methods: All 83 subjects first underwent upper-gastrointestinal endoscopy followed by rapid urease test and histopathology and depending on these results; we classified 49 subjects as H. pylori positive and 34 negative. After an overnight, fast patients are taken 4 gm of citric acid in 200 ml water solution and 10 minute after ingestion of the test meal, a baseline exhaled breath sample was collected. Thereafter an oral dose of 75 mg 13C-Urea dissolved in 50 ml water was given and breath samples were collected upto 90 minute for 15 minute intervals and analysed by laser based high precisional cavity enhanced spectroscopy. Results: We studied the excretion kinetics of 13C isotope enrichment (expressed as δDOB13C ‰) of exhaled breath samples and found maximum enrichment around 30 minute of H. pylori positive patients, it is due to the acid mediated stimulated urease enzyme activity and maximum acidification happened within 30 minute but no such significant isotopic enrichment observed for H. pylori negative individuals. Using Receiver Operating Characteristic (ROC) curve an optimal diagnostic cut-off value, δDOB13C ‰ = 3.14 was determined at 30 minute exhibiting 89.16% accuracy. Now to overcome grey zone problem we explore percentage dose of 13C recovered per hour, i.e. 13C-PDR (%/hr) and cumulative percentage dose of 13C recovered, i.e. c-PDR (%) in exhaled breath samples for the present 13C-UBT. We further explored the diagnostic accuracy of 13C-UBT by constructing ROC curve using c-PDR (%) values and an optimal cut-off value was estimated to be c-PDR = 1.47 (%) at 60 minute, exhibiting 100 % diagnostic sensitivity , 100 % specificity and 100 % accuracy of 13C-UBT for detection of H. pylori infection. We also elucidate the gastric emptying process of present 13C-UBT for H. pylori positive patients. The maximal emptying rate found at 36 minute and half empting time of present 13C-UBT was found at 45 minute. Conclusions: The present study exhibiting the importance of c-PDR methodology to overcome of grey zone problem in 13C-UBT for accurate determination of infection without any risk of diagnostic errors and making it sufficiently robust and novel method for an accurate and fast non-invasive diagnosis of H. pylori infection for large scale screening purposes.

Keywords: 13C-Urea breath test, c-PDR methodology, grey zone, Helicobacter pylori

Procedia PDF Downloads 300
2360 Sea Level Characteristics Referenced to Specific Geodetic Datum in Alexandria, Egypt

Authors: Ahmed M. Khedr, Saad M. Abdelrahman, Kareem M. Tonbol

Abstract:

Two geo-referenced sea level datasets (September 2008 – November 2010) and (April 2012 – January 2014) were recorded at Alexandria Western Harbour (AWH). Accurate re-definition of tidal datum, referred to the latest International Terrestrial Reference Frame (ITRF-2014), was discussed and updated to improve our understanding of the old predefined tidal datum at Alexandria. Tidal and non-tidal components of sea level were separated with the use of Delft-3D hydrodynamic model-tide suit (Delft-3D, 2015). Tidal characteristics at AWH were investigated and harmonic analysis showed the most significant 34 constituents with their amplitudes and phases. Tide was identified as semi-diurnal pattern as indicated by a “Form Factor” of 0.24 and 0.25, respectively. Principle tidal datums related to major tidal phenomena were recalculated referred to a meaningful geodetic height datum. The portion of residual energy (surge) out of the total sea level energy was computed for each dataset and found 77% and 72%, respectively. Power spectral density (PSD) showed accurate resolvability in high band (1–6) cycle/days for the nominated independent constituents, except some neighbouring constituents, which are too close in frequency. Wind and atmospheric pressure data, during the recorded sea level time, were analysed and cross-correlated with the surge signals. Moderate association between surge and wind and atmospheric pressure data were obtained. In addition, long-term sea level rise trend at AWH was computed and showed good agreement with earlier estimated rates.

Keywords: Alexandria, Delft-3D, Egypt, geodetic reference, harmonic analysis, sea level

Procedia PDF Downloads 162
2359 Design and Simulation of an Inter-Satellite Optical Wireless Communication System Using Diversity Techniques

Authors: Sridhar Rapuru, D. Mallikarjunreddy, Rajanarendra Sai

Abstract:

In this reign of the internet, the access of any multimedia file to the users at any time with a superior quality is needed. To achieve this goal, it is very important to have a good network without any interruptions between the satellites along with various earth stations. For that purpose, a high speed inter-satellite optical wireless communication system (IsOWC) is designed with space and polarization diversity techniques. IsOWC offers a high bandwidth, small size, less power requirement and affordable when compared with the present microwave satellite systems. To improve the efficiency and to reduce the propagation delay, inter-satellite link is established between the satellites. High accurate tracking systems are required to establish the reliable connection between the satellites as they have their own orbits. The only disadvantage of this IsOWC system is laser beam width is narrower than the RF because of this highly accurate tracking system to meet this requirement. The satellite uses the 'ephemerides data' for rough pointing and tracking system for fine pointing to the other satellite. In this proposed IsOWC system, laser light is used as a wireless connectedness between the source and destination and free space acts as the channel to carry the message. The proposed system will be designed, simulated and analyzed for 6000km with an improvement of data rate over previously existing systems. The performance parameters of the system are Q-factor, eye opening, bit error rate, etc., The proposed system for Inter-satellite Optical Wireless Communication System Design Using Diversity Techniques finds huge scope of applications in future generation communication purposes.

Keywords: inter-satellite optical wireless system, space and polarization diversity techniques, line of sight, bit error rate, Q-factor

Procedia PDF Downloads 263
2358 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error

Authors: Oscar Javier Herrera, Manuel Angel Camacho

Abstract:

This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors’. The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.

Keywords: demand forecasting, empirical distribution, propagation of error, Bogota

Procedia PDF Downloads 623
2357 The Dynamic Nexus of Public Health and Journalism in Informed Societies

Authors: Ali Raza

Abstract:

The dynamic landscape of communication has brought about significant advancements that intersect with the realms of public health and journalism. This abstract explores the evolving synergy between these fields, highlighting how their intersection has contributed to informed societies and improved public health outcomes. In the digital age, communication plays a pivotal role in shaping public perception, policy formulation, and collective action. Public health, concerned with safeguarding and improving community well-being, relies on effective communication to disseminate information, encourage healthy behaviors, and mitigate health risks. Simultaneously, journalism, with its commitment to accurate and timely reporting, serves as the conduit through which health information reaches the masses. Advancements in communication technologies have revolutionized the ways in which public health information is both generated and shared. The advent of social media platforms, mobile applications, and online forums has democratized the dissemination of health-related news and insights. This democratization, however, brings challenges, such as the rapid spread of misinformation and the need for nuanced strategies to engage diverse audiences. Effective collaboration between public health professionals and journalists is pivotal in countering these challenges, ensuring that accurate information prevails. The synergy between public health and journalism is most evident during public health crises. The COVID-19 pandemic underscored the pivotal role of journalism in providing accurate and up-to-date information to the public. However, it also highlighted the importance of responsible reporting, as sensationalism and misinformation could exacerbate the crisis. Collaborative efforts between public health experts and journalists led to the amplification of preventive measures, the debunking of myths, and the promotion of evidence-based interventions. Moreover, the accessibility of information in the digital era necessitates a strategic approach to health communication. Behavioral economics and data analytics offer insights into human decision-making and allow tailored health messages to resonate more effectively with specific audiences. This approach, when integrated into journalism, enables the crafting of narratives that not only inform but also influence positive health behaviors. Ethical considerations emerge prominently in this alliance. The responsibility to balance the public's right to know with the potential consequences of sensational reporting underscores the significance of ethical journalism. Health journalists must meticulously source information from reputable experts and institutions to maintain credibility, thus fortifying the bridge between public health and the public. As both public health and journalism undergo transformative shifts, fostering collaboration between these domains becomes essential. Training programs that familiarize journalists with public health concepts and practices can enhance their capacity to report accurately and comprehensively on health issues. Likewise, public health professionals can gain insights into effective communication strategies from seasoned journalists, ensuring that health information reaches a wider audience. In conclusion, the convergence of public health and journalism, facilitated by communication advancements, is a cornerstone of informed societies. Effective communication strategies, driven by collaboration, ensure the accurate dissemination of health information and foster positive behavior change. As the world navigates complex health challenges, the continued evolution of this synergy holds the promise of healthier communities and a more engaged and educated public.

Keywords: public awareness, journalism ethics, health promotion, media influence, health literacy

Procedia PDF Downloads 66
2356 A Mobile Application for Analyzing and Forecasting Crime Using Autoregressive Integrated Moving Average with Artificial Neural Network

Authors: Gajaanuja Megalathan, Banuka Athuraliya

Abstract:

Crime is one of our society's most intimidating and threatening challenges. With the majority of the population residing in cities, many experts and data provided by local authorities suggest a rapid increase in the number of crimes committed in these cities in recent years. There has been an increasing graph in the crime rates. People living in Sri Lanka have the right to know the exact crime rates and the crime rates in the future of the place they are living in. Due to the current economic crisis, crime rates have spiked. There have been so many thefts and murders recorded within the last 6-10 months. Although there are many sources to find out, there is no solid way of searching and finding out the safety of the place. Due to all these reasons, there is a need for the public to feel safe when they are introduced to new places. Through this research, the author aims to develop a mobile application that will be a solution to this problem. It is mainly targeted at tourists, and people who recently relocated will gain advantage of this application. Moreover, the Arima Model combined with ANN is to be used to predict crime rates. From the past researchers' works, it is evidently clear that they haven’t used the Arima model combined with Artificial Neural Networks to forecast crimes.

Keywords: arima model, ANN, crime prediction, data analysis

Procedia PDF Downloads 122
2355 Predictive Maintenance of Electrical Induction Motors Using Machine Learning

Authors: Muhammad Bilal, Adil Ahmed

Abstract:

This study proposes an approach for electrical induction motor predictive maintenance utilizing machine learning algorithms. On the basis of a study of temperature data obtained from sensors put on the motor, the goal is to predict motor failures. The proposed models are trained to identify whether a motor is defective or not by utilizing machine learning algorithms like Support Vector Machines (SVM) and K-Nearest Neighbors (KNN). According to a thorough study of the literature, earlier research has used motor current signature analysis (MCSA) and vibration data to forecast motor failures. The temperature signal methodology, which has clear advantages over the conventional MCSA and vibration analysis methods in terms of cost-effectiveness, is the main subject of this research. The acquired results emphasize the applicability and effectiveness of the temperature-based predictive maintenance strategy by demonstrating the successful categorization of defective motors using the suggested machine learning models.

Keywords: predictive maintenance, electrical induction motors, machine learning, temperature signal methodology, motor failures

Procedia PDF Downloads 109
2354 Critical Appraisal of Different Drought Indices of Drought Predection and Their Application in KBK Districts of Odisha

Authors: Bibhuti Bhusan Sahoo, Ramakar Jha

Abstract:

Mapping of the extreme events (droughts) is one of the adaptation strategies to consequences of increasing climatic inconsistency and climate alterations. There is no operational practice to forecast the drought. One of the suggestions is to update mapping of drought prone areas for developmental planning. Drought indices play a significant role in drought mitigation. Many scientists have worked on different statistical analysis in drought and other climatological hazards. Many researchers have studied droughts individually for different sub-divisions or for India. Very few workers have studied district wise probabilities over large scale. In the present study, district wise drought probabilities over KBK (Kalahandi-Balangir-Koraput) districts of Odisha, India, Which are seriously prone to droughts, has been established using Hydrological drought index and Meteorological drought index along with the remote sensing drought indices to develop a multidirectional approach in the field of drought mitigation. Mapping for moderate and severe drought probabilities for KBK districts has been done and regions belonging different class intervals of probabilities of drought have been demarcated. Such type of information would be a good tool for planning purposes, for input in modelling and better promising results can be achieved.

Keywords: drought indices, KBK districts, proposed drought severity index, SPI

Procedia PDF Downloads 441
2353 Analysis of Photic Zone’s Summer Period-Dissolved Oxygen and Temperature as an Early Warning System of Fish Mass Mortality in Sampaloc Lake in San Pablo, Laguna

Authors: Al Romano, Jeryl C. Hije, Mechaela Marie O. Tabiolo

Abstract:

The decline in water quality is a major factor in aquatic disease outbreaks and can lead to significant mortality among aquatic organisms. Understanding the relationship between dissolved oxygen (DO) and water temperature is crucial, as these variables directly impact the health, behavior, and survival of fish populations. This study investigated how DO levels, water temperature, and atmospheric temperature interact in Sampaloc Lake to assess the risk of fish mortality. By employing a combination of linear regression models and machine learning techniques, researchers developed predictive models to forecast DO concentrations at various depths. The results indicate that while DO levels generally decrease with depth, the predicted concentrations are sufficient to support the survival of common fish species in Sampaloc Lake during March, April, and May 2025.

Keywords: aquaculture, dissolved oxygen, water temperature, regression analysis, machine learning, fish mass mortality, early warning system

Procedia PDF Downloads 23
2352 Dual Electrochemical Immunosensor for IL-13Rα2 and E-Cadherin Determination in Cell, Serum and Tissues from Cancer Patients

Authors: Amira ben Hassine, A. Valverde, V. Serafín, C. Muñoz-San Martín, M. Garranzo-Asensio, M. Gamella, R. Barderas, M. Pedrero, N. Raouafi, S. Campuzano, P. Yáñez-Sedeño, J. M. Pingarrón

Abstract:

This work describes the development of a dual electrochemical immunosensing platform for accurate determination of two target proteins, IL-13 Receptor α2 (IL-13Rα2) and E-cadherin (E-cad). The proposed methodology is based on the use of sandwich immunosensing approaches (involving horseradish peroxidase-labeled detector antibodies) implemented onto magnetic microbeads (MBs) and amperometric transduction at screen-printed dual carbon electrodes (SPdCEs). The magnetic bioconjugates were captured onto SPdCEs and the amperometric transduction was performed using the H2O2/hydroquinone (HQ) system. Under optimal experimental conditions, the developed bio platform demonstrates linear concentration ranges of 1.0–25 and 5.0-100 ng mL-1, detection limits of 0.28 and 1.04 ng mL-1 for E-cad and IL-13Rα2, respectively, and excellent selectivity against other non-target proteins. The developed immuno-platform also offers a good reproducibility among amperometric responses provided by nine different sensors constructed in the same manner (Relative Standard Deviation values of 3.1% for E-cad and 4.3% for IL-13Rα2). Moreover, obtained results confirm the practical applicability of this bio-platform for the accurate determination of the endogenous levels of both extracellular receptors in colon cancer cells (both intact and lysed) with different metastatic potential and serum and tissues from patients diagnosed with colorectal cancer at different grades. Interesting features in terms of, simplicity, speed, portability and sample amount required to provide quantitative results, make this immuno-platform more compatible than conventional methodologies with the clinical diagnosis and prognosis at the point of care.

Keywords: electrochemistry, mmunosensors, biosensors, E-cadherin, IL-13 receptor α2, cancer colorectal

Procedia PDF Downloads 132
2351 Heliport Remote Safeguard System Based on Real-Time Stereovision 3D Reconstruction Algorithm

Authors: Ł. Morawiński, C. Jasiński, M. Jurkiewicz, S. Bou Habib, M. Bondyra

Abstract:

With the development of optics, electronics, and computers, vision systems are increasingly used in various areas of life, science, and industry. Vision systems have a huge number of applications. They can be used in quality control, object detection, data reading, e.g., QR-code, etc. A large part of them is used for measurement purposes. Some of them make it possible to obtain a 3D reconstruction of the tested objects or measurement areas. 3D reconstruction algorithms are mostly based on creating depth maps from data that can be acquired from active or passive methods. Due to the specific appliance in airfield technology, only passive methods are applicable because of other existing systems working on the site, which can be blinded on most spectral levels. Furthermore, reconstruction is required to work long distances ranging from hundreds of meters to tens of kilometers with low loss of accuracy even with harsh conditions such as fog, rain, or snow. In response to those requirements, HRESS (Heliport REmote Safeguard System) was developed; which main part is a rotational head with a two-camera stereovision rig gathering images around the head in 360 degrees along with stereovision 3D reconstruction and point cloud combination. The sub-pixel analysis introduced in the HRESS system makes it possible to obtain an increased distance measurement resolution and accuracy of about 3% for distances over one kilometer. Ultimately, this leads to more accurate and reliable measurement data in the form of a point cloud. Moreover, the program algorithm introduces operations enabling the filtering of erroneously collected data in the point cloud. All activities from the programming, mechanical and optical side are aimed at obtaining the most accurate 3D reconstruction of the environment in the measurement area.

Keywords: airfield monitoring, artificial intelligence, stereovision, 3D reconstruction

Procedia PDF Downloads 116
2350 Quantitative Evaluation of Mitral Regurgitation by Using Color Doppler Ultrasound

Authors: Shang-Yu Chiang, Yu-Shan Tsai, Shih-Hsien Sung, Chung-Ming Lo

Abstract:

Mitral regurgitation (MR) is a heart disorder which the mitral valve does not close properly when the heart pumps out blood. MR is the most common form of valvular heart disease in the adult population. The diagnostic echocardiographic finding of MR is straightforward due to the well-known clinical evidence. In the determination of MR severity, quantification of sonographic findings would be useful for clinical decision making. Clinically, the vena contracta is a standard for MR evaluation. Vena contracta is the point in a blood stream where the diameter of the stream is the least, and the velocity is the maximum. The quantification of vena contracta, i.e. the vena contracta width (VCW) at mitral valve, can be a numeric measurement for severity assessment. However, manually delineating the VCW may not accurate enough. The result highly depends on the operator experience. Therefore, this study proposed an automatic method to quantify VCW to evaluate MR severity. Based on color Doppler ultrasound, VCW can be observed from the blood flows to the probe as the appearance of red or yellow area. The corresponding brightness represents the value of the flow rate. In the experiment, colors were firstly transformed into HSV (hue, saturation and value) to be closely align with the way human vision perceives red and yellow. Using ellipse to fit the high flow rate area in left atrium, the angle between the mitral valve and the ultrasound probe was calculated to get the vertical shortest diameter as the VCW. Taking the manual measurement as the standard, the method achieved only 0.02 (0.38 vs. 0.36) to 0.03 (0.42 vs. 0.45) cm differences. The result showed that the proposed automatic VCW extraction can be efficient and accurate for clinical use. The process also has the potential to reduce intra- or inter-observer variability at measuring subtle distances.

Keywords: mitral regurgitation, vena contracta, color doppler, image processing

Procedia PDF Downloads 366