Search results for: time estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19511

Search results for: time estimation

17651 Energy Analysis of Seasonal Air Conditioning Demand of All Income Classes Using Bottom up Model in Pakistan

Authors: Saba Arif, Anam Nadeem, Roman Kalvin, Tanzeel Rashid, Burhan Ali, Juntakan Taweekun

Abstract:

Currently, the energy crisis is taking serious attention. Globally, industries and building are major share takers of energy. 72% of total global energy is consumed by residential houses, markets, and commercial building. Additionally, in appliances air conditioners are major consumer of electricity; about 60% energy is used for cooling purpose in houses due to HVAC units. Energy demand will aid in determining what changes will be needed whether it is the estimation of the required energy for households or instituting conservation measures. Bottom-up model is one of the most famous methods for forecasting. In current research bottom-up model of air conditioners' energy consumption in all income classes in comparison with seasonal variation and hourly consumption is calculated. By comparison of energy consumption of all income classes by usage of air conditioners, total consumption of actual demand and current availability can be seen.

Keywords: air conditioning, bottom up model, income classes, energy demand

Procedia PDF Downloads 249
17650 Stable Time Reversed Integration of the Navier-Stokes Equation Using an Adjoint Gradient Method

Authors: Jurriaan Gillissen

Abstract:

This work is concerned with stabilizing the numerical integration of the Navier-Stokes equation (NSE), backwards in time. Applications involve the detection of sources of, e.g., sound, heat, and pollutants. Stable reverse numerical integration of parabolic differential equations is also relevant for image de-blurring. While the literature addresses the reverse integration problem of the advection-diffusion equation, the problem of numerical reverse integration of the NSE has, to our knowledge, not yet been addressed. Owing to the presence of viscosity, the NSE is irreversible, i.e., when going backwards in time, the fluid behaves, as if it had a negative viscosity. As an effect, perturbations from the perfect solution, due to round off errors or discretization errors, grow exponentially in time, and reverse integration of the NSE is inherently unstable, regardless of using an implicit time integration scheme. Consequently, some sort of filtering is required, in order to achieve a stable, numerical, reversed integration. The challenge is to find a filter with a minimal adverse affect on the accuracy of the reversed integration. In the present work, we explore an adjoint gradient method (AGM) to achieve this goal, and we apply this technique to two-dimensional (2D), decaying turbulence. The AGM solves for the initial velocity field u0 at t = 0, that, when integrated forward in time, produces a final velocity field u1 at t = 1, that is as close as is feasibly possible to some specified target field v1. The initial field u0 defines a minimum of a cost-functional J, that measures the distance between u1 and v1. In the minimization procedure, the u0 is updated iteratively along the gradient of J w.r.t. u0, where the gradient is obtained by transporting J backwards in time from t = 1 to t = 0, using the adjoint NSE. The AGM thus effectively replaces the backward integration by multiple forward and backward adjoint integrations. Since the viscosity is negative in the adjoint NSE, each step of the AGM is numerically stable. Nevertheless, when applied to turbulence, the AGM develops instabilities, which limit the backward integration to small times. This is due to the exponential divergence of phase space trajectories in turbulent flow, which produces a multitude of local minima in J, when the integration time is large. As an effect, the AGM may select unphysical, noisy initial conditions. In order to improve this situation, we propose two remedies. First, we replace the integration by a sequence of smaller integrations, i.e., we divide the integration time into segments, where in each segment the target field v1 is taken as the initial field u0 from the previous segment. Second, we add an additional term (regularizer) to J, which is proportional to a high-order Laplacian of u0, and which dampens the gradients of u0. We show that suitable values for the segment size and for the regularizer, allow a stable reverse integration of 2D decaying turbulence, with accurate results for more then O(10) turbulent, integral time scales.

Keywords: time reversed integration, parabolic differential equations, adjoint gradient method, two dimensional turbulence

Procedia PDF Downloads 224
17649 Application of Seasonal Autoregressive Integrated Moving Average Model for Forecasting Monthly Flows in Waterval River, South Africa

Authors: Kassahun Birhanu Tadesse, Megersa Olumana Dinka

Abstract:

Reliable future river flow information is basic for planning and management of any river systems. For data scarce river system having only a river flow records like the Waterval River, a univariate time series models are appropriate for river flow forecasting. In this study, a univariate Seasonal Autoregressive Integrated Moving Average (SARIMA) model was applied for forecasting Waterval River flow using GRETL statistical software. Mean monthly river flows from 1960 to 2016 were used for modeling. Different unit root tests and Mann-Kendall trend analysis were performed to test the stationarity of the observed flow time series. The time series was differenced to remove the seasonality. Using the correlogram of seasonally differenced time series, different SARIMA models were identified, their parameters were estimated, and diagnostic check-up of model forecasts was performed using white noise and heteroscedasticity tests. Finally, based on minimum Akaike Information (AIc) and Hannan-Quinn (HQc) criteria, SARIMA (3, 0, 2) x (3, 1, 3)12 was selected as the best model for Waterval River flow forecasting. Therefore, this model can be used to generate future river information for water resources development and management in Waterval River system. SARIMA model can also be used for forecasting other similar univariate time series with seasonality characteristics.

Keywords: heteroscedasticity, stationarity test, trend analysis, validation, white noise

Procedia PDF Downloads 205
17648 Some Integral Inequalities of Hermite-Hadamard Type on Time Scale and Their Applications

Authors: Artion Kashuri, Rozana Liko

Abstract:

In this paper, the authors establish an integral identity using delta differentiable functions. By applying this identity, some new results via a general class of convex functions with respect to two nonnegative functions on a time scale are given. Also, for suitable choices of nonnegative functions, some special cases are deduced. Finally, in order to illustrate the efficiency of our main results, some applications to special means are obtained as well. We hope that current work using our idea and technique will attract the attention of researchers working in mathematical analysis, mathematical inequalities, numerical analysis, special functions, fractional calculus, quantum mechanics, quantum calculus, physics, probability and statistics, differential and difference equations, optimization theory, and other related fields in pure and applied sciences.

Keywords: convex functions, Hermite-Hadamard inequality, special means, time scale

Procedia PDF Downloads 150
17647 Perspectives of Renewable Energy in 21st Century in India: Statistics and Estimation

Authors: Manoj Kumar, Rajesh Kumar

Abstract:

With the favourable geographical conditions at Indian-subcontinent, it is suitable for flourishing renewable energy. Increasing amount of dependence on coal and other conventional sources is driving the world into pollution and depletion of resources. This paper presents the statistics of energy consumption and energy generation in Indian Sub-continent, which notifies us with the increasing energy demands surpassing energy generation. With the aggrandizement in demand for energy, usage of coal has increased, since the major portion of energy production in India is from thermal power plants. The increase in usage of thermal power plants causes pollution and depletion of reserves; hence, a paradigm shift to renewable sources is inevitable. In this work, the capacity and potential of renewable sources in India are analyzed. Based on the analysis of this work, future potential of these sources is estimated.

Keywords: depletion of reserves, energy consumption and generation, emmissions, global warming, renewable sources

Procedia PDF Downloads 432
17646 Electrochemical Studies of Si, Si-Ge- and Ge-Air Batteries

Authors: R. C. Sharma, Rishabh Bansal, Prajwal Menon, Manoj K. Sharma

Abstract:

Silicon-air battery is highly promising for electric vehicles due to its high theoretical energy density (8470 Whkg⁻¹) and its discharge products are non-toxic. For the first time, pure silicon and germanium powders are used as anode material. Nickel wire meshes embedded with charcoal and manganese dioxide powder as cathode and concentrated potassium hydroxide is used as electrolyte. Voltage-time curves have been presented in this study for pure silicon and germanium powder and 5% and 10% germanium with silicon powder. Silicon powder cell assembly gives a stable voltage of 0.88 V for ~20 minutes while Si-Ge provides cell voltage of 0.80-0.76 V for ~10-12 minutes, and pure germanium cell provides cell voltage 0.80-0.76 V for ~30 minutes. The cell voltage is higher for concentrated (10%) sodium hydroxide solution (1.08 V) and it is stable for ~40 minutes. A sharp decrease in cell voltage beyond 40 min may be due to rapid corrosion.

Keywords: Silicon-air battery, Germanium-air battery, voltage-time curve, open circuit voltage, Anodic corrosion

Procedia PDF Downloads 238
17645 Time Series Simulation by Conditional Generative Adversarial Net

Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto

Abstract:

Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.

Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series

Procedia PDF Downloads 143
17644 Optimal Sequential Scheduling of Imperfect Maintenance Last Policy for a System Subject to Shocks

Authors: Yen-Luan Chen

Abstract:

Maintenance has a great impact on the capacity of production and on the quality of the products, and therefore, it deserves continuous improvement. Maintenance procedure done before a failure is called preventive maintenance (PM). Sequential PM, which specifies that a system should be maintained at a sequence of intervals with unequal lengths, is one of the commonly used PM policies. This article proposes a generalized sequential PM policy for a system subject to shocks with imperfect maintenance and random working time. The shocks arrive according to a non-homogeneous Poisson process (NHPP) with varied intensity function in each maintenance interval. As a shock occurs, the system suffers two types of failures with number-dependent probabilities: type-I (minor) failure, which is rectified by a minimal repair, and type-II (catastrophic) failure, which is removed by a corrective maintenance (CM). The imperfect maintenance is carried out to improve the system failure characteristic due to the altered shock process. The sequential preventive maintenance-last (PML) policy is defined as that the system is maintained before any CM occurs at a planned time Ti or at the completion of a working time in the i-th maintenance interval, whichever occurs last. At the N-th maintenance, the system is replaced rather than maintained. This article first takes up the sequential PML policy with random working time and imperfect maintenance in reliability engineering. The optimal preventive maintenance schedule that minimizes the mean cost rate of a replacement cycle is derived analytically and determined in terms of its existence and uniqueness. The proposed models provide a general framework for analyzing the maintenance policies in reliability theory.

Keywords: optimization, preventive maintenance, random working time, minimal repair, replacement, reliability

Procedia PDF Downloads 275
17643 Distributed Cost-Based Scheduling in Cloud Computing Environment

Authors: Rupali, Anil Kumar Jaiswal

Abstract:

Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc.  Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively.  Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.

Keywords: physical machines, virtual machines, support for repetition, self-healing, highly scalable programming model

Procedia PDF Downloads 168
17642 A Non-linear Damage Model For The Annulus Of the Intervertebral Disc Under Cyclic Loading, Including Recovery

Authors: Shruti Motiwale, Xianlin Zhou, Reuben H. Kraft

Abstract:

Military and sports personnel are often required to wear heavy helmets for extended periods of time. This leads to excessive cyclic loads on the neck and an increased chance of injury. Computational models offer one approach to understand and predict the time progression of disc degeneration under severe cyclic loading. In this paper, we have applied an analytic non-linear damage evolution model to estimate damage evolution in an intervertebral disc due to cyclic loads over decade-long time periods. We have also proposed a novel strategy for inclusion of recovery in the damage model. Our results show that damage only grows 20% in the initial 75% of the life, growing exponentially in the remaining 25% life. The analysis also shows that it is crucial to include recovery in a damage model.

Keywords: cervical spine, computational biomechanics, damage evolution, intervertebral disc, continuum damage mechanics

Procedia PDF Downloads 568
17641 The Use of Boosted Multivariate Trees in Medical Decision-Making for Repeated Measurements

Authors: Ebru Turgal, Beyza Doganay Erdogan

Abstract:

Machine learning aims to model the relationship between the response and features. Medical decision-making researchers would like to make decisions about patients’ course and treatment, by examining the repeated measurements over time. Boosting approach is now being used in machine learning area for these aims as an influential tool. The aim of this study is to show the usage of multivariate tree boosting in this field. The main reason for utilizing this approach in the field of decision-making is the ease solutions of complex relationships. To show how multivariate tree boosting method can be used to identify important features and feature-time interaction, we used the data, which was collected retrospectively from Ankara University Chest Diseases Department records. Dataset includes repeated PF ratio measurements. The follow-up time is planned for 120 hours. A set of different models is tested. In conclusion, main idea of classification with weighed combination of classifiers is a reliable method which was shown with simulations several times. Furthermore, time varying variables will be taken into consideration within this concept and it could be possible to make accurate decisions about regression and survival problems.

Keywords: boosted multivariate trees, longitudinal data, multivariate regression tree, panel data

Procedia PDF Downloads 203
17640 ‘Ethical Relativism’ in Offshore Business: A Critical Assessment

Authors: Biswanath Swain

Abstract:

Ethical relativism, as an ethical perspective, holds that moral worth of a course of action is dependent on a particular space and time. Moral rightness or wrongness of a course of action varies from space to space and from time to time. In short, ethical relativism holds that morality is relative to the context. If we reflect conscientiously on the scope of this perspective, we will find that it is wide-spread amongst the marketers involved in the offshore business. However, the irony is that most of the marketers gone along with ethical relativism in their offshore business have been found to be unsuccessful in terms of loss in market-share and bankruptcy. The upshot is purely self-defeating in nature for the marketers. GSK in China and Nestle Maggi in India are some of the burning examples of that sort. The paper argues and recommends that a marketer, as an alternative, should have recourse to Kantian ethical perspective to deliberate courses of action sensitive to offshore business as Kantian ethical perspective is logically and methodologically sound in nature.

Keywords: business, course of action, Kant, morality, offshore, relativism

Procedia PDF Downloads 303
17639 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters

Authors: K. Parandhama Gowd

Abstract:

The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.

Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)

Procedia PDF Downloads 572
17638 Prediction of Mechanical Strength of Multiscale Hybrid Reinforced Cementitious Composite

Authors: Salam Alrekabi, A. B. Cundy, Mohammed Haloob Al-Majidi

Abstract:

Novel multiscale hybrid reinforced cementitious composites based on carbon nanotubes (MHRCC-CNT), and carbon nanofibers (MHRCC-CNF) are new types of cement-based material fabricated with micro steel fibers and nanofilaments, featuring superior strain hardening, ductility, and energy absorption. This study focused on established models to predict the compressive strength, and direct and splitting tensile strengths of the produced cementitious composites. The analysis was carried out based on the experimental data presented by the previous author’s study, regression analysis, and the established models that available in the literature. The obtained models showed small differences in the predictions and target values with experimental verification indicated that the estimation of the mechanical properties could be achieved with good accuracy.

Keywords: multiscale hybrid reinforced cementitious composites, carbon nanotubes, carbon nanofibers, mechanical strength prediction

Procedia PDF Downloads 161
17637 The Foundation Binary-Signals Mechanics and Actual-Information Model of Universe

Authors: Elsadig Naseraddeen Ahmed Mohamed

Abstract:

In contrast to the uncertainty and complementary principle, it will be shown in the present paper that the probability of the simultaneous occupation event of any definite values of coordinates by any definite values of momentum and energy at any definite instance of time can be described by a binary definite function equivalent to the difference between their numbers of occupation and evacuation epochs up to that time and also equivalent to the number of exchanges between those occupation and evacuation epochs up to that times modulus two, these binary definite quantities can be defined at all point in the time’s real-line so it form a binary signal represent a complete mechanical description of physical reality, the time of these exchanges represent the boundary of occupation and evacuation epochs from which we can calculate these binary signals using the fact that the time of universe events actually extends in the positive and negative of time’s real-line in one direction of extension when these number of exchanges increase, so there exists noninvertible transformation matrix can be defined as the matrix multiplication of invertible rotation matrix and noninvertible scaling matrix change the direction and magnitude of exchange event vector respectively, these noninvertible transformation will be called actual transformation in contrast to information transformations by which we can navigate the universe’s events transformed by actual transformations backward and forward in time’s real-line, so these information transformations will be derived as an elements of a group can be associated to their corresponded actual transformations. The actual and information model of the universe will be derived by assuming the existence of time instance zero before and at which there is no coordinate occupied by any definite values of momentum and energy, and then after that time, the universe begin its expanding in spacetime, this assumption makes the need for the existence of Laplace’s demon who at one moment can measure the positions and momentums of all constituent particle of the universe and then use the law of classical mechanics to predict all future and past of universe’s events, superfluous, we only need for the establishment of our analog to digital converters to sense the binary signals that determine the boundaries of occupation and evacuation epochs of the definite values of coordinates relative to its origin by the definite values of momentum and energy as present events of the universe from them we can predict approximately in high precision it's past and future events.

Keywords: binary-signal mechanics, actual-information model of the universe, actual-transformation, information-transformation, uncertainty principle, Laplace's demon

Procedia PDF Downloads 175
17636 Digital Transformation, Financing Microstructures, and Impact on Well-Being and Income Inequality

Authors: Koffi Sodokin

Abstract:

Financing microstructures are increasingly seen as a means of financial inclusion and improving overall well-being in developing countries. In practice, digital transformation in finance can accelerate the optimal functioning of financing microstructures, such as access by households to microfinance and microinsurance. Large households' access to finance can lead to a reduction in income inequality and an overall improvement in well-being. This paper explores the impact of access to digital finance and financing microstructures on household well-being and the reduction of income inequality. To this end, we use the propensity score matching, the double difference, and the smooth instrumental quantile regression as estimation methods with two periods of survey data. The paper uses the FinScope consumer data (2016) and the Harmonized Living Standards Measurement Study (2018) from Togo in a comparative perspective. The results indicate that access to digital finance, as a cultural game changer, and to financing microstructures improves overall household well-being and contributes significantly to reducing income inequality.

Keywords: financing microstructure, microinsurance, microfinance, digital finance, well-being, income inequality

Procedia PDF Downloads 89
17635 Urea Amperometric Biosensor Based on Entrapment Immobilization of Urease onto a Nanostructured Polypyrrol and Multi-Walled Carbon Nanotube

Authors: Hamide Amani, Afshin FarahBakhsh, Iman Farahbakhsh

Abstract:

In this paper, an amprometric biosensor based on surface modified polypyrrole (PPy) has been developed for the quantitative estimation of urea in aqueous solutions. The incorporation of urease (Urs) into a bipolymeric substrate consisting of PPy was performed by entrapment to the polymeric matrix, PPy acts as amperometric transducer in these biosensors. To increase the membrane conductivity, multi-walled carbon nanotubes (MWCNT) were added to the PPy solution. The entrapped MWCNT in PPy film and the bipolymer layers were prepared for construction of Pt/PPy/MWCNT/Urs. Two different configurations of working electrodes were evaluated to investigate the potential use of the modified membranes in biosensors. The evaluation of two different configurations of working electrodes suggested that the second configuration, which was composed of an electrode-mediator-(pyrrole and multi-walled carbon nanotube) structure and enzyme, is the best candidate for biosensor applications.

Keywords: urea biosensor, polypyrrole, multi-walled carbon nanotube, urease

Procedia PDF Downloads 329
17634 Applying Lean Six Sigma in an Emergency Department, of a Private Hospital

Authors: Sarah Al-Lumai, Fatima Al-Attar, Nour Jamal, Badria Al-Dabbous, Manal Abdulla

Abstract:

Today, many commonly used Industrial Engineering tools and techniques are being used in hospitals around the world for the goal of producing a more efficient and effective healthcare system. A common quality improvement methodology known as Lean Six-Sigma has been successful in manufacturing industries and recently in healthcare. The objective of our project is to use the Lean Six-Sigma methodology to reduce waiting time in the Emergency Department (ED), in a local private hospital. Furthermore, a comprehensive literature review was conducted to evaluate the success of Lean Six-Sigma in the ED. According to the study conducted by Ibn Sina Hospital, in Morocco, the most common problem that patients complain about is waiting time. To ensure patient satisfaction many hospitals such as North Shore University Hospital were able to reduce waiting time up to 37% by using Lean Six-Sigma. Other hospitals, such as John Hopkins’s medical center used Lean Six-Sigma successfully to enhance the overall patient flow that ultimately decreased waiting time. Furthermore, it was found that capacity constraints, such as staff shortages and lack of beds were one of the main reasons behind long waiting time. With the use of Lean Six-Sigma and bed management, hospitals like Memorial Hermann Southwest Hospital were able to reduce patient delays. Moreover, in order to successfully implement Lean Six-Sigma in our project, two common methodologies were considered, DMAIC and DMADV. After the assessment of both methodologies, it was found that DMAIC was a more suitable approach to our project because it is more concerned with improving an already existing process. With many of its successes, Lean Six-Sigma has its limitation especially in healthcare; but limitations can be minimized if properly approached.

Keywords: lean six sigma, DMAIC, hospital, methodology

Procedia PDF Downloads 496
17633 Electronic Media and Physical Activity of Primary School Children

Authors: Srna Jenko Miholic, Marta Borovec, Josipa Persun

Abstract:

The constant expansion of technology has further accelerated the development of media and vice versa. Although its promotion includes all kinds of interesting and positive sides, the poor functioning of the media is still being researched and proven. Young people, as well as children from the earliest age, resort to the media the most, so it is necessary to defend the role of adults as it were parents, teachers, and environment against virtual co-educators such as the media. The research aim of this study was to determine the time consumption of using electronic media by primary school children as well as their involvement in certain physical activities. Furthermore, to determine what is happening when parents restrict their children's access to electronic media and encourage them to participate in alternative contents during their leisure time. Result reveals a higher percentage of parents restrict their children's access to electronic media and then encourage children to socialize with family and friends, spend time outdoors, engage in physical activity, read books or learn something unrelated to school content even though it would not be children's favorite activity. The results highlight the importance of parental control when it comes to children's use of electronic media and the positive effects that parental control has in terms of encouraging children to be useful, socially desirable, physically active, and healthy activities.

Keywords: elementary school, digital media, leisure time, parents, physical engagement

Procedia PDF Downloads 147
17632 The Fast Diagnosis of Acanthamoeba Keratitis Using Real-Time PCR Assay

Authors: Fadime Eroglu

Abstract:

Acanthamoeba genus belongs to kingdom protozoa, and it is known as free-living amoebae. Acanthamoeba genus has been isolated from human bodies, swimming pools, bottled mineral water, contact lens solutions, dust, and soil. The members of the genus Acanthamoeba causes Acanthamoeba Keratitis which is a painful sight-threatening disease of the eyes. In recent years, the prevalence of Acanthamoeba keratitis has been high rate reported. The eight different Acanthamoeba species are known to be effective in Acanthamoeba keratitis. These species are Acanthamoeba castellanii, Acanthamoeba polyphaga, Acanthamoeba griffini, Acanthamoeba hatchetti, Acanthamoeba culbertsoni and Acanhtamoeba rhysodes. The conventional diagnosis of Acanthamoeba Keratitis has relied on cytological preparations and growth of Acanthamoeba in culture. However molecular methods such as real-time PCR has been found to be more sensitive. The real-time PCR has now emerged as an effective method for more rapid testing for the diagnosis of infectious disease in decade. Therefore, a real-time PCR assay for the detection of Acanthamoeba keratitis and Acanthamoeba species have been developed in this study. The 18S rRNA sequences from Acanthamoeba species were obtained from National Center for Biotechnology Information and sequences were aligned with MEGA 6 programme. Primers and probe were designed using Custom Primers-OligoPerfectTMDesigner (ThermoFisherScientific, Waltham, MA, USA). They were also assayed for hairpin formation and degree of primer-dimer formation with Multiple Primer Analyzer ( ThermoFisherScientific, Watham, MA, USA). The eight different ATCC Acanthamoeba species were obtained, and DNA was extracted using the Qiagen Mini DNA extraction kit (Qiagen, Hilden, Germany). The DNA of Acanthamoeba species were analyzed using newly designed primer and probe set in real-time PCR assay. The early definitive laboratory diagnosis of Acanthamoeba Keratitis and the rapid initiation of suitable therapy is necessary for clinical prognosis. The results of the study have been showed that new primer and probes could be used for detection and distinguish for Acanthamoeba species. These new developing methods are helpful for diagnosis of Acanthamoeba Keratitis.

Keywords: Acathamoeba Keratitis, Acanthamoeba species, fast diagnosis, Real-Time PCR

Procedia PDF Downloads 120
17631 Towards an Effective Approach for Modelling near Surface Air Temperature Combining Weather and Satellite Data

Authors: Nicola Colaninno, Eugenio Morello

Abstract:

The urban environment affects local-to-global climate and, in turn, suffers global warming phenomena, with worrying impacts on human well-being, health, social and economic activities. Physic-morphological features of the built-up space affect urban air temperature, locally, causing the urban environment to be warmer compared to surrounding rural. This occurrence, typically known as the Urban Heat Island (UHI), is normally assessed by means of air temperature from fixed weather stations and/or traverse observations or based on remotely sensed Land Surface Temperatures (LST). The information provided by ground weather stations is key for assessing local air temperature. However, the spatial coverage is normally limited due to low density and uneven distribution of the stations. Although different interpolation techniques such as Inverse Distance Weighting (IDW), Ordinary Kriging (OK), or Multiple Linear Regression (MLR) are used to estimate air temperature from observed points, such an approach may not effectively reflect the real climatic conditions of an interpolated point. Quantifying local UHI for extensive areas based on weather stations’ observations only is not practicable. Alternatively, the use of thermal remote sensing has been widely investigated based on LST. Data from Landsat, ASTER, or MODIS have been extensively used. Indeed, LST has an indirect but significant influence on air temperatures. However, high-resolution near-surface air temperature (NSAT) is currently difficult to retrieve. Here we have experimented Geographically Weighted Regression (GWR) as an effective approach to enable NSAT estimation by accounting for spatial non-stationarity of the phenomenon. The model combines on-site measurements of air temperature, from fixed weather stations and satellite-derived LST. The approach is structured upon two main steps. First, a GWR model has been set to estimate NSAT at low resolution, by combining air temperature from discrete observations retrieved by weather stations (dependent variable) and the LST from satellite observations (predictor). At this step, MODIS data, from Terra satellite, at 1 kilometer of spatial resolution have been employed. Two time periods are considered according to satellite revisit period, i.e. 10:30 am and 9:30 pm. Afterward, the results have been downscaled at 30 meters of spatial resolution by setting a GWR model between the previously retrieved near-surface air temperature (dependent variable), the multispectral information as provided by the Landsat mission, in particular the albedo, and Digital Elevation Model (DEM) from the Shuttle Radar Topography Mission (SRTM), both at 30 meters. Albedo and DEM are now the predictors. The area under investigation is the Metropolitan City of Milan, which covers an area of approximately 1,575 km2 and encompasses a population of over 3 million inhabitants. Both models, low- (1 km) and high-resolution (30 meters), have been validated according to a cross-validation that relies on indicators such as R2, Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE). All the employed indicators give evidence of highly efficient models. In addition, an alternative network of weather stations, available for the City of Milano only, has been employed for testing the accuracy of the predicted temperatures, giving and RMSE of 0.6 and 0.7 for daytime and night-time, respectively.

Keywords: urban climate, urban heat island, geographically weighted regression, remote sensing

Procedia PDF Downloads 195
17630 Designing an App to Solve Surveying Challenges

Authors: Ali Mohammadi

Abstract:

Forming and equipping the surveyors team for construction projects such as dams, roads, and tunnels is always one of the first challenges and hiring surveyors who are proficient in reading maps and controlling structures, purchasing appropriate surveying equipment that the employer can find Also, using methods that can save time, in the bigger the project, the more these challenges show themselves. Finding a surveyor engineer who can lead the teams and train surveyors of the collection and buy TOTAL STATION according to the company's budget and the surveyors' ability to use them and the time available to each team In the following, we will introduce a surveying app and examine how to use it, which shows how useful it can be for surveyors in projects.

Keywords: DTM CUTFILL, datatransfer, section, tunnel, traverse

Procedia PDF Downloads 82
17629 Kinetic Parameter Estimation from Thermogravimetry and Microscale Combustion Calorimetry

Authors: Rhoda Afriyie Mensah, Lin Jiang, Solomon Asante-Okyere, Xu Qiang, Cong Jin

Abstract:

Flammability analysis of extruded polystyrene (XPS) has become crucial due to its utilization as insulation material for energy efficient buildings. Using the Kissinger-Akahira-Sunose and Flynn-Wall-Ozawa methods, the degradation kinetics of two pure XPS from the local market, red and grey ones, were obtained from the results of thermogravity analysis (TG) and microscale combustion calorimetry (MCC) experiments performed under the same heating rates. From the experiments, it was discovered that red XPS released more heat than grey XPS and both materials showed two mass loss stages. Consequently, the kinetic parameters for red XPS were higher than grey XPS. A comparative evaluation of activation energies from MCC and TG showed an insignificant degree of deviation signifying an equivalent apparent activation energy from both methods. However, different activation energy profiles as a result of the different chemical pathways were presented when the dependencies of the activation energies on extent of conversion for TG and MCC were compared.

Keywords: flammability, microscale combustion calorimetry, thermogravity analysis, thermal degradation, kinetic analysis

Procedia PDF Downloads 177
17628 Application of Simulation of Discrete Events in Resource Management of Massive Concreting

Authors: Mohammad Amin Hamedirad, Seyed Javad Vaziri Kang Olyaei

Abstract:

Project planning and control are one of the most critical issues in the management of construction projects. Traditional methods of project planning and control, such as the critical path method or Gantt chart, are not widely used for planning projects with discrete and repetitive activities, and one of the problems of project managers is planning the implementation process and optimal allocation of its resources. Massive concreting projects is also a project with discrete and repetitive activities. This study uses the concept of simulating discrete events to manage resources, which includes finding the optimal number of resources considering various limitations such as limitations of machinery, equipment, human resources and even technical, time and implementation limitations using analysis of resource consumption rate, project completion time and critical points analysis of the implementation process. For this purpose, the concept of discrete-event simulation has been used to model different stages of implementation. After reviewing the various scenarios, the optimal number of allocations for each resource is finally determined to reach the maximum utilization rate and also to reduce the project completion time or reduce its cost according to the existing constraints. The results showed that with the optimal allocation of resources, the project completion time could be reduced by 90%, and the resulting costs can be reduced by up to 49%. Thus, allocating the optimal number of project resources using this method will reduce its time and cost.

Keywords: simulation, massive concreting, discrete event simulation, resource management

Procedia PDF Downloads 148
17627 A Virtual Electrode through Summation of Time Offset Pulses

Authors: Isaac Cassar, Trevor Davis, Yi-Kai Lo, Wentai Liu

Abstract:

Retinal prostheses have been successful in eliciting visual responses in implanted subjects. As these prostheses progress, one of their major limitations is the need for increased resolution. As an alternative to increasing the number of electrodes, virtual electrodes may be used to increase the effective resolution of current electrode arrays. This paper presents a virtual electrode technique based upon time-offsets between stimuli. Two adjacent electrodes are stimulated with identical pulses with too short of pulse widths to activate a neuron, but one has a time offset of one pulse width. A virtual electrode of twice the pulse width was then shown to appear in the center, with a total width capable of activating a neuron. This can be used in retinal implants by stimulating electrodes with pulse widths short enough to not elicit responses in neurons, but with their combined pulse width adequate to activate a neuron in between them.

Keywords: electrical stimulation, neuroprosthesis, retinal implant, retinal prosthesis, virtual electrode

Procedia PDF Downloads 303
17626 Preserved Relative Differences between Regions of Different Thermal Scans

Authors: Tahir Majeed, Michael Handschuh, René Meier

Abstract:

Rheumatoid arthritis patients have swelling and pain at the joints of the hand. The regions where the patient feels pain also show increased body temperature. Thermal cameras can be used to detect the rise in temperature of the affected regions. To monitor the disease progression of rheumatoid arthritis patients, they must visit the clinic regularly for scanning and examination. After scanning and evaluation, the dosage of the medicine is regulated accordingly. To monitor the disease progression over time, the correlation between the images between different visits must be established. It has been observed that by using low-cost thermal cameras, the thermal measurements do not remain the same over time, even within a single scanning. In some situations, temperatures can vary as much as 2°C within the same scanning sequence. In this paper, it has been shown that although the absolute temperature varies over time, the relative difference between the different regions remains similar. Results have been computed over four scanning sequences and are presented.

Keywords: relative thermal difference, rheumatoid arthritis, thermal imaging, thermal sensors

Procedia PDF Downloads 196
17625 Sparse Unmixing of Hyperspectral Data by Exploiting Joint-Sparsity and Rank-Deficiency

Authors: Fanqiang Kong, Chending Bian

Abstract:

In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.

Keywords: hyperspectral unmixing, joint-sparse, low-rank representation, abundance estimation

Procedia PDF Downloads 261
17624 Dynamic Measurement System Modeling with Machine Learning Algorithms

Authors: Changqiao Wu, Guoqing Ding, Xin Chen

Abstract:

In this paper, ways of modeling dynamic measurement systems are discussed. Specially, for linear system with single-input single-output, it could be modeled with shallow neural network. Then, gradient based optimization algorithms are used for searching the proper coefficients. Besides, method with normal equation and second order gradient descent are proposed to accelerate the modeling process, and ways of better gradient estimation are discussed. It shows that the mathematical essence of the learning objective is maximum likelihood with noises under Gaussian distribution. For conventional gradient descent, the mini-batch learning and gradient with momentum contribute to faster convergence and enhance model ability. Lastly, experimental results proved the effectiveness of second order gradient descent algorithm, and indicated that optimization with normal equation was the most suitable for linear dynamic models.

Keywords: dynamic system modeling, neural network, normal equation, second order gradient descent

Procedia PDF Downloads 127
17623 Implementation of Real-Time Multiple Sound Source Localization and Separation

Authors: Jeng-Shin Sheu, Qi-Xun Zheng

Abstract:

This paper mainly discusses a method of separating speech when using a microphone array without knowing the number and direction of sound sources. In recent years, there have been many studies on the method of separating signals by using masking, but most of the separation methods must be operated under the condition of a known number of sound sources. Such methods cannot be used for real-time applications. In our method, this paper uses Circular-Integrated-Cross-Spectrum to estimate the statistical histogram distribution of the direction of arrival (DOA) to obtain the number of sound sources and sound in the mixed-signal Source direction. In calculating the relevant parameters of the ring integrated cross-spectrum, the phase (Phase of the Cross-Power Spectrum) and phase rotation factors (Phase Rotation Factors) calculated by the cross power spectrum of each microphone pair are used. In the part of separating speech, it uses the DOA weighting and shielding separation method to calculate the sound source direction (DOA) according to each T-F unit (time-frequency point). The weight corresponding to each T-F unit can be used to strengthen the intensity of each sound source from the T-F unit and reduce the influence of the remaining sound sources, thereby achieving voice separation.

Keywords: real-time, spectrum analysis, sound source localization, sound source separation

Procedia PDF Downloads 155
17622 Spatio-Temporal Analysis and Mapping of Malaria in Thailand

Authors: Krisada Lekdee, Sunee Sammatat, Nittaya Boonsit

Abstract:

This paper proposes a GLMM with spatial and temporal effects for malaria data in Thailand. A Bayesian method is used for parameter estimation via Gibbs sampling MCMC. A conditional autoregressive (CAR) model is assumed to present the spatial effects. The temporal correlation is presented through the covariance matrix of the random effects. The malaria quarterly data have been extracted from the Bureau of Epidemiology, Ministry of Public Health of Thailand. The factors considered are rainfall and temperature. The result shows that rainfall and temperature are positively related to the malaria morbidity rate. The posterior means of the estimated morbidity rates are used to construct the malaria maps. The top 5 highest morbidity rates (per 100,000 population) are in Trat (Q3, 111.70), Chiang Mai (Q3, 104.70), Narathiwat (Q4, 97.69), Chiang Mai (Q2, 88.51), and Chanthaburi (Q3, 86.82). According to the DIC criterion, the proposed model has a better performance than the GLMM with spatial effects but without temporal terms.

Keywords: Bayesian method, generalized linear mixed model (GLMM), malaria, spatial effects, temporal correlation

Procedia PDF Downloads 454