Search results for: single well model experiments of vacuum preloading technology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27822

Search results for: single well model experiments of vacuum preloading technology

27342 Technology Adoption Models: A Study on Brick Kiln Firms in Punjab

Authors: Ajay Kumar, Shamily Jaggi

Abstract:

In developing countries like India development of modern technologies has been a key determinant in accelerating industrialization and urbanization. But in the pursuit of rapid economic growth, development is considered a top priority, while environmental protection is not given the same importance. Thus, a number of industries sited haphazardly have been established, leading to a deterioration of natural resources like water, soil and air. As a result, environmental pollution is tremendously increasing due to industrialization and mechanization that are serving to fulfill the demands of the population. With the increasing population, demand for bricks for construction work is also increasing, establishing the brick industry as a growing industry. Brick production requires two main resources; water as a source of life, and soil, as a living environment. Water and soil conservation is a critical issue in areas facing scarcity of water and soil resources. The purpose of this review paper is to provide a brief overview of the theoretical frameworks used in the analysis of the adoption and/or acceptance of soil and water conservation practices in the brick industry. Different frameworks and models have been used in the analysis of the adoption and/or acceptance of new technologies and practices; these include the technology acceptance model, motivational model, theory of reasoned action, innovation diffusion theory, theory of planned behavior, and the unified theory of acceptance and use of technology. However, every model has some limitations, such as not considering environmental/contextual and economic factors that may affect the individual’s intention to perform a behavior. The paper concludes that in comparing other models, the UTAUT seems a better model for understanding the dynamics of acceptance and adoption of water and soil conservation practices.

Keywords: brick kiln, water conservation, soil conservation, unified theory of acceptance and use of technology, technology adoption

Procedia PDF Downloads 89
27341 The Importance of Science and Technology Education in Skill Acquisition for Self Dependence

Authors: Olaje Monday Olaje

Abstract:

Science and technology has been prove to be the back bone for economic development of any country, and for Nigeria, it has more critical role to play. This paper examines the importance of science and technology education for national development and self dependence for Nigerian citizens. A historical overview of the interconnectivity of science and technology and self dependence is heighted. The current situation and challenges facing science and technology education are also highlighted to bring out the theoretical importance of science and technology education for self dependence which actually has not been practically achieved. Recommendations are also made at the of the study so as to skill acquisition through science and technology for self dependence.

Keywords: acquisition, education, self-dependence, science, technology

Procedia PDF Downloads 489
27340 Communicative and Artistic Machines: A Survey of Models and Experiments on Artificial Agents

Authors: Artur Matuck, Guilherme F. Nobre

Abstract:

Machines can be either tool, media, or social agents. Advances in technology have been delivering machines capable of autonomous expression, both through communication and art. This paper deals with models (theoretical approach) and experiments (applied approach) related to artificial agents. On one hand it traces how social sciences' scholars have worked with topics such as text automatization, man-machine writing cooperation, and communication. On the other hand it covers how computer sciences' scholars have built communicative and artistic machines, including the programming of creativity. The aim is to present a brief survey on artificially intelligent communicators and artificially creative writers, and provide the basis to understand the meta-authorship and also to new and further man-machine co-authorship.

Keywords: artificial communication, artificial creativity, artificial writers, meta-authorship, robotic art

Procedia PDF Downloads 275
27339 Analytical Hierarchical Process for Multi-Criteria Decision-Making

Authors: Luis Javier Serrano Tamayo

Abstract:

This research on technology makes a first approach to the selection of an amphibious landing ship with strategic capabilities, through the implementation of a multi-criteria model using Analytical Hierarchical Process (AHP), in which a significant group of alternatives of latest technology has been considered. The variables were grouped at different levels to match design and performance characteristics, which affect the lifecycle as well as the acquisition, maintenance and operational costs. The model yielded an overall measure of effectiveness and an overall measure of cost of each kind of ship that was compared each other inside the model and showed in a Pareto chart. The modeling was developed using the Expert Choice software, based on AHP method.

Keywords: analytic hierarchy process, multi-criteria decision-making, Pareto analysis, Colombian Marine Corps, projection operations, expert choice, amphibious landing ship

Procedia PDF Downloads 533
27338 A Pre-Assessment Questionnaire to Identify Healthcare Professionals’ Perception on Information Technology Implementation

Authors: Y. Atilgan Şengül

Abstract:

Health information technologies promise higher quality, safer care and much more for both patients and professionals. Despite their promise, they are costly to develop and difficult to implement. On the other hand, user acceptance and usage determine the success of implemented information technology in healthcare. This study provides a model to understand health professionals’ perception and expectation of health information technology. Extensive literature review has been conducted to determine the main factors to be measured. A questionnaire has been designed as a measurement model and submitted to the personnel of an in vitro fertilization clinic. The respondents’ degree of agreement according to five-point Likert scale was 72% for convenient access to data and 69.4% for the importance of data security. There was a significant difference in acceptance of electronic data storage for female respondents. Also, other significant differences between professions were obtained.

Keywords: healthcare, health informatics, medical record system, questionnaire

Procedia PDF Downloads 158
27337 Parametric Study and Design on under Reamed Pile - An Experimental and Numerical Study

Authors: S. Chandrakaran, Aarthy D.

Abstract:

Abstract: Under reamed piles are piles which are of different types like bored cast in-situ pile or bored compaction concrete piles where one or more bulbs are provided. In this paper, the design procedure of under reamed pile by both experimental study and numerical study using PLAXIS 3D Foundation software was studied. The soil chosen for study was M Sand. The Single and double under reamed pile modelling was made using mild steel. The pile load test experiment was conducted in the laboratory and the ultimate compression load for 25 mm settlement on single and double under reamed pile was observed and finally the result was compared with conventional pile (pile without bulb). The parametric influence on under reamed pile was studied by varying the geometrical parameters like diameter of bulbs, spacing between bulbs, position of bulbs and number of bulbs. The results of the numerical model showed that when the diameter of bulb D u =2.5D, the ultimate compression load for an under-reamed pile with a single bulb increased by 55 % compared to a pile without a bulb. It was observed that when the spacing between the bulbs was S=6D u with three different positions of bulb from bottom of pile as D u , 2D u and 3D u , the ultimate compression load increased by 88%, 94% and 73 % respectively, compared to the ultimate compression load for 25 mm settlement on conventional pile and if spacing was more than 6D u , ultimate compression load for 25 mm settlement started to decrease. It was observed that when the bucket length was more than 2D u , the ultimate compression

Keywords: load capcity, under remed bulb . sand, model study, sand

Procedia PDF Downloads 66
27336 Deep Learning for SAR Images Restoration

Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo Ferraioli

Abstract:

In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring. SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.

Keywords: SAR image, polarimetric SAR image, convolutional neural network, deep learnig, deep neural network

Procedia PDF Downloads 56
27335 Application of Digital Image Correlation Technique on Vacuum Assisted Resin Transfer Molding Process and Performance Evaluation of the Produced Materials

Authors: Dingding Chen, Kazuo Arakawa, Masakazu Uchino, Changheng Xu

Abstract:

Vacuum assisted resin transfer moulding (VARTM) is a promising manufacture process for making large and complex fiber reinforced composite structures. However, the complexity of the flow of the resin in the infusion stage usually leads to nonuniform property distribution of the produced composite part. In order to control the flow of the resin, the situation of flow should be mastered. For the safety of the usage of the produced composite in practice, the understanding of the property distribution is essential. In this paper, we did some trials on monitoring the resin infusion stage and evaluation for the fiber volume fraction distribution of the VARTM produced composite using the digital image correlation methods. The results show that 3D-DIC is valid on monitoring the resin infusion stage and it is possible to use 2D-DIC to estimate the distribution of the fiber volume fraction on a FRP plate.

Keywords: digital image correlation, VARTM, FRP, fiber volume fraction

Procedia PDF Downloads 319
27334 Analysis of Vertical Hall Effect Device Using Current-Mode

Authors: Kim Jin Sup

Abstract:

This paper presents a vertical hall effect device using current-mode. Among different geometries that have been studied and simulated using COMSOL Multiphysics, optimized cross-shaped model displayed the best sensitivity. The cross-shaped model emerged as the optimum plate to fit the lowest noise and residual offset and the best sensitivity. The symmetrical cross-shaped hall plate is widely used because of its high sensitivity and immunity to alignment tolerances resulting from the fabrication process. The hall effect device has been designed using a 0.18-μm CMOS technology. The simulation uses the nominal bias current of 12μA. The applied magnetic field is from 0 mT to 20 mT. Simulation results achieved in COMSOL and validated with respect to the electrical behavior of equivalent circuit for Cadence. Simulation results of the one structure over the 13 available samples shows for the best geometry a current-mode sensitivity of 6.6 %/T at 20mT. Acknowledgment: This work was supported by Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIP) (No. R7117-16-0165, Development of Hall Effect Semiconductor for Smart Car and Device).

Keywords: vertical hall device, current-mode, crossed-shaped model, CMOS technology

Procedia PDF Downloads 277
27333 Numerical Simulation of Wishart Diffusion Processes

Authors: Raphael Naryongo, Philip Ngare, Anthony Waititu

Abstract:

This paper deals with numerical simulation of Wishart processes for a single asset risky pricing model whose volatility is described by Wishart affine diffusion processes. The multi-factor specification of volatility will make the model more flexible enough to fit the stock market data for short or long maturities for better returns. The Wishart process is a stochastic process which is a positive semi-definite matrix-valued generalization of the square root process. The aim of the study is to model the log asset stock returns under the double Wishart stochastic volatility model. The solution of the log-asset return dynamics for Bi-Wishart processes will be obtained through Euler-Maruyama discretization schemes. The numerical results on the asset returns are compared to the existing models returns such as Heston stochastic volatility model and double Heston stochastic volatility model

Keywords: euler schemes, log-asset return, infinitesimal generator, wishart diffusion affine processes

Procedia PDF Downloads 362
27332 Application of Lattice Boltzmann Method to Different Boundary Conditions in a Two Dimensional Enclosure

Authors: Jean Yves Trepanier, Sami Ammar, Sagnik Banik

Abstract:

Lattice Boltzmann Method has been advantageous in simulating complex boundary conditions and solving for fluid flow parameters by streaming and collision processes. This paper includes the study of three different test cases in a confined domain using the method of the Lattice Boltzmann model. 1. An SRT (Single Relaxation Time) approach in the Lattice Boltzmann model is used to simulate Lid Driven Cavity flow for different Reynolds Number (100, 400 and 1000) with a domain aspect ratio of 1, i.e., square cavity. A moment-based boundary condition is used for more accurate results. 2. A Thermal Lattice BGK (Bhatnagar-Gross-Krook) Model is developed for the Rayleigh Benard convection for both test cases - Horizontal and Vertical Temperature difference, considered separately for a Boussinesq incompressible fluid. The Rayleigh number is varied for both the test cases (10^3 ≤ Ra ≤ 10^6) keeping the Prandtl number at 0.71. A stability criteria with a precise forcing scheme is used for a greater level of accuracy. 3. The phase change problem governed by the heat-conduction equation is studied using the enthalpy based Lattice Boltzmann Model with a single iteration for each time step, thus reducing the computational time. A double distribution function approach with D2Q9 (density) model and D2Q5 (temperature) model are used for two different test cases-the conduction dominated melting and the convection dominated melting. The solidification process is also simulated using the enthalpy based method with a single distribution function using the D2Q5 model to provide a better understanding of the heat transport phenomenon. The domain for the test cases has an aspect ratio of 2 with some exceptions for a square cavity. An approximate velocity scale is chosen to ensure that the simulations are within the incompressible regime. Different parameters like velocities, temperature, Nusselt number, etc. are calculated for a comparative study with the existing works of literature. The simulated results demonstrate excellent agreement with the existing benchmark solution within an error limit of ± 0.05 implicates the viability of this method for complex fluid flow problems.

Keywords: BGK, Nusselt, Prandtl, Rayleigh, SRT

Procedia PDF Downloads 115
27331 Operation Strategies of Residential Micro Combined Heat and Power Technologies

Authors: Omar A. Shaneb, Adell S. Amer

Abstract:

Reduction of CO2 emissions has become a priority for several countries due to increasing concerns about global warming and climate change, especially in the developed countries. Residential sector is considered one of the most important sectors for considerable reduction of CO2 emissions since it represents a significant amount of the total consumed energy in those countries. A significant CO2 reduction cannot be achieved unless some initiatives have been adopted in the policy of these countries. Introducing micro combined heat and power (µCHP) systems into residential energy systems is one of these initiatives, since such a technology offers several advantages. Moreover, µCHP technology has the opportunity to be operated not only by natural gas but it could also be operated by renewable fuels. However, this technology can be operated by different operation strategies. Each strategy has some advantages and disadvantages. This paper provides a review of different operation strategies of such a technology used for residential energy systems, especially for single dwellings. The review summarizes key points that outline the trend of previous research carried out in this field.

Keywords: energy management, µCHP systems, residential energy systems, sustainable houses, operation strategy.

Procedia PDF Downloads 413
27330 Validation of the Recovery of House Dust Mites from Fabrics by Means of Vacuum Sampling

Authors: A. Aljohani, D. Burke, D. Clarke, M. Gormally, M. Byrne, G. Fleming

Abstract:

Introduction: House Dust Mites (HDMs) are a source of allergen particles embedded in textiles and furnishings. Vacuum sampling is commonly used to recover and determine the abundance of HDMs but the efficiency of this method is less than standardized. Here, the efficiency of recovery of HDMs was evaluated from home-associated textiles using vacuum sampling protocols.Methods/Approach: Living Mites (LMs) or dead Mites (DMs) House Dust Mites (Dermatophagoides pteronyssinus: FERA, UK) were separately seeded onto the surfaces of Smooth Cotton, Denim and Fleece (25 mites/10x10cm2 squares) and left for 10 minutes before vacuuming. Fabrics were vacuumed (SKC Flite 2 pump) at a flow rate of 14 L/min for 60, 90 or 120 seconds and the number of mites retained by the filter (0.4μm x 37mm) unit was determined. Vacuuming was carried out in a linear direction (Protocol 1) or in a multidirectional pattern (Protocol 2). Additional fabrics with LMs were also frozen and then thawed, thereby euthanizing live mites (now termed EMs). Results/Findings: While there was significantly greater (p=0.000) recovery of mites (76% greater) in fabrics seeded with DMs than LMs irrespective of vacuuming protocol or fabric type, the efficiency of recovery of DMs (72%-76%) did not vary significantly between fabrics. For fabrics containing EMs, recovery was greatest for Smooth Cotton and Denim (65-73% recovered) and least for Fleece (15% recovered). There was no significant difference (p=0.99) between the recovery of mites across all three mite categories from Smooth Cotton and Denim but significantly fewer (p=0.000) mites were recovered from Fleece. Scanning Electron Microscopy images of HMD-seeded fabrics showed that live mites burrowed deeply into the Fleece weave which reduced their efficiency of recovery by vacuuming. Research Implications: Results presented here have implications for the recovery of HDMs by vacuuming and the choice of fabric to ameliorate HDM-dust sensitization.

Keywords: allergy, asthma, dead, fabric, fleece, live mites, sampling

Procedia PDF Downloads 122
27329 Green It-Outsourcing Assurance Model for It-Outsourcing Vendors

Authors: Siffat Ullah Khan, Rahmat Ullah Khan, Rafiq Ahmad Khan, Habibullah Khan

Abstract:

Green IT or green computing has emerged as a fast growing business paradigm in recent years in order to develop energy-efficient Software and peripheral devices. With the constant evolution of technology and the world critical environmental status, all private and public information technology (IT) businesses are moving towards sustainability. We identified, through systematic literature review and questionnaire survey, 9 motivators, in total, faced by vendors in IT-Outsourcing relationship. Amongst these motivators 7 were ranked as critical motivators. We also identified 21, in total, practices for addressing these critical motivators. Based on these inputs we have developed Green IT-Outsourcing Assurance Model (GITAM) for IT-Outsourcing vendors. The model comprises four different levels. i.e. Initial, White, Green and Grey. Each level comprises different critical motivators and their relevant practices. We conclude that our model, GITAM, will assist IT-Outsourcing vendors in gauging their level in order to manage IT-Outsourcing activities in a green and sustainable fashion to assist the environment and to reduce the carbon emission. The model will assist vendors in improving their current level by suggesting various practices. The model will contribute to the body of knowledge in the field of Green IT.

Keywords: Green IT-outsourcing Assurance Model (GITAM), Systematic Literature Review, Empirical Study, Case Study

Procedia PDF Downloads 234
27328 Deep Learning Based Polarimetric SAR Images Restoration

Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo ferraioli

Abstract:

In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring . SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.

Keywords: SAR image, deep learning, convolutional neural network, deep neural network, SAR polarimetry

Procedia PDF Downloads 71
27327 A Comprehensive Study on Cast NiTi and Ti64 Alloys for Biomedical Applications

Authors: Khaled Mohamed Ibrahim

Abstract:

A comprehensive study on two biomaterials of NiTi and Ti-6Al-4V (Ti64) was done. Those materials were cast using vacuum arc remelting technique. As-cast structure of Ni-Ti alloy consists of NiTi matrix and some fine precipitates of Ni4Ti3. Ti-6Al-4V alloy showed a structure composed of equiaxed β grains and varied α-phase morphologies. Maximum ultimate compressive strength and reduction in height of 2042 MPa of 18%, respectively, were reported for the cast Ti64 alloy. However, minimum ultimate compressive strength of 1804 MPa and low reduction in height of 3% were obtained for the cast NiTi alloy. Wear rate of both Ni-Ti and Ti-6Al-4V alloys significantly increased at saline solution (0.9% NaCl) condition as compared to dry testing condition. Saline solution harmed the wear resistance of about 2 to 4 times compared to the dry condition. Corrosion rate of NiTi alloy at saline solution (0.9% NaCl) was (0.00038 mm/yr) is almost three times the value of Ti64 alloy (0.000171 mm/yr). The corrosion rate of Ti64 in SBF (0.00024 mm/yr) was lower than Ni-Ti (0.0003 mm/yr).

Keywords: NiTi, Ti64, vacuum casting, biomaterials

Procedia PDF Downloads 65
27326 Numerical and Experimental Studies on the Characteristic of the Air Distribution in the Wind-Box of a Circulating Fluidized Bed Boiler

Authors: Xiaozhou Liu, Guangyu Zhu, Yu Zhang, Hongwei Wu

Abstract:

The wind-box is one of the important components of a Circulating Fluidized Bed (CFB) boiler. The uniformity of air flow in the wind-box of is very important for highly efficient operation of the CFB boiler. Non-uniform air flow distribution within the wind-box can reduce the boiler's thermal efficiency, leading to higher energy consumptions. An effective measure to solve this problem is to install an air flow distributing device in the wind-box. In order to validate the effectiveness of the air flow distributing device, visual and velocity distribution uniformity experiments have been carried out under five different test conditions by using a 1:64 scale model of a 220t/hr CFB boiler. It has been shown that the z component of flow velocity remains almost the same at control cross-sections of the wind-box, with a maximum variation of less than 10%. Moreover, the same methodology has been carried out to a full-scale 220t/hr CFB boiler. The hot test results depict that the thermal efficiency of the boiler has increased from 85.71% to 88.34% when tested with an air flow distributing device in place, which is equivalent to a saving of 5,000 tons of coal per year. The economic benefits of this energy-saving technology have been shown to be very significant, which clearly demonstrates that the technology is worth applying and popularizing.

Keywords: circulating fluidized bed, CFB, wind-box, air flow distributing device, visual experiment, velocity distribution uniformity experiment, hot test

Procedia PDF Downloads 156
27325 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 293
27324 Density Determination by Dilution for Extra Heavy Oil Residues Obtained Using Molecular Distillation and Supercritical Fluid Extraction as Upgrading and Refining Process

Authors: Oscar Corredor, Alexander Guzman, Adan Leon

Abstract:

Density is a bulk physical property that indicates the quality of a petroleum fraction. It is also a useful property to estimate various physicochemical properties of fraction and petroleum fluids; however, the determination of density of extra heavy residual (EHR) fractions by standard methodologies, (ASTM D70) shows limitations for samples with higher densities than 1.0879 g/cm3. For this reason, a dilution methodology was developed in order to determinate density for those particular fractions, 87 (EHR) fractions were obtained as products of the fractionation of Colombian typical Vacuum Distillation Residual Fractions using molecular distillation (MD) and extraction with Solvent N-hexane in Supercritical Conditions (SFEF) pilot plants. The proposed methodology showed reliable results that can be demonstrated with the standard deviation of repeatability and reproducibility values of 0.0031 and 0.0061 g/ml respectively. In the same way, it was possible to determine densities in fractions EHR up to 1.1647g/cm3 and °API values obtained were ten times less than the water reference value.

Keywords: API, density, vacuum residual, molecular distillation, supercritical fluid extraction

Procedia PDF Downloads 255
27323 Impact of Task Technology Fit on User Effectiveness, Efficiency and Creativity in Iranian Pharmaceutical Oraganizations

Authors: Milad Keshvardoost, Amir Khanlari, Nader Khalesi

Abstract:

Background: Any firm in the pharmaceutical industry requires efficient and effective management information systems (MIS) to support managerial functions. Purpose: The aim of this study is to investigate the impact of Task-Technology Fit on user effectiveness, efficiency, and creativity in Iranian pharmaceutical companies. Methodology: 345 reliable and validate questionnaires were distributed among selected samples, through the cluster method, to Information system users of eight leading Iranian pharmaceutical companies, based on the likert scale. The proposed model of the article is based on a model with Task technology fit, on user performance with the definition of efficiency, effectiveness, and creativity through mediation effects of perceived usefulness and ease of use. Results: This study confirmed that TTF with definitions of adequacy and compatibility has positive impacts on user performance Conclusion: We concluded that pharmaceutical users of IS, utilizing a system with a precise and intense observation of users' demands, may make facilitation for them to design an exclusive IS framework.

Keywords: information systems, user performance, pharmaceuticals, task technology fit

Procedia PDF Downloads 151
27322 Development of a Complete Single Jet Common Rail Injection System Gas Dynamic Model for Hydrogen Fueled Engine with Port Injection Feeding System

Authors: Mohammed Kamil, M. M. Rahman, Rosli A. Bakar

Abstract:

Modeling of hydrogen fueled engine (H2ICE) injection system is a very important tool that can be used for explaining or predicting the effect of advanced injection strategies on combustion and emissions. In this paper, a common rail injection system (CRIS) is proposed for 4-strokes 4-cylinders hydrogen fueled engine with port injection feeding system (PIH2ICE). For this system, a numerical one-dimensional gas dynamic model is developed considering single injection event for each injector per a cycle. One-dimensional flow equations in conservation form are used to simulate wave propagation phenomenon throughout the CR (accumulator). Using this model, the effect of common rail on the injection system characteristics is clarified. These characteristics include: rail pressure, sound velocity, rail mass flow rate, injected mass flow rate and pressure drop across injectors. The interaction effects of operational conditions (engine speed and rail pressure) and geometrical features (injector hole diameter) are illustrated; and the required compromised solutions are highlighted. The CRIS is shown to be a promising enhancement for PIH2ICE.

Keywords: common rail, hydrogen engine, port injection, wave propagation

Procedia PDF Downloads 408
27321 Scheduling in a Single-Stage, Multi-Item Compatible Process Using Multiple Arc Network Model

Authors: Bokkasam Sasidhar, Ibrahim Aljasser

Abstract:

The problem of finding optimal schedules for each equipment in a production process is considered, which consists of a single stage of manufacturing and which can handle different types of products, where changeover for handling one type of product to the other type incurs certain costs. The machine capacity is determined by the upper limit for the quantity that can be processed for each of the products in a set up. The changeover costs increase with the number of set ups and hence to minimize the costs associated with the product changeover, the planning should be such that similar types of products should be processed successively so that the total number of changeovers and in turn the associated set up costs are minimized. The problem of cost minimization is equivalent to the problem of minimizing the number of set ups or equivalently maximizing the capacity utilization in between every set up or maximizing the total capacity utilization. Further, the production is usually planned against customers’ orders, and generally different customers’ orders are assigned one of the two priorities – “normal” or “priority” order. The problem of production planning in such a situation can be formulated into a Multiple Arc Network (MAN) model and can be solved sequentially using the algorithm for maximizing flow along a MAN and the algorithm for maximizing flow along a MAN with priority arcs. The model aims to provide optimal production schedule with an objective of maximizing capacity utilization, so that the customer-wise delivery schedules are fulfilled, keeping in view the customer priorities. Algorithms have been presented for solving the MAN formulation of the production planning with customer priorities. The application of the model is demonstrated through numerical examples.

Keywords: scheduling, maximal flow problem, multiple arc network model, optimization

Procedia PDF Downloads 389
27320 Computational Fluid Dynamicsfd Simulations of Air Pollutant Dispersion: Validation of Fire Dynamic Simulator Against the Cute Experiments of the Cost ES1006 Action

Authors: Virginie Hergault, Siham Chebbah, Bertrand Frere

Abstract:

Following in-house objectives, Central laboratory of Paris police Prefecture conducted a general review on models and Computational Fluid Dynamics (CFD) codes used to simulate pollutant dispersion in the atmosphere. Starting from that review and considering main features of Large Eddy Simulation, Central Laboratory Of Paris Police Prefecture (LCPP) postulates that the Fire Dynamics Simulator (FDS) model, from National Institute of Standards and Technology (NIST), should be well suited for air pollutant dispersion modeling. This paper focuses on the implementation and the evaluation of FDS in the frame of the European COST ES1006 Action. This action aimed at quantifying the performance of modeling approaches. In this paper, the CUTE dataset carried out in the city of Hamburg, and its mock-up has been used. We have performed a comparison of FDS results with wind tunnel measurements from CUTE trials on the one hand, and, on the other, with the models results involved in the COST Action. The most time-consuming part of creating input data for simulations is the transfer of obstacle geometry information to the format required by SDS. Thus, we have developed Python codes to convert automatically building and topographic data to the FDS input file. In order to evaluate the predictions of FDS with observations, statistical performance measures have been used. These metrics include the fractional bias (FB), the normalized mean square error (NMSE) and the fraction of predictions within a factor of two of observations (FAC2). As well as the CFD models tested in the COST Action, FDS results demonstrate a good agreement with measured concentrations. Furthermore, the metrics assessment indicate that FB and NMSE meet the tolerance acceptable.

Keywords: numerical simulations, atmospheric dispersion, cost ES1006 action, CFD model, cute experiments, wind tunnel data, numerical results

Procedia PDF Downloads 120
27319 The Evaluation Model for the Quality of Software Based on Open Source Code

Authors: Li Donghong, Peng Fuyang, Yang Guanghua, Su Xiaoyan

Abstract:

Using open source code is a popular method of software development. How to evaluate the quality of software becomes more important. This paper introduces an evaluation model. The model evaluates the quality from four dimensions: technology, production, management, and development. Each dimension includes many indicators. The weight of indicator can be modified according to the purpose of evaluation. The paper also introduces a method of using the model. The evaluating result can provide good advice for evaluating or purchasing the software.

Keywords: evaluation model, software quality, open source code, evaluation indicator

Procedia PDF Downloads 368
27318 Optimized Weight Selection of Control Data Based on Quotient Space of Multi-Geometric Features

Authors: Bo Wang

Abstract:

The geometric processing of multi-source remote sensing data using control data of different scale and different accuracy is an important research direction of multi-platform system for earth observation. In the existing block bundle adjustment methods, as the controlling information in the adjustment system, the approach using single observation scale and precision is unable to screen out the control information and to give reasonable and effective corresponding weights, which reduces the convergence and adjustment reliability of the results. Referring to the relevant theory and technology of quotient space, in this project, several subjects are researched. Multi-layer quotient space of multi-geometric features is constructed to describe and filter control data. Normalized granularity merging mechanism of multi-layer control information is studied and based on the normalized scale factor, the strategy to optimize the weight selection of control data which is less relevant to the adjustment system can be realized. At the same time, geometric positioning experiment is conducted using multi-source remote sensing data, aerial images, and multiclass control data to verify the theoretical research results. This research is expected to break through the cliché of the single scale and single accuracy control data in the adjustment process and expand the theory and technology of photogrammetry. Thus the problem to process multi-source remote sensing data will be solved both theoretically and practically.

Keywords: multi-source image geometric process, high precision geometric positioning, quotient space of multi-geometric features, optimized weight selection

Procedia PDF Downloads 270
27317 Segmental Motion of Polymer Chain at Glass Transition Probed by Single Molecule Detection

Authors: Hiroyuki Aoki

Abstract:

The glass transition phenomenon has been extensively studied for a long time. The glass transition of polymer materials is assigned to the transition of the dynamics of the chain backbone segment. However, the detailed mechanism of the transition behavior of the segmental motion is still unclear. In the current work, the single molecule detection technique was employed to reveal the trajectory of the molecular motion of the single polymer chain. The center segment of poly(butyl methacrylate) chain was labeled by a perylenediimide dye molecule and observed by a highly sensitive fluorescence microscope in a defocus condition. The translational and rotational diffusion of the center segment in a single polymer chain was analyzed near the glass transition temperature. The direct observation of the individual polymer chains revealed the intermittent behavior of the segmental motion, indicating the spatial inhomogeneity.

Keywords: glass transition, molecular motion, polymer materials, single molecule

Procedia PDF Downloads 308
27316 Addressing Security and Privacy Issues in a Smart Environment by Using Block-Chain as a Preemptive Technique

Authors: Shahbaz Pervez, Aljawharah Almuhana, Zahida Parveen, Samina Naz, Hira Tariq, Seyed Hosseini, Muhammad Awais Azam

Abstract:

With the latest development in the field of cutting-edge technologies, there is a rapid increase in the use of technology-oriented gadgets. In a recent scenario of the tech era, there is increasing demand to fulfill our day-to-day routine tasks with the help of technological gadgets. We are living in an era of technology where trends have been changing, and a race to introduce a new technology gadget has already begun. Smart cities are getting more popular with every passing day; city councils and governments are under enormous pressure to provide the latest services for their citizens and equip them with all the latest facilities. Thus, ultimately, they are going more into smart cities infrastructure building, providing services to their inhabitants with a single click from their smart devices. This trend is very exciting, but on the other hand, if some incident of security breach happens due to any weaker link, the results would be catastrophic. This paper addresses potential security and privacy breaches with a possible solution by using Blockchain technology in IoT enabled environment.

Keywords: blockchain, cybersecurity, DDOS, intrusion detection, IoT, RFID, smart devices security, smart services

Procedia PDF Downloads 103
27315 Experimental Investigation on Activated Carbon Based Cryosorption Pump

Authors: K. B. Vinay, K. G. Vismay, S. Kasturirengan, G. A. Vivek

Abstract:

Cryosorption pumps are considered to be safe, quiet and ultra-high vacuum production pumps which have their application from Semiconductor industries to ITER [International Thermonuclear Experimental Reactor] units. The principle of physisorption of gases over highly porous materials like activated charcoal at cryogenic temperatures (below -1500°C) is involved in determining the pumping speed of gases like Helium, Hydrogen, Argon and Nitrogen. This paper aims at providing detailed overview of development of Cryosorption pump which is the modern ultra-high vacuum pump and characterization of different activated charcoal materials that optimizes the performance of the pump. Different grades of charcoal were tested in order to determine the pumping speed of the pump and were compared with commercially available Varian cryopanel. The results for bare panel, bare panel with adhesive, cryopanel with pellets, and cryopanel with granules were obtained and compared. The comparison showed that cryopanel adhered with small granules gave better pumping speeds than large sized pellets.

Keywords: adhesive, cryopanel, granules, pellets

Procedia PDF Downloads 408
27314 Identification of High Stress Regions in Proximal Femur During Single-Leg Stance and Sideways Fall Using QCT-Based Finite Element Model

Authors: Hossein Kheirollahi, Yunhua Luo

Abstract:

Studying stress and strain trends in the femur and recognizing femur failure mechanism is very important for preventing hip fracture in the elderly. The aim of this study was to identify high stress and strain regions in the femur during normal walking and falling to find the mechanical behavior and failure mechanism of the femur. We developed a finite element model of the femur from the subject’s quantitative computed tomography (QCT) image and used it to identify potentially high stress and strain regions during the single-leg stance and the sideways fall. It was found that fracture may initiate from the superior region of femoral neck and propagate to the inferior region during a high impact force such as sideways fall. The results of this study showed that the femur bone is more sensitive to strain than stress which indicates the effect of strain, in addition to effect of stress, should be considered for failure analysis.

Keywords: finite element analysis, hip fracture, strain, stress

Procedia PDF Downloads 490
27313 Modelling of Cavity Growth in Underground Coal Gasification

Authors: Preeti Aghalayam, Jay Shah

Abstract:

Underground coal gasification (UCG) is the in-situ gasification of unmineable coals to produce syngas. In UCG, gasifying agents are injected into the coal seam, and a reactive cavity is formed due to coal consumption. The cavity formed is typically hemispherical, and this report consists of the MATLAB model of the UCG cavity to predict the composition of the output gases. There are seven radial and two time-variant ODEs. A MATLAB solver (ode15s) is used to solve the radial ODEs from the above equations. Two for-loops are implemented in the model, i.e., one for time variations and another for radial variation. In the time loop, the radial odes are solved using the MATLAB solver. The radial loop is nested inside the time loop, and the density odes are numerically solved using the Euler method. The model is validated by comparing it with the literature results of laboratory-scale experiments. The model predicts the radial and time variation of the product gases inside the cavity.

Keywords: gasification agent, MATLAB model, syngas, underground coal gasification (UCG)

Procedia PDF Downloads 187