Search results for: transformed gamma distribution
2770 Analyzing the Heat Transfer Mechanism in a Tube Bundle Air-PCM Heat Exchanger: An Empirical Study
Authors: Maria De Los Angeles Ortega, Denis Bruneau, Patrick Sebastian, Jean-Pierre Nadeau, Alain Sommier, Saed Raji
Abstract:
Phase change materials (PCM) present attractive features that made them a passive solution for thermal comfort assessment in buildings during summer time. They show a large storage capacity per volume unit in comparison with other structural materials like bricks or concrete. If their use is matched with the peak load periods, they can contribute to the reduction of the primary energy consumption related to cooling applications. Despite these promising characteristics, they present some drawbacks. Commercial PCMs, as paraffines, offer a low thermal conductivity affecting the overall performance of the system. In some cases, the material can be enhanced, adding other elements that improve the conductivity, but in general, a design of the unit that optimizes the thermal performance is sought. The material selection is the departing point during the designing stage, and it does not leave plenty of room for optimization. The PCM melting point depends highly on the atmospheric characteristics of the building location. The selection must relay within the maximum, and the minimum temperature reached during the day. The geometry of the PCM container and the geometrical distribution of these containers are designing parameters, as well. They significantly affect the heat transfer, and therefore its phenomena must be studied exhaustively. During its lifetime, an air-PCM unit in a building must cool down the place during daytime, while the melting of the PCM occurs. At night, the PCM must be regenerated to be ready for next uses. When the system is not in service, a minimal amount of thermal exchanges is desired. The aforementioned functions result in the presence of sensible and latent heat storage and release. Hence different types of mechanisms drive the heat transfer phenomena. An experimental test was designed to study the heat transfer phenomena occurring in a circular tube bundle air-PCM exchanger. An in-line arrangement was selected as the geometrical distribution of the containers. With the aim of visual identification, the containers material and a section of the test bench were transparent. Some instruments were placed on the bench for measuring temperature and velocity. The PCM properties were also available through differential scanning calorimeter (DSC) tests. An evolution of the temperature during both cycles, melting and solidification were obtained. The results showed some phenomena at a local level (tubes) and on an overall level (exchanger). Conduction and convection appeared as the main heat transfer mechanisms. From these results, two approaches to analyze the heat transfer were followed. The first approach described the phenomena in a single tube as a series of thermal resistances, where a pure conduction controlled heat transfer was assumed in the PCM. For the second approach, the temperature measurements were used to find some significant dimensionless numbers and parameters as Stefan, Fourier and Rayleigh numbers, and the melting fraction. These approaches allowed us to identify the heat transfer phenomena during both cycles. The presence of natural convection during melting might have been stated from the influence of the Rayleigh number on the correlations obtained.Keywords: phase change materials, air-PCM exchangers, convection, conduction
Procedia PDF Downloads 1782769 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors
Authors: Yaxin Bi
Abstract:
Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors
Procedia PDF Downloads 322768 Thermal Buckling of Functionally Graded Panel Based on Mori-Tanaka Scheme
Authors: Seok-In Bae, Young-Hoon Lee, Ji-Hwan Kim
Abstract:
Due to the asymmetry of the material properties of the Functionally Graded Materials(FGMs) in the thickness direction, neutral surface of the model is not the same as the mid-plane of the symmetric structure. In order to investigate the thermal bucking behavior of FGMs, neutral surface is chosen as a reference plane. In the model, material properties are assumed to be temperature dependent, and varied continuously in the thickness direction of the plate. Further, the effective material properties such as Young’s modulus and Poisson’s ratio are homogenized using Mori-Tanaka scheme which considers the interaction among adjacent inclusions. In this work, the finite element methods are used, and the first-order shear deformation theory of plate are accounted. The thermal loads are assumed to be uniform, linear and non-linear distribution through the thickness directions, respectively. Also, the effects of various parameters for thermal buckling behavior of FGM panel are discussed in detail.Keywords: functionally graded plate, thermal buckling analysis, neutral surface
Procedia PDF Downloads 4012767 Banking and Accounting Analysis Researches Effect on Environment
Authors: Marina Magdy Naguib Karas
Abstract:
New methods of providing banking services to the customer have been introduced, such as online banking. Banks have begun to consider electronic banking (e-banking) as a way to replace some traditional branch functions by using the Internet as a new distribution channel. Some consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. Not only is it time-consuming, but it is also a repeatable activity with a certain frequency. To solve this problem, the concept of account aggregation was added as a solution. Account consolidation in e-banking as a form of electronic banking appears to build a stronger relationship with customers. An account linking service is generally referred to as a service that allows customers to manage their bank accounts held at different institutions via a common online banking platform that places a high priority on security and data protection. The article provides an overview of the account aggregation approach in e-banking as a new service in the area of e-banking.Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, Internet banks, modernization of banks, banks, account aggregation, security, enterprise development
Procedia PDF Downloads 502766 The Study on Treatment Technology of Fused Carbonized Blast Furnace Slag
Authors: Jiaxu Huang
Abstract:
The melt carbonized blast furnace slag containing TiC was produced by carbothermal reduction of high titanium blast furnace slag. The treatment technology of melt carbonized blast furnace slag with TiC as raw material was studied, including the influence of different cooling methods, crushing atmosphere and sieving particle size on the target product TiC in the slag. The results show that air-cooling and water-cooling have little effect on TiC content of molten carbide blast furnace slag, and have great effect on crystal structure and grain size. TiC content in slag is different when carbide blast furnace slag is crushed in argon atmosphere and air atmosphere. After screening, the difference of TiC content of carbide blast furnace slag with different particle size distribution is obvious. The average TiC content of 100-400 mesh carbide blast furnace slag is 14%. And the average TiC content of carbide blast furnace slag with particle size less than 400 mesh is 10.5%.Keywords: crushing atmosphere, cooling methods, sieving particle size, TiC
Procedia PDF Downloads 1352765 A Comparative Study of Microstructure, Thermal and Mechanical Properties of A359 Composites Reinforced with SiC, Si3N4 and AlN Particles
Authors: Essam Shalaby, Alexander Churyumov, Malak Abou El-Khair, Atef Daoud
Abstract:
A comparative study of the thermal and mechanical behavior of squeezed A359 composites containing 5, 10 and 15 wt.% SiC, (SiC+ Si3N4) and AlN particulates was investigated. Stir followed by squeeze casting techniques are used to produce A359 composites. It was noticed that, A359/AlN composites have high thermal conductivity as compared to A359 alloy and even to A359/SiC or A359/(SiC+Si3N4) composites. Microstructures of the composites have shown homogeneous and even distribution of reinforcements within the matrix. Interfacial reactions between particles and matrix were investigated using X-ray diffraction and energy dispersive X-ray analysis. The presence of particles led not only to increase peak hardness of the composites but also to accelerate the aging kinetics. As compared with A359 matrix alloy, compression test of the composites has exhibited a significant increase in the yield and the ultimate compressive strengths with a relative reduction in the failure strain. Those light weight composites have a high potential to be used for automotive and aerospace applications.Keywords: metal-matrix composite, squeeze, microstructure, thermal conductivity, compressive properties
Procedia PDF Downloads 3812764 Beyond Classic Program Evaluation and Review Technique: A Generalized Model for Subjective Distributions with Flexible Variance
Authors: Byung Cheol Kim
Abstract:
The Program Evaluation and Review Technique (PERT) is widely used for project management, but it struggles with subjective distributions, particularly due to its assumptions of constant variance and light tails. To overcome these limitations, we propose the Generalized PERT (G-PERT) model, which enhances PERT by incorporating variability in three-point subjective estimates. Our methodology extends the original PERT model to cover the full range of unimodal beta distributions, enabling the model to handle thick-tailed distributions and offering formulas for computing mean and variance. This maintains the simplicity of PERT while providing a more accurate depiction of uncertainty. Our empirical analysis demonstrates that the G-PERT model significantly improves performance, particularly when dealing with heavy-tail subjective distributions. In comparative assessments with alternative models such as triangular and lognormal distributions, G-PERT shows superior accuracy and flexibility. These results suggest that G-PERT offers a more robust solution for project estimation while still retaining the user-friendliness of the classic PERT approach.Keywords: PERT, subjective distribution, project management, flexible variance
Procedia PDF Downloads 182763 A Corpus-Based Study on the Lexical, Syntactic and Sequential Features across Interpreting Types
Authors: Qianxi Lv, Junying Liang
Abstract:
Among the various modes of interpreting, simultaneous interpreting (SI) is regarded as a ‘complex’ and ‘extreme condition’ of cognitive tasks while consecutive interpreters (CI) do not have to share processing capacity between tasks. Given that SI exerts great cognitive demand, it makes sense to posit that the output of SI may be more compromised than that of CI in the linguistic features. The bulk of the research has stressed the varying cognitive demand and processes involved in different modes of interpreting; however, related empirical research is sparse. In keeping with our interest in investigating the quantitative linguistic factors discriminating between SI and CI, the current study seeks to examine the potential lexical simplification, syntactic complexity and sequential organization mechanism with a self-made inter-model corpus of transcribed simultaneous and consecutive interpretation, translated speech and original speech texts with a total running word of 321960. The lexical features are extracted in terms of the lexical density, list head coverage, hapax legomena, and type-token ratio, as well as core vocabulary percentage. Dependency distance, an index for syntactic complexity and reflective of processing demand is employed. Frequency motif is a non-grammatically-bound sequential unit and is also used to visualize the local function distribution of interpreting the output. While SI is generally regarded as multitasking with high cognitive load, our findings evidently show that CI may impose heavier or taxing cognitive resource differently and hence yields more lexically and syntactically simplified output. In addition, the sequential features manifest that SI and CI organize the sequences from the source text in different ways into the output, to minimize the cognitive load respectively. We reasoned the results in the framework that cognitive demand is exerted both on maintaining and coordinating component of Working Memory. On the one hand, the information maintained in CI is inherently larger in volume compared to SI. On the other hand, time constraints directly influence the sentence reformulation process. The temporal pressure from the input in SI makes the interpreters only keep a small chunk of information in the focus of attention. Thus, SI interpreters usually produce the output by largely retaining the source structure so as to relieve the information from the working memory immediately after formulated in the target language. Conversely, CI interpreters receive at least a few sentences before reformulation, when they are more self-paced. CI interpreters may thus tend to retain and generate the information in a way to lessen the demand. In other words, interpreters cope with the high demand in the reformulation phase of CI by generating output with densely distributed function words, more content words of higher frequency values and fewer variations, simpler structures and more frequently used language sequences. We consequently propose a revised effort model based on the result for a better illustration of cognitive demand during both interpreting types.Keywords: cognitive demand, corpus-based, dependency distance, frequency motif, interpreting types, lexical simplification, sequential units distribution, syntactic complexity
Procedia PDF Downloads 1782762 Synthesis and Characterization of Novel Hollow Silica Particle through DODAB Vesicle Templating
Authors: Eun Ju Park, Wendy Rusli, He Tao, Alexander M. Van Herk, Sanggu Kim
Abstract:
Hollow micro-/nano- structured materials have proven to be promising in wide range of applications, such as catalysis, drug delivery and controlled release, biotechnology, and personal and consumer care. Hollow sphere structures can be obtained through various templating approaches; colloid templates, emulsion templates, multi-surfactant templates, and single crystal templates. Vesicles are generally the self-directed assemblies of amphiphilic molecules including cationic, anionic, and cationic surfactants in aqueous solutions. The directed silica capsule formations were performed at the surface of dioctadecyldimethylammoniumbromide(DODAB) bilayer vesicles as soft template. The size of DODAB bilayer vesicles could be tuned by extrusion of a preheated dispersion of DODAB. The synthesized hollow silica particles were characterized by conventional TEM, cryo-TEM and SEM to determine the morphology and structure of particles and dynamic light scattering (DLS) method to measure the particle size and particle size distribution.Keywords: characterization, DODAB, hollow silica particle, synthesis, vesicle
Procedia PDF Downloads 3072761 Integration GIS–SCADA Power Systems to Enclosure Air Dispersion Model
Authors: Ibrahim Shaker, Amr El Hossany, Moustafa Osman, Mohamed El Raey
Abstract:
This paper will explore integration model between GIS–SCADA system and enclosure quantification model to approach the impact of failure-safe event. There are real demands to identify spatial objects and improve control system performance. Nevertheless, the employed methodology is predicting electro-mechanic operations and corresponding time to environmental incident variations. Open processing, as object systems technology, is presented for integration enclosure database with minimal memory size and computation time via connectivity drivers such as ODBC:JDBC during main stages of GIS–SCADA connection. The function of Geographic Information System is manipulating power distribution in contrast to developing issues. In other ward, GIS-SCADA systems integration will require numerical objects of process to enable system model calibration and estimation demands, determine of past events for analysis and prediction of emergency situations for response training.Keywords: air dispersion model, environmental management, SCADA systems, GIS system, integration power system
Procedia PDF Downloads 3692760 Representativity Based Wasserstein Active Regression
Authors: Benjamin Bobbia, Matthias Picard
Abstract:
In recent years active learning methodologies based on the representativity of the data seems more promising to limit overfitting. The presented query methodology for regression using the Wasserstein distance measuring the representativity of our labelled dataset compared to the global distribution. In this work a crucial use of GroupSort Neural Networks is made therewith to draw a double advantage. The Wasserstein distance can be exactly expressed in terms of such neural networks. Moreover, one can provide explicit bounds for their size and depth together with rates of convergence. However, heterogeneity of the dataset is also considered by weighting the Wasserstein distance with the error of approximation at the previous step of active learning. Such an approach leads to a reduction of overfitting and high prediction performance after few steps of query. After having detailed the methodology and algorithm, an empirical study is presented in order to investigate the range of our hyperparameters. The performances of this method are compared, in terms of numbers of query needed, with other classical and recent query methods on several UCI datasets.Keywords: active learning, Lipschitz regularization, neural networks, optimal transport, regression
Procedia PDF Downloads 802759 Coaxial Helix Antenna for Microwave Coagulation Therapy in Liver Tissue Simulations
Authors: M. Chaichanyut, S. Tungjitkusolmun
Abstract:
This paper is concerned with microwave (MW) ablation for a liver cancer tissue by using helix antenna. The antenna structure supports the propagation of microwave energy at 2.45 GHz. A 1½ turn spiral catheter-based microwave antenna applicator has been developed. We utilize the three-dimensional finite element method (3D FEM) simulation to analyze where the tissue heat flux, lesion pattern and volume destruction during MW ablation. The configurations of helix antenna where Helix air-core antenna and Helix Dielectric-core antenna. The 3D FEMs solutions were based on Maxwell and bio-heat equations. The simulation protocol was power control (10 W, 300s). Our simulation result, both helix antennas have heat flux occurred around the helix antenna and that can be induced the temperature distribution similar (teardrop). The region where the temperature exceeds 50°C the microwave ablation was successful (i.e. complete destruction). The Helix air-core antenna and Helix Dielectric-core antenna, ablation zone or axial ratios (Widest/length) were respectively 0.82 and 0.85; the complete destructions were respectively 4.18 cm³ and 5.64 cm³.Keywords: liver cancer, Helix antenna, finite element, microwave ablation
Procedia PDF Downloads 3092758 Potentials for Change in the MENA Region: A Socioeconomic Perspective
Authors: Shaira Karishma Sheriff, Zarinah Hamid
Abstract:
The Arab Spring, which commenced during the end of 2010 and accelerated during 2011, was caused primarily due to poverty, unemployment and a general recession in the Middle East and North African (MENA) region. The core motivation of this revolution could be said to be the need for political, economic and social reforms that the region desires to experience. Though GDP growth has been significant in the region, the income distribution mechanism in MENA countries has been ineffective. This results in low levels of education, substandard health care facilities, unemployment, and poverty. This paper argues that MENA countries have great potential for experiencing socioeconomic development by being less dependent on oil exports and enhancing their services sector through better education which would eventually lead to job creation. Furthermore, the region can encourage better trade and political integration by forming transparent and accountable governments. The notion of Nation-State needs to be addressed and the countries in the region need to look for ways to develop effective supra-national institutions for better political and economic integration that goes beyond geographical borders.Keywords: political reforms, social reforms, economic development, nation-state, economic integration
Procedia PDF Downloads 4412757 Stochastic Analysis of Linux Operating System through Copula Distribution
Authors: Vijay Vir Singh
Abstract:
This work is focused studying the Linux operating system connected in a LAN (local area network). The STAR topology (to be called subsystem-1) and BUS topology (to be called subsystem-2) are taken into account, which are placed at two different locations and connected to a server through a hub. In the both topologies BUS topology and STAR topology, we have assumed n clients. The system has two types of failures i.e. partial failure and complete failure. Further, the partial failure has been categorized as minor and major partial failure. It is assumed that the minor partial failure degrades the sub-systems and the major partial failure make the subsystem break down mode. The system may completely fail due to failure of server hacking and blocking etc. The system is studied using supplementary variable technique and Laplace transform by using different types of failure and two types of repair. The various measures of reliability for example, availability of system, reliability of system, MTTF, profit function for different parametric values have been discussed.Keywords: star topology, bus topology, blocking, hacking, Linux operating system, Gumbel-Hougaard family copula, supplementary variable
Procedia PDF Downloads 3702756 Secure Network Coding against Content Pollution Attacks in Named Data Network
Authors: Tao Feng, Xiaomei Ma, Xian Guo, Jing Wang
Abstract:
Named Data Network (NDN) is one of the future Internet architecture, all nodes (i.e., hosts, routers) are allowed to have a local cache, used to satisfy incoming requests for content. However, depending on caching allows an adversary to perform attacks that are very effective and relatively easy to implement, such as content pollution attack. In this paper, we use a method of secure network coding based on homomorphic signature system to solve this problem. Firstly ,we use a dynamic public key technique, our scheme for each generation authentication without updating the initial secret key used. Secondly, employing the homomorphism of hash function, intermediate node and destination node verify the signature of the received message. In addition, when the network topology of NDN is simple and fixed, the code coefficients in our scheme are generated in a pseudorandom number generator in each node, so the distribution of the coefficients is also avoided. In short, our scheme not only can efficiently prevent against Intra/Inter-GPAs, but also can against the content poisoning attack in NDN.Keywords: named data networking, content polloution attack, network coding signature, internet architecture
Procedia PDF Downloads 3372755 Constructing White-Box Implementations Based on Threshold Shares and Composite Fields
Authors: Tingting Lin, Manfred von Willich, Dafu Lou, Phil Eisen
Abstract:
A white-box implementation of a cryptographic algorithm is a software implementation intended to resist extraction of the secret key by an adversary. To date, most of the white-box techniques are used to protect block cipher implementations. However, a large proportion of the white-box implementations are proven to be vulnerable to affine equivalence attacks and other algebraic attacks, as well as differential computation analysis (DCA). In this paper, we identify a class of block ciphers for which we propose a method of constructing white-box implementations. Our method is based on threshold implementations and operations in composite fields. The resulting implementations consist of lookup tables and few exclusive OR operations. All intermediate values (inputs and outputs of the lookup tables) are masked. The threshold implementation makes the distribution of the masked values uniform and independent of the original inputs, and the operations in composite fields reduce the size of the lookup tables. The white-box implementations can provide resistance against algebraic attacks and DCA-like attacks.Keywords: white-box, block cipher, composite field, threshold implementation
Procedia PDF Downloads 1682754 Foreign Tourists’ Attitude toward Service Marketing Mix and Intention to Revisit in Boutique Hotel
Authors: Nattapong Techarattanased
Abstract:
This survey research aimed to study the influence of attitude in services, product, and marketing mix affected intention to revisit in boutique hotel of foreign travelers in Bangkok, Thailand. The total 400 sets of closed-ended questionnaires were utilized for conducting data from foreign tourists who come to boutique hotel and can communicate in English. The descriptive statistics and multiple regression analysis were used to analyze data. The research found that tourists’ attitude towards the service of check in and check out process, food and beverage, guest room and other facilities affected in opportunity of revisiting, recommending to others and possibility of revisiting in the future at 0.05 statistically significant levels. Tourists’ attitude towards service and marketing mix in term of people, physical evidence, price, process and channel of distribution could forecast intention to revisit in term of recommending to others and intention to revisit in the future at 0.05 statistically significant levels.Keywords: boutique hotel, foreign tourists, intention to revisit, service marketing mix
Procedia PDF Downloads 2472753 Analysis of Sediment Distribution around Karang Sela Coral Reef Using Multibeam Backscatter
Authors: Razak Zakariya, Fazliana Mustajap, Lenny Sharinee Sakai
Abstract:
A sediment map is quite important in the marine environment. The sediment itself contains thousands of information that can be used for other research. This study was conducted by using a multibeam echo sounder Reson T20 on 15 August 2020 at the Karang Sela (coral reef area) at Pulau Bidong. The study aims to identify the sediment type around the coral reef by using bathymetry and backscatter data. The sediment in the study area was collected as ground truthing data to verify the classification of the seabed. A dry sieving method was used to analyze the sediment sample by using a sieve shaker. PDS 2000 software was used for data acquisition, and Qimera QPS version 2.4.5 was used for processing the bathymetry data. Meanwhile, FMGT QPS version 7.10 processes the backscatter data. Then, backscatter data were analyzed by using the maximum likelihood classification tool in ArcGIS version 10.8 software. The result identified three types of sediments around the coral which were very coarse sand, coarse sand, and medium sand.Keywords: sediment type, MBES echo sounder, backscatter, ArcGIS
Procedia PDF Downloads 862752 Fault Detection of Pipeline in Water Distribution Network System
Authors: Shin Je Lee, Go Bong Choi, Jeong Cheol Seo, Jong Min Lee, Gibaek Lee
Abstract:
Water pipe network is installed underground and once equipped; it is difficult to recognize the state of pipes when the leak or burst happens. Accordingly, post management is often delayed after the fault occurs. Therefore, the systematic fault management system of water pipe network is required to prevent the accident and minimize the loss. In this work, we develop online fault detection system of water pipe network using data of pipes such as flow rate or pressure. The transient model describing water flow in pipelines is presented and simulated using Matlab. The fault situations such as the leak or burst can be also simulated and flow rate or pressure data when the fault happens are collected. Faults are detected using statistical methods of fast Fourier transform and discrete wavelet transform, and they are compared to find which method shows the better fault detection performance.Keywords: fault detection, water pipeline model, fast Fourier transform, discrete wavelet transform
Procedia PDF Downloads 5122751 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit
Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic
Abstract:
Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method
Procedia PDF Downloads 1202750 Study of Flow-Induced Noise Control Effects on Flat Plate through Biomimetic Mucus Injection
Authors: Chen Niu, Xuesong Zhang, Dejiang Shang, Yongwei Liu
Abstract:
Fishes can secrete high molecular weight fluid on their body skin to enable their rapid movement in the water. In this work, we employ a hybrid method that combines Computational Fluid Dynamics (CFD) and Finite Element Method (FEM) to investigate the effects of different mucus viscosities and injection velocities on fluctuation pressure in the boundary layer and flow-induced structural vibration noise of a flat plate model. To accurately capture the transient flow distribution on the plate surface, we use Large Eddy Simulation (LES) while the mucus inlet is positioned at a sufficient distance from the model to ensure effective coverage. Mucus injection is modeled using the Volume of Fluid (VOF) method for multiphase flow calculations. The results demonstrate that mucus control of pulsating pressure effectively reduces flow-induced structural vibration noise, providing an approach for controlling flow-induced noise in underwater vehicles.Keywords: mucus, flow control, noise control, flow-induced noise
Procedia PDF Downloads 1452749 Improvement of the 3D Finite Element Analysis of High Voltage Power Transformer Defects in Time Domain
Authors: M. Rashid Hussain, Shady S. Refaat
Abstract:
The high voltage power transformer is the most essential part of the electrical power utilities. Reliability on the transformers is the utmost concern, and any failure of the transformers can lead to catastrophic losses in electric power utility. The causes of transformer failure include insulation failure by partial discharge, core and tank failure, cooling unit failure, current transformer failure, etc. For the study of power transformer defects, finite element analysis (FEA) can provide valuable information on the severity of defects. FEA provides a more accurate representation of complex geometries because they consider thermal, electrical, and environmental influences on the insulation models to obtain basic characteristics of the insulation system during normal and partial discharge conditions. The purpose of this paper is the time domain analysis of defects 3D model of high voltage power transformer using FEA to study the electric field distribution at different points on the defects.Keywords: power transformer, finite element analysis, dielectric response, partial discharge, insulation
Procedia PDF Downloads 1572748 An Authentication Protocol for Quantum Enabled Mobile Devices
Authors: Natarajan Venkatachalam, Subrahmanya V. R. K. Rao, Vijay Karthikeyan Dhandapani, Swaminathan Saravanavel
Abstract:
The quantum communication technology is an evolving design which connects multiple quantum enabled devices to internet for secret communication or sensitive information exchange. In future, the number of these compact quantum enabled devices will increase immensely making them an integral part of present communication systems. Therefore, safety and security of such devices is also a major concern for us. To ensure the customer sensitive information will not be eavesdropped or deciphered, we need a strong authentications and encryption mechanism. In this paper, we propose a mutual authentication scheme between these smart quantum devices and server based on the secure exchange of information through quantum channel which gives better solutions for symmetric key exchange issues. An important part of this work is to propose a secure mutual authentication protocol over the quantum channel. We show that our approach offers robust authentication protocol and further our solution is lightweight, scalable, cost-effective with optimized computational processing overheads.Keywords: quantum cryptography, quantum key distribution, wireless quantum communication, authentication protocol, quantum enabled device, trusted third party
Procedia PDF Downloads 1742747 The Charge Exchange and Mixture Formation Model in the ASz-62IR Radial Aircraft Engine
Authors: Pawel Magryta, Tytus Tulwin, Paweł Karpiński
Abstract:
The ASz62IR engine is a radial aircraft engine with 9 cylinders. This object is produced by the Polish company WSK "PZL-KALISZ" S.A. This is engine is currently being developed by the above company and Lublin University of Technology. In order to provide an effective work of the technological development of this unit it was decided to made the simulation model. The model of ASz-62IR was developed with AVL BOOST software which is a tool dedicated to the one-dimensional modeling of internal combustion engines. This model can be used to calculate parameters of an air and fuel flow in an intake system including charging devices as well as combustion and exhaust flow to the environment. The main purpose of this model is the analysis of the charge exchange and mixture formation in this engine. For this purpose, the model consists of elements such: as air inlet, throttle system, compressor connector, charging compressor, inlet pipes and injectors, outlet pipes, fuel injection and model of fuel mixing and evaporation. The model of charge exchange and mixture formation was based on the model of mass flow rate in intake and exhaust pipes, and also on the calculation of gas properties values like gas constant or thermal capacity. This model was based on the equations to describe isentropic flow. The energy equation to describe flow under steady conditions was transformed into the mass flow equation. In the model the flow coefficient μσ was used, that varies with the stroke/valve opening and was determined in a steady flow state. The geometry of the inlet channels and other key components was mapped with reference to the technical documentation of the engine and empirical measurements of the structure elements. The volume of elements on the charge flow path between the air inlet and the exhaust outlet was measured by the CAD mapping of the structure. Taken from the technical documentation, the original characteristics of the compressor engine was entered into the model. Additionally, the model uses a general model for the transport of chemical compounds of the mixture. There are 7 compounds used, i.e. fuel, O2, N2, CO2, H2O, CO, H2. A gasoline fuel of a calorific value of 43.5 MJ/kg and an air mass fraction for stoichiometric mixture of 14.5 were used. Indirect injection into the intake manifold is used in this model. The model assumes the following simplifications: the mixture is homogenous at the beginning of combustion, accordingly, mixture stoichiometric coefficient A/F remains constant during combustion, combusted and non-combusted charges show identical pressures and temperatures although their compositions change. As a result of the simulation studies based on the model described above, the basic parameters of combustion process, charge exchange, mixture formation in cylinders were obtained. The AVL Boost software is very useful for the piston engine performance simulations. This work has been financed by the Polish National Centre for Research and Development, INNOLOT, under Grant Agreement No. INNOLOT/I/1/NCBR/2013.Keywords: aviation propulsion, AVL Boost, engine model, charge exchange, mixture formation
Procedia PDF Downloads 3382746 Finite Element Simulation for Preliminary Study on Microorganism Detection System
Authors: Muhammad Rosli Abdullah, Noor Hasmiza Harun
Abstract:
A microorganism detection system has a potential to be used with the advancement in a biosensor development. The detection system requires an optical sensing system, microfluidic device and biological reagent. Although, the biosensors are available in the market, a label free and a lab-on-chip approach will promote a flexible solution. As a preliminary study of microorganism detection, three mechanisms such as Total Internal Reflection (TIR), Micro Fluidic Channel (MFC) and magnetic-electric field propagation were study and simulated. The objective are to identify the TIR angle, MFC parabolic flow and the wavelength for the microorganism detection. The simulation result indicates that evanescent wave is achieved when TIR angle > 42°, the corner and centre of a parabolic velocity are 0.02 m/s and 0.06 m/s respectively, and a higher energy distribution of a perfect electromagnetic scattering with dipole resonance radiation occurs at 500 nm. This simulation is beneficial to determine the components of the microorganism detection system that does not rely on classical microbiological, immunological and genetic methods which are laborious, time-consuming procedures and confined to specialized laboratories with expensive instrumentation equipment.Keywords: microorganism, microfluidic, total internal reflection, lab on chip
Procedia PDF Downloads 2772745 Performance Degradation for the GLR Test-Statistics for Spatial Signal Detection
Authors: Olesya Bolkhovskaya, Alexander Maltsev
Abstract:
Antenna arrays are widely used in modern radio systems in sonar and communications. The solving of the detection problems of a useful signal on the background of noise is based on the GLRT method. There is a large number of problem which depends on the known a priori information. In this work, in contrast to the majority of already solved problems, it is used only difference spatial properties of the signal and noise for detection. We are analyzing the influence of the degree of non-coherence of signal and noise unhomogeneity on the performance characteristics of different GLRT statistics. The description of the signal and noise is carried out by means of the spatial covariance matrices C in the cases of different number of known information. The partially coherent signal is simulated as a plane wave with a random angle of incidence of the wave concerning a normal. Background noise is simulated as random process with uniform distribution function in each element. The results of investigation of degradation of performance characteristics for different cases are represented in this work.Keywords: GLRT, Neumann-Pearson’s criterion, Test-statistics, degradation, spatial processing, multielement antenna array
Procedia PDF Downloads 3852744 Banking and Accounting Analysis Researches Effect on Environment and Income
Authors: Gerges Samaan Henin Abdalla
Abstract:
New methods of providing banking services to the customer have been introduced, such as online banking. Banks have begun to consider electronic banking (e-banking) as a way to replace some traditional branch functions by using the Internet as a new distribution channel. Some consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. Not only is it time consuming, but it is also a repeatable activity with a certain frequency. To solve this problem, the concept of account aggregation was added as a solution. Account consolidation in e-banking as a form of electronic banking appears to build a stronger relationship with customers. An account linking service is generally referred to as a service that allows customers to manage their bank accounts held at different institutions via a common online banking platform that places a high priority on security and data protection. The article provides an overview of the account aggregation approach in e-banking as a new service in the area of e-banking.Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, Internet banks, modernization of banks, banks, account aggregation, security, enterprise development
Procedia PDF Downloads 572743 Research on the Development and Space Optimization of Rental-Type Public Housing in Hangzhou
Authors: Xuran Zhang, Huiru Chen
Abstract:
In recent years, China has made great efforts to cultivate and develop the housing rental market, especially the rental-type public housing, which has been paid attention to by all sectors of the society. This paper takes Hangzhou rental-type public housing as the research object, and divides it into three development stages according to the different supply modes of rental-type public housing. Through data collection and field research, the paper summarizes the spatial characteristics of rental-type public housing from the five perspectives of spatial planning, spatial layout, spatial integration, spatial organization and spatial configuration. On this basis, the paper proposes the optimization of the spatial layout. The study concludes that the spatial layout of rental-type public housing should be coordinated with the development of urban planning. When planning and constructing, it is necessary to select more mixed construction modes, to be properly centralized, and to improve the surrounding transportation service facilities. It is hoped that the recommendations in this paper will provide a reference for the further development of rental-type public housing in Hangzhou.Keywords: Hangzhou, rental-type public housing, spatial distribution, spatial optimization
Procedia PDF Downloads 3232742 Fuzzy Time Series- Markov Chain Method for Corn and Soybean Price Forecasting in North Carolina Markets
Authors: Selin Guney, Andres Riquelme
Abstract:
Among the main purposes of optimal and efficient forecasts of agricultural commodity prices is to guide the firms to advance the economic decision making process such as planning business operations and marketing decisions. Governments are also the beneficiaries and suppliers of agricultural price forecasts. They use this information to establish a proper agricultural policy, and hence, the forecasts affect social welfare and systematic errors in forecasts could lead to a misallocation of scarce resources. Various empirical approaches have been applied to forecast commodity prices that have used different methodologies. Most commonly-used approaches to forecast commodity sectors depend on classical time series models that assume values of the response variables are precise which is quite often not true in reality. Recently, this literature has mostly evolved to a consideration of fuzzy time series models that provide more flexibility in terms of the classical time series models assumptions such as stationarity, and large sample size requirement. Besides, fuzzy modeling approach allows decision making with estimated values under incomplete information or uncertainty. A number of fuzzy time series models have been developed and implemented over the last decades; however, most of them are not appropriate for forecasting repeated and nonconsecutive transitions in the data. The modeling scheme used in this paper eliminates this problem by introducing Markov modeling approach that takes into account both the repeated and nonconsecutive transitions. Also, the determination of length of interval is crucial in terms of the accuracy of forecasts. The problem of determining the length of interval arbitrarily is overcome and a methodology to determine the proper length of interval based on the distribution or mean of the first differences of series to improve forecast accuracy is proposed. The specific purpose of this paper is to propose and investigate the potential of a new forecasting model that integrates methodologies for determining the proper length of interval based on the distribution or mean of the first differences of series and Fuzzy Time Series- Markov Chain model. Moreover, the accuracy of the forecasting performance of proposed integrated model is compared to different univariate time series models and the superiority of proposed method over competing methods in respect of modelling and forecasting on the basis of forecast evaluation criteria is demonstrated. The application is to daily corn and soybean prices observed at three commercially important North Carolina markets; Candor, Cofield and Roaring River for corn and Fayetteville, Cofield and Greenville City for soybeans respectively. One main conclusion from this paper is that using fuzzy logic improves the forecast performance and accuracy; the effectiveness and potential benefits of the proposed model is confirmed with small selection criteria value such MAPE. The paper concludes with a discussion of the implications of integrating fuzzy logic and nonarbitrary determination of length of interval for the reliability and accuracy of price forecasts. The empirical results represent a significant contribution to our understanding of the applicability of fuzzy modeling in commodity price forecasts.Keywords: commodity, forecast, fuzzy, Markov
Procedia PDF Downloads 2172741 Deformation and Strength of Heat-Shielding Materials in a Long-Term Storage of Aircraft
Authors: Lyudmila L. Gracheva
Abstract:
Thermal shield is a multi-layer structure that consists of layers made of different materials. The use of composite materials (CM) reinforced with carbon fibers in rocket technologies (shells, bearings, wings, fairings, inter-step compartments, etc.) is due to a possibility of reducing the weight while increasing a structural strength. Structures made of a unidirectional carbon fiber reinforced plastic based on an epoxy resin are used as load-bearing skins for aircraft fairings. The results of an experimental study of the physical and mechanical properties of epoxy carbon fiber reinforced plastics depending on temperature for different storage times of products are presented. With an increasing temperature, the physical and mechanical properties of CM are determined by the thermal and deformation properties of the components and the geometry of their distribution. Samples for the study were cut from natural skins of the head fairings.Keywords: composite material, thermal deformation, carbon fiber, heat shield, epoxy resin, thermal expansion
Procedia PDF Downloads 57