Search results for: point estimate method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23853

Search results for: point estimate method

22863 Exposure Assessment for Worker Exposed to Heavy Metals during Road Marking Operations

Authors: Yin-Hsuan Wu, Perng-Jy Tsai, Ying-Fang Wang, Shun-Hui Chung

Abstract:

The present study was conducted to characterize exposure concentrations, concentrations deposited on the different respiratory regions, and resultant health risks associated with heavy metal exposures for road marking workers. Road marking workers of three similar exposure groups (SEGs) were selected, including the paint pouring worker, marking worker, and preparing worker. Personal exposure samples were collected using an inhalable dust sampler (IOM), and the involved particle size distribution samples were estimated using an eight-stage Marple personal cascade impactor during five working days. In total, 25 IOM samples and 20 Marple samples were collected. All collected samples were analyzed for their heavy metal contents using the ICP/MS. The resultant heavy metal particle size distributions were also used to estimate the fractions of particle deposited on the head airways (Chead), tracheobronchial (Cthorac) and alveolar regions (Cresp) of the exposed workers. In addition, Pb and Cr were selected to estimate the incremental cancer risk, and Zn, Ti, and Mo were selected to estimate the corresponding non-cancer risk in the present study. Results show that three heavy metals, including Pb, Cr, and Ti, were found with the highest concentrations for the SEG of the paint pouring worker (=0.585±2.98, 0.307±1.71, 0.902±2.99 μg/m³, respectively). For the fraction of heavy metal particle deposited on the respiratory tract, both alveolar and head regions were found with the highest values (=23-43% and 39-61%, respectively). For both SEGs of the paint pouring and marking, 51% of Cr, 59-61% of Zn, and 48-51% of Ti were found to be deposited on the alveolar region, and 41-43% of Pb was deposited on the head region. Finally, the incremental cancer risk for the SEGs of the paint pouring, marking, and preparing were found as 1.08×10⁻⁵, 2.78×10⁻⁶, and 2.20×10⁻⁶, respectively. In addition, the estimated non-cancer risk for the above three SEGs was found to be consistently less than unity. In conclusion, though the estimated non-cancer risk was less than unity, all resultant incremental cancer risk was greater than 10⁻⁶ indicating the abatement of workers’ exposure is necessary. It is suggested that strategies, including placing on the molten kettle, substitution the currently used paints for less heavy metal containing paints, and wearing fume protecting personal protective equipment can be considered in the future from reducing the worker’s exposure aspect.

Keywords: health risk assessment, heavy metal, respiratory track deposition, road marking

Procedia PDF Downloads 169
22862 A Data Driven Methodological Approach to Economic Pre-Evaluation of Reuse Projects of Ancient Urban Centers

Authors: Pietro D'Ambrosio, Roberta D'Ambrosio

Abstract:

The upgrading of the architectural and urban heritage of the urban historic centers almost always involves the planning for the reuse and refunctionalization of the structures. Such interventions have complexities linked to the need to take into account the urban and social context in which the structure and its intrinsic characteristics such as historical and artistic value are inserted. To these, of course, we have to add the need to make a preliminary estimate of recovery costs and more generally to assess the economic and financial sustainability of the whole project of re-socialization. Particular difficulties are encountered during the pre-assessment of costs since it is often impossible to perform analytical surveys and structural tests for both structural conditions and obvious cost and time constraints. The methodology proposed in this work, based on a multidisciplinary and data-driven approach, is aimed at obtaining, at very low cost, reasonably priced economic evaluations of the interventions to be carried out. In addition, the specific features of the approach used, derived from the predictive analysis techniques typically applied in complex IT domains (big data analytics), allow to obtain as a result indirectly the evaluation process of a shared database that can be used on a generalized basis to estimate such other projects. This makes the methodology particularly indicated in those cases where it is expected to intervene massively across entire areas of historical city centers. The methodology has been partially tested during a study aimed at assessing the feasibility of a project for the reuse of the monumental complex of San Massimo, located in the historic center of Salerno, and is being further investigated.

Keywords: evaluation, methodology, restoration, reuse

Procedia PDF Downloads 187
22861 Three-Level Converters Back-To-Back DC Bus Control for Torque Ripple Reduction of Induction Motor

Authors: T. Abdelkrim, K. Benamrane, B. Bezza, Aeh Benkhelifa, A. Borni

Abstract:

This paper proposes a regulation method of back-to-back connected three-level converters in order to reduce the torque ripple in induction motor. First part is dedicated to the presentation of the feedback control of three-level PWM rectifier. In the second part, three-level NPC voltage source inverter balancing DC bus algorithm is presented. A theoretical analysis with a complete simulation of the system is presented to prove the excellent performance of the proposed technique.

Keywords: back-to-back connection, feedback control, neutral-point balance, three-level converter, torque ripple

Procedia PDF Downloads 497
22860 Numerical Solutions of Generalized Burger-Fisher Equation by Modified Variational Iteration Method

Authors: M. O. Olayiwola

Abstract:

Numerical solutions of the generalized Burger-Fisher are obtained using a Modified Variational Iteration Method (MVIM) with minimal computational efforts. The computed results with this technique have been compared with other results. The present method is seen to be a very reliable alternative method to some existing techniques for such nonlinear problems.

Keywords: burger-fisher, modified variational iteration method, lagrange multiplier, Taylor’s series, partial differential equation

Procedia PDF Downloads 430
22859 Marketing of Turkish Films by Crowdfunding

Authors: Nurdan Tumbek Tekeoglu

Abstract:

With rising importance in all over the world, crowdfunding has become a new financing and marketing method for film industry. Crowdfunding is a new practice in film industry for funding a film project by raising monetary contributions from a large group of people. By crowdfunding an estimate fund of 20 billion USD has been raised in 2015. Through the crowdfunding platforms not only the film makers, but also the entrepreneurs and nongovernmental organizations finance and market their projects. Among the prominent crowdfunding platforms in Turkey, we can list Crowdfon, Fonlabeni, Kickstarter, Indiego, Bi Ayda, and Fongogo platforms. In 2014 the Turkish film industry celebrated its 100th anniversary and reached its peak producing around 150-200 films a year reminding the brilliant years of Yesilcam period. In general feature films apply for crowdfunding. Until April 2015 more than 190 films applied for crowdfunding platforms. Crowdfunding has a promising future in Turkey, since donation traditions has an important place in Turkish culture traditionally. This paper is exploring the marketing of the crowdfunding platforms established in Turkey in order for the films meet their target groups during the pre-production period.

Keywords: crowdfunding, marketing of films, Turkey, Turkish film industry

Procedia PDF Downloads 352
22858 Anthraquinone Labelled DNA for Direct Detection and Discrimination of Closely Related DNA Targets

Authors: Sarah A. Goodchild, Rachel Gao, Philip N. Bartlett

Abstract:

A novel detection approach using immobilized DNA probes labeled with Anthraquinone (AQ) as an electrochemically active reporter moiety has been successfully developed as a new, simple, reliable method for the detection of DNA. This method represents a step forward in DNA detection as it can discriminate between multiple nucleotide polymorphisms within target DNA strands without the need for any additional reagents, reporters or processes such as melting of DNA strands. The detection approach utilizes single-stranded DNA probes immobilized on gold surfaces labeled at the distal terminus with AQ. The effective immobilization has been monitored using techniques such as AC impedance and Raman spectroscopy. Simple voltammetry techniques (Differential Pulse Voltammetry, Cyclic Voltammetry) are then used to monitor the reduction potential of the AQ before and after the addition of complementary strand of target DNA. A reliable relationship between the shift in reduction potential and the number of base pair mismatch has been established and can be used to discriminate between DNA from highly related pathogenic organisms of clinical importance. This indicates that this approach may have great potential to be exploited within biosensor kits for detection and diagnosis of pathogenic organisms in Point of Care devices.

Keywords: Anthraquinone, discrimination, DNA detection, electrochemical biosensor

Procedia PDF Downloads 393
22857 A Lightweight Authentication and Key Exchange Protocol Design for Smart Homes

Authors: Zhifu Li, Lei Li, Wanting Zhou, Yuanhang He

Abstract:

This paper proposed a lightweight certificate-less authentication and key exchange protocol (Light-CL-PKC) based on elliptic curve cryptography and the Chinese Remainder Theorem for smart home scenarios. Light-CL-PKC can efficiently reduce the computational cost of both sides of authentication by forgoing time-consuming bilinear pair operations and making full use of point-addition and point-multiplication operations on elliptic curves. The authentication and key exchange processes in this system are also completed in a a single round of communication between the two parties. The analysis result demonstrates that it can significantly minimize the communication overhead of more than 32.14% compared with the referenced protocols, while the runtime for both authentication and key exchange have also been significantly reduced.

Keywords: authentication, key exchange, certificateless public key cryptography, elliptic curve cryptography

Procedia PDF Downloads 98
22856 Efficient Antenna Array Beamforming with Robustness against Random Steering Mismatch

Authors: Ju-Hong Lee, Ching-Wei Liao, Kun-Che Lee

Abstract:

This paper deals with the problem of using antenna sensors for adaptive beamforming in the presence of random steering mismatch. We present an efficient adaptive array beamformer with robustness to deal with the considered problem. The robustness of the proposed beamformer comes from the efficient designation of the steering vector. Using the received array data vector, we construct an appropriate correlation matrix associated with the received array data vector and a correlation matrix associated with signal sources. Then, the eigenvector associated with the largest eigenvalue of the constructed signal correlation matrix is designated as an appropriate estimate of the steering vector. Finally, the adaptive weight vector required for adaptive beamforming is obtained by using the estimated steering vector and the constructed correlation matrix of the array data vector. Simulation results confirm the effectiveness of the proposed method.

Keywords: adaptive beamforming, antenna array, linearly constrained minimum variance, robustness, steering vector

Procedia PDF Downloads 199
22855 Real-Time Compressive Strength Monitoring for NPP Concrete Construction Using an Embedded Piezoelectric Self-Sensing Technique

Authors: Junkyeong Kim, Seunghee Park, Ju-Won Kim, Myung-Sug Cho

Abstract:

Recently, demands for the construction of Nuclear Power Plants (NPP) using high strength concrete (HSC) has been increased. However, HSC might be susceptible to brittle fracture if the curing process is inadequate. To prevent unexpected collapse during and after the construction of HSC structures, it is essential to confirm the strength development of HSC during the curing process. However, several traditional strength-measuring methods are not effective and practical. In this study, a novel method to estimate the strength development of HSC based on electromechanical impedance (EMI) measurements using an embedded piezoelectric sensor is proposed. The EMI of NPP concrete specimen was tracked to monitor the strength development. In addition, cross-correlation coefficient was applied in sequence to examine the trend of the impedance variations more quantitatively. The results confirmed that the proposed technique can be applied successfully monitoring of the strength development during the curing process of HSC structures.

Keywords: concrete curing, embedded piezoelectric sensor, high strength concrete, nuclear power plant, self-sensing impedance

Procedia PDF Downloads 515
22854 Spectral Domain Fast Multipole Method for Solving Integral Equations of One and Two Dimensional Wave Scattering

Authors: Mohammad Ahmad, Dayalan Kasilingam

Abstract:

In this paper, a spectral domain implementation of the fast multipole method is presented. It is shown that the aggregation, translation, and disaggregation stages of the fast multipole method (FMM) can be performed using the spectral domain (SD) analysis. The spectral domain fast multipole method (SD-FMM) has the advantage of eliminating the near field/far field classification used in conventional FMM formulation. The study focuses on the application of SD-FMM to one-dimensional (1D) and two-dimensional (2D) electric field integral equation (EFIE). The case of perfectly conducting strip, circular and square cylinders are numerically analyzed and compared with the results from the standard method of moments (MoM).

Keywords: electric field integral equation, fast multipole method, method of moments, wave scattering, spectral domain

Procedia PDF Downloads 406
22853 Stability Analysis of Stagnation-Point Flow past a Shrinking Sheet in a Nanofluid

Authors: Amin Noor, Roslinda Nazar, Norihan Md. Arifin

Abstract:

In this paper, a numerical and theoretical study has been performed for the stagnation-point boundary layer flow and heat transfer towards a shrinking sheet in a nanofluid. The mathematical nanofluid model in which the effect of the nanoparticle volume fraction is taken into account is considered. The governing nonlinear partial differential equations are transformed into a system of nonlinear ordinary differential equations using a similarity transformation which is then solved numerically using the function bvp4c from Matlab. Numerical results are obtained for the skin friction coefficient, the local Nusselt number as well as the velocity and temperature profiles for some values of the governing parameters, namely the nanoparticle volume fraction Φ, the shrinking parameter λ and the Prandtl number Pr. Three different types of nanoparticles are considered, namely Cu, Al2O3 and TiO2. It is found that solutions do not exist for larger shrinking rates and dual (upper and lower branch) solutions exist when λ < -1.0. A stability analysis has been performed to show which branch solutions are stable and physically realizable. It is also found that the upper branch solutions are stable while the lower branch solutions are unstable.

Keywords: heat transfer, nanofluid, shrinking sheet, stability analysis, stagnation-point flow

Procedia PDF Downloads 381
22852 Analytical Method Development and Validation of Stability Indicating Rp - Hplc Method for Detrmination of Atorvastatin and Methylcobalamine

Authors: Alkaben Patel

Abstract:

The proposed RP-HPLC method is easy, rapid, economical, precise and accurate stability indicating RP-HPLC method for simultaneous estimation of Astorvastatin and Methylcobalamine in their combined dosage form has been developed.The separation was achieved by LC-20 AT C18(250mm*4.6mm*2.6mm)Colum and water (pH 3.5): methanol 70:30 as mobile phase, at a flow rate of 1ml/min. wavelength of this dosage form is 215nm.The drug is related to stress condition of hydrolysis, oxidation, photolysis and thermal degradation.

Keywords: RP- HPLC, atorvastatin, methylcobalamine, method, development, validation

Procedia PDF Downloads 335
22851 Statistical Study and Simulation of 140 Kv X– Ray Tube by Monte Carlo

Authors: Mehdi Homayouni, Karim Adinehvand, Bakhtiar Azadbakht

Abstract:

In this study, we used Monte Carlo code (MCNP4C) that is a general method, for simulation, electron source and electric field, a disc source with 0.05 cm radius in direct of anode are used, radius of disc source show focal spot of X-ray tube that here is 0.05 cm. In this simulation, the anode is from tungsten with 18.9 g/cm3 density and angle of the anode is 18°. We simulated X-ray tube for 140 kv. For increasing of speed data acquisition, we use F5 tally. With determination the exact position of F5 tally in the program, outputs are acquired. In this spectrum the start point is about 0.02 Mev, the absorption edges are about 0.06 Mev and 0.07 Mev, and average energy is about 0.05 Mev.

Keywords: X-spectrum, simulation, Monte Carlo, tube

Procedia PDF Downloads 722
22850 Correlation between Cephalometric Measurements and Visual Perception of Facial Profile in Skeletal Type II Patients

Authors: Choki, Supatchai Boonpratham, Suwannee Luppanapornlarp

Abstract:

The objective of this study was to find a correlation between cephalometric measurements and visual perception of facial profile in skeletal type II patients. In this study, 250 lateral cephalograms of female patients from age, 20 to 22 years were analyzed. The profile outlines of all the samples were hand traced and transformed into silhouettes by the principal investigator. Profile ratings were done by 9 orthodontists on Visual Analogue Scale from score one to ten (increasing level of convexity). 37 hard issue and soft tissue cephalometric measurements were analyzed by the principal investigator. All the measurements were repeated after 2 weeks interval for error assessment. At last, the rankings of visual perceptions were correlated with cephalometric measurements using Spearman correlation coefficient (P < 0.05). The results show that the increase in facial convexity was correlated with higher values of ANB (A point, nasion and B point), AF-BF (distance from A point to B point in mm), L1-NB (distance from lower incisor to NB line in mm), anterior maxillary alveolar height, posterior maxillary alveolar height, overjet, H angle hard tissue, H angle soft tissue and lower lip to E plane (absolute correlation values from 0.277 to 0.711). In contrast, the increase in facial convexity was correlated with lower values of Pg. to N perpendicular and Pg. to NB (mm) (absolute correlation value -0.302 and -0.294 respectively). From the soft tissue measurements, H angles had a higher correlation with visual perception than facial contour angle, nasolabial angle, and lower lip to E plane. In conclusion, the findings of this study indicated that the correlation of cephalometric measurements with visual perception was less than expected. Only 29% of cephalometric measurements had a significant correlation with visual perception. Therefore, diagnosis based solely on cephalometric analysis can result in failure to meet the patient’s esthetic expectation.

Keywords: cephalometric measurements, facial profile, skeletal type II, visual perception

Procedia PDF Downloads 138
22849 Poster : Incident Signals Estimation Based on a Modified MCA Learning Algorithm

Authors: Rashid Ahmed , John N. Avaritsiotis

Abstract:

Many signal subspace-based approaches have already been proposed for determining the fixed Direction of Arrival (DOA) of plane waves impinging on an array of sensors. Two procedures for DOA estimation based neural networks are presented. First, Principal Component Analysis (PCA) is employed to extract the maximum eigenvalue and eigenvector from signal subspace to estimate DOA. Second, minor component analysis (MCA) is a statistical method of extracting the eigenvector associated with the smallest eigenvalue of the covariance matrix. In this paper, we will modify a Minor Component Analysis (MCA(R)) learning algorithm to enhance the convergence, where a convergence is essential for MCA algorithm towards practical applications. The learning rate parameter is also presented, which ensures fast convergence of the algorithm, because it has direct effect on the convergence of the weight vector and the error level is affected by this value. MCA is performed to determine the estimated DOA. Preliminary results will be furnished to illustrate the convergences results achieved.

Keywords: Direction of Arrival, neural networks, Principle Component Analysis, Minor Component Analysis

Procedia PDF Downloads 451
22848 Microgravity, Hydrological and Metrological Monitoring of Shallow Ground Water Aquifer in Al-Ain, UAE

Authors: Serin Darwish, Hakim Saibi, Amir Gabr

Abstract:

The United Arab Emirates (UAE) is situated within an arid zone where the climate is arid and the recharge of the groundwater is very low. Groundwater is the primary source of water in the United Arab Emirates. However, rapid expansion, population growth, agriculture, and industrial activities have negatively affected these limited water resources. The shortage of water resources has become a serious concern due to the over-pumping of groundwater to meet demand. In addition to the deficit of groundwater, the UAE has one of the highest per capita water consumption rates in the world. In this study, a combination of time-lapse measurements of microgravity and depth to groundwater level in selected wells in Al Ain city was used to estimate the variations in groundwater storage. Al-Ain is the second largest city in Abu Dhabi Emirates and the third largest city in the UAE. The groundwater in this region has been overexploited. Relative gravity measurements were acquired using the Scintrex CG-6 Autograv. This latest generation gravimeter from Scintrex Ltd provides fast, precise gravity measurements and automated corrections for temperature, tide, instrument tilt and rejection of data noise. The CG-6 gravimeter has a resolution of 0.1μGal. The purpose of this study is to measure the groundwater storage changes in the shallow aquifers based on the application of microgravity method. The gravity method is a nondestructive technique that allows collection of data at almost any location over the aquifer. Preliminary results indicate a possible relationship between microgravity and water levels, but more work needs to be done to confirm this. The results will help to develop the relationship between monthly microgravity changes with hydrological and hydrogeological changes of shallow phreatic. The study will be useful in water management considerations and additional future investigations.

Keywords: Al-Ain, arid region, groundwater, microgravity

Procedia PDF Downloads 152
22847 Analytical Solution for End Depth Ratio in Rectangular Channels

Authors: Abdulrahman Abdulrahman, Abir Abdulrahman

Abstract:

Free over-fall is an instrument for measuring discharge in open channels by measuring end depth. A comprehensive researchers investigated theoretically and experimentally brink phenomenon with various approaches for different cross-sectional shapes. Anderson's method, based on Boussinq's approximation and energy approach was used to derive a pressure distribution factor at end depth. Applying the one-dimensional momentum equation and the principles of limit slope analysis, a relevant analytical solution may be derived for brink depth ratio (EDR) in prismatic rectangular channel. Also relationships between end depth ratio and slope ratio for a given non-dimensional normal or critical depth with upstream supercritical flow regime are presented. Simple indirect procedure is used to estimate the end depth discharge ratio (EDD) for subcritical and supercritical flow using measured end depth. The comparison of this analysis with all previous theoretical and experimental studies showed an excellent agreement.

Keywords: analytical solution, brink depth, end depth, flow measurement, free over fall, hydraulics, rectangular channel

Procedia PDF Downloads 181
22846 The Role of Academic Leaders at Jerash University in Crises Management 'Virus Corona as a Model'

Authors: Khaled M. Hama, Mohammed Al Magableh, Zaid Al Kuri, Ahmad Qayam

Abstract:

The study aimed to identify the role of academic leaders at Jerash University in crisis management from the faculty members' point of view, ‘the emerging Corona pandemic as a model’, as well as to identify the differences in the role of academic leaders at Jerash University in crisis management at the significance level (0.05 ≤ α) according to the study variables Gender Academic rank, years of experience, and identifying proposals that contribute to developing the performance of academic leaders at Jerash University in crisis management, ‘the Corona pandemic as a model’. The study was applied to a randomly selected sample of (72) faculty members at Jerash University, The researcher designed a tool for the study, which is the questionnaire, and it included two parts: the first part related to the personal data of the study sample members, and the second part was divided into five areas and (34) paragraphs to reveal the role of academic leaders at Jerash University in crisis management - the Corona pandemic as a model, it was confirmed From the validity and reliability of the tool, the study used the descriptive analytical method The study reached the following results: that the role of academic leaders at Jerash University in crisis management from the point of view of faculty members, ‘the emerging corona pandemic as a model’, came to a high degree, and there were no statistically significant differences at the level of statistical significance (α = 0.05) between the computational circles for the estimates of individuals The study sample for the role of academic leaders at Jerash University in crisis management is attributed to the study variables (gender, academic rank, and years of experience)

Keywords: academic leaders, crisis management, corona pandemic, Jerash University

Procedia PDF Downloads 54
22845 Simulation of 140 Kv X– Ray Tube by MCNP4C Code

Authors: Amin Sahebnasagh, Karim Adinehvand, Bakhtiar Azadbakht

Abstract:

In this study, we used Monte Carlo code (MCNP4C) that is a general method, for simulation, electron source and electric field, a disc source with 0.05 cm radius in direct of anode are used, radius of disc source show focal spot of x-ray tube that here is 0.05 cm. In this simulation, anode is from tungsten with 18.9 g/cm3 density and angle of anode is 180. we simulated x-ray tube for 140 kv. For increasing of speed data acquisition we use F5 tally. With determination the exact position of F5 tally in program, outputs are acquired. In this spectrum the start point is about 0.02 Mev, the absorption edges are about 0.06 Mev and 0.07 Mev and average energy is about 0.05 Mev.

Keywords: x-spectrum, simulation, Monte Carlo, MCNP4C code

Procedia PDF Downloads 646
22844 Total Controllability of the Second Order Nonlinear Differential Equation with Delay and Non-Instantaneous Impulses

Authors: Muslim Malik, Avadhesh Kumar

Abstract:

A stronger concept of exact controllability which is called Total Controllability is introduced in this manuscript. Sufficient conditions have been established for the total controllability of a control problem, governed by second order nonlinear differential equation with delay and non-instantaneous impulses in a Banach space X. The results are obtained using the strongly continuous cosine family and Banach fixed point theorem. Also, the total controllability of an integrodifferential problem is investigated. At the end, some numerical examples are provided to illustrate the analytical findings.

Keywords: Banach fixed point theorem, non-instantaneous impulses, strongly continuous cosine family, total controllability

Procedia PDF Downloads 298
22843 Design and Implementation of DC-DC Converter with Inc-Cond Algorithm

Authors: Mustafa Engin Başoğlu, Bekir Çakır

Abstract:

The most important component affecting the efficiency of photovoltaic power systems are solar panels. Efficiency of these systems are significantly affected because of being low efficiency of solar panel. Therefore, solar panels should be operated under maximum power point conditions through a power converter. In this study, design boost converter with maximum power point tracking (MPPT) operation has been designed and performed with Incremental Conductance (Inc-Cond) algorithm by using direct duty control. Furthermore, it is shown that performance of boost converter with MPPT operation fails under low load resistance connection.

Keywords: boost converter, incremental conductance (Inc-Cond), MPPT, solar panel

Procedia PDF Downloads 1046
22842 Analysis of Exponential Distribution under Step Stress Partially Accelerated Life Testing Plan Using Adaptive Type-I Hybrid Progressive Censoring Schemes with Competing Risks Data

Authors: Ahmadur Rahman, Showkat Ahmad Lone, Ariful Islam

Abstract:

In this article, we have estimated the parameters for the failure times of units based on the sampling technique adaptive type-I progressive hybrid censoring under the step-stress partially accelerated life tests for competing risk. The failure times of the units are assumed to follow an exponential distribution. Maximum likelihood estimation technique is used to estimate the unknown parameters of the distribution and tampered coefficient. Confidence interval also obtained for the parameters. A simulation study is performed by using Monte Carlo Simulation method to check the authenticity of the model and its assumptions.

Keywords: adaptive type-I hybrid progressive censoring, competing risks, exponential distribution, simulation, step-stress partially accelerated life tests

Procedia PDF Downloads 343
22841 Discerning Divergent Nodes in Social Networks

Authors: Mehran Asadi, Afrand Agah

Abstract:

In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.

Keywords: online social networks, data mining, social cloud computing, interaction and collaboration

Procedia PDF Downloads 157
22840 The Effect of Artificial Intelligence on Accounting and Finance

Authors: Evrime Fawzy Ishak Gadelsayed

Abstract:

This paper presents resource intake accounting as an inventive manner to cope with control accounting, which concentrates on administrators as the crucial customers of the information and offers satisfactory statistics of conventional control accounting. This machine underscores that the association's asset motivates prices; as a consequence, in costing frameworks, the emphasis ought to be on assets and their usage. Resource consumption accounting consolidates two costing methodologies, action-based totally and the German cost accounting approach called GPK. This methodology, however, is a danger to managers when making the management accounting undertaking operational. The motive for this article is to clarify the concept of resource intake accounting, its elements and highlights and use of this approach in associations. Inside the first area, we present useful resource consumption accounting, the basis, reasons for its improvement, and the issues that are faced beyond costing frameworks. At that point, we deliver the requirements and presumptions of this approach; ultimately, we depict the execution of this approach in associations and its preferences over other costing techniques.

Keywords: financial statement fraud, forensic accounting, fraud prevention and detection, auditing, audit expectation gap, corporate governance resource consumption accounting, management accounting, action based method, German cost accounting method

Procedia PDF Downloads 4
22839 A Novel Way to Create Qudit Quantum Error Correction Codes

Authors: Arun Moorthy

Abstract:

Quantum computing promises to provide algorithmic speedups for a number of tasks; however, similar to classical computing, effective error-correcting codes are needed. Current quantum computers require costly equipment to control each particle, so having fewer particles to control is ideal. Although traditional quantum computers are built using qubits (2-level systems), qudits (more than 2-levels) are appealing since they can have an equivalent computational space using fewer particles, meaning fewer particles need to be controlled. Currently, qudit quantum error-correction codes are available for different level qudit systems; however, these codes have sometimes overly specific constraints. When building a qudit system, it is important for researchers to have access to many codes to satisfy their requirements. This project addresses two methods to increase the number of quantum error correcting codes available to researchers. The first method is generating new codes for a given set of parameters. The second method is generating new error-correction codes by using existing codes as a starting point to generate codes for another level (i.e., a 5-level system code on a 2-level system). So, this project builds a website that researchers can use to generate new error-correction codes or codes based on existing codes.

Keywords: qudit, error correction, quantum, qubit

Procedia PDF Downloads 160
22838 Evaluation of Double Displacement Process via Gas Dumpflood from Multiple Gas Reservoirs

Authors: B. Rakjarit, S. Athichanagorn

Abstract:

Double displacement process is a method in which gas is injected at an updip well to displace the oil bypassed by waterflooding operation from downdip water injector. As gas injection is costly and a large amount of gas is needed, gas dump-flood from multiple gas reservoirs is an attractive alternative. The objective of this paper is to demonstrate the benefits of the novel approach of double displacement process via gas dump-flood from multiple gas reservoirs. A reservoir simulation model consisting of a dipping oil reservoir and several underlying layered gas reservoirs was constructed in order to investigate the performance of the proposed method. Initially, water was injected via the downdip well to displace oil towards the producer located updip. When the water cut at the producer became high, the updip well was shut in and perforated in the gas zones in order to dump gas into the oil reservoir. At this point, the downdip well was open for production. In order to optimize oil recovery, oil production and water injection rates and perforation strategy on the gas reservoirs were investigated for different numbers of gas reservoirs having various depths and thicknesses. Gas dump-flood from multiple gas reservoirs can help increase the oil recovery after implementation of waterflooding upto 10%. Although the amount of additional oil recovery is slightly lower than the one obtained in conventional double displacement process, the proposed process requires a small completion cost of the gas zones and no operating cost while the conventional method incurs high capital investment in gas compression facility and high-pressure gas pipeline and additional operating cost. From the simulation study, oil recovery can be optimized by producing oil at a suitable rate and perforating the gas zones with the right strategy which depends on depths, thicknesses and number of the gas reservoirs. Conventional double displacement process has been studied and successfully implemented in many fields around the world. However, the method of dumping gas into the oil reservoir instead of injecting it from surface during the second displacement process has never been studied. The study of this novel approach will help a practicing engineer to understand the benefits of such method and can implement it with minimum cost.

Keywords: gas dump-flood, multi-gas layers, double displacement process, reservoir simulation

Procedia PDF Downloads 408
22837 A Coupled System of Caputo-Type Katugampola Fractional Differential Equations with Integral Boundary Conditions

Authors: Yacine Arioua

Abstract:

In this paper, we investigate the existence and uniqueness of solutions for a coupled system of nonlinear Caputo-type Katugampola fractional differential equations with integral boundary conditions. Based upon a contraction mapping principle, Schauders fixed point theorems, some new existence and uniqueness results of solutions for the given problems are obtained. For application, some examples are given to illustrate the usefulness of our main results.

Keywords: fractional differential equations, coupled system, Caputo-Katugampola derivative, fixed point theorems, existence, uniqueness

Procedia PDF Downloads 264
22836 A Dynamic Cardiac Single Photon Emission Computer Tomography Using Conventional Gamma Camera to Estimate Coronary Flow Reserve

Authors: Maria Sciammarella, Uttam M. Shrestha, Youngho Seo, Grant T. Gullberg, Elias H. Botvinick

Abstract:

Background: Myocardial perfusion imaging (MPI) is typically performed with static imaging protocols and visually assessed for perfusion defects based on the relative intensity distribution. Dynamic cardiac SPECT, on the other hand, is a new imaging technique that is based on time varying information of radiotracer distribution, which permits quantification of myocardial blood flow (MBF). In this abstract, we report a progress and current status of dynamic cardiac SPECT using conventional gamma camera (Infinia Hawkeye 4, GE Healthcare) for estimation of myocardial blood flow and coronary flow reserve. Methods: A group of patients who had high risk of coronary artery disease was enrolled to evaluate our methodology. A low-dose/high-dose rest/pharmacologic-induced-stress protocol was implemented. A standard rest and a standard stress radionuclide dose of ⁹⁹ᵐTc-tetrofosmin (140 keV) was administered. The dynamic SPECT data for each patient were reconstructed using the standard 4-dimensional maximum likelihood expectation maximization (ML-EM) algorithm. Acquired data were used to estimate the myocardial blood flow (MBF). The correspondence between flow values in the main coronary vasculature with myocardial segments defined by the standardized myocardial segmentation and nomenclature were derived. The coronary flow reserve, CFR, was defined as the ratio of stress to rest MBF values. CFR values estimated with SPECT were also validated with dynamic PET. Results: The range of territorial MBF in LAD, RCA, and LCX was 0.44 ml/min/g to 3.81 ml/min/g. The MBF between estimated with PET and SPECT in the group of independent cohort of 7 patients showed statistically significant correlation, r = 0.71 (p < 0.001). But the corresponding CFR correlation was moderate r = 0.39 yet statistically significant (p = 0.037). The mean stress MBF value was significantly lower for angiographically abnormal than that for the normal (Normal Mean MBF = 2.49 ± 0.61, Abnormal Mean MBF = 1.43 ± 0. 0.62, P < .001). Conclusions: The visually assessed image findings in clinical SPECT are subjective, and may not reflect direct physiologic measures of coronary lesion. The MBF and CFR measured with dynamic SPECT are fully objective and available only with the data generated from the dynamic SPECT method. A quantitative approach such as measuring CFR using dynamic SPECT imaging is a better mode of diagnosing CAD than visual assessment of stress and rest images from static SPECT images Coronary Flow Reserve.

Keywords: dynamic SPECT, clinical SPECT/CT, selective coronary angiograph, ⁹⁹ᵐTc-Tetrofosmin

Procedia PDF Downloads 150
22835 The Structure and Composition of Plant Communities in Ajluon Forest Reserve in Jordan

Authors: Maher J. Tadros, Yaseen Ananbeh

Abstract:

The study area is located in Ajluon Forest Reserve northern part of Jordan. It consists of Mediterranean hills dominated by open woodlands of oak and pistachio. The aims of the study were to investigate the positive and negative relationships between the locals and the protected area and how it can affect the long-term forest conservation. The main research objectives are to review the impact of establishing Ajloun Forest Reserve on nature conservation and on the livelihood level of local communities around the reserve. The Ajloun forest reserve plays a fundamental role in Ajloun area development. The existence of initiatives of nature conservation in the area supports various socio-economic activities around the reserve that contribute towards the development of local communities in Ajloun area. A part of this research was to conduct a survey to study the impact of Ajloun forest reserve on biodiversity composition. Also, studying the biodiversity content especially for vegetation to determine the economic impacts of Ajloun forest reserve on its surroundings was studied. In this study, several methods were used to fill the objectives including point-centered quarter method which involves selecting randomly 50 plots at the study site. The collected data from the field showed that the absolute density was (1031.24 plant per hectare). Density was recorded and found to be the highest for Quecus coccifera, and relative density of (73.7%), this was followed by Arbutus andrachne and relative density (7.1%), Pistacia palaestina and relative density (10.5%) and Crataegus azarulus (82.5 p/ha) and relative density (5.1%),

Keywords: composition, density, frequency, importance value, point-centered quarter, structure, tree cover

Procedia PDF Downloads 278
22834 Numerical Simulations of Acoustic Imaging in Hydrodynamic Tunnel with Model Adaptation and Boundary Layer Noise Reduction

Authors: Sylvain Amailland, Jean-Hugh Thomas, Charles Pézerat, Romuald Boucheron, Jean-Claude Pascal

Abstract:

The noise requirements for naval and research vessels have seen an increasing demand for quieter ships in order to fulfil current regulations and to reduce the effects on marine life. Hence, new methods dedicated to the characterization of propeller noise, which is the main source of noise in the far-field, are needed. The study of cavitating propellers in closed-section is interesting for analyzing hydrodynamic performance but could involve significant difficulties for hydroacoustic study, especially due to reverberation and boundary layer noise in the tunnel. The aim of this paper is to present a numerical methodology for the identification of hydroacoustic sources on marine propellers using hydrophone arrays in a large hydrodynamic tunnel. The main difficulties are linked to the reverberation of the tunnel and the boundary layer noise that strongly reduce the signal-to-noise ratio. In this paper it is proposed to estimate the reflection coefficients using an inverse method and some reference transfer functions measured in the tunnel. This approach allows to reduce the uncertainties of the propagation model used in the inverse problem. In order to reduce the boundary layer noise, a cleaning algorithm taking advantage of the low rank and sparse structure of the cross-spectrum matrices of the acoustic and the boundary layer noise is presented. This approach allows to recover the acoustic signal even well under the boundary layer noise. The improvement brought by this method is visible on acoustic maps resulting from beamforming and DAMAS algorithms.

Keywords: acoustic imaging, boundary layer noise denoising, inverse problems, model adaptation

Procedia PDF Downloads 335