Search results for: probability formula
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1730

Search results for: probability formula

1250 Pure Economic Loss: A Trouble Child

Authors: Isabel Mousinho de Figueiredo

Abstract:

Pure economic loss can be brought into the 21st century and become a useful tool to keep the tort of negligence within reasonable limits, provided the concept is minutely reexamined. The term came about when wealth was physical, and Law wanted to be a modern science. As a tool to draw the line, it leads to satisfactory decisions in most cases, but needlessly creates distressing conundrums in others, and these are the ones parties bother to litigate about. Economic loss is deemed to be pure based on a blind negative criterion of physical harm, that inadvertently smelts vastly disparate problems into an indiscernible mass, with arbitrary outcomes. These shortcomings are usually dismissed as minor byproducts, for the lack of a better formula. Law could instead stick to the sound paradigms of the intended rule, and be more specific in identifying the losses deserving of compensation. This would provide a better service to Bench and Bar, and effectively assist everyone navigating the many challenges of Accident Law.

Keywords: accident law, comparative tort law, negligence, pure economic loss

Procedia PDF Downloads 95
1249 Optimum Dimensions of Hydraulic Structures Foundation and Protections Using Coupled Genetic Algorithm with Artificial Neural Network Model

Authors: Dheyaa W. Abbood, Rafa H. AL-Suhaili, May S. Saleh

Abstract:

A model using the artificial neural networks and genetic algorithm technique is developed for obtaining optimum dimensions of the foundation length and protections of small hydraulic structures. The procedure involves optimizing an objective function comprising a weighted summation of the state variables. The decision variables considered in the optimization are the upstream and downstream cutoffs length sand their angles of inclination, the foundation length, and the length of the downstream soil protection. These were obtained for a given maximum difference in head, depth of impervious layer and degree of anisotropy.The optimization carried out subjected to constraints that ensure a safe structure against the uplift pressure force and sufficient protection length at the downstream side of the structure to overcome an excessive exit gradient. The Geo-studios oft ware, was used to analyze 1200 different cases. For each case the length of protection and volume of structure required to satisfy the safety factors mentioned previously were estimated. An ANN model was developed and verified using these cases input-output sets as its data base. A MatLAB code was written to perform a genetic algorithm optimization modeling coupled with this ANN model using a formulated optimization model. A sensitivity analysis was done for selecting the cross-over probability, the mutation probability and level ,the number of population, the position of the crossover and the weights distribution for all the terms of the objective function. Results indicate that the most factor that affects the optimum solution is the number of population required. The minimum value that gives stable global optimum solution of this parameters is (30000) while other variables have little effect on the optimum solution.

Keywords: inclined cutoff, optimization, genetic algorithm, artificial neural networks, geo-studio, uplift pressure, exit gradient, factor of safety

Procedia PDF Downloads 303
1248 Modification of Newton Method in Two Point Block Backward Differentiation Formulas

Authors: Khairil I. Othman, Nur N. Kamal, Zarina B. Ibrahim

Abstract:

In this paper, we present modified Newton method as a new strategy for improving the efficiency of Two Point Block Backward Differentiation Formulas (BBDF) when solving stiff systems of ordinary differential equations (ODEs). These methods are constructed to produce two approximate solutions simultaneously at each iteration The detailed implementation of the predictor corrector BBDF with PE(CE)2 with modified Newton are discussed. The proposed modification of BBDF is validated through numerical results on some standard problems found in the literature and comparisons are made with the existing Block Backward Differentiation Formula. Numerical results show the advantage of using the new strategy for solving stiff ODEs in improving the accuracy of the solution.

Keywords: newton method, two point, block, accuracy

Procedia PDF Downloads 332
1247 Investigation of Building Pounding during Earthquake and Calculation of Impact Force between Two Adjacent Structures

Authors: H. Naderpour, R. C. Barros, S. M. Khatami

Abstract:

Seismic excitation is naturally caused large horizontal relative displacements, which is able to provide collisions between two adjacent buildings due to insufficient separation distance and severe damages are occurred due to impact especially in tall buildings. In this paper, an impact is numerically simulated and two needed parameters are calculated, including impact force and energy absorption. In order to calculate mentioned parameters, mathematical study needs to model an unreal link element, which is logically assumed to be spring and dashpot to determine lateral displacement and damping ratio of impact. For the determination of dynamic response of impact, a new equation of motion is theoretically suggested to evaluate impact force and energy dissipation. In order to confirm the rendered equation, a series of parametric study are performed and the accuracy of formula is confirmed.

Keywords: pounding, impact, dissipated energy, coefficient of restitution

Procedia PDF Downloads 337
1246 Heuristic to Generate Random X-Monotone Polygons

Authors: Kamaljit Pati, Manas Kumar Mohanty, Sanjib Sadhu

Abstract:

A heuristic has been designed to generate a random simple monotone polygon from a given set of ‘n’ points lying on a 2-Dimensional plane. Our heuristic generates a random monotone polygon in O(n) time after O(nℓogn) preprocessing time which is improved over the previous work where a random monotone polygon is produced in the same O(n) time but the preprocessing time is O(k) for n < k < n2. However, our heuristic does not generate all possible random polygons with uniform probability. The space complexity of our proposed heuristic is O(n).

Keywords: sorting, monotone polygon, visibility, chain

Procedia PDF Downloads 407
1245 Genotypic and Allelic Distribution of Polymorphic Variants of Gene SLC47A1 Leu125Phe (rs77474263) and Gly64Asp (rs77630697) and Their Association to the Clinical Response to Metformin in Adult Pakistani T2DM Patients

Authors: Sadaf Moeez, Madiha Khalid, Zoya Khalid, Sania Shaheen, Sumbul Khalid

Abstract:

Background: Inter-individual variation in response to metformin, which has been considered as a first line therapy for T2DM treatment is considerable. In the current study, it was aimed to investigate the impact of two genetic variants Leu125Phe (rs77474263) and Gly64Asp (rs77630697) in gene SLC47A1 on the clinical efficacy of metformin in T2DM Pakistani patients. Methods: The study included 800 T2DM patients (400 metformin responders and 400 metformin non-responders) along with 400 ethnically matched healthy individuals. The genotypes were determined by allele-specific polymerase chain reaction. In-silico analysis was done to confirm the effect of the two SNPs on the structure of genes. Association was statistically determined using SPSS software. Results: Minor allele frequency for rs77474263 and rs77630697 was 0.13 and 0.12. For SLC47A1 rs77474263 the homozygotes of one mutant allele ‘T’ (CT) of rs77474263 variant were fewer in metformin responders than metformin non-responders (29.2% vs. 35.5 %). Likewise, the efficacy was further reduced (7.2% vs. 4.0 %) in homozygotes of two copies of ‘T’ allele (TT). Remarkably, T2DM cases with two copies of allele ‘C’ (CC) had 2.11 times more probability to respond towards metformin monotherapy. For SLC47A1 rs77630697 the homozygotes of one mutant allele ‘A’ (GA) of rs77630697 variant were fewer in metformin responders than metformin non-responders (33.5% vs. 43.0 %). Likewise, the efficacy was further reduced (8.5% vs. 4.5%) in homozygotes of two copies of ‘A’ allele (AA). Remarkably, T2DM cases with two copies of allele ‘G’ (GG) had 2.41 times more probability to respond towards metformin monotherapy. In-silico analysis revealed that these two variants affect the structure and stability of their corresponding proteins. Conclusion: The present data suggest that SLC47A1 Leu125Phe (rs77474263) and Gly64Asp (rs77630697) polymorphisms were associated with the therapeutic response of metformin in T2DM patients of Pakistan.

Keywords: diabetes, T2DM, SLC47A1, Pakistan, polymorphism

Procedia PDF Downloads 134
1244 Evolving Credit Scoring Models using Genetic Programming and Language Integrated Query Expression Trees

Authors: Alexandru-Ion Marinescu

Abstract:

There exist a plethora of methods in the scientific literature which tackle the well-established task of credit score evaluation. In its most abstract form, a credit scoring algorithm takes as input several credit applicant properties, such as age, marital status, employment status, loan duration, etc. and must output a binary response variable (i.e. “GOOD” or “BAD”) stating whether the client is susceptible to payment return delays. Data imbalance is a common occurrence among financial institution databases, with the majority being classified as “GOOD” clients (clients that respect the loan return calendar) alongside a small percentage of “BAD” clients. But it is the “BAD” clients we are interested in since accurately predicting their behavior is crucial in preventing unwanted loss for loan providers. We add to this whole context the constraint that the algorithm must yield an actual, tractable mathematical formula, which is friendlier towards financial analysts. To this end, we have turned to genetic algorithms and genetic programming, aiming to evolve actual mathematical expressions using specially tailored mutation and crossover operators. As far as data representation is concerned, we employ a very flexible mechanism – LINQ expression trees, readily available in the C# programming language, enabling us to construct executable pieces of code at runtime. As the title implies, they model trees, with intermediate nodes being operators (addition, subtraction, multiplication, division) or mathematical functions (sin, cos, abs, round, etc.) and leaf nodes storing either constants or variables. There is a one-to-one correspondence between the client properties and the formula variables. The mutation and crossover operators work on a flattened version of the tree, obtained via a pre-order traversal. A consequence of our chosen technique is that we can identify and discard client properties which do not take part in the final score evaluation, effectively acting as a dimensionality reduction scheme. We compare ourselves with state of the art approaches, such as support vector machines, Bayesian networks, and extreme learning machines, to name a few. The data sets we benchmark against amount to a total of 8, of which we mention the well-known Australian credit and German credit data sets, and the performance indicators are the following: percentage correctly classified, area under curve, partial Gini index, H-measure, Brier score and Kolmogorov-Smirnov statistic, respectively. Finally, we obtain encouraging results, which, although placing us in the lower half of the hierarchy, drive us to further refine the algorithm.

Keywords: expression trees, financial credit scoring, genetic algorithm, genetic programming, symbolic evolution

Procedia PDF Downloads 99
1243 Semiconductor Variable Wavelength Generator of Near-Infrared-to-Terahertz Regions

Authors: Isao Tomita

Abstract:

Power characteristics are obtained for laser beams of near-infrared and terahertz wavelengths when produced by difference-frequency generation with a quasi-phase-matched (QPM) waveguide made of gallium phosphide (GaP). A refractive-index change of the QPM GaP waveguide is included in computations with Sellmeier’s formula for varying input wavelengths, where optical loss is also included. Although the output power decreases with decreasing photon energy as the beam wavelength changes from near-infrared to terahertz wavelengths, the beam generation with such greatly different wavelengths, which is not achievable with an ordinary laser diode without the replacement of semiconductor material with a different bandgap one, can be made with the same semiconductor (GaP) by changing the QPM period, where a way of changing the period is provided.

Keywords: difference-frequency generation, gallium phosphide, quasi-phase-matching, waveguide

Procedia PDF Downloads 94
1242 Dissociation of CDS from CVA Valuation Under Notation Changes

Authors: R. Henry, J-B. Paulin, St. Fauchille, Ph. Delord, K. Benkirane, A. Brunel

Abstract:

In this paper, the CVA computation of interest rate swap is presented based on its rating. Rating and probability default given by Moody’s Investors Service are used to calculate our CVA for a specific swap with different maturities. With this computation, the influence of rating variation can be shown on CVA. The application is made to the analysis of Greek CDS variation during the period of Greek crisis between 2008 and 2011. The main point is the determination of correlation between the fluctuation of Greek CDS cumulative value and the variation of swap CVA due to change of rating

Keywords: CDS, computation, CVA, Greek crisis, interest rate swap, maturity, rating, swap

Procedia PDF Downloads 284
1241 Quantum Teleportation Using W-BELL and Bell-GHZ Channels

Authors: Abhinav Pandey

Abstract:

Teleportation is the transfer of information between two particles without physically being in contact with each other. It has been around in Quantum computation and has been used in theoretical physics. Using the Entangled pair we can achieve teleportation up to 100% out of probable measurements. We introduce a 5-qubit general entanglement system using W-BELL and BELL-GHZ channel pairs and show its usefulness in teleportation. In this paper, we use these channels to achieve teleportation probabilistically conventionally through nonteleporting channels, which has never been achieved before. In this paper, we compare and determine which channel is better in terms of probabilistic results of teleportation of single qubits using W-Bell and Bell-GHZ channels.

Keywords: entanglement, teleportation, no cloning theorem, quantum mechanics, probability

Procedia PDF Downloads 24
1240 New Formula for Revenue Recognition Likely to Change the Prescription for Pharma Industry

Authors: Shruti Hajirnis

Abstract:

In May 2014, FASB issued Accounting Standards Update (ASU) 2014-09, Revenue from Contracts with Customers (Topic 606), and the International Accounting Standards Board (IASB) issued International Financial Reporting Standards (IFRS) 15, Revenue from Contracts with Customers that will supersede virtually all revenue recognition requirements in IFRS and US GAAP. FASB and the IASB have basically achieved convergence with these standards, with only some minor differences such as collectability threshold, interim disclosure requirements, early application and effective date, impairment loss reversal and nonpublic entity requirements. This paper discusses the impact of five-step model prescribed in new revenue standard on the entities operating in Pharma industry. It also outlines the considerations for these entities while implementing the new standard.

Keywords: revenue recognition, pharma industry, standard, requirements

Procedia PDF Downloads 423
1239 Evaluation of a Piecewise Linear Mixed-Effects Model in the Analysis of Randomized Cross-over Trial

Authors: Moses Mwangi, Geert Verbeke, Geert Molenberghs

Abstract:

Cross-over designs are commonly used in randomized clinical trials to estimate efficacy of a new treatment with respect to a reference treatment (placebo or standard). The main advantage of using cross-over design over conventional parallel design is its flexibility, where every subject become its own control, thereby reducing confounding effect. Jones & Kenward, discuss in detail more recent developments in the analysis of cross-over trials. We revisit the simple piecewise linear mixed-effects model, proposed by Mwangi et. al, (in press) for its first application in the analysis of cross-over trials. We compared performance of the proposed piecewise linear mixed-effects model with two commonly cited statistical models namely, (1) Grizzle model; and (2) Jones & Kenward model, used in estimation of the treatment effect, in the analysis of randomized cross-over trial. We estimate two performance measurements (mean square error (MSE) and coverage probability) for the three methods, using data simulated from the proposed piecewise linear mixed-effects model. Piecewise linear mixed-effects model yielded lowest MSE estimates compared to Grizzle and Jones & Kenward models for both small (Nobs=20) and large (Nobs=600) sample sizes. It’s coverage probability were highest compared to Grizzle and Jones & Kenward models for both small and large sample sizes. A piecewise linear mixed-effects model is a better estimator of treatment effect than its two competing estimators (Grizzle and Jones & Kenward models) in the analysis of cross-over trials. The data generating mechanism used in this paper captures two time periods for a simple 2-Treatments x 2-Periods cross-over design. Its application is extendible to more complex cross-over designs with multiple treatments and periods. In addition, it is important to note that, even for single response models, adding more random effects increases the complexity of the model and thus may be difficult or impossible to fit in some cases.

Keywords: Evaluation, Grizzle model, Jones & Kenward model, Performance measures, Simulation

Procedia PDF Downloads 102
1238 Characterization of Probability Distributions through Conditional Expectation of Pair of Generalized Order Statistics

Authors: Zubdahe Noor, Haseeb Athar

Abstract:

In this article, first a relation for conditional expectation is developed and then is used to characterize a general class of distributions F(x) = 1-e^(-ah(x)) through conditional expectation of difference of pair of generalized order statistics. Some results are reduced for particular cases. In the end, a list of distributions is presented in the form of table that are compatible with the given general class.

Keywords: generalized order statistics, order statistics, record values, conditional expectation, characterization

Procedia PDF Downloads 443
1237 Fire Safety Assessment of At-Risk Groups

Authors: Naser Kazemi Eilaki, Carolyn Ahmer, Ilona Heldal, Bjarne Christian Hagen

Abstract:

Older people and people with disabilities are recognized as at-risk groups when it comes to egress and travel from hazard zone to safe places. One's disability can negatively influence her or his escape time, and this becomes even more important when people from this target group live alone. This research deals with the fire safety of mentioned people's buildings by means of probabilistic methods. For this purpose, fire safety is addressed by modeling the egress of our target group from a hazardous zone to a safe zone. A common type of detached house with a prevalent plan has been chosen for safety analysis, and a limit state function has been developed according to the time-line evacuation model, which is based on a two-zone and smoke development model. An analytical computer model (B-Risk) is used to consider smoke development. Since most of the involved parameters in the fire development model pose uncertainty, an appropriate probability distribution function has been considered for each one of the variables with indeterministic nature. To achieve safety and reliability for the at-risk groups, the fire safety index method has been chosen to define the probability of failure (causalities) and safety index (beta index). An improved harmony search meta-heuristic optimization algorithm has been used to define the beta index. Sensitivity analysis has been done to define the most important and effective parameters for the fire safety of the at-risk group. Results showed an area of openings and intervals to egress exits are more important in buildings, and the safety of people would improve with increasing dimensions of occupant space (building). Fire growth is more critical compared to other parameters in the home without a detector and fire distinguishing system, but in a home equipped with these facilities, it is less important. Type of disabilities has a great effect on the safety level of people who live in the same home layout, and people with visual impairment encounter more risk of capturing compared to visual and movement disabilities.

Keywords: fire safety, at-risk groups, zone model, egress time, uncertainty

Procedia PDF Downloads 82
1236 Rapid Green Synthesis and Characterization of Silver Nanoparticles Using Eclipta prostrata Leaf Extract

Authors: Siva Prasad Peddi

Abstract:

Silver nanoparticles were successfully synthesized from silver nitrate through a rapid green synthesis method using Eclipta prostrata leaf extract as a reducing cum stabilizing agent. The experimental procedure was readily conducted at room temperature and pressure, and could be easily scaled up. The silver nanoparticles thus obtained were characterized using UV-Visible Spectroscopy (UV-VIS) which yielded an absorption peak at 416 nm. The biomolecules responsible for capping of the bio-reduced silver nanoparticles synthesized using plant extract were successfully identified through FTIR analysis. It was evinced through Scanning Electron Microscope (SEM), and X-ray diffraction (XRD) analysis that the silver nanoparticles were crystalline in nature and spherical in shape. The average size of the particles obtained using Scherrer’s formula was 27.4 nm. The adopted technique for silver nanoparticle synthesis is suitable for large-scale production.

Keywords: silver nanoparticles, green synthesis, characterization, Eclipta prostrata

Procedia PDF Downloads 447
1235 Characteristic Function in Estimation of Probability Distribution Moments

Authors: Vladimir S. Timofeev

Abstract:

In this article the problem of distributional moments estimation is considered. The new approach of moments estimation based on usage of the characteristic function is proposed. By statistical simulation technique, author shows that new approach has some robust properties. For calculation of the derivatives of characteristic function there is used numerical differentiation. Obtained results confirmed that author’s idea has a certain working efficiency and it can be recommended for any statistical applications.

Keywords: characteristic function, distributional moments, robustness, outlier, statistical estimation problem, statistical simulation

Procedia PDF Downloads 484
1234 Organic Paddy Production as a Coping Strategy to the Adverse Impact of Climate Change

Authors: Thapa M., J.P. Dutta, K.R. Pandey, R.R. Kattel

Abstract:

Nepal is extremely vulnerable to the impact of climate change. To mitigate the climate change effects on agricultural production and productivity a range of adaptive strategies needs to be considered. The study was conducted to assess organic paddy production as a coping strategy to the adverse impact of climate change in Phulbari, VDC of Chitwan district. Altogether, 120 respondents (60 adopters of organic farming and 60 from non adopter) were selected using snowball technique of sampling. Pre- tested interview schedule, direct observation, focus group discussion, key informant interview as well as secondary data were used to collect the required information. Factors determining the adoption of organic farming were found to be age, year of schooling, training, frequency of extension contact, perception about climate change, economically active members and poor. A unit increase in these factors except poor would increase the probability of adoption by 4.1%, 7.5%, 7.8%, 43.1%, 41.8% and 7% respectively. However, for poor, it would decrease the probability of adoption of organic farming by 5.1%. Average organic matter content in the adopters' field was higher (2.7%) than the non-adopters' field (2.5%). The regression result showed that type of farmer, price and area under rice cultivation had positive and significant relationship with income; however dependency ratio had negative relationship. As the year of adoption of organic farming increases, the production of rice decline in the first two years then after goes on increasing but the cost of production goes on decreasing with the year of adoption. The respondents adapted to the changing climate through diversification of crops, use of resistance varieties and following good cropping pattern. Gradually growing consumers' awareness about health, preference towards quality food products are the strong points behind organic farming, whereas lacks of bio-fertilizers, lack of effective extension services, no price differentiation between organic and inorganic products were the weak points. There is need for more training and education to change the attitude of farmers and enhance their confidence about the role of organic farming to cope with climate change impact.

Keywords: Organic farming, climate change, sustainable development

Procedia PDF Downloads 438
1233 Risk Factors Affecting Construction Project Cost in Oman

Authors: Omar Amoudi, Latifa Al Brashdi

Abstract:

Construction projects are always subject to risks and uncertainties due to its unique and dynamic nature, outdoor work environment, the wide range of skills employed, various parties involved in addition to situation of construction business environment at large. Altogether, these risks and uncertainties affect projects objectives and lead to cost overruns, delay, and poor quality. Construction projects in Oman often experience cost overruns and delay. Managing these risks and reducing their impacts on construction cost requires firstly identifying these risks, and then analyzing their severity on project cost to obtain deep understanding about these risks. This in turn will assist construction managers in managing and tacking these risks. This paper aims to investigate the main risk factors that affect construction projects cost in the Sultanate of Oman. In order to achieve the main aim, literature review was carried out to identify the main risk factors affecting construction cost. Thirty-three risk factors were identified from the literature. Then, a questionnaire survey was designed and distributed among construction professionals (i.e., client, contractor and consultant) to obtain their opinion toward the probability of occurrence for each risk factor and its possible impact on construction project cost. The collected data was analyzed based on qualitative aspects and in several ways. The severity of each risk factor was obtained by multiplying the probability occurrence of a risk factor with its impact. The findings of this study reveal that the most significant risk factors that have high severity impact on construction project cost are: Change of Oil Price, Delay of Materials and Equipment Delivery, Changes in Laws and Regulations, Improper Budgeting, and Contingencies, Lack of Skilled Workforce and Personnel, Delays Caused by Contractor, Delays of Owner Payments, Delays Caused by Client, and Funding Risk. The results can be used as a basis for construction managers to make informed decisions and produce risk response procedures and strategies to tackle these risks and reduce their negative impacts on construction project cost.

Keywords: construction cost, construction projects, Oman, risk factors, risk management

Procedia PDF Downloads 314
1232 Factorization of Computations in Bayesian Networks: Interpretation of Factors

Authors: Linda Smail, Zineb Azouz

Abstract:

Given a Bayesian network relative to a set I of discrete random variables, we are interested in computing the probability distribution P(S) where S is a subset of I. The general idea is to write the expression of P(S) in the form of a product of factors where each factor is easy to compute. More importantly, it will be very useful to give an interpretation of each of the factors in terms of conditional probabilities. This paper considers a semantic interpretation of the factors involved in computing marginal probabilities in Bayesian networks. Establishing such a semantic interpretations is indeed interesting and relevant in the case of large Bayesian networks.

Keywords: Bayesian networks, D-Separation, level two Bayesian networks, factorization of computation

Procedia PDF Downloads 503
1231 Groundwater Recharge Suitability Mapping Using Analytical Hierarchy Process Based-Approach

Authors: Aziza Barrek, Mohamed Haythem Msaddek, Ismail Chenini

Abstract:

Excessive groundwater pumping due to the increasing water demand, especially in the agricultural sector, causes groundwater scarcity. Groundwater recharge is the most important process that contributes to the water's durability. This paper is based on the Analytic Hierarchy Process multicriteria analysis to establish a groundwater recharge susceptibility map. To delineate aquifer suitability for groundwater recharge, eight parameters were used: soil type, land cover, drainage density, lithology, NDVI, slope, transmissivity, and rainfall. The impact of each factor was weighted. This method was applied to the El Fahs plain shallow aquifer. Results suggest that 37% of the aquifer area has very good and good recharge suitability. The results have been validated by the Receiver Operating Characteristics curve. The accuracy of the prediction obtained was 89.3%.

Keywords: AHP, El Fahs aquifer, empirical formula, groundwater recharge zone, remote sensing, semi-arid region

Procedia PDF Downloads 96
1230 Generating 3D Anisotropic Centroidal Voronoi Tessellations

Authors: Alexandre Marin, Alexandra Bac, Laurent Astart

Abstract:

New numerical methods for PDE resolution (such as Finite Volumes (FV) or Virtual Elements Method (VEM)) open new needs in terms of meshing of domains of interest, and in particular, polyhedral meshes have many advantages. One way to build such meshes consists of constructing Restricted Voronoi Diagrams (RVDs) whose boundaries respect the domain of interest. By minimizing a function defined for RVDs, the shapes of cells can be controlled, e.g., elongated according to user-defined directions or adjusted to comply with given aspect ratios (anisotropy) and density variations. In this paper, our contribution is threefold: First, we introduce a new gradient formula for the Voronoi tessellation energy under a continuous anisotropy field. Second, we describe a meshing algorithm based on the optimisation of this function that we validate against state-of-the-art approaches. Finally, we propose a hierarchical approach to speed up our meshing algorithm.

Keywords: anisotropic Voronoi diagrams, meshes for numerical simulations, optimisation, volumic polyhedral meshing

Procedia PDF Downloads 75
1229 Radiation Dosimetry Using Sintered Pellets of Yellow Beryl (Heliodor) Crystals

Authors: Lucas Sátiro Do Carmo, Betzabel Noemi Silva Carrera, Shigueo Watanabe, J. F. D. Chubaci

Abstract:

Beryl is a silicate with chemical formula Be₃Al₂(SiO₃)₆ commonly found in Brazil. It has a few colored variations used as jewelry, like Aquamarine (blueish), Emerald (green) and Heliodor (yellow). The color of each variation depends on the dopant that is naturally present in the crystal lattice. In this work, Heliodor pellets of 5 mm diameter and 1 mm thickness have been produced and investigated using thermoluminescence (TL) to evaluate its potential for use as gamma ray’s dosimeter. The results show that the pellets exhibited a prominent TL peak at 205 °C that grows linearly with dose when irradiated from 1 Gy to 1000 Gy. A comparison has been made between powdered and sintered dosimeters. The results show that sintered pellets have higher sensitivity than powder dosimeter. The TL response of this mineral is satisfactory for radiation dosimetry applications in the studied dose range.

Keywords: dosimetry, beryl, gamma rays, sintered pellets, new material

Procedia PDF Downloads 74
1228 Adaptive CFAR Analysis for Non-Gaussian Distribution

Authors: Bouchemha Amel, Chachoui Takieddine, H. Maalem

Abstract:

Automatic detection of targets in a modern communication system RADAR is based primarily on the concept of adaptive CFAR detector. To have an effective detection, we must minimize the influence of disturbances due to the clutter. The detection algorithm adapts the CFAR detection threshold which is proportional to the average power of the clutter, maintaining a constant probability of false alarm. In this article, we analyze the performance of two variants of adaptive algorithms CA-CFAR and OS-CFAR and we compare the thresholds of these detectors in the marine environment (no-Gaussian) with a Weibull distribution.

Keywords: CFAR, threshold, clutter, distribution, Weibull, detection

Procedia PDF Downloads 562
1227 Analytical Modeling of Globular Protein-Ferritin in α-Helical Conformation: A White Noise Functional Approach

Authors: Vernie C. Convicto, Henry P. Aringa, Wilson I. Barredo

Abstract:

This study presents a conformational model of the helical structures of globular protein particularly ferritin in the framework of white noise path integral formulation by using Associated Legendre functions, Bessel and convolution of Bessel and trigonometric functions as modulating functions. The model incorporates chirality features of proteins and their helix-turn-helix sequence structural motif.

Keywords: globular protein, modulating function, white noise, winding probability

Procedia PDF Downloads 450
1226 Feigenbaum Universality, Chaos and Fractal Dimensions in Discrete Dynamical Systems

Authors: T. K. Dutta, K. K. Das, N. Dutta

Abstract:

The salient feature of this paper is primarily concerned with Ricker’s population model: f(x)=x e^(r(1-x/k)), where r is the control parameter and k is the carrying capacity, and some fruitful results are obtained with the following objectives: 1) Determination of bifurcation values leading to a chaotic region, 2) Development of Statistical Methods and Analysis required for the measure of Fractal dimensions, 3) Calculation of various fractal dimensions. These results also help that the invariant probability distribution on the attractor, when it exists, provides detailed information about the long-term behavior of a dynamical system. At the end, some open problems are posed for further research.

Keywords: Feigenbaum universality, chaos, Lyapunov exponent, fractal dimensions

Procedia PDF Downloads 285
1225 Adaptation of Projection Profile Algorithm for Skewed Handwritten Text Line Detection

Authors: Kayode A. Olaniyi, Tola. M. Osifeko, Adeola A. Ogunleye

Abstract:

Text line segmentation is an important step in document image processing. It represents a labeling process that assigns the same label using distance metric probability to spatially aligned units. Text line detection techniques have successfully been implemented mainly in printed documents. However, processing of the handwritten texts especially unconstrained documents has remained a key problem. This is because the unconstrained hand-written text lines are often not uniformly skewed. The spaces between text lines may not be obvious, complicated by the nature of handwriting and, overlapping ascenders and/or descenders of some characters. Hence, text lines detection and segmentation represents a leading challenge in handwritten document image processing. Text line detection methods that rely on the traditional global projection profile of the text document cannot efficiently confront with the problem of variable skew angles between different text lines. Hence, the formulation of a horizontal line as a separator is often not efficient. This paper presents a technique to segment a handwritten document into distinct lines of text. The proposed algorithm starts, by partitioning the initial text image into columns, across its width into chunks of about 5% each. At each vertical strip of 5%, the histogram of horizontal runs is projected. We have worked with the assumption that text appearing in a single strip is almost parallel to each other. The algorithm developed provides a sliding window through the first vertical strip on the left side of the page. It runs through to identify the new minimum corresponding to a valley in the projection profile. Each valley would represent the starting point of the orientation line and the ending point is the minimum point on the projection profile of the next vertical strip. The derived text-lines traverse around any obstructing handwritten vertical strips of connected component by associating it to either the line above or below. A decision of associating such connected component is made by the probability obtained from a distance metric decision. The technique outperforms the global projection profile for text line segmentation and it is robust to handle skewed documents and those with lines running into each other.

Keywords: connected-component, projection-profile, segmentation, text-line

Procedia PDF Downloads 100
1224 Astronomical Object Classification

Authors: Alina Muradyan, Lina Babayan, Arsen Nanyan, Gohar Galstyan, Vigen Khachatryan

Abstract:

We present a photometric method for identifying stars, galaxies and quasars in multi-color surveys, which uses a library of ∼> 65000 color templates for comparison with observed objects. The method aims for extracting the information content of object colors in a statistically correct way, and performs a classification as well as a redshift estimation for galaxies and quasars in a unified approach based on the same probability density functions. For the redshift estimation, we employ an advanced version of the Minimum Error Variance estimator which determines the redshift error from the redshift dependent probability density function itself. The method was originally developed for the Calar Alto Deep Imaging Survey (CADIS), but is now used in a wide variety of survey projects. We checked its performance by spectroscopy of CADIS objects, where the method provides high reliability (6 errors among 151 objects with R < 24), especially for the quasar selection, and redshifts accurate within σz ≈ 0.03 for galaxies and σz ≈ 0.1 for quasars. For an optimization of future survey efforts, a few model surveys are compared, which are designed to use the same total amount of telescope time but different sets of broad-band and medium-band filters. Their performance is investigated by Monte-Carlo simulations as well as by analytic evaluation in terms of classification and redshift estimation. If photon noise were the only error source, broad-band surveys and medium-band surveys should perform equally well, as long as they provide the same spectral coverage. In practice, medium-band surveys show superior performance due to their higher tolerance for calibration errors and cosmic variance. Finally, we discuss the relevance of color calibration and derive important conclusions for the issues of library design and choice of filters. The calibration accuracy poses strong constraints on an accurate classification, which are most critical for surveys with few, broad and deeply exposed filters, but less severe for surveys with many, narrow and less deep filters.

Keywords: VO, ArVO, DFBS, FITS, image processing, data analysis

Procedia PDF Downloads 52
1223 The Role of Public Education in Increasing Public Awareness through Mass Media with Emphasis on Newspapers and TV: Coping with Possible Earthquake in Tehran

Authors: Naser Charkhsaz, Ashraf Sadat Mousavi, Navvab Shamspour

Abstract:

This study aimed to evaluate the role of state education in increasing public awareness through mass media (with emphasis on newspapers and TV) coping with possible earthquake in Tehran. All residents aged 15 to 65 who live in the five regions of Tehran (North, South, East, West and Center) during the plan implementation were selected and studied. The required sample size in each region was calculated based on the Cochran formula (n=380). In order to collect and analyze the data, a questionnaire with reliability (82%) and a one-sample t-test has been used, respectively. The results showed that warnings related to the Tehran earthquake affected people in the pre-contemplation stage, while public education through mass media did not promote public awareness about prevention, preparedness and rehabilitation.

Keywords: media, disaster, knowledge, Iranian Red Crescent society

Procedia PDF Downloads 297
1222 An Introduction to the Radiation-Thrust Based on Alpha Decay and Spontaneous Fission

Authors: Shiyi He, Yan Xia, Xiaoping Ouyang, Liang Chen, Zhongbing Zhang, Jinlu Ruan

Abstract:

As the key system of the spacecraft, various propelling system have been developing rapidly, including ion thrust, laser thrust, solar sail and other micro-thrusters. However, there still are some shortages in these systems. The ion thruster requires the high-voltage or magnetic field to accelerate, resulting in extra system, heavy quantity and large volume. The laser thrust now is mostly ground-based and providing pulse thrust, restraint by the station distribution and the capacity of laser. The thrust direction of solar sail is limited to its relative position with the Sun, so it is hard to propel toward the Sun or adjust in the shadow.In this paper, a novel nuclear thruster based on alpha decay and spontaneous fission is proposed and the principle of this radiation-thrust with alpha particle has been expounded. Radioactive materials with different released energy, such as 210Po with 5.4MeV and 238Pu with 5.29MeV, attached to a metal film will provides various thrust among 0.02-5uN/cm2. With this repulsive force, radiation is able to be a power source. With the advantages of low system quantity, high accuracy and long active time, the radiation thrust is promising in the field of space debris removal, orbit control of nano-satellite array and deep space exploration. To do further study, a formula lead to the amplitude and direction of thrust by the released energy and decay coefficient is set up. With the initial formula, the alpha radiation elements with the half life period longer than a hundred days are calculated and listed. As the alpha particles emit continuously, the residual charge in metal film grows and affects the emitting energy distribution of alpha particles. With the residual charge or extra electromagnetic field, the emitting of alpha particles performs differently and is analyzed in this paper. Furthermore, three more complex situations are discussed. Radiation element generating alpha particles with several energies in different intensity, mixture of various radiation elements, and cascaded alpha decay are studied respectively. In combined way, it is more efficient and flexible to adjust the thrust amplitude. The propelling model of the spontaneous fission is similar with the one of alpha decay, which has a more complex angular distribution. A new quasi-sphere space propelling system based on the radiation-thrust has been introduced, as well as the collecting and processing system of excess charge and reaction heat. The energy and spatial angular distribution of emitting alpha particles on unit area and certain propelling system have been studied. As the alpha particles are easily losing energy and self-absorb, the distribution is not the simple stacking of each nuclide. With the change of the amplitude and angel of radiation-thrust, orbital variation strategy on space debris removal is shown and optimized.

Keywords: alpha decay, angular distribution, emitting energy, orbital variation, radiation-thruster

Procedia PDF Downloads 178
1221 Dielectric and Impedance Spectroscopy of Samarium and Lanthanum Doped Barium Titanate at Room Temperature

Authors: Sukhleen Bindra Narang, Dalveer Kaur, Kunal Pubby

Abstract:

Dielectric ceramic samples in the BaO-Re2O3-TiO2 ternary system were synthesized with structural formula Ba2-xRe4+2x/3Ti8O24 where Re= rare earth metal and Re= Sm and La where x varies from 0.0 to 0.6 with step size 0.1. Polycrystalline samples were prepared by the conventional solid state reaction technique. The dielectric, electrical and impedance analysis of all the samples in the frequency range 1KHz- 1MHz at room temperature (25°C) have been done to get the understanding of electrical conduction and dielectric relaxation and their correlation. Dielectric response of the samples at lower frequencies shows dielectric dispersion while at higher frequencies it shows dielectric relaxation. The ac conductivity is well fitted by the Jonscher law (σac = σdc+Aωn). The spectroscopic data in the impedance plane confirms the existence of grain contribution to the relaxation. All the properties are found out to be function of frequency as well as the amount of substitution.

Keywords: dielectric ceramics, dielectric constant, loss tangent, AC conductivity, impedance spectroscopy

Procedia PDF Downloads 433