Search results for: random match probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3551

Search results for: random match probability

2591 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)

Authors: Robert Jacobsen

Abstract:

Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.

Keywords: hydrology, mapping, high-definition, inundation

Procedia PDF Downloads 71
2590 Risk Identification of Investment Feasibility in Indonesia’s Toll Road Infrastructure Investment

Authors: Christo Februanto Putra

Abstract:

This paper presents risk identification that affects investment feasibility on toll road infrastructure in Indonesia using qualitative methods survey based on the expert practitioner in investor, contractor, and state officials. The problems on infrastructure investment in Indonesia, especially on KPBU model contract, is many risk factors in the investment plan is not calculated in detail thoroughly. Risk factor is a value used to provide an overview of the risk level assessment of an event which is a function of the probability of the occurrence and the consequences of the risks that arise. As results of the survey which is to show which risk factors impacts directly to the investment feasibility and rank them by their impacts on the investment.

Keywords: risk identification, indonesia toll road, investment feasibility

Procedia PDF Downloads 272
2589 Large-Scale Photovoltaic Generation System Connected to HVDC Grid with Centralized High Voltage and High Power DC/DC Converter

Authors: Xinke Huang, Huan Wang, Lidong Guo, Changbin Ju, Runbiao Liu, Shanshan Meng, Yibo Wang, Honghua Xu

Abstract:

Large-scale photovoltaic (PV) generation system connected to HVDC grid has many advantages compared to its counterpart of AC grid. DC connection can solve many problems that AC connection faces, such as the grid-connection and power transmission, and DC connection is the tendency. DC/DC converter as the most important device in the system has become one of the hot spots recently. The paper proposes a centralized DC/DC converter which uses Boost Full Bridge Isolated DC/DC Converter(BFBIC) topology and combination through input parallel output series(IPOS) method to improve power capacity and output voltage to match with the HVDC grid voltage. Meanwhile, it adopts input current sharing control strategy to realize input current and output voltage balance. A ±30kV/1MW system is modeled in MATLAB/SIMULINK, and a downscaled ±10kV/200kW DC/DC converter platform is built to verify the proposed topology and control strategy.

Keywords: photovoltaic generation, cascaded dc/dc converter, galvanic isolation, high-voltage, direct current (HVDC)

Procedia PDF Downloads 440
2588 Auto-Tuning of CNC Parameters According to the Machining Mode Selection

Authors: Jenq-Shyong Chen, Ben-Fong Yu

Abstract:

CNC(computer numerical control) machining centers have been widely used for machining different metal components for various industries. For a specific CNC machine, its everyday job is assigned to cut different products with quite different attributes such as material type, workpiece weight, geometry, tooling, and cutting conditions. Theoretically, the dynamic characteristics of the CNC machine should be properly tuned match each machining job in order to get the optimal machining performance. However, most of the CNC machines are set with only a standard set of CNC parameters. In this study, we have developed an auto-tuning system which can automatically change the CNC parameters and in hence change the machine dynamic characteristics according to the selection of machining modes which are set by the mixed combination of three machine performance indexes: the HO (high surface quality) index, HP (high precision) index and HS (high speed) index. The acceleration, jerk, corner error tolerance, oscillation and dynamic bandwidth of machine’s feed axes have been changed according to the selection of the machine performance indexes. The proposed auto-tuning system of the CNC parameters has been implemented on a PC-based CNC controller and a three-axis machining center. The measured experimental result have shown the promising of our proposed auto-tuning system.

Keywords: auto-tuning, CNC parameters, machining mode, high speed, high accuracy, high surface quality

Procedia PDF Downloads 374
2587 Structure and Mechanics Patterns in the Assembly of Type V Intermediate-Filament Protein-Based Fibers

Authors: Mark Bezner, Shani Deri, Tom Trigano, Kfir Ben-Harush

Abstract:

Intermediate filament (IF) proteins-based fibers are among the toughest fibers in nature, as was shown by native hagfish slime threads and by synthetic fibers that are based on type V IF-proteins, the nuclear lamins. It is assumed that their mechanical performance stems from two major factors: (1) the transition from elastic -helices to stiff-sheets during tensile load; and (2) the specific organization of the coiled-coil proteins into a hierarchical network of nano-filaments. Here, we investigated the interrelationship between these two factors by using wet-spun fibers based on C. elegans (Ce) lamin. We found that Ce-lamin fibers, whether assembled in aqueous or alcoholic solutions, had the same nonlinear mechanical behavior, with the elastic region ending at ~5%. The pattern of the transition was, however, different: the ratio between -helices and -sheets/random coils was relatively constant until a 20% strain for fibers assembled in an aqueous solution, whereas for fibers assembled in 70% ethanol, the transition ended at a 6% strain. This structural phenomenon in alcoholic solution probably occurred through the transition between compacted and extended conformation of the random coil, and not between -helix and -sheets, as cycle analyses had suggested. The different transition pattern can also be explained by the different higher order organization of Ce-lamins in aqueous or alcoholic solutions, as demonstrated by introducing a point mutation in conserved residue in Ce-lamin gene that alter the structure of the Ce-lamins’ nano-fibrils. In addition, biomimicking the layered structure of silk and hair fibers by coating the Ce-lamin fiber with a hydrophobic layer enhanced fiber toughness and lead to a reversible transition between -helix and the extended conformation. This work suggests that different hierarchical structures, which are formed by specific assembly conditions, lead to diverse secondary structure transitions patterns, which in turn affect the fibers’ mechanical properties.

Keywords: protein-based fibers, intermediate filaments (IF) assembly, toughness, structure-property relationships

Procedia PDF Downloads 105
2586 User-Friendly Task Creation Using a CAD Integrated Robotic System on a Real Workcell

Authors: Alireza Changizi, Arash Rezaei, Jamal Muhammad, Jyrki Latokartano, Minna Lanz

Abstract:

Offline programming (OLP) is a new method in robot programming which is used widely in the industry nowadays which is a simulation base method that can produce the robot codes for motion according to virtual world in the simulation software. In this project Delmia v5 is used as simulation software. First the work cell component was modelled by Catia v5 and all of them was imported to a process file in Delmia and placed roughly to form the virtual work cell. Then robot was added to the work cell from the Delmia library. Work cell was calibrated corresponding to real world work cell to have accurate code. Tool calibration is the first step of calibration scheme and then work cell equipment can be calibrated using 6 point calibration method. Finally generated code needs to be reformed to match related controller code instruction. At the last stage IO were set to accomplish robots cooperation and make their motion synchronized. The pros and cons also will be discussed to clarify the presented results show the feasibility of the method and its effect on production line efficiency. Finally the positive and negative points of the implementation will be discussed.

Keywords: robotic, automated, production, offline programming, CAD

Procedia PDF Downloads 382
2585 Cross Matching: An Improved Method to Obtain Comprehensive and Consolidated Evidence

Authors: Tuula Heinonen, Wilhelm Gaus

Abstract:

At present safety, assessment starts with animal tests although their predictivity is often poor. Even after extended human use experimental data are often judged as the core information for risk assessment. However, the best opportunity to generate true evidence is to match all available information. Cross matching methodology combines the different fields of knowledge and types of data (e.g. in-vitro and in-vivo experiments, clinical observations, clinical and epidemiological studies, and daily life observations) and gives adequate weight to individual findings. To achieve a consolidated outcome, the information from all available sources is analysed and compared with each other. If single pieces of information fit together a clear picture becomes visible. If pieces of information are inconsistent or contradictory careful consideration is necessary. 'Cross' can be understood as 'orthographic' in geometry or as 'independent' in mathematics. Results coming from different sources bring independent and; therefore, they result in new information. Independent information gives a larger contribution to evidence than results coming repeatedly from the same source. A successful example of cross matching is the assessment of Ginkgo biloba where we were able to come to the conclusive result: Ginkgo biloba leave extract is well tolerated and safe for humans.

Keywords: cross-matching, human use, safety assessment, Ginkgo biloba leave extract

Procedia PDF Downloads 278
2584 Relaxation Dynamics of Quantum Emitters Resonantly Coupled to a Localized Surface Plasmon

Authors: Khachatur V. Nerkararyan, Sergey I. Bozhevolnyi

Abstract:

We investigate relaxation dynamics of a quantum dipole emitter (QDE), e.g., a molecule or quantum dot, located near a metal nanoparticle (MNP) exhibiting a dipolar localized surface plasmon (LSP) resonance at the frequency of the QDE radiative transition. It is shown that under the condition of the QDE-MNP characteristic relaxation time being much shorter than that of the QDE in free-space but much longer than the LSP lifetime. It is also shown that energy dissipation in the QDE-MNP system is relatively weak with the probability of the photon emission being about 0.75, a number which, rather surprisingly, does not explicitly depend on the metal absorption characteristics. The degree of entanglement measured by the concurrency takes the maximum value, while the distances between the QDEs and metal ball approximately are equal.

Keywords: metal nanoparticle, localized surface plasmon, quantum dipole emitter, relaxation dynamics

Procedia PDF Downloads 446
2583 Making Sense of Adversity Triggers Using Organisational Resilience, a Systematic Literature Review

Authors: Luke McGowan, David Pickernell, Martini Battisti

Abstract:

In this paper, Adversity Triggers were explored through the lens of Organisational Resilience. Adversity Triggers are contextualized by temporal factors, thus, naturally aligning to Resilience literature. Resilience has been chosen as the theoretical framework as risk management approaches are often not geared towards providing meaningful responses to high-impact, low-probability events. Adversity Triggers and Organisational Resilience both consider temporal factors which enabled investigation of each phase of recovery. A systematic literature was employed to assess previous literature and define further areas of research. The systematic literature review method was chosen to catalogue and identify gaps in current literature.

Keywords: adversity triggers, crisis, extreme events, organisational resilience, resilience

Procedia PDF Downloads 139
2582 Proficient Estimation Procedure for a Rare Sensitive Attribute Using Poisson Distribution

Authors: S. Suman, G. N. Singh

Abstract:

The present manuscript addresses the estimation procedure of population parameter using Poisson probability distribution when characteristic under study possesses a rare sensitive attribute. The generalized form of unrelated randomized response model is suggested in order to acquire the truthful responses from respondents. The resultant estimators have been proposed for two situations when the information on an unrelated rare non-sensitive characteristic is known as well as unknown. The properties of the proposed estimators are derived, and the measure of confidentiality of respondent is also suggested for respondents. Empirical studies are carried out in the support of discussed theory.

Keywords: Poisson distribution, randomized response model, rare sensitive attribute, non-sensitive attribute

Procedia PDF Downloads 262
2581 Autonomous Quantum Competitive Learning

Authors: Mohammed A. Zidan, Alaa Sagheer, Nasser Metwally

Abstract:

Real-time learning is an important goal that most of artificial intelligence researches try to achieve it. There are a lot of problems and applications which require low cost learning such as learn a robot to be able to classify and recognize patterns in real time and real-time recall. In this contribution, we suggest a model of quantum competitive learning based on a series of quantum gates and additional operator. The proposed model enables to recognize any incomplete patterns, where we can increase the probability of recognizing the pattern at the expense of the undesired ones. Moreover, these undesired ones could be utilized as new patterns for the system. The proposed model is much better compared with classical approaches and more powerful than the current quantum competitive learning approaches.

Keywords: competitive learning, quantum gates, quantum gates, winner-take-all

Procedia PDF Downloads 463
2580 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients

Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi

Abstract:

Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.

Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution

Procedia PDF Downloads 257
2579 The Effect of Peer Pressure and Leisure Boredom on Substance Use Among Adolescents in Low-Income Communities in Capetown

Authors: Gaironeesa Hendricks, Shazly Savahl, Maria Florence

Abstract:

The aim of the study is to determine whether peer pressure and leisure boredom influence substance use among adolescents in low-income communities in Cape Town. Non-probability sampling was used to select 296 adolescents between the ages of 16–18 from schools located in two low-income communities. The measurement tools included the Drug Use Disorders Identification Test, the Resistance to Peer Influence and Leisure Boredom Scales. Multiple regression revealed that the combined influence of peer pressure and leisure boredom predicted substance use, while peer pressure emerged as a stronger predictor than leisure boredom on substance use among adolescents.

Keywords: substance use, peer pressure, leisure boredom, adolescents, multiple regression

Procedia PDF Downloads 595
2578 InfoMiracles in the Qur’an and a Mathematical Proof to the Existence of God

Authors: Mohammad Mahmoud Mandurah

Abstract:

The existence of InfoMiracles in scripture is evidence that the scripture has a divine origin. It is also evidence to the existence of God. An InfoMiracle is an information-based miracle. The basic component of an InfoMiracle is a piece of information that could not be obtained by a human except through a divine channel. The existence of a sufficient number of convincing InfoMiracles in a scripture necessitates the existence of the divine source to these InfoMiracles. A mathematical equation is developed to prove that the Qur’an has a divine origin, and hence, prove the existence of God. The equation depends on a single variable only, which is the number of InfoMiracles in the Qur’an. The Qur’an is rich with InfoMiracles. It is shown that the existence of less than 30 InfoMiracles in the Qur’an is sufficient proof to the existence of God and that the Qur’an is a revelation from God.

Keywords: InfoMiracle, God, mathematical proof, miracle, probability

Procedia PDF Downloads 209
2577 Reliability Analysis of Partial Safety Factor Design Method for Slopes in Granular Soils

Authors: K. E. Daryani, H. Mohamad

Abstract:

Uncertainties in the geo-structure analysis and design have a significant impact on the safety of slopes. Traditionally, uncertainties in the geotechnical design are addressed by incorporating a conservative factor of safety in the analytical model. In this paper, a risk-based approach is adopted to assess the influence of the geotechnical variable uncertainties on the stability of infinite slopes in cohesionless soils using the “partial factor of safety on shear strength” approach as stated in Eurocode 7. Analyses conducted using Monte Carlo simulation show that the same partial factor can have very different levels of risk depending on the degree of uncertainty of the mean values of the soil friction angle and void ratio.

Keywords: Safety, Probability of Failure, Reliability, Infinite Slopes, Sand.

Procedia PDF Downloads 569
2576 Understanding and Political Participation in Constitutional Monarchy of Dusit District Residents

Authors: Sudaporn Arundee

Abstract:

The purposes of this research were to study in three areas: (1) to study political understanding and participating of the constitutional monarchy, (2) to study the level of participation. This paper drew upon data collected from 395 Dusit residents by using questionnaire. In addition, a simple random sampling was utilized to collect data. The findings revealed that 94 percent of respondents had a very good understanding of constitution monarchy with a mean of 4.8. However, the respondents overall had a very low level of participation with the mean score of 1.69 and standard deviation of .719.

Keywords: political participation, constitutional monarchy, management and social sciences

Procedia PDF Downloads 248
2575 Effect of the Aluminum Fraction “X” on the Laser Wavelengths in GaAs/AlxGa1-xAs Superlattices

Authors: F.Bendahma, S.Bentata

Abstract:

In this paper, we study numerically the eigenstates existing in a GaAs/AlxGa1-xAs superlattice with structural disorder in trimer height barrier (THB). Aluminium concentration x takes at random two different values, one of them appears only in triply and remains inferior to the second in the studied structure. In spite of the presence of disorder, the system exhibits two kinds of sets of propagating states lying below the barrier due to the characteristic structure of the superlattice. This result allows us to note the existence of a single laser emission in trimer and wavelengths are obtained in the mid-infrared.

Keywords: infrared (IR), laser emission, superlattice, trimer

Procedia PDF Downloads 439
2574 A Graph-Based Retrieval Model for Passage Search

Authors: Junjie Zhong, Kai Hong, Lei Wang

Abstract:

Passage Retrieval (PR) plays an important role in many Natural Language Processing (NLP) tasks. Traditional efficient retrieval models relying on exact term-matching, such as TF-IDF or BM25, have nowadays been exceeded by pre-trained language models which match by semantics. Though they gain effectiveness, deep language models often require large memory as well as time cost. To tackle the trade-off between efficiency and effectiveness in PR, this paper proposes Graph Passage Retriever (GraphPR), a graph-based model inspired by the development of graph learning techniques. Different from existing works, GraphPR is end-to-end and integrates both term-matching information and semantics. GraphPR constructs a passage-level graph from BM25 retrieval results and trains a GCN-like model on the graph with graph-based objectives. Passages were regarded as nodes in the constructed graph and were embedded in dense vectors. PR can then be implemented using embeddings and a fast vector-similarity search. Experiments on a variety of real-world retrieval datasets show that the proposed model outperforms related models in several evaluation metrics (e.g., mean reciprocal rank, accuracy, F1-scores) while maintaining a relatively low query latency and memory usage.

Keywords: efficiency, effectiveness, graph learning, language model, passage retrieval, term-matching model

Procedia PDF Downloads 130
2573 Multi-Criteria Evolutionary Algorithm to Develop Efficient Schedules for Complex Maintenance Problems

Authors: Sven Tackenberg, Sönke Duckwitz, Andreas Petz, Christopher M. Schlick

Abstract:

This paper introduces an extension to the well-established Resource-Constrained Project Scheduling Problem (RCPSP) to apply it to complex maintenance problems. The problem is to assign technicians to a team which has to process several tasks with multi-level skill requirements during a work shift. Here, several alternative activities for a task allow both, the temporal shift of activities or the reallocation of technicians and tools. As a result, switches from one valid work process variant to another can be considered and may be selected by the developed evolutionary algorithm based on the present skill level of technicians or the available tools. An additional complication of the observed scheduling problem is that the locations of the construction sites are only temporarily accessible during a day. Due to intensive rail traffic, the available time slots for maintenance and repair works are extremely short and are often distributed throughout the day. To identify efficient working periods, a first concept of a Bayesian network is introduced and is integrated into the extended RCPSP with pre-emptive and non-pre-emptive tasks. Thereby, the Bayesian network is used to calculate the probability of a maintenance task to be processed during a specific period of the shift. Focusing on the domain of maintenance of the railway infrastructure in metropolitan areas as the most unproductive implementation process at construction site, the paper illustrates how the extended RCPSP can be applied for maintenance planning support. A multi-criteria evolutionary algorithm with a problem representation is introduced which is capable of revising technician-task allocations, whereas the duration of the task may be stochastic. The approach uses a novel activity list representation to ensure easily describable and modifiable elements which can be converted into detailed shift schedules. Thereby, the main objective is to develop a shift plan which maximizes the utilization of each technician due to a minimization of the waiting times caused by rail traffic. The results of the already implemented core algorithm illustrate a fast convergence towards an optimal team composition for a shift, an efficient sequence of tasks and a high probability of the subsequent implementation due to the stochastic durations of the tasks. In the paper, the algorithm for the extended RCPSP is analyzed in experimental evaluation using real-world example problems with various size, resource complexity, tightness and so forth.

Keywords: maintenance management, scheduling, resource constrained project scheduling problem, genetic algorithms

Procedia PDF Downloads 227
2572 A Novel Parametric Chaos-Based Switching System PCSS for Image Encryption

Authors: Mohamed Salah Azzaz, Camel Tanougast, Tarek Hadjem

Abstract:

In this paper, a new low-cost image encryption technique is proposed and analyzed. The developed chaos-based key generator provides complex behavior and can change it automatically via a random-like switching rule. The designed encryption scheme is called PCSS (Parametric Chaos-based Switching System). The performances of this technique were evaluated in terms of data security and privacy. Simulation results have shown the effectiveness of this technique, and it can thereafter, ready for a hardware implementation.

Keywords: chaos, encryption, security, image

Procedia PDF Downloads 467
2571 ML-Based Blind Frequency Offset Estimation Schemes for OFDM Systems in Non-Gaussian Noise Environments

Authors: Keunhong Chae, Seokho Yoon

Abstract:

This paper proposes frequency offset (FO) estimation schemes robust to the non-Gaussian noise for orthogonal frequency division multiplexing (OFDM) systems. A maximum-likelihood (ML) scheme and a low-complexity estimation scheme are proposed by applying the probability density function of the cyclic prefix of OFDM symbols to the ML criterion. From simulation results, it is confirmed that the proposed schemes offer a significant FO estimation performance improvement over the conventional estimation scheme in non-Gaussian noise environments.

Keywords: frequency offset, cyclic prefix, maximum-likelihood, non-Gaussian noise, OFDM

Procedia PDF Downloads 468
2570 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia

Authors: Yuyun Wabula, B. J. Dewancker

Abstract:

In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.

Keywords: geolocation, Twitter, distribution analysis, human mobility

Procedia PDF Downloads 308
2569 Kinetic Model to Interpret Whistler Waves in Multicomponent Non-Maxwellian Space Plasmas

Authors: Warda Nasir, M. N. S. Qureshi

Abstract:

Whistler waves are right handed circularly polarized waves and are frequently observed in space plasmas. The Low frequency branch of the Whistler waves having frequencies nearly around 100 Hz, known as Lion roars, are frequently observed in magnetosheath. Another feature of the magnetosheath is the observations of flat top electron distributions with single as well as two electron populations. In the past, lion roars were studied by employing kinetic model using classical bi-Maxwellian distribution function, however, could not be justified both on quantitatively as well as qualitatively grounds. We studied Whistler waves by employing kinetic model using non-Maxwellian distribution function such as the generalized (r,q) distribution function which is the generalized form of kappa and Maxwellian distribution functions by employing kinetic theory with single or two electron populations. We compare our results with the Cluster observations and found good quantitative and qualitative agreement between them. At times when lion roars are observed (not observed) in the data and bi-Maxwellian could not provide the sufficient growth (damping) rates, we showed that when generalized (r,q) distribution function is employed, the resulted growth (damping) rates exactly match the observations.

Keywords: kinetic model, whistler waves, non-maxwellian distribution function, space plasmas

Procedia PDF Downloads 310
2568 Monotonicity of the Jensen Functional for f-Divergences via the Zipf-Mandelbrot Law

Authors: Neda Lovričević, Đilda Pečarić, Josip Pečarić

Abstract:

The Jensen functional in its discrete form is brought in relation to the Csiszar divergence functional, this time via its monotonicity property. This approach presents a generalization of the previously obtained results that made use of interpolating Jensen-type inequalities. Thus the monotonicity property is integrated with the Zipf-Mandelbrot law and applied to f-divergences for probability distributions that originate from the Csiszar divergence functional: Kullback-Leibler divergence, Hellinger distance, Bhattacharyya distance, chi-square divergence, total variation distance. The Zipf-Mandelbrot and the Zipf law are widely used in various scientific fields and interdisciplinary and here the focus is on the aspect of the mathematical inequalities.

Keywords: Jensen functional, monotonicity, Csiszar divergence functional, f-divergences, Zipf-Mandelbrot law

Procedia PDF Downloads 135
2567 Approximation of the Time Series by Fractal Brownian Motion

Authors: Valeria Bondarenko

Abstract:

In this paper, we propose two problems related to fractal Brownian motion. First problem is simultaneous estimation of two parameters, Hurst exponent and the volatility, that describe this random process. Numerical tests for the simulated fBm provided an efficient method. Second problem is approximation of the increments of the observed time series by a power function by increments from the fractional Brownian motion. Approximation and estimation are shown on the example of real data, daily deposit interest rates.

Keywords: fractional Brownian motion, Gausssian processes, approximation, time series, estimation of properties of the model

Procedia PDF Downloads 365
2566 Evaluating Daylight Performance in an Office Environment in Malaysia, Using Venetian Blind System: Case Study

Authors: Fatemeh Deldarabdolmaleki, Mohamad Fakri Zaky Bin Ja'afar

Abstract:

Having a daylit space together with view results in a pleasant and productive environment for office employees. A daylit space is a space which utilizes daylight as a basic source of illumination to fulfill user’s visual demands and minimizes the electric energy consumption. Malaysian weather is hot and humid all over the year because of its location in the equatorial belt. however, because most of the commercial buildings in Malaysia are air-conditioned, huge glass windows are normally installed in order to keep the physical and visual relation between inside and outside. As a result of climatic situation and mentioned new trend, an ordinary office has huge heat gain, glare, and discomfort for occupants. Balancing occupant’s comfort and energy conservation in a tropical climate is a real challenge. This study concentrates on evaluating a venetian blind system using per pixel analyzing tools based on the suggested cut-out metrics by the literature. Workplace area in a private office room has been selected as a case study. Eight-day measurement experiment was conducted to investigate the effect of different venetian blind angles in an office area under daylight conditions in Serdang, Malaysia. The study goal was to explore daylight comfort of a commercially available venetian blind system, its’ daylight sufficiency and excess (8:00 AM to 5 PM) as well as Glare examination. Recently developed software, analyzing High Dynamic Range Images (HDRI captured by CCD camera), such as radiance based Evalglare and hdrscope help to investigate luminance-based metrics. The main key factors are illuminance and luminance levels, mean and maximum luminance, daylight glare probability (DGP) and luminance ratio of the selected mask regions. The findings show that in most cases, morning session needs artificial lighting in order to achieve daylight comfort. However, in some conditions (e.g. 10° and 40° slat angles) in the second half of day the workplane illuminance level exceeds the maximum of 2000 lx. Generally, a rising trend is discovered toward mean window luminance and the most unpleasant cases occur after 2 P.M. Considering the luminance criteria rating, the uncomfortable conditions occur in the afternoon session. Surprisingly in no blind condition, extreme case of window/task ratio is not common. Studying the daylight glare probability, there is not any DGP value higher than 0.35 in this experiment.

Keywords: daylighting, energy simulation, office environment, Venetian blind

Procedia PDF Downloads 248
2565 The Lopsided Burden of Non-Communicable Diseases in India: Evidences from the Decade 2004-2014

Authors: Kajori Banerjee, Laxmi Kant Dwivedi

Abstract:

India is a part of the ongoing globalization, contemporary convergence, industrialization and technical advancement that is taking place world-wide. Some of the manifestations of this evolution is rapid demographic, socio-economic, epidemiological and health transition. There has been a considerable increase in non-communicable diseases due to change in lifestyle. This study aims to assess the direction of burden of disease and compare the pressure of infectious diseases against cardio-vascular, endocrine, metabolic and nutritional diseases. The change in prevalence in a ten-year period (2004-2014) is further decomposed to determine the net contribution of various socio-economic and demographic covariates. The present study uses the recent 71st (2014) and 60th (2004) rounds of National Sample Survey. The pressure of infectious diseases against cardio-vascular (CVD), endocrine, metabolic and nutritional (EMN) diseases during 2004-2014 is calculated by Prevalence Rates (PR), Hospitalization Rates (HR) and Case Fatality Rates (CFR). The prevalence of non-communicable diseases are further used as a dependent variable in a logit regression to find the effect of various social, economic and demographic factors on the chances of suffering from the particular disease. Multivariate decomposition technique further assists in determining the net contribution of socio-economic and demographic covariates. This paper upholds evidences of stagnation of the burden of communicable diseases (CD) and rapid increase in the burden of non-communicable diseases (NCD) uniformly for all population sub-groups in India. CFR for CVD has increased drastically in 2004-2014. Logit regression indicates the chances of suffering from CVD and EMN is significantly higher among the urban residents, older ages, females, widowed/ divorced and separated individuals. Decomposition displays ample proof that improvement in quality of life markers like education, urbanization, longevity of life has positively contributed in increasing the NCD prevalence rate. In India’s current epidemiological phase, compression theory of morbidity is in action as a significant rise in the probability of contracting the NCDs over the time period among older ages is observed. Age is found to play a vital contributor in increasing the probability of having CVD and EMN over the study decade 2004-2014 in the nationally representative sample of National Sample Survey.

Keywords: cardio-vascular disease, case-fatality rate, communicable diseases, hospitalization rate, multivariate decomposition, non-communicable diseases, prevalence rate

Procedia PDF Downloads 307
2564 Self-serving Anchoring of Self-judgments

Authors: Elitza Z. Ambrus, Bjoern Hartig, Ryan McKay

Abstract:

Individuals’ self-judgments might be malleable and influenced by comparison with a random value. On the one hand, self-judgments reflect our self-image, which is typically considered to be stable in adulthood. Indeed, people also strive hard to maintain a fixed, positive moral image of themselves. On the other hand, research has shown the robustness of the so-called anchoring effect on judgments and decisions. The anchoring effect refers to the influence of a previously considered comparative value (anchor) on a consecutive absolute judgment and reveals that individuals’ estimates of various quantities are flexible and can be influenced by a salient random value. The present study extends the anchoring paradigm to the domain of the self. We also investigate whether participants are more susceptible to self-serving anchors, i.e., anchors that enhance participant’s self-image, especially their moral self-image. In a pre-reregistered study via the online platform Prolific, 249 participants (156 females, 89 males, 3 other and 1 who preferred not to specify their gender; M = 35.88, SD = 13.91) ranked themselves on eight personality characteristics. However, in the anchoring conditions, respondents were asked to first indicate whether they thought they would rank higher or lower than a given anchor value before providing their estimated rank in comparison to 100 other anonymous participants. A high and a low anchor value were employed to differentiate between anchors in a desirable (self-serving) direction and anchors in an undesirable (self-diminishing) direction. In the control treatment, there was no comparison question. Subsequently, participants provided their self-rankings on the eight personality traits with two personal characteristics for each combination of the factors desirable/undesirable and moral/non-moral. We found evidence of an anchoring effect for self-judgments. Moreover, anchoring was more efficient when people were anchored in a self-serving direction: the anchoring effect was enhanced when supporting a more favorable self-view and mitigated (even reversed) when implying a deterioration of the self-image. The self-serving anchoring was more pronounced for moral than for non-moral traits. The data also provided evidence in support of a better-than-average effect in general as well as a magnified better-than-average effect for moral traits. Taken together, these results suggest that self-judgments might not be as stable in adulthood as previously thought. In addition, considerations of constructing and maintaining a positive self-image might interact with the anchoring effect on self-judgments. Potential implications of our results concern the construction and malleability of self-judgments as well as the psychological mechanism shaping anchoring.

Keywords: anchoring, better-than-average effect, self-judgments, self-serving anchoring

Procedia PDF Downloads 174
2563 The Study of ZigBee Protocol Application in Wireless Networks

Authors: Ardavan Zamanpour, Somaieh Yassari

Abstract:

ZigBee protocol network was developed in industries and MIT laboratory in 1997. ZigBee is a wireless networking technology by alliance ZigBee which is designed to low board and low data rate applications. It is a Protocol which connects between electrical devises with very low energy and cost. The first version of IEEE 802.15.4 which was formed ZigBee was based on 2.4GHZ MHZ 912MHZ 868 frequency band. The name of system is often reminded random directions that bees (BEES) traversing during pollination of products. Such as alloy of the ways in which information packets are traversed within the mesh network. This paper aims to study the performance and effectiveness of this protocol in wireless networks.

Keywords: ZigBee, protocol, wireless, networks

Procedia PDF Downloads 362
2562 TomoTherapy® System Repositioning Accuracy According to Treatment Localization

Authors: Veronica Sorgato, Jeremy Belhassen, Philippe Chartier, Roddy Sihanath, Nicolas Docquiere, Jean-Yves Giraud

Abstract:

We analyzed the image-guided radiotherapy method used by the TomoTherapy® System (Accuray Corp.) for patient repositioning in clinical routine. The TomoTherapy® System computes X, Y, Z and roll displacements to match the reference CT, on which the dosimetry has been performed, with the pre-treatment MV CT. The accuracy of the repositioning method has been studied according to the treatment localization. For this, a database of 18774 treatment sessions, performed during 2 consecutive years (2016-2017 period) has been used. The database includes the X, Y, Z and roll displacements proposed by TomoTherapy® System as well as the manual correction of these proposals applied by the radiation therapist. This manual correction aims to further improve the repositioning based on the clinical situation and depends on the structures surrounding the target tumor tissue. The statistical analysis performed on the database aims to define repositioning limits to be used as security and guiding tool for the manual adjustment implemented by the radiation therapist. This tool will participate not only to notify potential repositioning errors but also to further improve patient positioning for optimal treatment.

Keywords: accuracy, IGRT MVCT, image-guided radiotherapy megavoltage computed tomography, statistical analysis, tomotherapy, localization

Procedia PDF Downloads 220