Search results for: high leverage points
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21622

Search results for: high leverage points

21562 Lies of Police Interrogators in the Ultimatum Game

Authors: Eitan Elaad

Abstract:

The present study's purpose was to examine lyingand pretend fairness by police interrogators in sharing situations. Forty police officers and 40 laypeople from the community, all males, self-assessed their lie-telling ability, rated the frequency of their lies, evaluated the acceptability of lying, and indicated using rational and intuitive thinking while lying. Next, according to the ultimatum game procedure, participants were asked to share 100 points with a virtual target, either a male police interrogator or a male layman. Participantsallocated points to the target person bearing in mind that the other person must accept their offer. Participants' goal was to retain as many points as possible, and to this end, they could tell the target person that fewer than 100 points were available for distribution. The difference between the available 100 points and the sum of points designated for sharing defines lying. The ratio of offered and designated points defines pretend fairness. Results indicate that those police officers lied more than laypeople. Similar results emergedeven when the target person was a police interrogator. However, police interrogators presented higher pretend fairness than laypeople. The higher pretend fairness may be in line with interrogation tactics of persuasion used in the criminal interrogation. Higher-lying frequency reported by police interrogators compared with laypeople support the present results. Finally, lie acceptability predicted lying in the ultimatum game. Specifically, participants who rated lying as more acceptable tended to lie more than low acceptability raters.

Keywords: lying, police interrogators, lie acceptability, ultimatum game, pretend fairness

Procedia PDF Downloads 138
21561 Volatility Spillover Among the Stock Markets of South Asian Countries

Authors: Tariq Aziz, Suresh Kumar, Vikesh Kumar, Sheraz Mustafa, Jhanzeb Marwat

Abstract:

The paper provides an updated version of volatility spillover among the equity markets of South Asian countries, including Pakistan, India, Srilanka, and Bangladesh. The analysis uses both symmetric and asymmetric Generalized Autoregressive Conditional Heteroscedasticity models to investigate volatility persistence and leverage effect. The bivariate EGARCH model is used to test for volatility transmission between two equity markets. Weekly data for the period February 2013 to August 2019 is used for empirical analysis. The findings indicate that the leverage effect exists in the equity markets of all the countries except Bangladesh. The volatility spillover from the equity market of Bangladesh to all other countries is negative and significant whereas the volatility of the equity market of Sri-Lanka does influence the volatility of any other country’s equity market. Indian equity market influence only the volatility of the Sri-Lankan equity market; and there is bidirectional volatility spillover between the equity markets of Pakistan and Bangladesh. The findings are important for policy-makers and international investors.

Keywords: volatility spillover, volatility persistence, garch, egarch

Procedia PDF Downloads 124
21560 Augmented Tourism: Definitions and Design Principles

Authors: Eric Hawkinson

Abstract:

After designing and implementing several iterations of implementations of augmented reality (AR) in tourism, this paper takes a deep look into design principles and implementation strategies of using AR at destination tourism settings. The study looks to define augmented tourism from past implementations as well as several cases, uses designed and implemented for tourism. The discussion leads to formation of frameworks and best practices for AR as well as virtual reality( VR) to be used in tourism settings. Some main affordances include guest autonomy, customized experiences, visitor data collection and increased electronic word-of-mouth generation for promotion purposes. Some challenges found include the need for high levels of technology infrastructure, low adoption rates or ‘buy-in’ rates, high levels of calibration and customization, and the need for maintenance and support services. Some suggestions are given as to how to leverage the affordances and meet the challenges of implementing AR for tourism.

Keywords: augmented tourism, augmented reality, eTourism, virtual tourism, tourism design

Procedia PDF Downloads 353
21559 Eliminating Cutter-Path Deviation For Five-Axis Nc Machining

Authors: Alan C. Lin, Tsong Der Lin

Abstract:

This study proposes a deviation control method to add interpolation points to numerical control (NC) codes of five-axis machining in order to achieve the required machining accuracy. Specific research issues include: (1) converting machining data between the CL (cutter location) domain and the NC domain, (2) calculating the deviation between the deviated path and the linear path, (3) finding interpolation points, and (4) determining tool orientations for the interpolation points. System implementation with practical examples will also be included to highlight the applicability of the proposed methodology.

Keywords: CAD/CAM, cutter path, five-axis machining, numerical control

Procedia PDF Downloads 408
21558 Lies and Pretended Fairness of Police Officers in Sharing

Authors: Eitan Elaad

Abstract:

The current study aimed to examine lying and pretended fairness by police personnel in sharing situations. Forty Israeli police officers and 40 laypeople from the community, all males, self-assessed their lie-telling ability, rated the frequency of their lies, evaluated the acceptability of lying, and indicated using rational and intuitive thinking while lying. Next, according to the ultimatum game procedure, participants were asked to share 100 points with an imagined target, either a male policeman or a male non-policeman. Participants allocated points to the target person bearing in mind that the other person must accept or reject their offer. Participants' goal was to retain as many points as possible, and to this end, they could tell the target person that fewer than 100 points were available for distribution. We defined concealment or lying as the difference between the available 100 points and the sum of points designated for sharing. Results indicated that police officers lied less to their fellow police targets than non-police targets, whereas laypeople lied less to non-police targets than imagined police targets. The ratio between the points offered to the imagined target person and the points endowed by the participant as available for sharing defined pretended fairness.Enhanced pretended fairness indicates higher motivation to display fair sharing even if the fair sharing is fictitious. Police officers presented higher pretended fairness to police targets than laypeople, whereas laypeople set off more fairness to non-police targets than police officers. We discussed the results concerning occupation solidarity and loyalty among police personnel. Specifically, police work involves uncertainty, danger and risk, coercive authority, and the use of force, which isolates the police from the community and dictates strong bonds of solidarity between police personnel. No wonder police officers shared more points (lied less) to fellow police targets than non-police targets. On the other hand, police legitimacy or the belief that the police are acting honestly in the best interest of the citizens constitutes citizens' attitudes toward the police. The relatively low number of points shared for distribution by laypeople to police targets indicates difficulties with the legitimacy of the Israeli police.

Keywords: lying, fairness, police solidarity, police legitimacy, sharing, ultimatum game

Procedia PDF Downloads 103
21557 Fusion of MOLA-based DEMs and HiRISE Images for Large-Scale Mars Mapping

Authors: Ahmed F. Elaksher, Islam Omar

Abstract:

In this project, we used MOLA-based DEMs to orthorectify HiRISE optical images. The MOLA data was interpolated using the kriging interpolation technique. Corresponding tie points were then digitized from both datasets. These points were employed in co-registering both datasets using GIS analysis tools. Different transformation models, including the affine and projective transformation models, were used with different sets and distributions of tie points. Additionally, we evaluated the use of the MOLA elevations in co-registering the MOLA and HiRISE datasets. The planimetric RMSEs achieved for each model are reported. Results suggested the use of 3D-2D transformation models.

Keywords: photogrammetry, Mars, MOLA, HiRISE

Procedia PDF Downloads 59
21556 Optimal Price Points in Differential Pricing

Authors: Katerina Kormusheva

Abstract:

Pricing plays a pivotal role in the marketing discipline as it directly influences consumer perceptions, purchase decisions, and overall market positioning of a product or service. This paper seeks to expand current knowledge in the area of discriminatory and differential pricing, a main area of marketing research. The methodology includes developing a framework and a model for determining how many price points to implement in differential pricing. We focus on choosing the levels of differentiation, derive a function form of the model framework proposed, and lastly, test it empirically with data from a large-scale marketing pricing experiment of services in telecommunications.

Keywords: marketing, differential pricing, price points, optimization

Procedia PDF Downloads 76
21555 Approximation of Intersection Curves of Two Parametric Surfaces

Authors: Misbah Irshad, Faiza Sarfraz

Abstract:

The problem of approximating surface to surface intersection is considered to be very important in computer aided geometric design and computer aided manufacturing. Although it is a complex problem to handle, its continuous need in the industry makes it an active topic in research. A technique for approximating intersection curves of two parametric surfaces is proposed, which extracts boundary points and turning points from a sequence of intersection points and interpolate them with the help of rational cubic spline functions. The proposed approach is demonstrated with the help of examples and analyzed by calculating error.

Keywords: approximation, parametric surface, spline function, surface intersection

Procedia PDF Downloads 247
21554 Comparison of Sports Massage and Stretching along the Cold on Pain Intensity in Elite Female Volleyball Players with Trigger Points in Shoulder Girdle Region

Authors: Sahar Mohammadyari Ghareh Bolagh, Behnaz Seyedi Aghdam, Jalal Shamlou

Abstract:

This study was done to compare the effects of sports massage and stretching along the cold on pain intensity in elite female volleyball players with trigger points in shoulder girdle region. This study was conducted on 32 female volleyball players with latent trigger points in shoulder girdle region. Patients were randomly assigned to three groups: sports massage (n=11) stretching along the cold (n=11) and control group (n=10). One session treatment program during 15 minutes was performed. Pain intensity with VAS + algometer was assessed before and after intervention and improved in both of massage and cold groups. After treatment there were no significant difference between two treatment groups (P < 0. 050). Results of present research showed sports massage and stretching along the cold were effective on pain intensity of myofascial trigger points.

Keywords: sports massage، stretching along the cold، pain intensity، trigger points, elite, volleyball players, shoulder girdle region

Procedia PDF Downloads 349
21553 The Quality of Management: A Leadership Maturity Model to Leverage Complexity

Authors: Marlene Kuhn, Franziska Schäfer, Heiner Otten

Abstract:

Today´s production processes experience a constant increase in complexity paving new ways for progressive forms of leadership. In the customized production, individual customer requirements drive companies to adapt their manufacturing processes constantly while the pressure for smaller lot sizes, lower costs and faster lead times grows simultaneously. When production processes are becoming more dynamic and complex, the conventional quality management approaches show certain limitations. This paper gives an introduction to complexity science from a quality management perspective. By analyzing and evaluating different characteristics of complexity, the critical complexity parameters are identified and assessed. We found that the quality of leadership plays a crucial role when dealing with increasing complexity. Therefore, we developed a concept for qualitative leadership customized for the management within complex processes based on a maturity model. The maturity model was then applied in the industry to assess the leadership quality of several shop floor managers with a positive evaluation feedback. In result, the maturity model proved to be a sustainable approach to leverage the rising complexity in production processes more effectively.

Keywords: maturity model, process complexity, quality of leadership, quality management

Procedia PDF Downloads 353
21552 Electret: A Solution of Partial Discharge in High Voltage Applications

Authors: Farhina Haque, Chanyeop Park

Abstract:

The high efficiency, high field, and high power density provided by wide bandgap (WBG) semiconductors and advanced power electronic converter (PEC) topologies enabled the dynamic control of power in medium to high voltage systems. Although WBG semiconductors outperform the conventional Silicon based devices in terms of voltage rating, switching speed, and efficiency, the increased voltage handling properties, high dv/dt, and compact device packaging increase local electric fields, which are the main causes of partial discharge (PD) in the advanced medium and high voltage applications. PD, which occurs actively in voids, triple points, and airgaps, is an inevitable dielectric challenge that causes insulation and device aging. The aging process accelerates over time and eventually leads to the complete failure of the applications. Hence, it is critical to mitigating PD. Sharp edges, airgaps, triple points, and bubbles are common defects that exist in any medium to high voltage device. The defects are created during the manufacturing processes of the devices and are prone to high-electric-field-induced PD due to the low permittivity and low breakdown strength of the gaseous medium filling the defects. A contemporary approach of mitigating PD by neutralizing electric fields in high power density applications is introduced in this study. To neutralize the locally enhanced electric fields that occur around the triple points, airgaps, sharp edges, and bubbles, electrets are developed and incorporated into high voltage applications. Electrets are electric fields emitting dielectric materials that are embedded with electrical charges on the surface and in bulk. In this study, electrets are fabricated by electrically charging polyvinylidene difluoride (PVDF) films based on the widely used triode corona discharge method. To investigate the PD mitigation performance of the fabricated electret films, a series of PD experiments are conducted on both the charged and uncharged PVDF films under square voltage stimuli that represent PWM waveform. In addition to the use of single layer electrets, multiple layers of electrets are also experimented with to mitigate PD caused by higher system voltages. The electret-based approach shows great promise in mitigating PD by neutralizing the local electric field. The results of the PD measurements suggest that the development of an ultimate solution to the decades-long dielectric challenge would be possible with further developments in the fabrication process of electrets.

Keywords: electrets, high power density, partial discharge, triode corona discharge

Procedia PDF Downloads 187
21551 Comparison of Rainfall Trends in the Western Ghats and Coastal Region of Karnataka, India

Authors: Vinay C. Doranalu, Amba Shetty

Abstract:

In recent days due to climate change, there is a large variation in spatial distribution of daily rainfall within a small region. Rainfall is one of the main end climatic variables which affect spatio-temporal patterns of water availability. The real task postured by the change in climate is identification, estimation and understanding the uncertainty of rainfall. This study intended to analyze the spatial variations and temporal trends of daily precipitation using high resolution (0.25º x 0.25º) gridded data of Indian Meteorological Department (IMD). For the study, 38 grid points were selected in the study area and analyzed for daily precipitation time series (113 years) over the period 1901-2013. Grid points were divided into two zones based on the elevation and situated location of grid points: Low Land (exposed to sea and low elevated area/ coastal region) and High Land (Interior from sea and high elevated area/western Ghats). Time series were applied to examine the spatial analysis and temporal trends in each grid points by non-parametric Mann-Kendall test and Theil-Sen estimator to perceive the nature of trend and magnitude of slope in trend of rainfall. Pettit-Mann-Whitney test is applied to detect the most probable change point in trends of the time period. Results have revealed remarkable monotonic trend in each grid for daily precipitation of the time series. In general, by the regional cluster analysis found that increasing precipitation trend in shoreline region and decreasing trend in Western Ghats from recent years. Spatial distribution of rainfall can be partly explained by heterogeneity in temporal trends of rainfall by change point analysis. The Mann-Kendall test shows significant variation as weaker rainfall towards the rainfall distribution over eastern parts of the Western Ghats region of Karnataka.

Keywords: change point analysis, coastal region India, gridded rainfall data, non-parametric

Procedia PDF Downloads 281
21550 Times2D: A Time-Frequency Method for Time Series Forecasting

Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan

Abstract:

Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.

Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation

Procedia PDF Downloads 25
21549 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 86
21548 X-Corner Detection for Camera Calibration Using Saddle Points

Authors: Abdulrahman S. Alturki, John S. Loomis

Abstract:

This paper discusses a corner detection algorithm for camera calibration. Calibration is a necessary step in many computer vision and image processing applications. Robust corner detection for an image of a checkerboard is required to determine intrinsic and extrinsic parameters. In this paper, an algorithm for fully automatic and robust X-corner detection is presented. Checkerboard corner points are automatically found in each image without user interaction or any prior information regarding the number of rows or columns. The approach represents each X-corner with a quadratic fitting function. Using the fact that the X-corners are saddle points, the coefficients in the fitting function are used to identify each corner location. The automation of this process greatly simplifies calibration. Our method is robust against noise and different camera orientations. Experimental analysis shows the accuracy of our method using actual images acquired at different camera locations and orientations.

Keywords: camera calibration, corner detector, edge detector, saddle points

Procedia PDF Downloads 392
21547 An Efficient Algorithm of Time Step Control for Error Correction Method

Authors: Youngji Lee, Yonghyeon Jeon, Sunyoung Bu, Philsu Kim

Abstract:

The aim of this paper is to construct an algorithm of time step control for the error correction method most recently developed by one of the authors for solving stiff initial value problems. It is achieved with the generalized Chebyshev polynomial and the corresponding error correction method. The main idea of the proposed scheme is in the usage of the duplicated node points in the generalized Chebyshev polynomials of two different degrees by adding necessary sample points instead of re-sampling all points. At each integration step, the proposed method is comprised of two equations for the solution and the error, respectively. The constructed algorithm controls both the error and the time step size simultaneously and possesses a good performance in the computational cost compared to the original method. Two stiff problems are numerically solved to assess the effectiveness of the proposed scheme.

Keywords: stiff initial value problem, error correction method, generalized Chebyshev polynomial, node points

Procedia PDF Downloads 552
21546 The Impact of Covid-19 on Anxiety Levels in the General Population of the United States: An Exploratory Survey

Authors: Amro Matyori, Fatimah Sherbeny, Askal Ali, Olayiwola Popoola

Abstract:

Objectives: The study evaluated the impact of COVID-19 on anxiety levels in the general population in the United States. Methods: The study used an online questionnaire. It adopted the Generalized Anxiety Disorder Assessment (GAD-7) instrument. It is a self-administered scale with seven items used as a screening tool and severity measure for generalized anxiety disorder. The participants rated the frequency of anxiety symptoms in the last two weeks on a Likert scale, which ranges from 0-3. Then the item points are summed to provide the total score. Results: Thirty-two participants completed the questionnaire. Among them, 24 (83%) females and 5 (17%) males. The age range of 18-24-year-old represented the most respondents. Only one of the participants tested positive for the COVID-19, and 39% of them, one of their family members, friends, or colleagues, tested positive for the coronavirus. Moreover, 10% have lost a family member, a close friend, or a colleague because of COVID-19. Among the respondents, there were ten who scored approximately five points on the GAD-7 scale, which indicates mild anxiety. Furthermore, eight participants scored 10 to 14 points, which put them under the category of moderate anxiety, and one individual who was categorized under severe anxiety scored 15 points. Conclusions: It is identified that most of the respondents scored the points that put them under the mild anxiety category during the COVID-19 pandemic. It is also noticed that severe anxiety was the lowest among the participants, and people who tested positive and/or their family members, close friends, and colleagues were more likely to experience anxiety. Additionally, participants who lost friends or family members were also at high risk of anxiety. It is obvious the COVID-19 outcomes and too much thinking about the pandemic put people under stress which led to anxiety. Therefore, continuous assessment and monitoring of psychological outcomes during pandemics will help to establish early well-informed interventions.

Keywords: anxiety and covid-19, covid-19 and mental health outcomes, influence of covid-19 on anxiety, population and covid-19 impact on mental health

Procedia PDF Downloads 192
21545 A Robust Digital Image Watermarking Against Geometrical Attack Based on Hybrid Scheme

Authors: M. Samadzadeh Mahabadi, J. Shanbehzadeh

Abstract:

This paper presents a hybrid digital image-watermarking scheme, which is robust against varieties of attacks and geometric distortions. The image content is represented by important feature points obtained by an image-texture-based adaptive Harris corner detector. These feature points are extracted from LL2 of 2-D discrete wavelet transform which are obtained by using the Harris-Laplacian detector. We calculate the Fourier transform of circular regions around these points. The amplitude of this transform is rotation invariant. The experimental results demonstrate the robustness of the proposed method against the geometric distortions and various common image processing operations such as JPEG compression, colour reduction, Gaussian filtering, median filtering, and rotation.

Keywords: digital watermarking, geometric distortions, geometrical attack, Harris Laplace, important feature points, rotation, scale invariant feature

Procedia PDF Downloads 487
21544 Disclosure Extension of Oil and Gas Reserve Quantum

Authors: Ali Alsawayeh, Ibrahim Eldanfour

Abstract:

This paper examines the extent of disclosure of oil and gas reserve quantum in annual reports of international oil and gas exploration and production companies, particularly companies in untested international markets, such as Canada, the UK and the US, and seeks to determine the underlying factors that affect the level of disclosure on oil reserve quantum. The study is concerned with the usefulness of disclosure of oil and gas reserves quantum to investors and other users. Given the primacy of the annual report (10-k) as a source of supplemental reserves data about the company and as the channel through which companies disseminate information about their performance, the annual reports for one year (2009) were the central focus of the study. This comparative study seeks to establish whether differences exist between the sample companies, based on new disclosure requirements by the Securities and Exchange Commission (SEC) in respect of reserves classification and definition. The extent of disclosure of reserve is provided and compared among the selected companies. Statistical analysis is performed to determine whether any differences exist in the extent of disclosure of reserve under the determinant variables. This study shows that some factors would affect the extent of disclosure of reserve quantum in the above-mentioned countries, namely: company’s size, leverage and quality of auditor. Companies that provide reserves quantum in detail appear to display higher size. The findings also show that the level of leverage has affected companies’ reserves quantum disclosure. Indeed, companies that provide detailed reserves quantum disclosure tend to employ a ‘high-quality auditor’. In addition, the study found significant independent variable such as Profit Sharing Contracts (PSC). This factor could explain variations in the level of disclosure of oil reserve quantum between the contractor and host governments. The implementation of SEC oil and gas reporting requirements do not enhance companies’ valuation because the new rules are based only on past and present reserves information (proven reserves); hence, future valuation of oil and gas companies is missing for the market.

Keywords: comparison, company characteristics, disclosure, reserve quantum, regulation

Procedia PDF Downloads 390
21543 Variogram Fitting Based on the Wilcoxon Norm

Authors: Hazem Al-Mofleh, John Daniels, Joseph McKean

Abstract:

Within geostatistics research, effective estimation of the variogram points has been examined, particularly in developing robust alternatives. The parametric fit of these variogram points which eventually defines the kriging weights, however, has not received the same attention from a robust perspective. This paper proposes the use of the non-linear Wilcoxon norm over weighted non-linear least squares as a robust variogram fitting alternative. First, we introduce the concept of variogram estimation and fitting. Then, as an alternative to non-linear weighted least squares, we discuss the non-linear Wilcoxon estimator. Next, the robustness properties of the non-linear Wilcoxon are demonstrated using a contaminated spatial data set. Finally, under simulated conditions, increasing levels of contaminated spatial processes have their variograms points estimated and fit. In the fitting of these variogram points, both non-linear Weighted Least Squares and non-linear Wilcoxon fits are examined for efficiency. At all levels of contamination (including 0%), using a robust estimation and robust fitting procedure, the non-weighted Wilcoxon outperforms weighted Least Squares.

Keywords: non-linear wilcoxon, robust estimation, variogram estimation, wilcoxon norm

Procedia PDF Downloads 439
21542 Digitalization of Functional Safety - Increasing Productivity while Reducing Risks

Authors: Michael Scott, Phil Jarrell

Abstract:

Digitalization seems to be everywhere these days. So if one was to digitalize Functional Safety, what would that require: • Ability to directly use data from intelligent P&IDs / process design in a PHA / LOPA • Ability to directly use data from intelligent P&IDs in the SIS Design to support SIL Verification Calculations, SRS, C&Es, Functional Test Plans • Ability to create Unit Operation / SIF Libraries to radically reduce engineering manhours while ensuring consistency and improving quality of SIS designs • Ability to link data directly from a PHA / LOPA to SIS Designs • Ability to leverage reliability models and SRS details from SIS Designs to automatically program the Safety PLC • Ability to leverage SIS Test Plans to automatically create Safety PLC application logic Test Plans for a virtual FAT • Ability to tie real-time data from Process Historians / CMMS to assumptions in the PHA / LOPA and SIS Designs to generate leading indicators on protection layer health • Ability to flag SIS bad actors for proactive corrective actions prior to a near miss or loss of containment event What if I told you all of this was available today? This paper will highlight how the digital revolution has revolutionized the way Safety Instrumented Systems are designed, configured, operated and maintained.

Keywords: IEC 61511, safety instrumented systems, functional safety, digitalization, IIoT

Procedia PDF Downloads 155
21541 Modelling Impacts of Global Financial Crises on Stock Volatility of Nigeria Banks

Authors: Maruf Ariyo Raheem, Patrick Oseloka Ezepue

Abstract:

This research aimed at determining most appropriate heteroskedastic model to predicting volatility of 10 major Nigerian banks: Access, United Bank for Africa (UBA), Guaranty Trust, Skye, Diamond, Fidelity, Sterling, Union, ETI and Zenith banks using daily closing stock prices of each of the banks from 2004 to 2014. The models employed include ARCH (1), GARCH (1, 1), EGARCH (1, 1) and TARCH (1, 1). The results show that all the banks returns are highly leptokurtic, significantly skewed and thus non-normal across the four periods except for Fidelity bank during financial crises; findings similar to those of other global markets. There is also strong evidence for the presence of heteroscedasticity, and that volatility persistence during crisis is higher than before the crisis across the 10 banks, with that of UBA taking the lead, about 11 times higher during the crisis. Findings further revealed that Asymmetric GARCH models became dominant especially during financial crises and post crises when the second reforms were introduced into the banking industry by the Central Bank of Nigeria (CBN). Generally, one could say that Nigerian banks returns are volatility persistent during and after the crises, and characterised by leverage effects of negative and positive shocks during these periods

Keywords: global financial crisis, leverage effect, persistence, volatility clustering

Procedia PDF Downloads 509
21540 Volatility and Stylized Facts

Authors: Kalai Lamia, Jilani Faouzi

Abstract:

Measuring and controlling risk is one of the most attractive issues in finance. With the persistence of uncontrolled and erratic stocks movements, volatility is perceived as a barometer of daily fluctuations. An objective measure of this variable seems then needed to control risks and cover those that are considered the most important. Non-linear autoregressive modeling is our first evaluation approach. In particular, we test the presence of “persistence” of conditional variance and the presence of a degree of a leverage effect. In order to resolve for the problem of “asymmetry” in volatility, the retained specifications point to the importance of stocks reactions in response to news. Effects of shocks on volatility highlight also the need to study the “long term” behaviour of conditional variance of stocks returns and articulate the presence of long memory and dependence of time series in the long run. We note that the integrated fractional autoregressive model allows for representing time series that show long-term conditional variance thanks to fractional integration parameters. In order to stop at the dynamics that manage time series, a comparative study of the results of the different models will allow for better understanding volatility structure over the Tunisia stock market, with the aim of accurately predicting fluctuation risks.

Keywords: asymmetry volatility, clustering, stylised facts, leverage effect

Procedia PDF Downloads 286
21539 Searching k-Nearest Neighbors to be Appropriate under Gaming Environments

Authors: Jae Moon Lee

Abstract:

In general, algorithms to find continuous k-nearest neighbors have been researched on the location based services, monitoring periodically the moving objects such as vehicles and mobile phone. Those researches assume the environment that the number of query points is much less than that of moving objects and the query points are not moved but fixed. In gaming environments, this problem is when computing the next movement considering the neighbors such as flocking, crowd and robot simulations. In this case, every moving object becomes a query point so that the number of query point is same to that of moving objects and the query points are also moving. In this paper, we analyze the performance of the existing algorithms focused on location based services how they operate under gaming environments.

Keywords: flocking behavior, heterogeneous agents, similarity, simulation

Procedia PDF Downloads 281
21538 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network

Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan

Abstract:

Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.

Keywords: aggregation point, data communication, data aggregation, wireless sensor network

Procedia PDF Downloads 144
21537 The Effect of Physical Therapy on Triceps Surae Myofascial Trigger Point

Authors: M. Simon, O. Peillon, R. Seijas, P. Alvarez, A. Pérez-Bellmunt

Abstract:

Introduction: Myofascial trigger points (MTrPs) are defined as hyperirritable areas within taut bands of skeletal muscle and classified as either active or latent. Although they could be present in any muscle, the triceps surae is one of the most affected of the lower limb. The aim of this study was described which treatments are more used and their principal results. Study design: We performed a systematic literature search using strategies for the concepts of “Trigger Points and Gastrocnemius and Soleus not Trapezius” in Medline. Articles were screened by authors and included if they contained a rehabilitation intervention of MTrPs in healthy subjects or patients. Results: The treatments used were mostly invasive interventions and only a small part of the studies used non-invasive treatments. The methodology (time o type of intervention, characteristics of treatment, etc.) used in these treatments were frequently undefined. Overall, examination variables varied significantly among the included studies, but they were improving their parameters when the MTrPs were treated. Conclusions: There are a high variety of physical therapy treatments to improve the symptomatology of MTrPs when affect triceps surae muscle. Even so, not a single study analyzing the skeletal muscle contractile parameters (as maximal displacement or delay time) change with MTrPS therapy has been found. The treatments have to better specificity the methodology used in the futures investigation.

Keywords: fascia, myofascial trigger points, physical therapy, triceps surae

Procedia PDF Downloads 135
21536 Modeling Discrimination against Gay People: Predictors of Homophobic Behavior against Gay Men among High School Students in Switzerland

Authors: Patrick Weber, Daniel Gredig

Abstract:

Background and Purpose: Research has well documented the impact of discrimination and micro-aggressions on the wellbeing of gay men and, especially, adolescents. For the prevention of homophobic behavior against gay adolescents, however, the focus has to shift on those who discriminate: For the design and tailoring of prevention and intervention, it is important to understand the factors responsible for homophobic behavior such as, for example, verbal abuse. Against this background, the present study aimed to assess homophobic – in terms of verbally abusive – behavior against gay people among high school students. Furthermore, it aimed to establish the predictors of the reported behavior by testing an explanatory model. This model posits that homophobic behavior is determined by negative attitudes and knowledge. These variables are supposed to be predicted by the acceptance of traditional gender roles, religiosity, orientation toward social dominance, contact with gay men, and by the perceived expectations of parents, friends and teachers. These social-cognitive variables in turn are assumed to be determined by students’ gender, age, immigration background, formal school level, and the discussion of gay issues in class. Method: From August to October 2016, we visited 58 high school classes in 22 public schools in a county in Switzerland, and asked the 8th and 9th year students on three formal school levels to participate in survey about gender and gay issues. For data collection, we used an anonymous self-administered questionnaire filled in during class. Data were analyzed using descriptive statistics and structural equation modelling (Generalized Least Square Estimates method). The sample included 897 students, 334 in the 8th and 563 in the 9th year, aged 12–17, 51.2% being female, 48.8% male, 50.3% with immigration background. Results: A proportion of 85.4% participants reported having made homophobic statements in the 12 month before survey, 4.7% often and very often. Analysis showed that respondents’ homophobic behavior was predicted directly by negative attitudes (β=0.20), as well as by the acceptance of traditional gender roles (β=0.06), religiosity (β=–0.07), contact with gay people (β=0.10), expectations of parents (β=–0.14) and friends (β=–0.19), gender (β=–0.22) and having a South-East-European or Western- and Middle-Asian immigration background (β=0.09). These variables were predicted, in turn, by gender, age, immigration background, formal school level, and discussion of gay issues in class (GFI=0.995, AGFI=0.979, SRMR=0.0169, CMIN/df=1.199, p>0.213, adj. R2 =0.384). Conclusion: Findings evidence a high prevalence of homophobic behavior in the responding high school students. The tested explanatory model explained 38.4% of the assessed homophobic behavior. However, data did not found full support of the model. Knowledge did not turn out to be a predictor of behavior. Except for the perceived expectation of teachers and orientation toward social dominance, the social-cognitive variables were not fully mediated by attitudes. Equally, gender and immigration background predicted homophobic behavior directly. These findings demonstrate the importance of prevention and provide also leverage points for interventions against anti-gay bias in adolescents – also in social work settings as, for example, in school social work, open youth work or foster care.

Keywords: discrimination, high school students, gay men, predictors, Switzerland

Procedia PDF Downloads 314
21535 Human Posture Estimation Based on Multiple Viewpoints

Authors: Jiahe Liu, HongyangYu, Feng Qian, Miao Luo

Abstract:

This study aimed to address the problem of improving the confidence of key points by fusing multi-view information, thereby estimating human posture more accurately. We first obtained multi-view image information and then used the MvP algorithm to fuse this multi-view information together to obtain a set of high-confidence human key points. We used these as the input for the Spatio-Temporal Graph Convolution (ST-GCN). ST-GCN is a deep learning model used for processing spatio-temporal data, which can effectively capture spatio-temporal relationships in video sequences. By using the MvP algorithm to fuse multi-view information and inputting it into the spatio-temporal graph convolution model, this study provides an effective method to improve the accuracy of human posture estimation and provides strong support for further research and application in related fields.

Keywords: multi-view, pose estimation, ST-GCN, joint fusion

Procedia PDF Downloads 53
21534 Customized Design of Amorphous Solids by Generative Deep Learning

Authors: Yinghui Shang, Ziqing Zhou, Rong Han, Hang Wang, Xiaodi Liu, Yong Yang

Abstract:

The design of advanced amorphous solids, such as metallic glasses, with targeted properties through artificial intelligence signifies a paradigmatic shift in physical metallurgy and materials technology. Here, we developed a machine-learning architecture that facilitates the generation of metallic glasses with targeted multifunctional properties. Our architecture integrates the state-of-the-art unsupervised generative adversarial network model with supervised models, allowing the incorporation of general prior knowledge derived from thousands of data points across a vast range of alloy compositions, into the creation of data points for a specific type of composition, which overcame the common issue of data scarcity typically encountered in the design of a given type of metallic glasses. Using our generative model, we have successfully designed copper-based metallic glasses, which display exceptionally high hardness or a remarkably low modulus. Notably, our architecture can not only explore uncharted regions in the targeted compositional space but also permits self-improvement after experimentally validated data points are added to the initial dataset for subsequent cycles of data generation, hence paving the way for the customized design of amorphous solids without human intervention.

Keywords: metallic glass, artificial intelligence, mechanical property, automated generation

Procedia PDF Downloads 27
21533 Continuous-Time Convertible Lease Pricing and Firm Value

Authors: Ons Triki, Fathi Abid

Abstract:

Along with the increase in the use of leasing contracts in corporate finance, multiple studies aim to model the credit risk of the lease in order to cover the losses of the lessor of the asset if the lessee goes bankrupt. In the current research paper, a convertible lease contract is elaborated in a continuous time stochastic universe aiming to ensure the financial stability of the firm and quickly recover the losses of the counterparties to the lease in case of default. This work examines the term structure of the lease rates taking into account the credit default risk and the capital structure of the firm. The interaction between the lessee's capital structure and the equilibrium lease rate has been assessed by applying the competitive lease market argument developed by Grenadier (1996) and the endogenous structural default model set forward by Leland and Toft (1996). The cumulative probability of default was calculated by referring to Leland and Toft (1996) and Yildirim and Huan (2006). Additionally, the link between lessee credit risk and lease rate was addressed so as to explore the impact of convertible lease financing on the term structure of the lease rate, the optimal leverage ratio, the cumulative default probability, and the optimal firm value by applying an endogenous conversion threshold. The numerical analysis is suggestive that the duration structure of lease rates increases with the increase in the degree of the market price of risk. The maximal value of the firm decreases with the effect of the optimal leverage ratio. The results are indicative that the cumulative probability of default increases with the maturity of the lease contract if the volatility of the asset service flows is significant. Introducing the convertible lease contract will increase the optimal value of the firm as a function of asset volatility for a high initial service flow level and a conversion ratio close to 1.

Keywords: convertible lease contract, lease rate, credit-risk, capital structure, default probability

Procedia PDF Downloads 64