Search results for: credit scoring
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 610

Search results for: credit scoring

520 The Sensitivity of Credit Defaults Swaps Premium to Global Risk Factor: Evidence from Emerging Markets

Authors: Oguzhan Cepni, Doruk Kucuksarac, M. Hasan Yilmaz

Abstract:

Changes in the global risk appetite cause co-movement in emerging market risk premiums. However, the sensitivity of the changes in risk premium to the global risk appetite may vary across emerging markets. In this study, how the global risk appetite affects Credit Default Swap (CDS) premiums in emerging markets are analyzed using Principal Component Analysis (PCA) and rolling regressions. The PCA results indicate that the first common component derived by the PCA accounts for almost 76 percent of the common variation in CDS premiums. Additionally, the explanatory power of the first factor seems to be high over the sample period. However, the sensitivity to the global risk factor tends to change over time and across countries. In this regard, fixed effects panel regressions are used to identify the macroeconomic factors driving the heterogeneity across emerging markets. The panel regression results point to the significance of government debt to GDP and international reserves to GDP in explaining sensitivity. Accordingly, countries with lower government debt and higher reserves tend to be less subject to the variations in the global risk appetite.

Keywords: credit default swaps, emerging markets, principal components analysis, sovereign risk

Procedia PDF Downloads 378
519 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering

Authors: Emiel Caron

Abstract:

Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.

Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics

Procedia PDF Downloads 194
518 Sensitivity of Credit Default Swaps Premium to Global Risk Factor: Evidence from Emerging Markets

Authors: Oguzhan Cepni, Doruk Kucuksarac, M. Hasan Yilmaz

Abstract:

Risk premium of emerging markets are moving altogether depending on the momentum and shifts in the global risk appetite. However, the magnitudes of these changes in the risk premium of emerging market economies might vary. In this paper, we focus on how global risk factor affects credit default swaps (CDS) premiums of emerging markets using principal component analysis (PCA) and rolling regressions. PCA results indicate that the first common component accounts for almost 76% of common variation in CDS premiums of emerging markets. Additionally, the explanatory power of the first factor seems to be high over sample period. However, the sensitivity to the global risk factor tends to change over time and across countries. In this regard, fixed effects panel regressions are employed to identify the macroeconomic factors driving the heterogeneity across emerging markets. There are two main macroeconomic variables that affect the sensitivity; government debt to GDP and international reserves to GDP. The countries with lower government debt and higher reserves tend to be less subject to the variations in the global risk appetite.

Keywords: emerging markets, principal component analysis, credit default swaps, sovereign risk

Procedia PDF Downloads 381
517 Scoring Approach to Identify High-Risk Corridors for Winter Safety Measures ‎in the Iranian Roads Network

Authors: M. Mokhber, J. Hedayati

Abstract:

From the managerial perspective, it is important to devise an operational plan based on top priorities due to limited resources, diversity of measures and high costs needed to improve safety in infrastructure. Dealing with the high-risk corridors across Iran, this study prioritized the corridors according to statistical data on accidents involving fatalities, injury or damage over three consecutive years. In collaboration with the Iranian Police Department, data were collected and modified. Then, the prioritization criteria were specified based on the expertise opinions and international standards. In this study, the prioritization criteria included accident severity and accident density. Finally, the criteria were standardized and weighted (equal weights) to score each high-risk corridor. The prioritization phase involved the scoring and weighting procedure. The high-risk corridors were divided into twelve groups out of 50. The results of data analysis for a three-year span suggested that the first three groups (150 corridors) along with a quarter of Iranian road network length account for nearly 60% of traffic accidents. In the next step, according to variables including weather conditions particular roads for the purpose of winter safety measures were extracted from the abovementioned categories. According to the results ranking, ‎‏9‏‎ roads with the overall ‎length of about ‎‎‏1000‏‎ Km of high-risk corridors are considered as preferences of ‎safety measures‎.

Keywords: high-risk corridors, HRCs, road safety rating, road scoring, winter safety measures

Procedia PDF Downloads 178
516 The Role of Microfinance in Economic Development

Authors: Babak Salekmahdy

Abstract:

Microfinance is often seen as a means of repairing credit markets and unleashing the potential contribution of impoverished people who rely on self-employment. Since the 1990s, the microfinance industry has expanded rapidly, opening the path for additional kinds of social entrepreneurship and social investment. However, current data indicate relatively few average consumer effects, opposing pushback against microfinance. This research reconsiders microfinance statements, stressing the variety of data on impacts and the essential (but limited) role of reimbursements. The report finishes by explaining a shift in thinking: from microfinance as a strictly defined enterprise finance to microfinance as a more widely defined home finance. Microfinance, under this perspective, provides advantages by providing liquidity for various requirements rather than just by increasing income.

Keywords: microfinance, small business, economic development, credit markets

Procedia PDF Downloads 82
515 The Role of Macroeconomic Condition and Volatility in Credit Risk: An Empirical Analysis of Credit Default Swap Index Spread on Structural Models in U.S. Market during Post-Crisis Period

Authors: Xu Wang

Abstract:

This research builds linear regressions of U.S. macroeconomic condition and volatility measures in the investment grade and high yield Credit Default Swap index spreads using monthly data from March 2009 to July 2016, to study the relationship between different dimensions of macroeconomy and overall credit risk quality. The most significant contribution of this research is systematically examining individual and joint effects of macroeconomic condition and volatility on CDX spreads by including macroeconomic time series that captures different dimensions of the U.S. economy. The industrial production index growth, non-farm payroll growth, consumer price index growth, 3-month treasury rate and consumer sentiment are introduced to capture the condition of real economic activity, employment, inflation, monetary policy and risk aversion respectively. The conditional variance of the macroeconomic series is constructed using ARMA-GARCH model and is used to measure macroeconomic volatility. The linear regression model is conducted to capture relationships between monthly average CDX spreads and macroeconomic variables. The Newey–West estimator is used to control for autocorrelation and heteroskedasticity in error terms. Furthermore, the sensitivity factor analysis and standardized coefficients analysis are conducted to compare the sensitivity of CDX spreads to different macroeconomic variables and to compare relative effects of macroeconomic condition versus macroeconomic uncertainty respectively. This research shows that macroeconomic condition can have a negative effect on CDX spread while macroeconomic volatility has a positive effect on determining CDX spread. Macroeconomic condition and volatility variables can jointly explain more than 70% of the whole variation of the CDX spread. In addition, sensitivity factor analysis shows that the CDX spread is the most sensitive to Consumer Sentiment index. Finally, the standardized coefficients analysis shows that both macroeconomic condition and volatility variables are important in determining CDX spread but macroeconomic condition category of variables have more relative importance in determining CDX spread than macroeconomic volatility category of variables. This research shows that the CDX spread can reflect the individual and joint effects of macroeconomic condition and volatility, which suggests that individual investors or government should carefully regard CDX spread as a measure of overall credit risk because the CDX spread is influenced by macroeconomy. In addition, the significance of macroeconomic condition and volatility variables, such as Non-farm Payroll growth rate and Industrial Production Index growth volatility suggests that the government, should pay more attention to the overall credit quality in the market when macroecnomy is low or volatile.

Keywords: autoregressive moving average model, credit spread puzzle, credit default swap spread, generalized autoregressive conditional heteroskedasticity model, macroeconomic conditions, macroeconomic uncertainty

Procedia PDF Downloads 167
514 Automatic Classification for the Degree of Disc Narrowing from X-Ray Images Using CNN

Authors: Kwangmin Joo

Abstract:

Automatic detection of lumbar vertebrae and classification method is proposed for evaluating the degree of disc narrowing. Prior to classification, deep learning based segmentation is applied to detect individual lumbar vertebra. M-net is applied to segment five lumbar vertebrae and fine-tuning segmentation is employed to improve the accuracy of segmentation. Using the features extracted from previous step, clustering technique, k-means clustering, is applied to estimate the degree of disc space narrowing under four grade scoring system. As preliminary study, techniques proposed in this research could help building an automatic scoring system to diagnose the severity of disc narrowing from X-ray images.

Keywords: Disc space narrowing, Degenerative disc disorders, Deep learning based segmentation, Clustering technique

Procedia PDF Downloads 125
513 The Impact of Financial Risk on Banks’ Financial Performance: A Comparative Study of Islamic Banks and Conventional Banks in Pakistan

Authors: Mohammad Yousaf Safi Mohibullah Afghan

Abstract:

The study made on Islamic and conventional banks scrutinizes the risks interconnected with credit and liquidity on the productivity performance of Islamic and conventional banks that operate in Pakistan. Among the banks, only 4 Islamic and 18 conventional banks have been selected to enrich the result of our study on Islamic banks performance in connection to conventional banks. The selection of the banks to the panel is based on collecting quarterly unbalanced data ranges from the first quarter of 2007 to the last quarter of 2017. The data are collected from the Bank’s web sites and State Bank of Pakistan. The data collection is carried out based on Delta-method test. The mentioned test is used to find out the empirical results. In the study, while collecting data on the banks, the return on assets and return on equity have been major factors that are used assignificant proxies in determining the profitability of the banks. Moreover, another major proxy is used in measuring credit and liquidity risks, the loan loss provision to total loan and the ratio of liquid assets to total liability. Meanwhile, with consideration to the previous literature, some other variables such as bank size, bank capital, bank branches, and bank employees have been used to tentatively control the impact of those factors whose direct and indirect effects on profitability is understood. In conclusion, the study emphasizes that credit risk affects return on asset and return on equity positively, and there is no significant difference in term of credit risk between Islamic and conventional banks. Similarly, the liquidity risk has a significant impact on the bank’s profitability, though the marginal effect of liquidity risk is higher for Islamic banks than conventional banks.

Keywords: islamic & conventional banks, performance return on equity, return on assets, pakistan banking sectors, profitibility

Procedia PDF Downloads 164
512 Improving Detection of Illegitimate Scores and Assessment in Most Advantageous Tenders

Authors: Hao-Hsi Tseng, Hsin-Yun Lee

Abstract:

The Most Advantageous Tender (MAT) has been criticized for its susceptibility to dictatorial situations and for its processing of same score, same rank issues. This study applies the four criteria from Arrow's Impossibility Theorem to construct a mechanism for revealing illegitimate scores in scoring methods. While commonly be used to improve on problems resulting from extreme scores, ranking methods hide significant defects, adversely affecting selection fairness. To address these shortcomings, this study relies mainly on the overall evaluated score method, using standardized scores plus normal cumulative distribution function conversion to calculate the evaluation of vender preference. This allows for free score evaluations, which reduces the influence of dictatorial behavior and avoiding same score, same rank issues. Large-scale simulations confirm that this method outperforms currently used methods using the Impossibility Theorem.

Keywords: Arrow’s impossibility theorem, cumulative normal distribution function, most advantageous tender, scoring method

Procedia PDF Downloads 464
511 Callous-Unemotional Traits in Preschoolers: Distinct Associations with Empathy Subcomponents

Authors: E. Stylianopoulou, A. K. Fanti

Abstract:

Object: Children scoring high on Callous-Unemotional traits (CU traits) exhibit lack of empathy. More specifically, children scoring high on CU traits appear to exhibit deficits on affective empathy or deficits in other constructs. However, little is known about cognitive empathy, and it's relation with CU traits in preschoolers. Despite the fact that empathy is measurable at a very young age, relatively less study has focused on empathy in preschoolers than older children with CU traits. The present study examines the cognitive and affective empathy in preschoolers with CU traits. The aim was to examine the differences between cognitive and affective empathy in those individuals. Based on previous research in children with CU traits, it was hypothesized that preschoolers scoring high in CU traits will show deficits in both cognitive and affective empathy; however, more deficits will be detected in affective empathy rather than cognitive empathy. Method: The sample size was 209 children, of which 109 were male, and 100 were female between the ages of 3 and 7 (M=4.73, SD=0.71). From those participants, only 175 completed all the items. The Inventory of Callous-Unemotional traits was used to measure CU traits. Moreover, the Griffith Empathy Measure (GEM) Affective Scale and the Griffith Empathy Measure (GEM) Cognitive Scale was used to measure Affective and Cognitive empathy, respectively. Results: Linear Regression was applied to examine the preceding hypotheses. The results showed that generally, there was a moderate negative association between CU traits and empathy, which was significant. More specifically, it has been found that there was a significant and negative moderate relation between CU traits and cognitive empathy. Surprisingly, results indicated that there was no significant relation between CU traits and affective empathy. Conclusion: The current findings support that preschoolers show deficits in understanding others emotions, indicating a significant association between CU traits and cognitive empathy. However, such a relation was not found between CU traits and affective empathy. The current results raised the importance that there is a need for focusing more on cognitive empathy in preschoolers with CU traits, a component that seems to be underestimated till now.

Keywords: affective empathy, callous-unemotional traits, cognitive empathy, preschoolers

Procedia PDF Downloads 152
510 Bank Internal Controls and Credit Risk in Europe: A Quantitative Measurement Approach

Authors: Ellis Kofi Akwaa-Sekyi, Jordi Moreno Gené

Abstract:

Managerial actions which negatively profile banks and impair corporate reputation are addressed through effective internal control systems. Disregard for acceptable standards and procedures for granting credit have affected bank loan portfolios and could be cited for the crises in some European countries. The study intends to determine the effectiveness of internal control systems, investigate whether perceived agency problems exist on the part of board members and to establish the relationship between internal controls and credit risk among listed banks in the European Union. Drawing theoretical support from the behavioural compliance and agency theories, about seventeen internal control variables (drawn from the revised COSO framework), bank-specific, country, stock market and macro-economic variables will be involved in the study. A purely quantitative approach will be employed to model internal control variables covering the control environment, risk management, control activities, information and communication and monitoring. Panel data from 2005-2014 on listed banks from 28 European Union countries will be used for the study. Hypotheses will be tested and the Generalized Least Squares (GLS) regression will be run to establish the relationship between dependent and independent variables. The Hausman test will be used to select whether random or fixed effect model will be used. It is expected that listed banks will have sound internal control systems but their effectiveness cannot be confirmed. A perceived agency problem on the part of the board of directors is expected to be confirmed. The study expects significant effect of internal controls on credit risk. The study will uncover another perspective of internal controls as not only an operational risk issue but credit risk too. Banks will be cautious that observing effective internal control systems is an ethical and socially responsible act since the collapse (crisis) of financial institutions as a result of excessive default is a major contagion. This study deviates from the usual primary data approach to measuring internal control variables and rather models internal control variables in a quantitative approach for the panel data. Thus a grey area in approaching the revised COSO framework for internal controls is opened for further research. Most bank failures and crises could be averted if effective internal control systems are religiously adhered to.

Keywords: agency theory, credit risk, internal controls, revised COSO framework

Procedia PDF Downloads 316
509 Risk, Capital Buffers, and Bank Lending: The Adjustment of Euro Area Banks

Authors: Laurent Maurin, Mervi Toivanen

Abstract:

This paper estimates euro area banks’ internal target capital ratios and investigates whether banks’ adjustment to the targets have an impact on credit supply and holding of securities during the financial crisis in 2005-2011. Using data on listed banks and country-specific macro-variables a partial adjustment model is estimated in a panel context. The results indicate, firstly, that an increase in the riskiness of banks’ balance sheets influences positively on the target capital ratios. Secondly, the adjustment towards higher equilibrium capital ratios has a significant impact on banks’ assets. The impact is found to be more size-able on security holdings than on loans, thereby suggesting a pecking order.

Keywords: Euro area, capital ratios, credit supply, partial adjustment model

Procedia PDF Downloads 448
508 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios

Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu

Abstract:

Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.

Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method

Procedia PDF Downloads 167
507 External Validation of Established Pre-Operative Scoring Systems in Predicting Response to Microvascular Decompression for Trigeminal Neuralgia

Authors: Kantha Siddhanth Gujjari, Shaani Singhal, Robert Andrew Danks, Adrian Praeger

Abstract:

Background: Trigeminal neuralgia (TN) is a heterogenous pain syndrome characterised by short paroxysms of lancinating facial pain in the distribution of the trigeminal nerve, often triggered by usually innocuous stimuli. TN has a low prevalence of less than 0.1%, of which 80% to 90% is caused by compression of the trigeminal nerve from an adjacent artery or vein. The root entry zone of the trigeminal nerve is most sensitive to neurovascular conflict (NVC), causing dysmyelination. Whilst microvascular decompression (MVD) is an effective treatment for TN with NVC, all patients do not achieve long-term pain relief. Pre-operative scoring systems by Panczykowski and Hardaway have been proposed but have not been externally validated. These pre-operative scoring systems are composite scores calculated according to a subtype of TN, presence and degree of neurovascular conflict, and response to medical treatments. There is discordance in the assessment of NVC identified on pre-operative magnetic resonance imaging (MRI) between neurosurgeons and radiologists. To our best knowledge, the prognostic impact for MVD of this difference of interpretation has not previously been investigated in the form of a composite scoring system such as those suggested by Panczykowski and Hardaway. Aims: This study aims to identify prognostic factors and externally validate the proposed scoring systems by Panczykowski and Hardaway for TN. A secondary aim is to investigate the prognostic difference between a neurosurgeon's interpretation of NVC on MRI compared with a radiologist’s. Methods: This retrospective cohort study included 95 patients who underwent de novo MVD in a single neurosurgical unit in Melbourne. Data was recorded from patients’ hospital records and neurosurgeon’s correspondence from perioperative clinic reviews. Patient demographics, type of TN, distribution of TN, response to carbamazepine, neurosurgeon, and radiologist interpretation of NVC on MRI, were clearly described prospectively and preoperatively in the correspondence. Scoring systems published by Panczykowski et al. and Hardaway et al. were used to determine composite scores, which were compared with the recurrence of TN recorded during follow-up over 1-year. Categorical data analysed using Pearson chi-square testing. Independent numerical and nominal data analysed with logistical regression. Results: Logistical regression showed that a Panczykowski composite score of greater than 3 points was associated with a higher likelihood of pain-free outcome 1-year post-MVD with an OR 1.81 (95%CI 1.41-2.61, p=0.032). The composite score using neurosurgeon’s impression of NVC had an OR 2.96 (95%CI 2.28-3.31, p=0.048). A Hardaway composite score of greater than 2 points was associated with a higher likelihood of pain-free outcome 1 year post-MVD with an OR 3.41 (95%CI 2.58-4.37, p=0.028). The composite score using neurosurgeon’s impression of NVC had an OR 3.96 (95%CI 3.01-4.65, p=0.042). Conclusion: Composite scores developed by Panczykowski and Hardaway were validated for the prediction of response to MVD in TN. A composite score based on the neurosurgeon’s interpretation of NVC on MRI, when compared with the radiologist’s had a greater correlation with pain-free outcomes 1 year post-MVD.

Keywords: de novo microvascular decompression, neurovascular conflict, prognosis, trigeminal neuralgia

Procedia PDF Downloads 74
506 An Attentional Bi-Stream Sequence Learner (AttBiSeL) for Credit Card Fraud Detection

Authors: Amir Shahab Shahabi, Mohsen Hasirian

Abstract:

Modern societies, marked by expansive Internet connectivity and the rise of e-commerce, are now integrated with digital platforms at an unprecedented level. The efficiency, speed, and accessibility of e-commerce have garnered a substantial consumer base. Against this backdrop, electronic banking has undergone rapid proliferation within the realm of online activities. However, this growth has inadvertently given rise to an environment conducive to illicit activities, notably electronic payment fraud, posing a formidable challenge to the domain of electronic banking. A pivotal role in upholding the integrity of electronic commerce and business transactions is played by electronic fraud detection, particularly in the context of credit cards which underscores the imperative of comprehensive research in this field. To this end, our study introduces an Attentional Bi-Stream Sequence Learner (AttBiSeL) framework that leverages attention mechanisms and recurrent networks. By incorporating bidirectional recurrent layers, specifically bidirectional Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) layers, the proposed model adeptly extracts past and future transaction sequences while accounting for the temporal flow of information in both directions. Moreover, the integration of an attention mechanism accentuates specific transactions to varying degrees, as manifested in the output of the recurrent networks. The effectiveness of the proposed approach in automatic credit card fraud classification is evaluated on the European Cardholders' Fraud Dataset. Empirical results validate that the hybrid architectural paradigm presented in this study yields enhanced accuracy compared to previous studies.

Keywords: credit card fraud, deep learning, attention mechanism, recurrent neural networks

Procedia PDF Downloads 14
505 Pudhaiyal: A Maze-Based Treasure Hunt Game for Tamil Words

Authors: Aarthy Anandan, Anitha Narasimhan, Madhan Karky

Abstract:

Word-based games are popular in helping people to improve their vocabulary skills. Games like ‘word search’ and crosswords provide a smart way of increasing vocabulary skills. Word search games are fun to play, but also educational which actually helps to learn a language. Finding the words from word search puzzle helps the player to remember words in an easier way, and it also helps to learn the spellings of words. In this paper, we present a tile distribution algorithm for a Maze-Based Treasure Hunt Game 'Pudhaiyal’ for Tamil words, which describes how words can be distributed horizontally, vertically or diagonally in a 10 x 10 grid. Along with the tile distribution algorithm, we also present an algorithm for the scoring model of the game. The proposed game has been tested with 20,000 Tamil words.

Keywords: Pudhaiyal, Tamil word game, word search, scoring, maze, algorithm

Procedia PDF Downloads 441
504 A Product-Specific/Unobservable Approach to Segmentation for a Value Expressive Credit Card Service

Authors: Manfred F. Maute, Olga Naumenko, Raymond T. Kong

Abstract:

Using data from a nationally representative financial panel of Canadian households, this study develops a psychographic segmentation of the customers of a value-expressive credit card service and tests for effects on relational response differences. The variety of segments elicited by agglomerative and k means clustering and the familiar profiles of individual clusters suggest that the face validity of the psychographic segmentation was quite high. Segmentation had a significant effect on customer satisfaction and relationship depth. However, when socio-demographic characteristics like household size and income were accounted for in the psychographic segmentation, the effect on relational response differences was magnified threefold. Implications for the segmentation of financial services markets are considered.

Keywords: customer satisfaction, financial services, psychographics, response differences, segmentation

Procedia PDF Downloads 334
503 Early Warning System of Financial Distress Based On Credit Cycle Index

Authors: Bi-Huei Tsai

Abstract:

Previous studies on financial distress prediction choose the conventional failing and non-failing dichotomy; however, the distressed extent differs substantially among different financial distress events. To solve the problem, “non-distressed”, “slightly-distressed” and “reorganization and bankruptcy” are used in our article to approximate the continuum of corporate financial health. This paper explains different financial distress events using the two-stage method. First, this investigation adopts firm-specific financial ratios, corporate governance and market factors to measure the probability of various financial distress events based on multinomial logit models. Specifically, the bootstrapping simulation is performed to examine the difference of estimated misclassifying cost (EMC). Second, this work further applies macroeconomic factors to establish the credit cycle index and determines the distressed cut-off indicator of the two-stage models using such index. Two different models, one-stage and two-stage prediction models, are developed to forecast financial distress, and the results acquired from different models are compared with each other, and with the collected data. The findings show that the two-stage model incorporating financial ratios, corporate governance and market factors has the lowest misclassification error rate. The two-stage model is more accurate than the one-stage model as its distressed cut-off indicators are adjusted according to the macroeconomic-based credit cycle index.

Keywords: Multinomial logit model, corporate governance, company failure, reorganization, bankruptcy

Procedia PDF Downloads 377
502 Digitalised Welfare: Systems for Both Seeing and Working with Mess

Authors: Amelia Morris, Lizzie Coles-Kemp, Will Jones

Abstract:

This paper examines how community welfare initiatives transform how individuals use and experience an ostensibly universal welfare system. This paper argues that the digitalisation of welfare overlooks the complex reality of being unemployed or in low-wage work, and erects digital barriers to accessing welfare. Utilising analysis of ethnographic research in food banks and community groups, the paper explores the ways that Universal Credit has not abolished face-to-face support, but relocated it to unofficial sites of welfare. The apparent efficiency and simplicity of the state’s digital welfare apparatus, therefore, is produced not by reducing the ‘messiness’ of welfare, but by rendering it invisible within the digital framework. Using the analysis of the study’s data, this paper recommends three principles of service design that would render the messiness visible to the state.

Keywords: welfare, digitalisation, food bank, Universal Credit

Procedia PDF Downloads 152
501 Karyotyping the Date Palm (Phoenix dactylifera L.)

Authors: Abdullah M. Alzahrani

Abstract:

The karyotypes of Khalas (KH), Sukkary (SK), Sheeshi (SS), Shibeebi (SB) and Sillije (SJ) date palm cultivars were investigated. Data showed no variation in chromosome number, 2n = 36, 34 autosomes in addition to XX in females and XY in males. Mean autosomes length ranged from 3.85-9.93 μm and 3.71-2.73 μm for X and Y chromosomes, respectively. The formula of female date palm karyotype was 8m + 4sm +2st + 4t, and submedian Y chromosome. Relative chromosome length ranged from 3.3- 9.38 μm. SS cultivar showed high asymmetry levels by scoring low values of Syi (45.51), TF (42.8) and high values for A1 (0.53), A (0.41) and AI (0.29). Syi developed an inverse relation with A1 and A while A exhibited a direct correlation with A1. Cultivars SK, SB and SJ score medium values of Syi, A1, AI and A. KH cultivar exhibited high symmetry by scoring highest values of Syi (53.68), TF (51.81) and lowest values of A1 (0.44), A (0.34) and AI (0.18). Higher DI value was obtained in SB cultivar (1.34) followed by SJ (1.15) and low DI scores of 0.99, 0.86 and 0.71 were detected in KH, SS and SK, respectively. Stebbins classification assorted SS as 3B and the other cultivars as 2B, insuring the evolution and asymmetry of SS compared to the other karyotypes. Scatter diagram of Syi-A1 couple has the advantage of revealing high degree of sensitivity to present karyotype interrelationships, followed by AI-A and CVCL-CVCI couples.

Keywords: Karyotype, date palm, Khalas, Sukkary, Sheeshi

Procedia PDF Downloads 369
500 Analysis of the Effect of Farmers’ Socio-Economic Factors on Net Farm Income of Catfish Farmers in Kwara State, Nigeria

Authors: Olanike A. Ojo, Akindele M. Ojo, Jacob H. Tsado, Ramatu U. Kutigi

Abstract:

The study was carried out on analysis of the effect of farmers’ socio-economic factors on the net farm income of catfish farmers in Kwara State, Nigeria. Primary data were collected from selected catfish farmers with the aid of well-structured questionnaire and a multistage sampling technique was used to select 102 catfish farmers in the area. The analytical techniques involved the use of descriptive statistics and multiple regression analysis. The findings of the analysis of socio-economic characteristics of catfish farmers reveal that 60% of the catfish farmers in the study area were male gender which implied the existence of gender inequality in the area. The mean age of 47 years was an indication that they were at their economically productive age and could contribute positively to increased production of catfish in the area. Also, the mean household size was five while the mean year of experience was five. The latter implied that the farmers were experienced in fishing techniques, breeding and fish culture which would assist in generating more revenue, reduce cost of production and eventual increase in profit levels of the farmers. The result also revealed that stock capacity (X3), accessibility to credit (X7) and labour (X4) were the main determinants of catfish production in the area. In addition, farmer’s sex, household size, no of ponds, distance of the farm from market, access to credit were the main socio-economic factors influencing the net farm income of the catfish farmers in the area. The most serious constraints militating against catfish production in the study area were high mortality rate, insufficient market, inadequate credit facilities/ finance and inadequate skilled labour needed for daily production routine. Based on the findings, it is therefore recommended that, to reduce the mortality rate of catfish extension agents should organize training workshops on improved methods and techniques of raising catfish right from juvenile to market size.

Keywords: credit, income, stock, mortality

Procedia PDF Downloads 332
499 Supervised-Component-Based Generalised Linear Regression with Multiple Explanatory Blocks: THEME-SCGLR

Authors: Bry X., Trottier C., Mortier F., Cornu G., Verron T.

Abstract:

We address component-based regularization of a Multivariate Generalized Linear Model (MGLM). A set of random responses Y is assumed to depend, through a GLM, on a set X of explanatory variables, as well as on a set T of additional covariates. X is partitioned into R conceptually homogeneous blocks X1, ... , XR , viewed as explanatory themes. Variables in each Xr are assumed many and redundant. Thus, Generalised Linear Regression (GLR) demands regularization with respect to each Xr. By contrast, variables in T are assumed selected so as to demand no regularization. Regularization is performed searching each Xr for an appropriate number of orthogonal components that both contribute to model Y and capture relevant structural information in Xr. We propose a very general criterion to measure structural relevance (SR) of a component in a block, and show how to take SR into account within a Fisher-scoring-type algorithm in order to estimate the model. We show how to deal with mixed-type explanatory variables. The method, named THEME-SCGLR, is tested on simulated data.

Keywords: Component-Model, Fisher Scoring Algorithm, GLM, PLS Regression, SCGLR, SEER, THEME

Procedia PDF Downloads 396
498 Demographic Characteristics and Factors Affecting Mortality in Pediatric Trauma Patients Who Are Admitted to Emergency Service

Authors: Latif Duran, Erdem Aydin, Ahmet Baydin, Ali Kemal Erenler, Iskender Aksoy

Abstract:

Aim: In this retrospective study, we aim to contribute to the literature by presenting the proposals for taking measures to reduce the mortality by examining the demographic characteristics of the pediatric age group patients presenting with trauma and the factors that may cause mortality Material and Method: This study has been performed by retrospectively investigating the data obtained from the patient files and the hospital automation registration system of the pediatric trauma patients who applied to the Adult Emergency Department of the Ondokuz Mayıs University Medical Faculty between January 1, 2016, and December 31, 2016. Results: 289 of 415 patients involved in our study, were males. The median age was 11.3 years. The most common trauma mechanism was falling from the high. A significant statistical difference was found on the association between trauma mechanisms and gender. An increase in the number of trauma cases was found especially in the summer months. The study showed that thoracic and abdominal trauma was relevant to the increased mortality. Computerized tomography was the most common diagnostic imaging modality. The presence of subarachnoid hemorrhage has increased the risk of mortality by 62.3 fold. Eight of the patients (1.9%) died. Scoring systems were statistically significant to predict mortality. Conclusion: Children are vulnerable to trauma because of their unique anatomical and physiological differences compared to adult patient groups. It will be more successful in the mortality rate and in the post-traumatic healing process by administering the patient triage fast and most appropriate trauma centers in the prehospital period, management of the critical patients with the scoring systems and management with standard treatment protocols

Keywords: emergency service, pediatric patients, scoring systems, trauma, age groups

Procedia PDF Downloads 197
497 The Underground Ecosystem of Credit Card Frauds

Authors: Abhinav Singh

Abstract:

Point Of Sale (POS) malwares have been stealing the limelight this year. They have been the elemental factor in some of the biggest breaches uncovered in past couple of years. Some of them include • Target: A Retail Giant reported close to 40 million credit card data being stolen • Home Depot : A home product Retailer reported breach of close to 50 million credit records • Kmart: A US retailer recently announced breach of 800 thousand credit card details. Alone in 2014, there have been reports of over 15 major breaches of payment systems around the globe. Memory scrapping malwares infecting the point of sale devices have been the lethal weapon used in these attacks. These malwares are capable of reading the payment information from the payment device memory before they are being encrypted. Later on these malwares send the stolen details to its parent server. These malwares are capable of recording all the critical payment information like the card number, security number, owner etc. All these information are delivered in raw format. This Talk will cover the aspects of what happens after these details have been sent to the malware authors. The entire ecosystem of credit card frauds can be broadly classified into these three steps: • Purchase of raw details and dumps • Converting them to plastic cash/cards • Shop! Shop! Shop! The focus of this talk will be on the above mentioned points and how they form an organized network of cyber-crime. The first step involves buying and selling of the stolen details. The key point to emphasize are : • How is this raw information been sold in the underground market • The buyer and seller anatomy • Building your shopping cart and preferences • The importance of reputation and vouches • Customer support and replace/refunds These are some of the key points that will be discussed. But the story doesn’t end here. As of now the buyer only has the raw card information. How will this raw information be converted to plastic cash? Now comes in picture the second part of this underground economy where-in these raw details are converted into actual cards. There are well organized services running underground that can help you in converting these details into plastic cards. We will discuss about this technique in detail. At last, the final step involves shopping with the stolen cards. The cards generated with the stolen details can be easily used to swipe-and-pay for purchased goods at different retail shops. Usually these purchases are of expensive items that have good resale value. Apart from using the cards at stores, there are underground services that lets you deliver online orders to their dummy addresses. Once the package is received it will be delivered to the original buyer. These services charge based on the value of item that is being delivered. The overall underground ecosystem of credit card fraud works in a bulletproof way and it involves people working in close groups and making heavy profits. This is a brief summary of what I plan to present at the talk. I have done an extensive research and have collected good deal of material to present as samples. Some of them include: • List of underground forums • Credit card dumps • IRC chats among these groups • Personal chat with big card sellers • Inside view of these forum owners. The talk will be concluded by throwing light on how these breaches are being tracked during investigation. How are credit card breaches tracked down and what steps can financial institutions can build an incidence response over it.

Keywords: POS mawalre, credit card frauds, enterprise security, underground ecosystem

Procedia PDF Downloads 439
496 A Breakthrough Improvement Brought by Taxi-Calling APPs for Taxi Operation Level

Authors: Yuan-Lin Liu, Ye Li, Tian Xia

Abstract:

Taxi-calling APPs have been used widely, while brought both benefits and a variety of issues for the taxi market. Many countries do not know whether the benefits are remarkable than the issues or not. This paper established a comparison between the basic scenario (2009-2012) and a taxi-calling software usage scenario (2012-2015) to explain the impact of taxi-calling APPs. The impacts of taxi-calling APPs illustrated by the comparison results are: 1) The supply and demand distribution is more balanced, extending from the city center to the suburb. The availability of taxi service has been improved in low density areas, thin market attribute has also been improved; 2)The ratio of short distance taxi trip decreased, long distance service increased, the utilization of mileage increased, and the rate of empty decreased; 3) The popularity of taxi-calling APPs was able to reduce the average empty distance, cruise time, empty mileage rate and average times of loading passengers, can also enhance the average operating speed, improve the taxi operating level, and reduce social cost although there are some disadvantages. This paper argues that the taxi industry and government can establish an integrated third-party credit information platform based on credit evaluated by the data of the drivers’ driving behaviors to supervise the drivers. Taxi-calling APPs under fully covered supervision in the mobile Internet environment will become a new trend.

Keywords: taxi, taxi-calling APPs, credit, scenario comparison

Procedia PDF Downloads 254
495 Comparison of Two Home Sleep Monitors Designed for Self-Use

Authors: Emily Wood, James K. Westphal, Itamar Lerner

Abstract:

Background: Polysomnography (PSG) recordings are regularly used in research and clinical settings to study sleep and sleep-related disorders. Typical PSG studies are conducted in professional laboratories and performed by qualified researchers. However, the number of sleep labs worldwide is disproportionate to the increasing number of individuals with sleep disorders like sleep apnea and insomnia. Consequently, there is a growing need to supply cheaper yet reliable means to measure sleep, preferably autonomously by subjects in their own home. Over the last decade, a variety of devices for self-monitoring of sleep became available in the market; however, very few have been directly validated against PSG to demonstrate their ability to perform reliable automatic sleep scoring. Two popular mobile EEG-based systems that have published validation results, the DREEM 3 headband and the Z-Machine, have never been directly compared one to the other by independent researchers. The current study aimed to compare the performance of DREEM 3 and the Z-Machine to help investigators and clinicians decide which of these devices may be more suitable for their studies. Methods: 26 participants have completed the study for credit or monetary compensation. Exclusion criteria included any history of sleep, neurological or psychiatric disorders. Eligible participants arrived at the lab in the afternoon and received the two devices. They then spent two consecutive nights monitoring their sleep at home. Participants were also asked to keep a sleep log, indicating the time they fell asleep, woke up, and the number of awakenings occurring during the night. Data from both devices, including detailed sleep hypnograms in 30-second epochs (differentiating Wake, combined N1/N2, N3; and Rapid Eye Movement sleep), were extracted and aligned upon retrieval. For analysis, the number of awakenings each night was defined as four or more consecutive wake epochs between sleep onset and termination. Total sleep time (TST) and the number of awakenings were compared to subjects’ sleep logs to measure consistency with the subjective reports. In addition, the sleep scores from each device were compared epoch-by-epoch to calculate the agreement between the two devices using Cohen’s Kappa. All analysis was performed using Matlab 2021b and SPSS 27. Results/Conclusion: Subjects consistently reported longer times spent asleep than the time reported by each device (M= 448 minutes for sleep logs compared to M= 406 and M= 345 minutes for the DREEM and Z-Machine, respectively; both ps<0.05). Linear correlations between the sleep log and each device were higher for the DREEM than the Z-Machine for both TST and the number of awakenings, and, likewise, the mean absolute bias between the sleep logs and each device was higher for the Z-Machine for both TST (p<0.001) and awakenings (p<0.04). There was some indication that these effects were stronger for the second night compared to the first night. Epoch-by-epoch comparisons showed that the main discrepancies between the devices were for detecting N2 and REM sleep, while N3 had a high agreement. Overall, the DREEM headband seems superior for reliably scoring sleep at home.

Keywords: DREEM, EEG, seep monitoring, Z-machine

Procedia PDF Downloads 107
494 Continuous-Time Convertible Lease Pricing and Firm Value

Authors: Ons Triki, Fathi Abid

Abstract:

Along with the increase in the use of leasing contracts in corporate finance, multiple studies aim to model the credit risk of the lease in order to cover the losses of the lessor of the asset if the lessee goes bankrupt. In the current research paper, a convertible lease contract is elaborated in a continuous time stochastic universe aiming to ensure the financial stability of the firm and quickly recover the losses of the counterparties to the lease in case of default. This work examines the term structure of the lease rates taking into account the credit default risk and the capital structure of the firm. The interaction between the lessee's capital structure and the equilibrium lease rate has been assessed by applying the competitive lease market argument developed by Grenadier (1996) and the endogenous structural default model set forward by Leland and Toft (1996). The cumulative probability of default was calculated by referring to Leland and Toft (1996) and Yildirim and Huan (2006). Additionally, the link between lessee credit risk and lease rate was addressed so as to explore the impact of convertible lease financing on the term structure of the lease rate, the optimal leverage ratio, the cumulative default probability, and the optimal firm value by applying an endogenous conversion threshold. The numerical analysis is suggestive that the duration structure of lease rates increases with the increase in the degree of the market price of risk. The maximal value of the firm decreases with the effect of the optimal leverage ratio. The results are indicative that the cumulative probability of default increases with the maturity of the lease contract if the volatility of the asset service flows is significant. Introducing the convertible lease contract will increase the optimal value of the firm as a function of asset volatility for a high initial service flow level and a conversion ratio close to 1.

Keywords: convertible lease contract, lease rate, credit-risk, capital structure, default probability

Procedia PDF Downloads 98
493 Analysis of Technical Efficiency and Its Determinants among Cattle Fattening Enterprises in Kebbi State, Nigeria

Authors: Gona Ayuba, Isiaka Mohammed, Kotom Mohammed Baba, Mohammed Aabubakar Maikasuwa

Abstract:

The study examined the technical efficiency and its determinants of cattle fattening enterprises in Kebbi state, Nigeria. Data were collected from a sample of 160 fatteners between June 2010 and June 2011 using the multistage random sampling technique. Translog stochastic frontier production function was employed for the analysis. Results of the analysis show that technical efficiency indices varied from 0.74 to 0.98%, with a mean of 0.90%, indicating that there was no wide gap between the efficiency of best technical efficient fatteners and that of the average fattener. The result also showed that fattening experience and herd size influenced the level of technical efficiency at 1% levels. It is recommended that credit agencies should ensure that credit made available to the fatteners is monitored to ensure appropriate utilization.

Keywords: technical efficiency, determinants, cattle, fattening enterprises

Procedia PDF Downloads 451
492 The Recorded Interaction Task: A Validation Study of a New Observational Tool to Assess Mother-Infant Bonding

Authors: Hannah Edwards, Femke T. A. Buisman-Pijlman, Adrian Esterman, Craig Phillips, Sandra Orgeig, Andrea Gordon

Abstract:

Mother-infant bonding is a term which refers to the early emotional connectedness between a mother and her infant. Strong mother-infant bonding promotes higher quality mother and infant interactions including prolonged breastfeeding, secure attachment and increased sensitive parenting and maternal responsiveness. Strengthening of all such interactions leads to improved social behavior, and emotional and cognitive development throughout childhood, adolescence and adulthood. The positive outcomes observed following strong mother-infant bonding emphasize the need to screen new mothers for disrupted mother-infant bonding, and in turn the need for a robust, valid tool to assess mother-infant bonding. A recent scoping review conducted by the research team identified four tools to assess mother-infant bonding, all of which employed self-rating scales. Thus, whilst these tools demonstrated both adequate validity and reliability, they rely on self-reported information from the mother. As such this may reflect a mother’s perception of bonding with their infant, rather than their actual behavior. Therefore, a new tool to assess mother-infant bonding has been developed. The Recorded Interaction Task (RIT) addresses shortcomings of previous tools by employing observational methods to assess bonding. The RIT focusses on the common interaction between mother and infant of changing a nappy, at the target age of 2-6 months, which is visually recorded and then later assessed. Thirteen maternal and seven infant behaviors are scored on the RIT Observation Scoring Sheet, and a final combined score of mother-infant bonding is determined. The aim of the current study was to assess the content validity and inter-rater reliability of the RIT. A panel of six experts with specialized expertise in bonding and infant behavior were consulted. Experts were provided with the RIT Observation Scoring Sheet, a visual recording of a nappy change interaction, and a feedback form. Experts scored the mother and infant interaction on the RIT Observation Scoring Sheet and completed the feedback form which collected their opinions on the validity of each item on the RIT Observation Scoring Sheet and the RIT as a whole. Twelve of the 20 items on the RIT Observation Scoring Sheet were scored ‘Valid’ by all (n=6) or most (n=5) experts. Two items received a ‘Not valid’ score from one expert. The remainder of the items received a mixture of ‘Valid’ and ‘Potentially Valid’ scores. Few changes were made to the RIT Observation Scoring Sheet following expert feedback, including rewording of items for clarity and the exclusion of an item focusing on behavior deemed not relevant for the target infant age. The overall ICC for single rater absolute agreement was 0.48 (95% CI 0.28 – 0.71). Experts (n=6) ratings were less consistent for infant behavior (ICC 0.27 (-0.01 – 0.82)) compared to mother behavior (ICC 0.55 (0.28 – 0.80)). Whilst previous tools employ self-report methods to assess mother-infant bonding, the RIT utilizes observational methods. The current study highlights adequate content validity and moderate inter-rater reliability of the RIT, supporting its use in future research. A convergent validity study comparing the RIT against an existing tool is currently being undertaken to confirm these results.

Keywords: content validity, inter-rater reliability, mother-infant bonding, observational tool, recorded interaction task

Procedia PDF Downloads 181
491 A Copula-Based Approach for the Assessment of Severity of Illness and Probability of Mortality: An Exploratory Study Applied to Intensive Care Patients

Authors: Ainura Tursunalieva, Irene Hudson

Abstract:

Continuous improvement of both the quality and safety of health care is an important goal in Australia and internationally. The intensive care unit (ICU) receives patients with a wide variety of and severity of illnesses. Accurately identifying patients at risk of developing complications or dying is crucial to increasing healthcare efficiency. Thus, it is essential for clinicians and researchers to have a robust framework capable of evaluating the risk profile of a patient. ICU scoring systems provide such a framework. The Acute Physiology and Chronic Health Evaluation III and the Simplified Acute Physiology Score II are ICU scoring systems frequently used for assessing the severity of acute illness. These scoring systems collect multiple risk factors for each patient including physiological measurements then render the assessment outcomes of individual risk factors into a single numerical value. A higher score is related to a more severe patient condition. Furthermore, the Mortality Probability Model II uses logistic regression based on independent risk factors to predict a patient’s probability of mortality. An important overlooked limitation of SAPS II and MPM II is that they do not, to date, include interaction terms between a patient’s vital signs. This is a prominent oversight as it is likely there is an interplay among vital signs. The co-existence of certain conditions may pose a greater health risk than when these conditions exist independently. One barrier to including such interaction terms in predictive models is the dimensionality issue as it becomes difficult to use variable selection. We propose an innovative scoring system which takes into account a dependence structure among patient’s vital signs, such as systolic and diastolic blood pressures, heart rate, pulse interval, and peripheral oxygen saturation. Copulas will capture the dependence among normally distributed and skewed variables as some of the vital sign distributions are skewed. The estimated dependence parameter will then be incorporated into the traditional scoring systems to adjust the points allocated for the individual vital sign measurements. The same dependence parameter will also be used to create an alternative copula-based model for predicting a patient’s probability of mortality. The new copula-based approach will accommodate not only a patient’s trajectories of vital signs but also the joint dependence probabilities among the vital signs. We hypothesise that this approach will produce more stable assessments and lead to more time efficient and accurate predictions. We will use two data sets: (1) 250 ICU patients admitted once to the Chui Regional Hospital (Kyrgyzstan) and (2) 37 ICU patients’ agitation-sedation profiles collected by the Hunter Medical Research Institute (Australia). Both the traditional scoring approach and our copula-based approach will be evaluated using the Brier score to indicate overall model performance, the concordance (or c) statistic to indicate the discriminative ability (or area under the receiver operating characteristic (ROC) curve), and goodness-of-fit statistics for calibration. We will also report discrimination and calibration values and establish visualization of the copulas and high dimensional regions of risk interrelating two or three vital signs in so-called higher dimensional ROCs.

Keywords: copula, intensive unit scoring system, ROC curves, vital sign dependence

Procedia PDF Downloads 152