Search results for: conflicting claim on credit of discovery of ridge regression
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4686

Search results for: conflicting claim on credit of discovery of ridge regression

4566 Structural Equation Modeling Semiparametric in Modeling the Accuracy of Payment Time for Customers of Credit Bank in Indonesia

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

The research was conducted to apply semiparametric SEM modeling to the timeliness of paying credit. Semiparametric SEM is structural modeling in which two combined approaches of parametric and nonparametric approaches are used. The analysis method in this research is semiparametric SEM with a nonparametric approach using a truncated spline. The data in the study were obtained through questionnaires distributed to Bank X mortgage debtors and are confidential. The study used 3 variables consisting of one exogenous variable, one intervening endogenous variable, and one endogenous variable. The results showed that (1) the effect of capacity and willingness to pay variables on timeliness of payment is significant, (2) modeling the capacity variable on willingness to pay also produces a significant estimate, (3) the effect of the capacity variable on the timeliness of payment variable is not influenced by the willingness to pay variable as an intervening variable, (4) the R^2 value of 0.763 or 76.33% indicates that the model has good predictive relevance.

Keywords: structural equation modeling semiparametric, credit bank, accuracy of payment time, willingness to pay

Procedia PDF Downloads 40
4565 New Segmentation of Piecewise Linear Regression Models Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

Piecewise linear regression models are very flexible models for modeling the data. If the piecewise linear regression models are matched against the data, then the parameters are generally not known. This paper studies the problem of parameter estimation of piecewise linear regression models. The method used to estimate the parameters of picewise linear regression models is Bayesian method. But the Bayes estimator can not be found analytically. To overcome these problems, the reversible jump MCMC algorithm is proposed. Reversible jump MCMC algorithm generates the Markov chain converges to the limit distribution of the posterior distribution of the parameters of picewise linear regression models. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of picewise linear regression models.

Keywords: regression, piecewise, Bayesian, reversible Jump MCMC

Procedia PDF Downloads 517
4564 Factors Affecting the Ultimate Compressive Strength of the Quaternary Calcarenites, North Western Desert, Egypt

Authors: M. A. Rashed, A. S. Mansour, H. Faris, W. Afify

Abstract:

The calcarenites carbonate rocks of the Quaternary ridges, which extend along the northwestern Mediterranean coastal plain of Egypt, represent an excellent model for the transformation of loose sediments to real sedimentary rocks by the different stages of meteoric diagenesis. The depositional and diagenetic fabrics of the rocks, in addition to the strata orientation, highly affect their ultimate compressive strength and other geotechnical properties. There is a marked increase in the compressive strength (UCS) from the first to the fourth ridge rock samples. The lowest values are related to the loose packing, weakly cemented aragonitic ooid sediments with high porosity, besides the irregularly distributed of cement, which result in decreasing the ability of these rocks to withstand crushing under direct pressure. The high (UCS) values are attributed to the low porosity, the presence of micritic cement, the reduction in grain size and the occurrence of micritization and calcretization processes. The strata orientation has a notable effect on the measured (UCS). The lowest values have been recorded for the samples cored in the inclined direction; whereas the highest values have been noticed in most samples cored in the vertical and parallel directions to bedding plane. In case of the inclined direction, the bedding planes were oriented close to the plane of maximum shear stress. The lowest and highest anisotropy values have been recorded for the first and the third ridges rock samples, respectively, which may attributed to the relatively homogeneity and well sorted grain-stone of the first ridge rock samples, and relatively heterogeneity in grain and pore size distribution and degree of cementation of the third ridge rock samples, besides, the abundance of shell fragments with intra-particle pore spaces, which may produce lines of weakness within the rock.

Keywords: compressive strength, anisotropy, calcarenites, Egypt

Procedia PDF Downloads 368
4563 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest

Procedia PDF Downloads 227
4562 Application Difference between Cox and Logistic Regression Models

Authors: Idrissa Kayijuka

Abstract:

The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.

Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio

Procedia PDF Downloads 447
4561 Study of Evaluation Model Based on Information System Success Model and Flow Theory Using Web-scale Discovery System

Authors: June-Jei Kuo, Yi-Chuan Hsieh

Abstract:

Because of the rapid growth of information technology, more and more libraries introduce the new information retrieval systems to enhance the users’ experience, improve the retrieval efficiency, and increase the applicability of the library resources. Nevertheless, few of them are discussed the usability from the users’ aspect. The aims of this study are to understand that the scenario of the information retrieval system utilization, and to know why users are willing to continuously use the web-scale discovery system to improve the web-scale discovery system and promote their use of university libraries. Besides of questionnaires, observations and interviews, this study employs both Information System Success Model introduced by DeLone and McLean in 2003 and the flow theory to evaluate the system quality, information quality, service quality, use, user satisfaction, flow, and continuing to use web-scale discovery system of students from National Chung Hsing University. Then, the results are analyzed through descriptive statistics and structural equation modeling using AMOS. The results reveal that in web-scale discovery system, the user’s evaluation of system quality, information quality, and service quality is positively related to the use and satisfaction; however, the service quality only affects user satisfaction. User satisfaction and the flow show a significant impact on continuing to use. Moreover, user satisfaction has a significant impact on user flow. According to the results of this study, to maintain the stability of the information retrieval system, to improve the information content quality, and to enhance the relationship between subject librarians and students are recommended for the academic libraries. Meanwhile, to improve the system user interface, to minimize layer from system-level, to strengthen the data accuracy and relevance, to modify the sorting criteria of the data, and to support the auto-correct function are required for system provider. Finally, to establish better communication with librariana commended for all users.

Keywords: web-scale discovery system, discovery system, information system success model, flow theory, academic library

Procedia PDF Downloads 98
4560 Stock Market Prediction by Regression Model with Social Moods

Authors: Masahiro Ohmura, Koh Kakusho, Takeshi Okadome

Abstract:

This paper presents a regression model with autocorrelated errors in which the inputs are social moods obtained by analyzing the adjectives in Twitter posts using a document topic model. The regression model predicts Dow Jones Industrial Average (DJIA) more precisely than autoregressive moving-average models.

Keywords: stock market prediction, social moods, regression model, DJIA

Procedia PDF Downloads 542
4559 Post-harvest Handling Practices and Technologies Harnessed by Smallholder Fruit Crop Farmers in Vhembe District, Limpopo Province, South Africa

Authors: Vhahangwele Belemu, Isaac Busayo Oluwatayo

Abstract:

Post-harvest losses pose a serious challenge to smallholder fruit crop farmers, especially in the rural communities of South Africa, affecting their economic livelihoods and food security. This study investigated the post-harvest handling practices and technologies harnessed by smallholder fruit crop farmers in the Vhembe district of Limpopo province, South Africa. Data were collected on a random sample of 224 smallholder fruit crop farmers selected from the four municipalities of the district using a multistage sampling technique. Analytical tools employed include descriptive statistics and the tobit regression model. A descriptive analysis of farmers’ socioeconomic characteristics showed that a sizeable number of these farmers are still in their active working age (mean = 52 years) with more males (63.8%) than their female (36.2%) counterparts. Respondents’ distribution by educational status revealed that only a few of these had no formal education (2.2%), with the majority having secondary education (48.7%). Results of data analysis further revealed that the prominent post-harvest technologies and handling practices harnessed by these farmers include using appropriate harvesting techniques (20.5%), selling at a reduced price (19.6%), transportation consideration (18.3%), cleaning and disinfecting (17.9%), sorting and grading (16.5%), manual cleaning (15.6%) and packaging technique (11.6%) among others. The result of the Tobit regression analysis conducted to examine the determinants of post-harvest technologies and handling practices harnessed showed that age, educational status of respondents, awareness of technology/handling practices, farm size, access to credit, extension contact, and membership of association were the significant factors. The study suggests enhanced awareness creation, access to credit facility and improved access to market as important factors to consider by relevant stakeholders to assist smallholder fruit crop farmers in the study area.

Keywords: fruit crop farmers, handling practices, post harvest losses, smallholder, Vhembe District, South Africa

Procedia PDF Downloads 54
4558 Application of Data Mining Techniques for Tourism Knowledge Discovery

Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee

Abstract:

Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.

Keywords: classification algorithms, data mining, knowledge discovery, tourism

Procedia PDF Downloads 291
4557 Model-Based Software Regression Test Suite Reduction

Authors: Shiwei Deng, Yang Bao

Abstract:

In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.

Keywords: dependence analysis, EFSM model, greedy algorithm, regression test

Procedia PDF Downloads 422
4556 Conceptualizing Clashing Values in the Field of Media Ethics

Authors: Saadia Izzeldin Malik

Abstract:

Lack of ethics is the crisis of the 21-century. Today’s global world is filled with economic, political, environmental, media/communication, and social crises that all generated by the eroding fabric of ethics and moral values that guide human’s decisions in all aspects of live. Our global world is guided by liberal western democratic principles and liberal capitalist economic principles that define and reinforce each other. In economic terms, capitalism has turned world economic systems into one market place of ideas and products controlled by big multinational corporations that not only determine the conditions and terms of commodity production and commodity exchange between countries, but also transform the political economy of media systems around the globe. The citizen (read the consumer) today is the target of persuasion by all types of media at a time when her/his interests should be, ethically and in principle, the basic significant factor in the selection of media content. It is very important in this juncture of clashing media values –professional and commercial- and wide spread ethical lapses of media organizations and media professionals to think of a perspective to theorize these conflicting values within a broader framework of media ethics. Thus, the aim of this paper is to, epistemologically, bring to the center a perspective on media ethics as a basis for reconciliation of clashing values of the media. The paper focuses on conflicting ethical values in current media debate; namely ownership of media vs. press freedom, individual right for privacy vs. public right to know, and global western consumerism values vs. media values. The paper concludes that a framework to reconcile conflicting values of media ethics should focus on the “individual” journalist and his/her moral development as well as focus on maintaining ethical principles of the media as an institution with a primary social responsibility for the “public” it serves.

Keywords: ethics, media, journalism, social responsibility, conflicting values, global

Procedia PDF Downloads 483
4555 A Regional Analysis on Co-movement of Sovereign Credit Risk and Interbank Risks

Authors: Mehdi Janbaz

Abstract:

The global financial crisis and the credit crunch that followed magnified the importance of credit risk management and its crucial role in the stability of all financial sectors and the whole of the system. Many believe that risks faced by the sovereign sector are highly interconnected with banking risks and most likely to trigger and reinforce each other. This study aims to examine (1) the impact of banking and interbank risk factors on the sovereign credit risk of Eurozone, and (2) how the EU Credit Default Swaps spreads dynamics are affected by the Crude Oil price fluctuations. The hypothesizes are tested by employing fitting risk measures and through a four-staged linear modeling approach. The sovereign senior 5-year Credit Default Swap spreads are used as a core measure of the credit risk. The monthly time-series data of the variables used in the study are gathered from the DataStream database for a period of 2008-2019. First, a linear model test the impact of regional macroeconomic and market-based factors (STOXX, VSTOXX, Oil, Sovereign Debt, and Slope) on the CDS spreads dynamics. Second, the bank-specific factors, including LIBOR-OIS spread (the difference between the Euro 3-month LIBOR rate and Euro 3-month overnight index swap rates) and Euribor, are added to the most significant factors of the previous model. Third, the global financial factors including EURO to USD Foreign Exchange Volatility, TED spread (the difference between 3-month T-bill and the 3-month LIBOR rate based in US dollars), and Chicago Board Options Exchange (CBOE) Crude Oil Volatility Index are added to the major significant factors of the first two models. Finally, a model is generated by a combination of the major factor of each variable set in addition to the crisis dummy. The findings show that (1) the explanatory power of LIBOR-OIS on the sovereign CDS spread of Eurozone is very significant, and (2) there is a meaningful adverse co-movement between the Crude Oil price and CDS price of Eurozone. Surprisingly, adding TED spread (the difference between the three-month Treasury bill and the three-month LIBOR based in US dollars.) to the analysis and beside the LIBOR-OIS spread (the difference between the Euro 3M LIBOR and Euro 3M OIS) in third and fourth models has been increased the predicting power of LIBOR-OIS. Based on the results, LIBOR-OIS, Stoxx, TED spread, Slope, Oil price, OVX, FX volatility, and Euribor are the determinants of CDS spreads dynamics in Eurozone. Moreover, the positive impact of the crisis period on the creditworthiness of the Eurozone is meaningful.

Keywords: CDS, crude oil, interbank risk, LIBOR-OIS, OVX, sovereign credit risk, TED

Procedia PDF Downloads 140
4554 Segmentation of Piecewise Polynomial Regression Model by Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

Piecewise polynomial regression model is very flexible model for modeling the data. If the piecewise polynomial regression model is matched against the data, its parameters are not generally known. This paper studies the parameter estimation problem of piecewise polynomial regression model. The method which is used to estimate the parameters of the piecewise polynomial regression model is Bayesian method. Unfortunately, the Bayes estimator cannot be found analytically. Reversible jump MCMC algorithm is proposed to solve this problem. Reversible jump MCMC algorithm generates the Markov chain that converges to the limit distribution of the posterior distribution of piecewise polynomial regression model parameter. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of piecewise polynomial regression model.

Keywords: piecewise regression, bayesian, reversible jump MCMC, segmentation

Procedia PDF Downloads 368
4553 Hardships Faced by Entrepreneurs in Marketing Projects for Acquiring Business Loans

Authors: Sudipto Sarkar

Abstract:

Capital is the primary fuel for starting and running a business. Since capital is crucial for every business, entrepreneurs must successfully acquire adequate capital for executing their projects. Sources for the necessary capital for entrepreneurs include their own personal funds from existing bank accounts, or lines of credit or loans from banks or financial institutions, or equity funding from investors. The most commonly selected source of capital is a bank loan. However, acquiring a loan by any entrepreneur requires adhering to strict guidelines, conditions and norms. Because not only they have to show evidence for viability of the project, but also the means to return the acquired loan. On the bank’s part, it requires that every loan officer performs a thorough credit appraisal of the prospective borrowers and makes decisions about whether or not to lend money, how much to lend, and what conditions should be attached to it. Moreover, these credit decisions in general were often based on biases, analytical techniques, or prior experience. A loan can either turn out to be good or poor, irrespective of what type of credit decisions were followed. However, based on prior experience, the loan officers seem to differentiate between a good and a bad loan by examining the borrower’s credit history, pattern of borrowing, volume of borrowing, frequency of borrowing, and reasons for borrowing. As per an article written by Maureen Wallenfang on postcrescent.com dated May 10, 2010, it is observed that borrowers with good credit, solid business plans and adequate collateral security were able to procure loans very easily in the Fox Valley region. Since loans are required to run businesses, and also with the propensity of loans to become bad, loan officers tend to be very critical and cautious before approving and disbursing the loans. The pressure to be critical and cautious, at least partly, is a result of increased scrutiny by the Securities and Exchange Commission. As per Wall Street Journal (Sidel & Eaglesham, March, 3 2011, online), the Securities and Exchange Commission scrutinized banks that have restructured troubled loans in order to make them appear healthier than they really are. Therefore, loan officers’ loan criteria are of immense importance for entrepreneurs and banks alike.

Keywords: entrepreneur, loans, marketing, banks

Procedia PDF Downloads 253
4552 The Critical Relevance of Credit and Debt Data in Household Food Security Analysis: The Risks of Ineffective Response Actions

Authors: Siddharth Krishnaswamy

Abstract:

Problem Statement: Currently, when analyzing household food security, the most commonly studied food access indicators are household income and expenditure. Larger studies do take into account other indices such as credit and employment. But these are baselines studies and by definition are conducted infrequently. Food security analysis for access is usually dedicated to analyzing income and expenditure indicators. And both these indicators are notoriously inconsistent. Yet this data can very often end up being the basis on which household food access is calculated; and by extension, be used for decision making. Objectives: This paper argues that along with income and expenditure, credit and debit information should be collected so that an accurate analysis of household food security (and in particular) food access can be determined. The lack of collection and analysis of this information routinely means that there is often a “masking” of the actual situation; a household’s food access and food availability patterns may be adequate mainly as a result of borrowing and may even be due to a long- term dependency (a debt cycle). In other words, such a household is, in reality, worse off than it appears a factor masked by its performance on basic access indicators. Procedures/methodologies/approaches: Existing food security data sets collected in 2005 in Azerbaijan, 2010 across Myanmar and 2014-15 across Uganda were used to support the theory that analyzing income and expenditure of a HHs and analyzing the same in addition to data on credit & borrowing patterns will result in an entirely different scenario of food access of the household. Furthermore, the data analyzed depicts food consumption patterns across groups of households and then relates this to the extent of dependency on credit, i.e. households borrowing money in order to meet food needs. Finally, response options that were based on analyzing only income and expenditure; and response options based on income, expenditure, credit, and borrowing – from the same geographical area of operation are studied and discussed. Results: The purpose of this work was to see if existing methods of household food security analysis could be improved. It is hoped that food security analysts will collect household level information on credit and debit and analyze them against income, expenditure and consumption patterns. This will help determine if a household’s food access and availability are dependent on unsustainable strategies such as borrowing money for food or undertaking sustained debts. Conclusions: The results clearly show the amount of relevant information that is missing in Food Access analysis if debit and borrowing of the household is not analyzed along with the typical Food Access indicators that are usually analyzed. And the serious repercussions this has on Programmatic response and interventions.

Keywords: analysis, food security indicators, response, resilience analysis

Procedia PDF Downloads 329
4551 A Fuzzy Linear Regression Model Based on Dissemblance Index

Authors: Shih-Pin Chen, Shih-Syuan You

Abstract:

Fuzzy regression models are useful for investigating the relationship between explanatory variables and responses in fuzzy environments. To overcome the deficiencies of previous models and increase the explanatory power of fuzzy data, the graded mean integration (GMI) representation is applied to determine representative crisp regression coefficients. A fuzzy regression model is constructed based on the modified dissemblance index (MDI), which can precisely measure the actual total error. Compared with previous studies based on the proposed MDI and distance criterion, the results from commonly used test examples show that the proposed fuzzy linear regression model has higher explanatory power and forecasting accuracy.

Keywords: dissemblance index, fuzzy linear regression, graded mean integration, mathematical programming

Procedia PDF Downloads 431
4550 Study of Dermatoglyphics Pattern in Patient with Hypertension

Authors: Ajeevan Gautam, Gulam Anwer Khan, Pratibha Pokhrel

Abstract:

Introduction: Dermatoglyphics is the science which deals with the study of dermal ridge configuration on the digits, palms and soles. It is grooved by ridges and forms variety of configurations. The aim of the study was to identify dermal ridge patterns on fingertip of hypertensive patients and in normal population and to compare patterns among them. Methods: The subjects of the study were 130 hypertensives and 130 non-hypertensives cases of Kathmandu Valley aged between 40 to 80 years. Case history was recorded after consent finger prints were taken. Different parameters as whorl, loop, arch and composite patterns were studied and analysed. Result: It revealed, increased whorl pattern in hypertensive. It showed 65.69% whorl, 29.23% loop and 5.07% arch patterns in right hand of hypertensive people. In control, it was found to be 34.46% whorl, 58.15% loop and 5.38% arch patterns respectively. Similarly in left hand 63.69% whorl, 32% loop and 4.30% arch in hypertensive group. In control group it was 60.15% as loop, 35.69% as whorl and 15% as arch. Discussion: Based on findings of the result, it was concluded that the whorl, loop and arch patterns observed as 65.69%, 29.23% and 5.07% respectively in hypertensive cases in right hand. Similarly in left hand, it was found to be 4.30% as arch, 32% as loop and 63.69% as whorl patterns, but in normotensive subjects these patterns were recorded as 36.43%, 58.15%, 5.38% in right hand and 35.69%, 60.15%, 4.15% in left hand as whorl, loop and arch respectively.

Keywords: arch, dermatoglyphics, hypertension, loop, whorl

Procedia PDF Downloads 292
4549 Offline Signature Verification Using Minutiae and Curvature Orientation

Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee

Abstract:

A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.

Keywords: signature, ridge breaks, minutiae, orientation

Procedia PDF Downloads 142
4548 Jalovchat Gabbroic Intrusive of the Caucasus: Petrological Study, Geochemical Peculiarities and Formation Conditions

Authors: Giorgi Chichinadze, David Shengelia, Tamara Tsutsunava, Nikoloz Maisuradze, Giorgi Beridze

Abstract:

The Jalovchat intrusive is built up of hornblende gabbros, gabbro-norites and norites. Within the intrusive hornblende-bearing gabbro-pegmatites are widespread. That is a coarse-grained rock with gigantic hornblende crystals. By its unusual composition, the Jalovchat intrusive has no analogue in the Caucasus. However, petrologically and geochemically, the intrusive rocks were studied insufficiently. For comprehensive investigations, the authors applied appropriate methodologies: Microscopic study of thin sections, petro- and geochemical analyses of the samples and also different petrogenic, rare and rare earth elements diagrams and spidergrams. Analytical study established that the Jalovchat intrusive by its composition corresponds mainly to the mid-ocean ridge basalts and according to geodynamic type belongs to the subduction type. In general, it is an anomalous phenomenon, as in the rocks of such composition crystallization of hornblende and especially of its gigantic crystals is atypical. The authors believe that the water-rich magma reservoir, which was necessary for the crystallization of gigantic hornblende crystals, appeared as a result of melting of water-rich mid-ocean ridge basaltic rocks during the subduction process in Bajocian time.

Keywords: gabbro-pegmatite, intrusive, petrogenesis, petrogeochemistry, the Caucasus

Procedia PDF Downloads 203
4547 Comparative Analysis of Yield before and after Access to Extension Services among Crop Farmers in Bauchi Local Government Area of Bauchi State, Nigeria

Authors: U. S. Babuga, A. H. Danwanka, A. Garba

Abstract:

The research was carried out to compare the yield of respondents before and after access to extension services on crop production technologies in the study area. Data were collected from the study area through questionnaires administered to seventy-five randomly selected respondents. Data were analyzed using descriptive statistics, t-test and regression models. The result disclosed that majority (97%) of the respondent attended one form of school or the other. The majority (78.67%) of the respondents had farm size ranging between 1-3 hectares. The majority of the respondent adopt improved variety of crops, plant spacing, herbicide, fertilizer application, land preparation, crop protection, crop processing and storage of farm produce. The result of the t-test between the yield of respondents before and after access to extension services shows that there was a significant (p<0.001) difference in yield before and after access to extension. It also indicated that farm size was significant (p<0.001) while household size, years of farming experience and extension contact were significant at (p<0.005). The major constraint to adoption of crop production technologies were shortage of extension agents, high cost of technology and lack of access to credit facility. The major pre-requisite for the improvement of extension service are employment of more extension agents or workers and adequate training. Adequate agricultural credit to farmers at low interest rates will enhance their adoption of crop production technologies.

Keywords: comparative, analysis, yield, access, extension

Procedia PDF Downloads 359
4546 Enhancing Students’ Achievement, Interest and Retention in Chemistry through an Integrated Teaching/Learning Approach

Authors: K. V. F. Fatokun, P. A. Eniayeju

Abstract:

This study concerns the effects of concept mapping-guided discovery integrated teaching approach on the learning style and achievement of chemistry students. The sample comprised 162 senior secondary school (SS 2) students drawn from two science schools in Nasarawa State which have equivalent mean scores of 9.68 and 9.49 in their pre-test. Five instruments were developed and validated while the sixth was purely adopted by the investigator for the study, Four null hypotheses were tested at α = 0.05 level of significance. Chi square analysis showed that there is a significant shift in students’ learning style from accommodating and diverging to converging and assimilating when exposed to concept mapping- guided discovery approach. Also t-test and ANOVA that those in experimental group achieve and retain content learnt better. Results of the Scheffe’s test for multiple comparisons showed that boys in the experimental group performed better than girls. It is therefore concluded that the concept mapping-guided discovery integrated approach should be used in secondary schools to successfully teach electrochemistry. It is strongly recommended that chemistry teachers should be encouraged to adopt this method for teaching difficult concepts.

Keywords: integrated teaching approach, concept mapping-guided discovery, achievement, retention, learning styles and interest

Procedia PDF Downloads 324
4545 Risk-Sharing Financing of Islamic Banks: Better Shielded against Interest Rate Risk

Authors: Mirzet SeHo, Alaa Alaabed, Mansur Masih

Abstract:

In theory, risk-sharing-based financing (RSF) is considered a corner stone of Islamic finance. It is argued to render Islamic banks more resilient to shocks. In practice, however, this feature of Islamic financial products is almost negligible. Instead, debt-based instruments, with conventional like features, have overwhelmed the nascent industry. In addition, the framework of present-day economic, regulatory and financial reality inevitably exposes Islamic banks in dual banking systems to problems of conventional banks. This includes, but is not limited to, interest rate risk. Empirical evidence has, thus far, confirmed such exposures, despite Islamic banks’ interest-free operations. This study applies system GMM in modeling the determinants of RSF, and finds that RSF is insensitive to changes in interest rates. Hence, our results provide support to the “stability” view of risk-sharing-based financing. This suggests RSF as the way forward for risk management at Islamic banks, in the absence of widely acceptable Shariah compliant hedging instruments. Further support to the stability view is given by evidence of counter-cyclicality. Unlike debt-based lending that inflates artificial asset bubbles through credit expansion during the upswing of business cycles, RSF is negatively related to GDP growth. Our results also imply a significantly strong relationship between risk-sharing deposits and RSF. However, the pass-through of these deposits to RSF is economically low. Only about 40% of risk-sharing deposits are channeled to risk-sharing financing. This raises questions on the validity of the industry’s claim that depositors accustomed to conventional banking shun away from risk sharing and signals potential for better balance sheet management at Islamic banks. Overall, our findings suggest that, on the one hand, Islamic banks can gain ‘independence’ from conventional banks and interest rates through risk-sharing products, the potential for which is enormous. On the other hand, RSF could enable policy makers to improve systemic stability and restrain excessive credit expansion through its countercyclical features.

Keywords: Islamic banks, risk-sharing, financing, interest rate, dynamic system GMM

Procedia PDF Downloads 315
4544 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.

Keywords: classification, CRISP-DM, machine learning, predictive quality, regression

Procedia PDF Downloads 140
4543 Feeling Sorry for Some Creditors

Authors: Hans Tjio, Wee Meng Seng

Abstract:

The interaction of contract and property has always been a concern in corporate and commercial law, where there are internal structures created that may not match the externally perceived image generated by the labels attached to those structures. We will focus, in particular, on the priority structures created by affirmative asset partitioning, which have increasingly come under challenge by those attempting to negotiate around them. The most prominent has been the AT1 bonds issued by Credit Suisse which were wiped out before its equity when the troubled bank was acquired by UBS. However, this should not have come as a surprise to those whose “bonds” had similarly been “redeemed” upon the occurrence of certain reference events in countries like Singapore, Hong Kong and Taiwan during their Minibond crisis linked to US sub-prime defaults. These were derivatives classified as debentures and sold as such. At the same time, we are again witnessing “liabilities” seemingly ranking higher up the balance sheet ladder, finding themselves lowered in events of default. We will examine the mechanisms holders of perpetual securities or preference shares have tried to use to protect themselves. This is happening against a backdrop that sees a rise in the strength of private credit and inter-creditor conflicts. The restructuring regime of the hybrid scheme in Singapore now, while adopting the absolute priority rule in Chapter 11 as the quid pro quo for creditor cramdown, does not apply to shareholders and so exempts them from cramdown. Complicating the picture further, shareholders are not exempted from cramdown in the Dutch scheme, but it adopts a relative priority rule. At the same time, the important UK Supreme Court decision in BTI 2014 LLC v Sequana [2022] UKSC 25 has held that directors’ duties to take account of creditor interests are activated only when a company is almost insolvent. All this has been complicated by digital assets created by businesses. Investors are quite happy to have them classified as property (like a thing) when it comes to their transferability, but then when the issuer defaults to have them seen as a claim on the business (as a choice in action), that puts them at the level of a creditor. But these hidden interests will not show themselves on an issuer’s balance sheet until it is too late to be considered and yet if accepted, may also prevent any meaningful restructuring.

Keywords: asset partitioning, creditor priority, restructuring, BTI v Sequana, digital assets

Procedia PDF Downloads 73
4542 Self-Organizing Maps for Credit Card Fraud Detection

Authors: ChunYi Peng, Wei Hsuan CHeng, Shyh Kuang Ueng

Abstract:

This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.

Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies

Procedia PDF Downloads 52
4541 The Belt and Road Initiative in a Spiderweb of Conflicting Great Power Interests: A Geopolitical Analysis

Authors: Csaba Barnabas Horvath

Abstract:

The Belt and Road initiative of China is one that can change the face of Eurasia as we know it. Instead of four major, densely populated subcontinents defined by Mackinder (East Asia, Europe, the Indian Subcontinent, and the Middle East) isolated from each other by vast, sparsely populated and underdeveloped regions, it can at last start to function as a geographic whole, with a sophisticated infrastructure linking its different parts to each other. This initiative, however, happens not in a geopolitical vacuum, but in a space of conflicting great power interests. In Central Asia, the influence of China and Russia are in a setting of competition, where despite the cooperation between the two powers to a great degree, issues causing mutual mistrust emerge repeatedly. In Afghanistan, besides western military presence, even India’s efforts can be added to the picture. In Southeast Asia, a key region regarding the maritime Silk Road, India’s Act East policy meets with China’s Belt and Road, not always in consensus, not to mention US and Japanese interests in the region. The presentation aims to take an overview on how conflicting great power interests are likely to influence the outcome of the Belt and Road initiative. The findings show, that overall success of the Belt and Road Initiative may not be as smooth, as hoped by China, but at the same time, in a limited number of strategically important countries (such as Pakistan, Laos, and Cambodia), this setting is actually a factor favoring China, providing at least a selected number of reliable corridors, where the initiative is actually likely to be successful.

Keywords: belt and road initiative, geostrategic corridors, geopolitics, great power rivalry

Procedia PDF Downloads 216
4540 Self-Organizing Maps for Credit Card Fraud Detection and Visualization

Authors: Peng Chun-Yi, Chen Wei-Hsuan, Ueng Shyh-Kuang

Abstract:

This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.

Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies

Procedia PDF Downloads 55
4539 Sensitivity of Credit Default Swaps Premium to Global Risk Factor: Evidence from Emerging Markets

Authors: Oguzhan Cepni, Doruk Kucuksarac, M. Hasan Yilmaz

Abstract:

Risk premium of emerging markets are moving altogether depending on the momentum and shifts in the global risk appetite. However, the magnitudes of these changes in the risk premium of emerging market economies might vary. In this paper, we focus on how global risk factor affects credit default swaps (CDS) premiums of emerging markets using principal component analysis (PCA) and rolling regressions. PCA results indicate that the first common component accounts for almost 76% of common variation in CDS premiums of emerging markets. Additionally, the explanatory power of the first factor seems to be high over sample period. However, the sensitivity to the global risk factor tends to change over time and across countries. In this regard, fixed effects panel regressions are employed to identify the macroeconomic factors driving the heterogeneity across emerging markets. There are two main macroeconomic variables that affect the sensitivity; government debt to GDP and international reserves to GDP. The countries with lower government debt and higher reserves tend to be less subject to the variations in the global risk appetite.

Keywords: emerging markets, principal component analysis, credit default swaps, sovereign risk

Procedia PDF Downloads 377
4538 Digital Library Evaluation by SWARA-WASPAS Method

Authors: Mehmet Yörükoğlu, Serhat Aydın

Abstract:

Since the discovery of the manuscript, mechanical methods for storing, transferring and using the information have evolved into digital methods over the time. In this process, libraries that are the center of the information have also become digitized and become accessible from anywhere and at any time in the world by taking on a structure that has no physical boundaries. In this context, some criteria for information obtained from digital libraries have become more important for users. This paper evaluates the user criteria from different perspectives that make a digital library more useful. The Step-Wise Weight Assessment Ratio Analysis-Weighted Aggregated Sum Product Assessment (SWARA-WASPAS) method is used with flexibility and easy calculation steps for the evaluation of digital library criteria. Three different digital libraries are evaluated by information technology experts according to five conflicting main criteria, ‘interface design’, ‘effects on users’, ‘services’, ‘user engagement’ and ‘context’. Finally, alternatives are ranked in descending order.

Keywords: digital library, multi criteria decision making, SWARA-WASPAS method

Procedia PDF Downloads 149
4537 Knowledge-Driven Decision Support System Based on Knowledge Warehouse and Data Mining by Improving Apriori Algorithm with Fuzzy Logic

Authors: Pejman Hosseinioun, Hasan Shakeri, Ghasem Ghorbanirostam

Abstract:

In recent years, we have seen an increasing importance of research and study on knowledge source, decision support systems, data mining and procedure of knowledge discovery in data bases and it is considered that each of these aspects affects the others. In this article, we have merged information source and knowledge source to suggest a knowledge based system within limits of management based on storing and restoring of knowledge to manage information and improve decision making and resources. In this article, we have used method of data mining and Apriori algorithm in procedure of knowledge discovery one of the problems of Apriori algorithm is that, a user should specify the minimum threshold for supporting the regularity. Imagine that a user wants to apply Apriori algorithm for a database with millions of transactions. Definitely, the user does not have necessary knowledge of all existing transactions in that database, and therefore cannot specify a suitable threshold. Our purpose in this article is to improve Apriori algorithm. To achieve our goal, we tried using fuzzy logic to put data in different clusters before applying the Apriori algorithm for existing data in the database and we also try to suggest the most suitable threshold to the user automatically.

Keywords: decision support system, data mining, knowledge discovery, data discovery, fuzzy logic

Procedia PDF Downloads 328