Search results for: data interpolating empirical orthogonal function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29443

Search results for: data interpolating empirical orthogonal function

28243 Exploring the Effect of Accounting Information on Systematic Risk: An Empirical Evidence of Tehran Stock Exchange

Authors: Mojtaba Rezaei, Elham Heydari

Abstract:

This paper highlights the empirical results of analyzing the correlation between accounting information and systematic risk. This association is analyzed among financial ratios and systematic risk by considering the financial statement of 39 companies listed on the Tehran Stock Exchange (TSE) for five years (2014-2018). Financial ratios have been categorized into four groups and to describe the special features, as representative of accounting information we selected: Return on Asset (ROA), Debt Ratio (Total Debt to Total Asset), Current Ratio (current assets to current debt), Asset Turnover (Net sales to Total assets), and Total Assets. The hypotheses were tested through simple and multiple linear regression and T-student test. The findings illustrate that there is no significant relationship between accounting information and market risk. This indicates that in the selected sample, historical accounting information does not fully reflect the price of stocks.

Keywords: accounting information, market risk, systematic risk, stock return, efficient market hypothesis, EMH, Tehran stock exchange, TSE

Procedia PDF Downloads 119
28242 Comparison between Two Software Packages GSTARS4 and HEC-6 about Prediction of the Sedimentation Amount in Dam Reservoirs and to Estimate Its Efficient Life Time in the South of Iran

Authors: Fatemeh Faramarzi, Hosein Mahjoob

Abstract:

Building dams on rivers for utilization of water resources causes problems in hydrodynamic equilibrium and results in leaving all or part of the sediments carried by water in dam reservoir. This phenomenon has also significant impacts on water and sediment flow regime and in the long term can cause morphological changes in the environment surrounding the river, reducing the useful life of the reservoir which threatens sustainable development through inefficient management of water resources. In the past, empirical methods were used to predict the sedimentation amount in dam reservoirs and to estimate its efficient lifetime. But recently the mathematical and computational models are widely used in sedimentation studies in dam reservoirs as a suitable tool. These models usually solve the equations using finite element method. This study compares the results from tow software packages, GSTARS4 & HEC-6, in the prediction of the sedimentation amount in Dez dam, southern Iran. The model provides a one-dimensional, steady-state simulation of sediment deposition and erosion by solving the equations of momentum, flow and sediment continuity and sediment transport. GSTARS4 (Generalized Sediment Transport Model for Alluvial River Simulation) which is based on a one-dimensional mathematical model that simulates bed changes in both longitudinal and transverse directions by using flow tubes in a quasi-two-dimensional scheme to calibrate a period of 47 years and forecast the next 47 years of sedimentation in Dez Dam, Southern Iran. This dam is among the highest dams all over the world (with its 203 m height), and irrigates more than 125000 square hectares of downstream lands and plays a major role in flood control in the region. The input data including geometry, hydraulic and sedimentary data, starts from 1955 to 2003 on a daily basis. To predict future river discharge, in this research, the time series data were assumed to be repeated after 47 years. Finally, the obtained result was very satisfactory in the delta region so that the output from GSTARS4 was almost identical to the hydrographic profile in 2003. In the Dez dam due to the long (65 km) and a large tank, the vertical currents are dominant causing the calculations by the above-mentioned method to be inaccurate. To solve this problem, we used the empirical reduction method to calculate the sedimentation in the downstream area which led to very good answers. Thus, we demonstrated that by combining these two methods a very suitable model for sedimentation in Dez dam for the study period can be obtained. The present study demonstrated successfully that the outputs of both methods are the same.

Keywords: Dez Dam, prediction, sedimentation, water resources, computational models, finite element method, GSTARS4, HEC-6

Procedia PDF Downloads 302
28241 A Local Tensor Clustering Algorithm to Annotate Uncharacterized Genes with Many Biological Networks

Authors: Paul Shize Li, Frank Alber

Abstract:

A fundamental task of clinical genomics is to unravel the functions of genes and their associations with disorders. Although experimental biology has made efforts to discover and elucidate the molecular mechanisms of individual genes in the past decades, still about 40% of human genes have unknown functions, not to mention the diseases they may be related to. For those biologists who are interested in a particular gene with unknown functions, a powerful computational method tailored for inferring the functions and disease relevance of uncharacterized genes is strongly needed. Studies have shown that genes strongly linked to each other in multiple biological networks are more likely to have similar functions. This indicates that the densely connected subgraphs in multiple biological networks are useful in the functional and phenotypic annotation of uncharacterized genes. Therefore, in this work, we have developed an integrative network approach to identify the frequent local clusters, which are defined as those densely connected subgraphs that frequently occur in multiple biological networks and consist of the query gene that has few or no disease or function annotations. This is a local clustering algorithm that models multiple biological networks sharing the same gene set as a three-dimensional matrix, the so-called tensor, and employs the tensor-based optimization method to efficiently find the frequent local clusters. Specifically, massive public gene expression data sets that comprehensively cover dynamic, physiological, and environmental conditions are used to generate hundreds of gene co-expression networks. By integrating these gene co-expression networks, for a given uncharacterized gene that is of biologist’s interest, the proposed method can be applied to identify the frequent local clusters that consist of this uncharacterized gene. Finally, those frequent local clusters are used for function and disease annotation of this uncharacterized gene. This local tensor clustering algorithm outperformed the competing tensor-based algorithm in both module discovery and running time. We also demonstrated the use of the proposed method on real data of hundreds of gene co-expression data and showed that it can comprehensively characterize the query gene. Therefore, this study provides a new tool for annotating the uncharacterized genes and has great potential to assist clinical genomic diagnostics.

Keywords: local tensor clustering, query gene, gene co-expression network, gene annotation

Procedia PDF Downloads 140
28240 Estimation of Population Mean Using Characteristics of Poisson Distribution: An Application to Earthquake Data

Authors: Prayas Sharma

Abstract:

This paper proposed a generalized class of estimators, an exponential class of estimators based on the adaption of Sharma and Singh (2015) and Solanki and Singh (2013), and a simple difference estimator for estimating unknown population mean in the case of Poisson distributed population in simple random sampling without replacement. The expressions for mean square errors of the proposed classes of estimators are derived from the first order of approximation. It is shown that the adapted version of Solanki and Singh (2013), the exponential class of estimator, is always more efficient than the usual estimator, ratio, product, exponential ratio, and exponential product type estimators and equally efficient to simple difference estimator. Moreover, the adapted version of Sharma and Singh's (2015) estimator is always more efficient than all the estimators available in the literature. In addition, theoretical findings are supported by an empirical study to show the superiority of the constructed estimators over others with an application to earthquake data of Turkey.

Keywords: auxiliary attribute, point bi-serial, mean square error, simple random sampling, Poisson distribution

Procedia PDF Downloads 139
28239 Conditions Required for New Sector Emergence: Results from a Systematic Literature Review

Authors: Laurie Prange-Martin, Romeo Turcan, Norman Fraser

Abstract:

The aim of this study is to identify the conditions required and describe the process of emergence for a new economic sector created from new or established businesses. A systematic literature review of English-language studies published from 1983 to 2016 was conducted using the following databases: ABI/INFORM Complete; Business Source Premiere; Google Scholar; Scopus; and Web of Science. The two main terms of business sector and emergence were used in the systematic literature search, along with another seventeen synonyms for each these main terms. From the search results, 65 publications met the requirements of an empirical study discussing and reporting the conditions of new sector emergence. A meta-analysis of the literature examined suggest that there are six favourable conditions and five key individuals or groups required for new sector emergence. In addition, the results from the meta-analysis showed that there are eighteen theories used in the literature to explain the phenomenon of new sector emergence, which can be grouped in three study disciplines. With such diversity in theoretical frameworks used in the 65 empirical studies, the authors of this paper propose the development of a new theory of sector emergence.

Keywords: economic geography, new sector emergence, economic diversification, regional economies

Procedia PDF Downloads 256
28238 Implementation of an Associative Memory Using a Restricted Hopfield Network

Authors: Tet H. Yeap

Abstract:

An analog restricted Hopfield Network is presented in this paper. It consists of two layers of nodes, visible and hidden nodes, connected by directional weighted paths forming a bipartite graph with no intralayer connection. An energy or Lyapunov function was derived to show that the proposed network will converge to stable states. By introducing hidden nodes, the proposed network can be trained to store patterns and has increased memory capacity. Training to be an associative memory, simulation results show that the associative memory performs better than a classical Hopfield network by being able to perform better memory recall when the input is noisy.

Keywords: restricted Hopfield network, Lyapunov function, simultaneous perturbation stochastic approximation

Procedia PDF Downloads 115
28237 Activity-Based Costing in the Hospitality Industry: A Case Study in a Hotel

Authors: Bita Mashayekhi, Mohammad Ara

Abstract:

The purpose of this study is to provide some empirical evidence about implementing Activity-Based Costing (ABC) in the hospitality industry in Iran. For this purpose, we consider the Tabriz International Hotel as our sample hotel and then gather the relevant data from its cost accounting system in 2012. Then, we use ABC as our costing method and compare the cost of each service unit with that cost which had been extracted for the traditional costing method. The results show a different cost per unit for two methods. Also, because of its more precise and detailed provided information, an ABC system facilitates the decision-making process for managers on decisions related to profitability analysis, budgeting, pricing, and so on.

Keywords: Activity-Based Costing (ABC), activity, cost driver, hospitality industry

Procedia PDF Downloads 285
28236 Platelet Transfusion Thresholds for Pediatrics; A Retrospective Study

Authors: Hessah Alsulami, Majedah Aldosari

Abstract:

Introduction: Platelet threshold of 10x109 /L is recommended for clinically stable thrombocytopenic pediatric patients. Transfusions at a higher level (given the absence of research evidence, as determined by clinical circumstances, generally at threshold of 40x109 /L) may be required for patients with signs of bleeding, high fever, hyper-leukocytosis, rapid fall in platelet count, concomitant coagulation abnormality, critically ill patients, and those with impaired platelet function (including drug induced). Transfusions at a higher level may be also required for patients undergoing invasive procedures. Method: This study is a retrospective observational analysis of platelet transfusion thresholds in a single secondary pediatric hospital in Riyadh. From the blood bank database, the list of the patients who received platelet transfusions in the second half of 2018 was retrieved. Patients were divided into two groups; group A, those belong to the category of high platelet level for transfusion (such as those with bleeding, high fever, rapid fall in platelet count, impaired platelet function or undergoing invasive procedures) and group B, those who were not. Then we looked at the pre and post transfusion platelet levels for each group. The data was analyzed using GraphPad software and the data expressed as Mean ± SD. Result: A total of 112 of transfusion episodes in 61 patients (38% female) were analyzed. The age ranged from 24 days to 8 years. The distribution of platelet transfusion episodes was 64% (n=72) for group A and 36% (n= 40) for group B. The mean pre-transfusion platelet count was 46x103 ± (11x 103) for group A and 28x103 ± (6x103) for group B. the post-transfusion mean platelet count was 61 x 103 ± (14 x 103) and 60 x103 ± (24 x 103) for group A and B respectively. Among the groups the rise in the mean platelet count after transfusion was significant among stable patients (group B) compared to unstable patients (group A) (P < 0.001). Conclusion: The platelet count threshold for transfusion varied with the clinical condition and is higher among unstable patients’ group which is expected. For stable patients the threshold was higher than what it should be which means that the clinicians don’t follow the guidelines in this regard. The rise of platelet count after transfusion was higher among stable patients.

Keywords: platelet, transfusion, threshold, pediatric

Procedia PDF Downloads 56
28235 A Semiparametric Approach to Estimate the Mode of Continuous Multivariate Data

Authors: Tiee-Jian Wu, Chih-Yuan Hsu

Abstract:

Mode estimation is an important task, because it has applications to data from a wide variety of sources. We propose a semi-parametric approach to estimate the mode of an unknown continuous multivariate density function. Our approach is based on a weighted average of a parametric density estimate using the Box-Cox transform and a non-parametric kernel density estimate. Our semi-parametric mode estimate improves both the parametric- and non-parametric- mode estimates. Specifically, our mode estimate solves the non-consistency problem of parametric mode estimates (at large sample sizes) and reduces the variability of non-parametric mode estimates (at small sample sizes). The performance of our method at practical sample sizes is demonstrated by simulation examples and two real examples from the fields of climatology and image recognition.

Keywords: Box-Cox transform, density estimation, mode seeking, semiparametric method

Procedia PDF Downloads 269
28234 Filler Elastomers Abrasion at Steady State: Optimal Use Conditions

Authors: Djeridi Rachid, Ould Ouali Mohand

Abstract:

The search of a mechanism for the elastomer abrasive wear study is an open issue. The practice difficulties are complex due to the complexity of deformation mechanism, to the complex mechanism of the material tearing and to the marked interactions between the tribological parameters. In this work, we present an experimental technique to study the elastomers abrasive wear. The interaction 'elastomer/indenter' implicate dependant ant temporary of different tribological parameters. Consequently, the phenomenon that governs this interaction is not easy to explain. An optimal elastomers compounding and an adequate utilization conditions of these materials that define its resistance at the abrasion is discussed. The results are confronted to theoretical models: the weight loss variation in function of blade angle or in function of cycle number is in agreement with rupture models and with the mechanism of fissures propagation during the material tearing in abrasive wear of filler elastomers. The weight loss in function of the sliding velocity shows the existence of a critical velocity that corresponds to the maximal wear. The adding of silica or black carbon influences in a different manner on wear abrasive behavior of filler elastomers.

Keywords: abrasion wear, filler elastomer, tribology, hyperelastic

Procedia PDF Downloads 303
28233 Experimental Investigation on Over-Cut in Ultrasonic Machining of WC-Co Composite

Authors: Ravinder Kataria, Jatinder Kumar, B. S. Pabla

Abstract:

Ultrasonic machining is one of the most widely used non-traditional machining processes for machining of materials that are relatively brittle, hard, and fragile such as advanced ceramics, refractories, crystals, quartz etc. Present article has been targeted at investigating the impact of different experimental conditions (power rating, cobalt content, tool material, thickness of work piece, tool geometry, and abrasive grit size) on over cut in ultrasonic drilling of WC-Co composite material. Taguchi’s L-36 orthogonal array has been employed for conducting the experiments. Significant factors have been identified using analysis of variance (ANOVA) test. The experimental results revealed that abrasive grit size and tool material are most significant factors for over cut.

Keywords: ANOVA, abrasive grit size, Taguchi, WC-Co, ultrasonic machining

Procedia PDF Downloads 389
28232 Medication Errors in a Juvenile Justice Youth Development Center

Authors: Tanja Salary

Abstract:

This paper discusses a study conducted in a juvenile justice facility regarding medication errors. It includes an introduction to data collected about medication errors in a juvenile justice facility from 2011 - 2019 and explores contributing factors that relate to those errors. The data was obtained from electronic incident records of medication errors that were documented from the years 2011 through 2019. In addition, the presentation reviews both current and historical research of empirical data about patient safety standards and quality care comparing traditional health care facilities to juvenile justice residential facilities and acknowledges a gap in research. The theoretical/conceptual framework for the research study was Bandura and Adams’s self-efficacy theory of behavioral change and Mark Friedman’s results-based accountability theory. Despite the lack of evidence in previous studies addressing medication errors in juvenile justice facilities, this presenter will share information that adds to the body of knowledge, including the potential relationship of medication errors and contributing factors of race and age. Implications for future research include the effect that education and training will have on the communication among juvenile justice staff, including nurses, who administer medications to juveniles to ensure adherence to patient safety standards. There are several opportunities for future research concerning other characteristics about factors that may affect medication administration errors within the residential juvenile justice facility.

Keywords: Juvenile justice, medication errors, juveniles, error reduction strategies

Procedia PDF Downloads 47
28231 Control the Flow of Big Data

Authors: Shizra Waris, Saleem Akhtar

Abstract:

Big data is a research area receiving attention from academia and IT communities. In the digital world, the amounts of data produced and stored have within a short period of time. Consequently this fast increasing rate of data has created many challenges. In this paper, we use functionalism and structuralism paradigms to analyze the genesis of big data applications and its current trends. This paper presents a complete discussion on state-of-the-art big data technologies based on group and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also covers big data analytics techniques, processing methods, some reported case studies from different vendor, several open research challenges and the chances brought about by big data. The similarities and differences of these techniques and technologies based on important limitations are also investigated. Emerging technologies are suggested as a solution for big data problems.

Keywords: computer, it community, industry, big data

Procedia PDF Downloads 178
28230 Adaptive Few-Shot Deep Metric Learning

Authors: Wentian Shi, Daming Shi, Maysam Orouskhani, Feng Tian

Abstract:

Whereas currently the most prevalent deep learning methods require a large amount of data for training, few-shot learning tries to learn a model from limited data without extensive retraining. In this paper, we present a loss function based on triplet loss for solving few-shot problem using metric based learning. Instead of setting the margin distance in triplet loss as a constant number empirically, we propose an adaptive margin distance strategy to obtain the appropriate margin distance automatically. We implement the strategy in the deep siamese network for deep metric embedding, by utilizing an optimization approach by penalizing the worst case and rewarding the best. Our experiments on image recognition and co-segmentation model demonstrate that using our proposed triplet loss with adaptive margin distance can significantly improve the performance.

Keywords: few-shot learning, triplet network, adaptive margin, deep learning

Procedia PDF Downloads 153
28229 Is There a Month Effect on the Deposits Interest Rates? Evidence from the Greek Banking Industry during the Period 2003-13

Authors: Konstantopoulos N., Samitas A., E. Vasileiou, Kinias I.

Abstract:

This article introduces a new view on the month effect study. Applying a Markov Switching Regime model on data from the Greek Time Deposits (TDs) market for the time span January 2003 to October 2013, we examine if there is a month effect on the Greek banking industry. The empirical findings provide convincing evidence for a new king of monthly anomaly. The explanation for the specific abnormality may be the upward deposits window dressing. Further research should be done in order to examine if the specific calendar effect exists in other countries or it is only a Greek phenomenon.

Keywords: calendar anomalies, banking crisis, month effect, Greek banking industry

Procedia PDF Downloads 357
28228 Popularization of Persian Scientific Articles in the Public Media: An Analysis Based on Experimental Meta-function View Point

Authors: Behnaz Zolfaghari

Abstract:

In civilized societies, linguists seek to find suitable equivalents for scientific terms in the common language of their society. Many researches have conducted surveys about language of science on one hand and media discourse on the other, but the goal of this research is the comparative analysis of science discourse in Persian academic media and public discourse in the general Persian media by applying experimental meta-function as one of the four theoretical tools introduced by Holiday’s Systemic Functional Grammar .The said analysis aims to explore the processes that can convert the language in which scientific facts are published to a language well suited to the interested layman. The results of comparison show that these two discourses use differently six processes of experimental meta-function. Comparing the redundancy of different processes, the researcher tried to re-identify these differences in these two discourses and present a model for the procedures of converting science discourse to popularized discourse. This model can be useful for those journalists and textbook authors who want to restate scientific technical texts in a simple style for inexpert addresser including general people and students.

Keywords: systemic functional grammar, discourse analysis, science language, popularization, media discourse

Procedia PDF Downloads 180
28227 Fast Algorithm to Determine Initial Tsunami Wave Shape at Source

Authors: Alexander P. Vazhenin, Mikhail M. Lavrentiev, Alexey A. Romanenko, Pavel V. Tatarintsev

Abstract:

One of the problems obstructing effective tsunami modelling is the lack of information about initial wave shape at source. The existing methods; geological, sea radars, satellite images, contain an important part of uncertainty. Therefore, direct measurement of tsunami waves obtained at the deep water bottom peruse recorders is also used. In this paper we propose a new method to reconstruct the initial sea surface displacement at tsunami source by the measured signal (marigram) approximation with the help of linear combination of synthetic marigrams from the selected set of unit sources, calculated in advance. This method has demonstrated good precision and very high performance. The mathematical model and results of numerical tests are here described.

Keywords: numerical tests, orthogonal decomposition, Tsunami Initial Sea Surface Displacement

Procedia PDF Downloads 454
28226 Study of Pulmonary Function Test of over the 40 Years Adults in Ulaanbaatar

Authors: D. Densenbal, Ts. Naidansuren, M. Oyunchimeg, Ts. Manaljav, D. Udval, L. Khosbayar, Kh. Solongo, D. Ichinnorov, B. Solongo

Abstract:

Background: The rapid economic growth and to the common use of smoky fuel such as coal in the small traditional houses (Ger) in Mongolia is worsening its air pollution problem. In addition, the smoking rate is considered to be high. Despite these conditions, few prevalence studies of COPD epidemiology and diagnose have been performed in Mongolia. The spirometric test is a widely used diagnose for COPD. Aims: Healthy and over the 40 aged adults were evaluated of Pulmonary function test in Ulaanbaatar. Methods: Healthy, over the 40 aged residences were admitted for this study from II sub-district, in Khan-Uul district of Ulaanbaatar city. In this cross-sectional study. Health information was collected 184 subjects between 01-03 July in 2013; spirometry device was named Hichest–105 Japan that was employed for this study. Studies were using the acceptability standards outlined, and data were compared with personal reference data generated on Asian subjects which were performed abnormally to evaluated by global initiative obstructive lung decreases (GOLD). Data were analyzed using SPSS 20 software. Results: A total of 134 subjects (age 52.9±9.8, man 32.8%) were performed PFT which were interpreted normal 73.9% (sum of man 65.0% and woman 79.4% ), abnormal 26.1% which were typed obstruction 17.2% (23), restriction 6% (8), mixed 3% (4). Airflow obstruction were determined in all man 25% (11), woman 13.3% (12) which were classified mild 43.4% (man 54.5%, woman 33.3%), moderate 52.2% (36.3% vs. 66.7%) and severe 4.3% man 1 GOLD degree. Undetermined a very severe obstruction. Normal PFT subjects were compared a group of gender and age group which man was significantly higher than the women (p<0.05). Age group of PFT decrease was no difference in gender (p>0.05) also no difference in BMI (p>0.05). Normal PFT subjects were compared with predicted values were used to Asian population which was significantly lower than FEV1 (0.15±0.36 l), PEF (1.92±1.31 l) and same deference occurred man (FEV1 0.19±0.42 l, PEF 2.04±1.64), women (0.14±0.33 l vs. 1.86±1.15 l). The decrease of FEV1 was defined in over the 60 age group higher than other age groups. Conclusion: Not only observed an air flow limitation prevalence dominance in all case but also COPD prevalence diagnosed man were higher than women. Normal PFT subjects were compared with predicted values were used to Asian population which was significant air flow limitation started early.

Keywords: PFT, obstruction, FEV1, COPD

Procedia PDF Downloads 201
28225 Analysis of Unconditional Conservatism and Earnings Quality before and after the IFRS Adoption

Authors: Monica Santi, Evita Puspitasari

Abstract:

International Financial Reporting Standard (IFRS) has developed the principle based accounting standard. Based on this, IASB then eliminated the conservatism concept within accounting framework. Conservatism concept represents a prudent reaction to uncertainty to try to ensure that uncertainties and risk inherent in business situations are adequately considered. The conservatism concept has two ingredients: conditional conservatism or ex-post (news depending prudence) and unconditional conservatism or ex-ante (news-independent prudence). IFRS in substance disregards the unconditional conservatism because the unconditional conservatism can cause the understatement assets or overstated liabilities, and eventually the financial statement would be irrelevance since the information does not represent the real fact. Therefore, the IASB eliminate the conservatism concept. However, it does not decrease the practice of unconditional conservatism in the financial statement reporting. Therefore, we expected the earnings quality would be affected because of this situation, even though the IFRS implementation was expected to increase the earnings quality. The objective of this study was to provide empirical findings about the unconditional conservatism and the earnings quality before and after the IFRS adoption. The earnings per accrual measure were used as the proxy for the unconditional conservatism. If the earnings per accrual were negative (positive), it meant the company was classified as the conservative (not conservative). The earnings quality was defined as the ability of the earnings in reflecting the future earnings by considering the earnings persistence and stability. We used the earnings response coefficient (ERC) as the proxy for the earnings quality. ERC measured the extant of a security’s abnormal market return in response to the unexpected component of reporting earning of the firm issuing that security. The higher ERC indicated the higher earnings quality. The manufacturing companies listed in the Indonesian Stock Exchange (IDX) were used as the sample companies, and the 2009-2010 period was used to represent the condition before the IFRS adoption, and 2011-2013 was used to represent the condition after the IFRS adoption. Data was analyzed using the Mann-Whitney test and regression analysis. We used the firm size as the control variable with the consideration the firm size would affect the earnings quality of the company. This study had proved that the unconditional conservatism had not changed, either before and after the IFRS adoption period. However, we found the different findings for the earnings quality. The earnings quality had decreased after the IFRS adoption period. This empirical results implied that the earnings quality before the IFRS adoption was higher. This study also had found that the unconditional conservatism positively influenced the earnings quality insignificantly. The findings implied that the implementation of the IFRS had not decreased the unconditional conservatism practice and has not altered the earnings quality of the manufacturing company. Further, we found that the unconditional conservatism did not affect the earnings quality. Eventhough the empirical result shows that the unconditional conservatism gave positive influence to the earnings quality, but the influence was not significant. Thus, we concluded that the implementation of the IFRS did not increase the earnings quality.

Keywords: earnings quality, earnings response coefficient, IFRS Adoption, unconditional conservatism

Procedia PDF Downloads 245
28224 Generalized Correlation Coefficient in Genome-Wide Association Analysis of Cognitive Ability in Twins

Authors: Afsaneh Mohammadnejad, Marianne Nygaard, Jan Baumbach, Shuxia Li, Weilong Li, Jesper Lund, Jacob v. B. Hjelmborg, Lene Christensen, Qihua Tan

Abstract:

Cognitive impairment in the elderly is a key issue affecting the quality of life. Despite a strong genetic background in cognition, only a limited number of single nucleotide polymorphisms (SNPs) have been found. These explain a small proportion of the genetic component of cognitive function, thus leaving a large proportion unaccounted for. We hypothesize that one reason for this missing heritability is the misspecified modeling in data analysis concerning phenotype distribution as well as the relationship between SNP dosage and the phenotype of interest. In an attempt to overcome these issues, we introduced a model-free method based on the generalized correlation coefficient (GCC) in a genome-wide association study (GWAS) of cognitive function in twin samples and compared its performance with two popular linear regression models. The GCC-based GWAS identified two genome-wide significant (P-value < 5e-8) SNPs; rs2904650 near ZDHHC2 on chromosome 8 and rs111256489 near CD6 on chromosome 11. The kinship model also detected two genome-wide significant SNPs, rs112169253 on chromosome 4 and rs17417920 on chromosome 7, whereas no genome-wide significant SNPs were found by the linear mixed model (LME). Compared to the linear models, more meaningful biological pathways like GABA receptor activation, ion channel transport, neuroactive ligand-receptor interaction, and the renin-angiotensin system were found to be enriched by SNPs from GCC. The GCC model outperformed the linear regression models by identifying more genome-wide significant genetic variants and more meaningful biological pathways related to cognitive function. Moreover, GCC-based GWAS was robust in handling genetically related twin samples, which is an important feature in handling genetic confounding in association studies.

Keywords: cognition, generalized correlation coefficient, GWAS, twins

Procedia PDF Downloads 108
28223 BodeACD: Buffer Overflow Vulnerabilities Detecting Based on Abstract Syntax Tree, Control Flow Graph, and Data Dependency Graph

Authors: Xinghang Lv, Tao Peng, Jia Chen, Junping Liu, Xinrong Hu, Ruhan He, Minghua Jiang, Wenli Cao

Abstract:

As one of the most dangerous vulnerabilities, effective detection of buffer overflow vulnerabilities is extremely necessary. Traditional detection methods are not accurate enough and consume more resources to meet complex and enormous code environment at present. In order to resolve the above problems, we propose the method for Buffer overflow detection based on Abstract syntax tree, Control flow graph, and Data dependency graph (BodeACD) in C/C++ programs with source code. Firstly, BodeACD constructs the function samples of buffer overflow that are available on Github, then represents them as code representation sequences, which fuse control flow, data dependency, and syntax structure of source code to reduce information loss during code representation. Finally, BodeACD learns vulnerability patterns for vulnerability detection through deep learning. The results of the experiments show that BodeACD has increased the precision and recall by 6.3% and 8.5% respectively compared with the latest methods, which can effectively improve vulnerability detection and reduce False-positive rate and False-negative rate.

Keywords: vulnerability detection, abstract syntax tree, control flow graph, data dependency graph, code representation, deep learning

Procedia PDF Downloads 158
28222 Entropy Risk Factor Model of Exchange Rate Prediction

Authors: Darrol Stanley, Levan Efremidze, Jannie Rossouw

Abstract:

We investigate the predictability of the USD/ZAR (South African Rand) exchange rate with sample entropy analytics for the period of 2004-2015. We calculate sample entropy based on the daily data of the exchange rate and conduct empirical implementation of several market timing rules based on these entropy signals. The dynamic investment portfolio based on entropy signals produces better risk adjusted performance than a buy and hold strategy. The returns are estimated on the portfolio values in U.S. dollars. These results are preliminary and do not yet account for reasonable transactions costs, although these are very small in currency markets.

Keywords: currency trading, entropy, market timing, risk factor model

Procedia PDF Downloads 255
28221 Organization Development’s Role in Environmental, Social and Governance (ESG) Sustainability in the Private Organizations

Authors: Karmela Palma Samson

Abstract:

In recent years, there has been a growing interest in the implementation of Environmental, Social, and Governance (ESG) frameworks in private organizations. The COVID-19 pandemic and increasing global environmental concerns have further highlighted the importance of ESG practices in businesses. To be effective, the development and sustainability of ESG implementation require specific organizational functions. One such function is Organization Development (OD). This study aims to identify the roles of OD in the development, monitoring, and evaluation of ESG in private organizations. The role of OD in sustaining ESG implementation in private organizations was analyzed in this study. Qualitative research was conducted, which included interviews with OD practitioners to understand their role and challenges in maintaining ESG programs and initiatives. The study found that OD practitioners have low participation in managing ESG programs, initiatives, and indicators. However, the study also revealed that the OD function is crucial for the development, monitoring, and evaluation of ESG implementation in private organizations. In essence, the study highlights the importance of the OD function in ensuring the success of ESG implementation in private organizations. With their expertise in organizational development, OD practitioners can contribute significantly to the development, implementation, and evaluation of ESG initiatives. Therefore, private organizations should involve their OD departments in ESG implementation to ensure that they are sustainable, effective, and aligned with their organizational goals.

Keywords: ESG, organization development, private sector, sustainability

Procedia PDF Downloads 77
28220 Study and Analysis of a Susceptible Infective Susceptible Mathematical Model with Density Dependent Migration

Authors: Jitendra Singh, Vivek Kumar

Abstract:

In this paper, a susceptible infective susceptible mathematical model is proposed and analyzed where the migration of human population is given by migration function. It is assumed that the disease is transmitted by direct contact of susceptible and infective populations with constant contact rate. The equilibria and their stability are studied by using the stability theory of ordinary differential equations and computer simulation. The model analysis shows that the spread of infectious disease increases when human population immigration increases in the habitat but it decreases if emigration increases.

Keywords: SIS (Susceptible Infective Susceptible) model, migration function, susceptible, stability

Procedia PDF Downloads 246
28219 Optimized Text Summarization Model on Mobile Screens for Sight-Interpreters: An Empirical Study

Authors: Jianhua Wang

Abstract:

To obtain key information quickly from long texts on small screens of mobile devices, sight-interpreters need to establish optimized summarization model for fast information retrieval. Four summarization models based on previous studies were studied including title+key words (TKW), title+topic sentences (TTS), key words+topic sentences (KWTS) and title+key words+topic sentences (TKWTS). Psychological experiments were conducted on the four models for three different genres of interpreting texts to establish the optimized summarization model for sight-interpreters. This empirical study shows that the optimized summarization model for sight-interpreters to quickly grasp the key information of the texts they interpret is title+key words (TKW) for cultural texts, title+key words+topic sentences (TKWTS) for economic texts and topic sentences+key words (TSKW) for political texts.

Keywords: different genres, mobile screens, optimized summarization models, sight-interpreters

Procedia PDF Downloads 300
28218 The Mediating Effect of Individual Readiness for Change in the Relationship between Organisational Culture and Individual Commitment to Change

Authors: Mohamed Haffar, Lois Farquharson, Gbola Gbadamosi, Wafi Al-Karaghouli, Ramadane Djbarni

Abstract:

A few recent research studies and mostly conceptual in nature have paid attention to the relationship between organizational culture (OC), individual readiness for change (IRFC) and individual affective commitment to change (IACC). Surprisingly enough, there is a lack of empirical studies investigating the influence of all four OC types on IRFC and IACC. Moreover, there is a very limited research investigating the mediating role of individual readiness for change between OC types and individual affective commitment to change. Therefore, this study is proposed to fill this gap by providing empirical evidence leading to advancement in the understanding of direct and indirect influences of OC on individual affective commitment to change. To achieve this, a questionnaire based survey was developed and self-administered to 226 middle managers in Algerian manufacturing organizations (AMOs). The results of this study indicated that group culture and adhocracy culture positively affect the IACC. Furthermore, the findings of this study show support for the mediating roles of self-efficacy and personally valence in the relationship between OC and IACC.

Keywords: individual readiness for change, individual commitment to change, organisational culture, manufacturing organisations

Procedia PDF Downloads 494
28217 High Performance Computing and Big Data Analytics

Authors: Branci Sarra, Branci Saadia

Abstract:

Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.

Keywords: high performance computing, HPC, big data, data analysis

Procedia PDF Downloads 505
28216 Cultural Heritage, War and Heritage Legislations: An Empirical Review

Authors: Gebrekiros Welegebriel Asfaw

Abstract:

The conservation of cultural heritage during times of war is a topic of significant importance and concern in the field of heritage studies. The destruction, looting, and illicit acts against cultural heritages have devastating consequences. International and national legislations have been put in place to address these issues and provide a legal framework for protecting cultural heritage during armed conflicts. Thus, the aim of this review is to examine the existing heritage legislations and evaluate their effectiveness in protecting cultural heritage during times of war with a special insight of the Tigray war. The review is based on a comprehensive empirical analysis of existing heritage legislations related to the protection of cultural heritage during war, with a special focus on the Tigray war. The review reveals that there are several international and national legislations in place to protect cultural heritage during times of war. However, the implementation of these legislations has been insufficient and ineffective in the case of the Tigray war. The priceless cultural heritages in Tigray, which were once the centers of investment and world pride were, have been subjected to destruction, looting, and other illicit acts, in violation of both international conventions such as the UNESCO Convention and national legislations. Therefore, there is a need for consistent intervention and enforcement of different legislations from the international community and organizations to rehabilitate, repatriate, and reinstitute the irreplaceable heritages of Tigray.

Keywords: cultural heritage, heritage legislations, tigray, war

Procedia PDF Downloads 127
28215 Challenges and Opportunities in Computing Logistics Cost in E-Commerce Supply Chain

Authors: Pramod Ghadge, Swadesh Srivastava

Abstract:

Revenue generation of a logistics company depends on how the logistics cost of a shipment is calculated. Logistics cost of a shipment is a function of distance & speed of the shipment travel in a particular network, its volumetric size and dead weight. Logistics billing is based mainly on the consumption of the scarce resource (space or weight carrying capacity of a carrier). Shipment’s size or deadweight is a function of product and packaging weight, dimensions and flexibility. Hence, to arrive at a standard methodology to compute accurate cost to bill the customer, the interplay among above mentioned physical attributes along with their measurement plays a key role. This becomes even more complex for an ecommerce company, like Flipkart, which caters to shipments from both warehouse and marketplace in an unorganized non-standard market like India. In this paper, we will explore various methodologies to define a standard way of billing the non-standard shipments across a wide range of size, shape and deadweight. Those will be, usage of historical volumetric/dead weight data to arrive at a factor which can be used to compute the logistics cost of a shipment, also calculating the real/contour volume of a shipment to address the problem of irregular shipment shapes which cannot be solved by conventional bounding box volume measurements. We will also discuss certain key business practices and operational quality considerations needed to bring standardization and drive appropriate ownership in the ecosystem.

Keywords: contour volume, logistics, real volume, volumetric weight

Procedia PDF Downloads 251
28214 Discursive (Re/De)Construction of Objectivity-Subjectivity: Critiquing Rape/Flesh Trade-Documentaries

Authors: Muhammed Shahriar Haque

Abstract:

As an offshoot of journalistic discourse, the documentary should be objective in nature without harbouring any preconceived notion to foster ulterior motifs. When it comes to a social issue like rape in South Asian countries, as media in recent times is inundated with this violent act in India, Pakistan, Myanmar, Bangladesh, how does one document it in terms of objectivity and subjectivity? The objective of this study is twofold: to document the history of documentaries, and to critically analyze South Asian rape/flesh trade-documentaries. The overall goal is to trace the (re/de)construction of objectivity-subjectivity in documentaries. This paper adopts a qualitative approach to documentarist discourse through the lens of critical discourse analysis (CDA). Data was gathered for 10 documentaries on the theme of rape and/or flesh trade from eight South Asian countries, predominantly the South Asian Association of Regional Cooperation (SAARC) region. The documentaries were primarily categorised by using three frameworks based on six modes, six subgenres, and four basic approaches of documentary. Subsequently, the findings were critiqued from CDA perspective. The outcome suggests that there a two schools of thoughts regarding documentaries. According to journalistic ethics, news and/or documentaries should be objective in orientation and focus on informing the audience and/common people. The empirical findings tend to challenge ethical parameters of objectivity. At times, it seems that journalistic discourse is discursively (re)constructed to give an augmented simulation of objectivity. Based on the findings it may be recommended that if documentaries steer away from empirical facts and indulge in poetic naivety, their credibility could be questioned. A research of this nature is significant as it raises questions with regard to ethical and moral conscience of documentary filmmakers. Furthermore, it looks at whether they uphold journalistic integrity or succumb to their bias, and thereby depict subjective views, which could be tainted with political and/or propagandist ulterior motifs.

Keywords: discursive (re/de)construction, documentaries, journalistic integrity, rape/flesh trade

Procedia PDF Downloads 137