Search results for: risk management indicators
14789 Financial Instruments Disclosure: A Review of the Literature
Authors: Y. Tahat, T. Dunne, S. Fifield, D. Power
Abstract:
Information about a firm’s usage of Financial Instruments (FIs) plays a very important role in determining its financial position and performance. Yet accounting standard-setters have encountered problems when deciding on the FI-related disclosures which firms must make. The primary objective of this paper is to review the extant literature on FI disclosure. This objective is achieved by surveying the literature on: the corporate usage of FIs; the different accounting standards adopted concerning FIs; and empirical studies on FI disclosure. This review concludes that the current research on FI disclosure has generated a number of useful insights. In particular, the paper reports that: FIs are a very important risk management mechanism in ensuring that companies have the cash available to make value-enhancing investments, however, without a clear set of risk management objectives, using such instruments can be dangerous; accounting standards concerning FIs have resulted in enhanced transparency about the usage of these instruments; and FI-related information is a key input into investors’ decision-making processes. Finally, the paper provides a number of suggestions for future research in the area.Keywords: financial instruments, financial reporting, accounting standards, value relevance, corporate disclosure
Procedia PDF Downloads 41114788 Improving the Quantification Model of Internal Control Impact on Banking Risks
Authors: M. Ndaw, G. Mendy, S. Ouya
Abstract:
Risk management in banking sector is a key issue linked to financial system stability and its importance has been elevated by technological developments and emergence of new financial instruments. In this paper, we improve the model previously defined for quantifying internal control impact on banking risks by automatizing the residual criticality estimation step of FMECA. For this, we defined three equations and a maturity coefficient to obtain a mathematical model which is tested on all banking processes and type of risks. The new model allows an optimal assessment of residual criticality and improves the correlation rate that has become 98%.Keywords: risk, control, banking, FMECA, criticality
Procedia PDF Downloads 33014787 Design and Application of NFC-Based Identity and Access Management in Cloud Services
Authors: Shin-Jer Yang, Kai-Tai Yang
Abstract:
In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.Keywords: cloud service, multi-tenancy, NFC, IAM, mobile device
Procedia PDF Downloads 43314786 Risk Assessment in Construction of K-Span Buildings in United Arab Emirates (UAE)
Authors: Imtiaz Ali, Imam Mansoor
Abstract:
Investigations as a part of the academic study were undertaken to identify and evaluate the significant risks associated with the construction of K-span buildings in the region of UAE. Primary field data was collected through questionnaires obtaining specific open and close-ended questions from carefully selected construction firms, civil engineers and, construction manager regarding risks associated to K-span building construction. Historical data available for other regions of the same construction technique was available which was compared for identifying various non-critical and critical risk parameters by comparative evaluation techniques to come up with important risks and potential sources for their control and minimization in K-Span buildings that is increasing in the region. The associated risks have been determined with their Relative Importance Index (RII) values of which Risk involved in Change of Design required by Owners carries the highest value (RII=0.79) whereas, Delayed Payment by Owner to Contractor is one of the least (RII=0.42) value. The overall findings suggest that most relative risks as quantified originate or associated with the contractors. It may be concluded that project proponents undertaking K-span projects in planning and budgeting the cost and delays should take into account of risks on high account if changes in design are also required any delays in the material by the supplier would then be a major risk in K-span project delay. Since projects are, less costly, so owners have limited budgets, then they hire small contractors, which are not highly competent contractors. So study suggests that owner should be aware of these types of risks associated with the construction of K-span buildings in order to make it cost effective.Keywords: k-span buildings, k-span construction, risk management, relative improvement index (RII)
Procedia PDF Downloads 37314785 Demographic Assessment and Evaluation of Degree of Lipid Control in High Risk Indian Dyslipidemia Patients
Authors: Abhijit Trailokya
Abstract:
Background: Cardiovascular diseases (CVD’s) are the major cause of morbidity and mortality in both developed and developing countries. Many clinical trials have demonstrated that low-density lipoprotein cholesterol (LDL-C) lowering, reduces the incidence of coronary and cerebrovascular events across a broad spectrum of patients at risk. Guidelines for the management of patients at risk have been established in Europe and North America. The guidelines have advocated progressively lower LDL-C targets and more aggressive use of statin therapy. In Indian patients, comprehensive data on dyslipidemia management and its treatment outcomes are inadequate. There is lack of information on existing treatment patterns, the patient’s profile being treated, and factors that determine treatment success or failure in achieving desired goals. Purpose: The present study was planned to determine the lipid control status in high-risk dyslipidemic patients treated with lipid-lowering therapy in India. Methods: This cross-sectional, non-interventional, single visit program was conducted across 483 sites in India where male and female patients with high-risk dyslipidemia aged 18 to 65 years who had visited for a routine health check-up to their respective physician at hospital or a healthcare center. Percentage of high-risk dyslipidemic patients achieving adequate LDL-C level (< 70 mg/dL) on lipid-lowering therapy and the association of lipid parameters with patient characteristics, comorbid conditions, and lipid lowering drugs were analysed. Results: 3089 patients were enrolled in the study; of which 64% were males. LDL-C data was available for 95.2% of the patients; only 7.7% of these patients achieved LDL-C levels < 70 mg/dL on lipid-lowering therapy, which may be due to inability to follow therapeutic plans, poor compliance, or inadequate counselling by physician. The physician’s lack of awareness about recent treatment guidelines also might contribute to patients’ poor adherence, not explaining adequately the benefit and risks of a medication, not giving consideration to the patient’s life style and the cost of medication. Statin was the most commonly used anti-dyslipidemic drug across population. The higher proportion of patients had the comorbid condition of CVD and diabetes mellitus across all dyslipidemic patients. Conclusion: As per the European Society of Cardiology guidelines the ideal LDL-C levels in high risk dyslipidemic patients should be less than 70%. In the present study, 7.7% of the patients achieved LDL-C levels < 70 mg/dL on lipid lowering therapy which is very less. Most of high risk dyslipidemic patients in India are on suboptimal dosage of statin. So more aggressive and high dosage statin therapy may be required to achieve target LDLC levels in high risk Indian dyslipidemic patients.Keywords: cardiovascular disease, diabetes mellitus, dyslipidemia, LDL-C, lipid lowering drug, statins
Procedia PDF Downloads 20014784 The 10-year Risk of Major Osteoporotic and Hip Fractures Among Indonesian People Living with HIV
Authors: Iqbal Pramukti, Mamat Lukman, Hasniatisari Harun, Kusman Ibrahim
Abstract:
Introduction: People living with HIV had a higher risk of osteoporotic fracture than the general population. The purpose of this study was to predict the 10-year risk of fracture among people living with HIV (PLWH) using FRAX™ and to identify characteristics related to the fracture risk. Methodology: This study consisted of 75 subjects. The ten-year probability of major osteoporotic fractures (MOF) and hip fractures was assessed using the FRAX™ algorithm. A cross-tabulation was used to identify the participant’s characteristics related to fracture risk. Results: The overall mean 10-year probability of fracture was 2.4% (1.7) for MOF and 0.4% (0.3) for hip fractures. For MOF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use showed a higher MOF score than those who were not (3.1 vs. 2.5; 4.6 vs 2.5; and 3.4 vs 2.5, respectively). For HF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use also showed a higher HF score than those who were not (0.5 vs. 0.3; 0.8 vs. 0.3; and 0.5 vs. 0.3, respectively). Conclusions: The 10-year risk of fracture was higher among PLWH with several factors, including the parent’s hip. Fracture history, smoking behavior and glucocorticoid used. Further analysis on determining factors using multivariate regression analysis with a larger sample size is required to confirm the factors associated with the high fracture risk.Keywords: HIV, PLWH, osteoporotic fractures, hip fractures, 10-year risk of fracture, FRAX
Procedia PDF Downloads 4814783 Importance of Infrastucture Delivery and Management in South Africa
Authors: Onyeka Nkwonta, Theo Haupt, Karana Padayachee
Abstract:
This study aims primarily to identify potential causes of the bottlenecks in the public sector that affect delivery and formulate evidence-based interventions to improve delivery and management of infrastructure projects. An initial literature review was carried out on infrastructural development and delivery in South Africa, with the aim to formulate evidence-based interventions to improve delivery within the sector. The infrastructure delivery management model was developed to map out best practice delivery processes. These will become the backbone on which improvement initiatives that will be developed within participating stakeholders. The model will, in turn, support a range of methodologies, including the risk system and a knowledge management framework. It will also look at key challenges facing departments with the ability to ensure knowledge and skills transfer at various sectors. The research is limited because the findings were based on existing literature. This study adopted an indirect approach for infrastructure management by focussing on the challenges faced and approaches adopted to overcome these challenges. This may narrow the consideration of some of the viewpoints, thereby limiting the richness of experience available to this research.Keywords: infrastructure, management, challenges, South Africa
Procedia PDF Downloads 13814782 Governance, Risk Management, and Compliance Factors Influencing the Adoption of Cloud Computing in Australia
Authors: Tim Nedyalkov
Abstract:
A business decision to move to the cloud brings fundamental changes in how an organization develops and delivers its Information Technology solutions. The accelerated pace of digital transformation across businesses and government agencies increases the reliance on cloud-based services. They are collecting, managing, and retaining large amounts of data in cloud environments makes information security and data privacy protection essential. It becomes even more important to understand what key factors drive successful cloud adoption following the commencement of the Privacy Amendment Notifiable Data Breaches (NDB) Act 2017 in Australia as the regulatory changes impact many organizations and industries. This quantitative correlational research investigated the governance, risk management, and compliance factors contributing to cloud security success. The factors influence the adoption of cloud computing within an organizational context after the commencement of the NDB scheme. The results and findings demonstrated that corporate information security policies, data storage location, management understanding of data governance responsibilities, and regular compliance assessments are the factors influencing cloud computing adoption. The research has implications for organizations, future researchers, practitioners, policymakers, and cloud computing providers to meet the rapidly changing regulatory and compliance requirements.Keywords: cloud compliance, cloud security, data governance, privacy protection
Procedia PDF Downloads 11614781 Comparison Study of Capital Protection Risk Management Strategies: Constant Proportion Portfolio Insurance versus Volatility Target Based Investment Strategy with a Guarantee
Authors: Olga Biedova, Victoria Steblovskaya, Kai Wallbaum
Abstract:
In the current capital market environment, investors constantly face the challenge of finding a successful and stable investment mechanism. Highly volatile equity markets and extremely low bond returns bring about the demand for sophisticated yet reliable risk management strategies. Investors are looking for risk management solutions to efficiently protect their investments. This study compares a classic Constant Proportion Portfolio Insurance (CPPI) strategy to a Volatility Target portfolio insurance (VTPI). VTPI is an extension of the well-known Option Based Portfolio Insurance (OBPI) to the case where an embedded option is linked not to a pure risky asset such as e.g., S&P 500, but to a Volatility Target (VolTarget) portfolio. VolTarget strategy is a recently emerged rule-based dynamic asset allocation mechanism where the portfolio’s volatility is kept under control. As a result, a typical VTPI strategy allows higher participation rates in the market due to reduced embedded option prices. In addition, controlled volatility levels eliminate the volatility spread in option pricing, one of the frequently cited reasons for OBPI strategy fall behind CPPI. The strategies are compared within the framework of the stochastic dominance theory based on numerical simulations, rather than on the restrictive assumption of the Black-Scholes type dynamics of the underlying asset. An extended comparative quantitative analysis of performances of the above investment strategies in various market scenarios and within a range of input parameter values is presented.Keywords: CPPI, portfolio insurance, stochastic dominance, volatility target
Procedia PDF Downloads 16714780 The Artificial Intelligence (AI) Impact on Project Management: A Destructive or Transformative Agent
Authors: Kwame Amoah
Abstract:
Artificial intelligence (AI) has the prospect of transforming project management, significantly improving efficiency and accuracy. By automating specific tasks with defined guidelines, AI can assist project managers in making better decisions and allocating resources efficiently, with possible risk mitigation. This study explores how AI is already impacting project management and likely future AI's impact on the field. The AI's reaction has been a divided opinion; while others picture it as a destroyer of jobs, some welcome it as an innovation advocate. Both sides agree that AI will be disruptive and revolutionize PM's functions. If current research is to go by, AI or some form will replace one-third of all learning graduate PM jobs by as early as 2030. A recent survey indicates AI spending will reach $97.9 billion by the end of 2023. Considering such a profound impact, the project management profession will also see a paradigm shift driven by AI. The study examines what the project management profession will look like in the next 5-10 years after this technological disruption. The research methods incorporate existing literature, develop trend analysis, and conduct structured interviews with project management stakeholders from North America to gauge the trend. PM professionals can harness the power of AI, ensuring a smooth transition and positive outcomes. AI adoption will maximize benefits, minimize adverse consequences, and uphold ethical standards, leading to improved project performance.Keywords: project management, disruptive teacnologies, project management function, AL applications, artificial intelligence
Procedia PDF Downloads 8214779 Integrative-Cyclical Approach to the Study of Quality Control of Resource Saving by the Use of Innovation Factors
Authors: Anatoliy A. Alabugin, Nikolay K. Topuzov, Sergei V. Aliukov
Abstract:
It is well known, that while we do a quantitative evaluation of the quality control of some economic processes (in particular, resource saving) with help innovation factors, there are three groups of problems: high uncertainty of indicators of the quality management, their considerable ambiguity, and high costs to provide a large-scale research. These problems are defined by the use of contradictory objectives of enhancing of the quality control in accordance with innovation factors and preservation of economic stability of the enterprise. The most acutely, such factors are felt in the countries lagging behind developed economies of the world according to criteria of innovativeness and effectiveness of management of the resource saving. In our opinion, the following two methods for reconciling of the above-mentioned objectives and reducing of conflictness of the problems are to solve this task most effectively: 1) the use of paradigms and concepts of evolutionary improvement of quality of resource-saving management in the cycle "from the project of an innovative product (technology) - to its commercialization and update parameters of customer value"; 2) the application of the so-called integrative-cyclical approach which consistent with complexity and type of the concept, to studies allowing to get quantitative assessment of the stages of achieving of the consistency of these objectives (from baseline of imbalance, their compromise to achievement of positive synergies). For implementation, the following mathematical tools are included in the integrative-cyclical approach: index-factor analysis (to identify the most relevant factors); regression analysis of relationship between the quality control and the factors; the use of results of the analysis in the model of fuzzy sets (to adjust the feature space); method of non-parametric statistics (for a decision on the completion or repetition of the cycle in the approach in depending on the focus and the closeness of the connection of indicator ranks of disbalance of purposes). The repetition is performed after partial substitution of technical and technological factors ("hard") by management factors ("soft") in accordance with our proposed methodology. Testing of the proposed approach has shown that in comparison with the world practice there are opportunities to improve the quality of resource-saving management using innovation factors. We believe that the implementation of this promising research, to provide consistent management decisions for reducing the severity of the above-mentioned contradictions and increasing the validity of the choice of resource-development strategies in terms of parameters of quality management and sustainability of enterprise, is perspective. Our existing experience in the field of quality resource-saving management and the achieved level of scientific competence of the authors allow us to hope that the use of the integrative-cyclical approach to the study and evaluation of the resulting and factor indicators will help raise the level of resource-saving characteristics up to the value existing in the developed economies of post-industrial type.Keywords: integrative-cyclical approach, quality control, evaluation, innovation factors. economic sustainability, innovation cycle of management, disbalance of goals of development
Procedia PDF Downloads 24514778 Risk Factors for Defective Autoparts Products Using Bayesian Method in Poisson Generalized Linear Mixed Model
Authors: Pitsanu Tongkhow, Pichet Jiraprasertwong
Abstract:
This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.Keywords: defective autoparts products, Bayesian framework, generalized linear mixed model (GLMM), risk factors
Procedia PDF Downloads 56614777 A Case Study of Brownfield Revitalization in Taiwan
Authors: Jen Wang, Wei-Chia Hsu, Zih-Sin Wang, Ching-Ping Chu, Bo-Shiou Guo
Abstract:
In the late 19th century, the Jinguashi ore deposit in northern Taiwan was discovered, and accompanied with flourishing mining activities. However, tons of contaminants including heavy metals, sulfur dioxide, and total petroleum hydrocarbons (TPH) were released to surroundings and caused environmental problems. Site T was one of copper smelter located on the coastal hill near Jinguashi ore deposit. In over ten years of operation, variety contaminants were emitted that it polluted the surrounding soil and groundwater quality. In order to exhaust fumes produced from smelting process, three stacks were built along the hill behind the factory. The sediment inside the stacks contains high concentration of heavy metals such as arsenic, lead, copper, etc. Moreover, soil around the discarded stacks suffered a serious contamination when deposition leached from the ruptures of stacks. Consequently, Site T (including the factory and its surroundings) was declared as a pollution remediation site that visiting the site and land-use activities on it are forbidden. However, the natural landscape and cultural attractions of Site T are spectacular that it attracts a lot of visitors annually. Moreover, land resources are extremely precious in Taiwan. In addition, Taiwan Environmental Protection Administration (EPA) is actively promoting the contaminated land revitalization policy. Therefore, this study took Site T as case study for brownfield revitalization planning to the limits of activate and remediate the natural resources. Land-use suitability analysis and risk mapping were applied in this study to make appropriate risk management measures and redevelopment plan for the site. In land-use suitability analysis, surrounding factors into consideration such as environmentally sensitive areas, biological resources, land use, contamination, culture, and landscapes were collected to assess the development of each area; health risk mapping was introduced to show the image of risk assessments results based on the site contamination investigation. According to land-use suitability analysis, the site was divided into four zones: priority area (for high-efficiency development), secondary area (for co-development with priority area), conditional area (for reusing existing building) and limited area (for Eco-tourism and education). According to the investigation, polychlorinated biphenyls (PCB), heavy metals and TPH were considered as target contaminants while oral, inhalation and dermal would be the major exposure pathways in health risk assessment. In accordance with health risk map, the highest risk was found in the southwest and eastern side. Based on the results, the development plan focused on zoning and land use. Site T was recommended be divides to public facility zone, public architectonic art zone, viewing zone, existing building preservation zone, historic building zone, and cultural landscape zone for various purpose. In addition, risk management measures including sustained remediation, extinguish exposure and administration management are applied to ensure particular places are suitable for visiting and protect the visitors’ health. The consolidated results are corroborated available by analyzing aspects of law, land acquired method, maintenance and management and public participation. Therefore, this study has a certain reference value to promote the contaminated land revitalization policy in Taiwan.Keywords: brownfield revitalization, land-use suitability analysis, health risk map, risk management
Procedia PDF Downloads 18214776 Risk Tolerance and Individual Worthiness Based on Simultaneous Analysis of the Cognitive Performance and Emotional Response to a Multivariate Situational Risk Assessment
Authors: Frederic Jumelle, Kelvin So, Didan Deng
Abstract:
A method and system for neuropsychological performance test, comprising a mobile terminal, used to interact with a cloud server which stores user information and is logged into by the user through the terminal device; the user information is directly accessed through the terminal device and is processed by artificial neural network, and the user information comprises user facial emotions information, performance test answers information and user chronometrics. This assessment is used to evaluate the cognitive performance and emotional response of the subject to a series of dichotomous questions describing various situations of daily life and challenging the users' knowledge, values, ethics, and principles. In industrial applications, the timing of this assessment will depend on the users' need to obtain a service from a provider, such as opening a bank account, getting a mortgage or an insurance policy, authenticating clearance at work, or securing online payments.Keywords: artificial intelligence, neurofinance, neuropsychology, risk management
Procedia PDF Downloads 13614775 Pricing the Risk Associated to Weather of Variable Renewable Energy Generation
Authors: Jorge M. Uribe
Abstract:
We propose a methodology for setting the price of an insurance contract targeted to manage the risk associated with weather conditions that affect variable renewable energy generation. The methodology relies on conditional quantile regressions to estimate the weather risk of a solar panel. It is illustrated using real daily radiation and weather data for three cities in Spain (Valencia, Barcelona and Madrid) from February 2/2004 to January 22/2019. We also adapt the concepts of value at risk and expected short fall from finance to this context, to provide a complete panorama of what we label as weather risk. The methodology is easy to implement and can be used by insurance companies to price a contract with the aforementioned characteristics when data about similar projects and accurate cash flow projections are lacking. Our methodology assigns a higher price to an insurance product with the stated characteristics in Madrid, compared to Valencia and Barcelona. This is consistent with Madrid showing the largest interquartile range of operational deficits and it is unrelated to the average value deficit, which illustrates the importance of our proposal.Keywords: insurance, weather, vre, risk
Procedia PDF Downloads 14614774 Minimum Data of a Speech Signal as Special Indicators of Identification in Phonoscopy
Authors: Nazaket Gazieva
Abstract:
Voice biometric data associated with physiological, psychological and other factors are widely used in forensic phonoscopy. There are various methods for identifying and verifying a person by voice. This article explores the minimum speech signal data as individual parameters of a speech signal. Monozygotic twins are believed to be genetically identical. Using the minimum data of the speech signal, we came to the conclusion that the voice imprint of monozygotic twins is individual. According to the conclusion of the experiment, we can conclude that the minimum indicators of the speech signal are more stable and reliable for phonoscopic examinations.Keywords: phonogram, speech signal, temporal characteristics, fundamental frequency, biometric fingerprints
Procedia PDF Downloads 14114773 Correlations between Obesity Indices and Cardiometabolic Risk Factors in Obese Subgroups in Severely Obese Women
Authors: Seung Hun Lee, Sang Yeoup Lee
Abstract:
Objectives: To investigate associations between degrees of obesity using correlations between obesity indices and cardiometabolic risk factors. Methods: BMI, waist circumference (WC), fasting insulin, fasting glucose, lipids, and visceral adipose tissue (VAT) area using computed tomographic images were measured in 113 obese female without cardiovascular disease (CVD). Correlations between obesity indices and cardiometabolic risk factors were analyzed in obese subgroups defined using sequential obesity indices. Results: Mean BMI and WC were 29.6 kg/m2 and 92.8 cm. BMI showed significant correlations with all five cardiometabolic risk factors until the BMI cut-off point reached 27 kg/m2, but when it exceeded 30 kg/m2, correlations no longer existed. WC was significantly correlated with all five cardiometabolic risk factors up to a value of 85 cm, but when WC exceeded 90 cm, correlations no longer existed. Conclusions: Our data suggest that moderate weight-loss goals may not be enough to ameliorate cardiometabolic markers in severely obese patients. Therefore, individualized weight-loss goals should be recommended to such patients to improve health benefits.Keywords: correlation, cardiovascular disease, risk factors, obesity
Procedia PDF Downloads 35414772 Measuring Flood Risk concerning with the Flood Protection Embankment in Big Flooding Events of Dhaka Metropolitan Zone
Authors: Marju Ben Sayed, Shigeko Haruyama
Abstract:
Among all kinds of natural disaster, the flood is a common feature in rapidly urbanizing Dhaka city. In this research, assessment of flood risk of Dhaka metropolitan area has been investigated by using an integrated approach of GIS, remote sensing and socio-economic data. The purpose of the study is to measure the flooding risk concerning with the flood protection embankment in big flooding events (1988, 1998 and 2004) and urbanization of Dhaka metropolitan zone. In this research, we considered the Dhaka city into two parts; East Dhaka (outside the flood protection embankment) and West Dhaka (inside the flood protection embankment). Using statistical data, we explored the socio-economic status of the study area population by comparing the density of population, land price and income level. We have drawn the cross section profile of the flood protection embankment into three different points for realizing the flooding risk in the study area, especially in the big flooding year (1988, 1998 and 2004). According to the physical condition of the study area, the land use/land cover map has been classified into five classes. Comparing with each land cover unit, historical weather station data and the socio-economic data, the flooding risk has been evaluated. Moreover, we compared between DEM data and each land cover units to find out the relationship with flood. It is expected that, this study could contribute to effective flood forecasting, relief and emergency management for a future flood event in Dhaka city.Keywords: land use, land cover change, socio-economic, Dhaka city, GIS, flood
Procedia PDF Downloads 29514771 An Analytical Approach to Assess and Compare the Vulnerability Risk of Operating Systems
Authors: Pubudu K. Hitigala Kaluarachchilage, Champike Attanayake, Sasith Rajasooriya, Chris P. Tsokos
Abstract:
Operating system (OS) security is a key component of computer security. Assessing and improving OSs strength to resist against vulnerabilities and attacks is a mandatory requirement given the rate of new vulnerabilities discovered and attacks occurring. Frequency and the number of different kinds of vulnerabilities found in an OS can be considered an index of its information security level. In the present study five mostly used OSs, Microsoft Windows (windows 7, windows 8 and windows 10), Apple’s Mac and Linux are assessed for their discovered vulnerabilities and the risk associated with each. Each discovered and reported vulnerability has an exploitability score assigned in CVSS score of the national vulnerability database. In this study the risk from vulnerabilities in each of the five Operating Systems is compared. Risk Indexes used are developed based on the Markov model to evaluate the risk of each vulnerability. Statistical methodology and underlying mathematical approach is described. Initially, parametric procedures are conducted and measured. There were, however, violations of some statistical assumptions observed. Therefore the need for non-parametric approaches was recognized. 6838 vulnerabilities recorded were considered in the analysis. According to the risk associated with all the vulnerabilities considered, it was found that there is a statistically significant difference among average risk levels for some operating systems, indicating that according to our method some operating systems have been more risk vulnerable than others given the assumptions and limitations. Relevant test results revealing a statistically significant difference in the Risk levels of different OSs are presented.Keywords: cybersecurity, Markov chain, non-parametric analysis, vulnerability, operating system
Procedia PDF Downloads 18214770 Portfolio Selection with Active Risk Monitoring
Authors: Marc S. Paolella, Pawel Polak
Abstract:
The paper proposes a framework for large-scale portfolio optimization which accounts for all the major stylized facts of multivariate financial returns, including volatility clustering, dynamics in the dependency structure, asymmetry, heavy tails, and non-ellipticity. It introduces a so-called risk fear portfolio strategy which combines portfolio optimization with active risk monitoring. The former selects optimal portfolio weights. The latter, independently, initiates market exit in case of excessive risks. The strategy agrees with the stylized fact of stock market major sell-offs during the initial stage of market downturns. The advantages of the new framework are illustrated with an extensive empirical study. It leads to superior multivariate density and Value-at-Risk forecasting, and better portfolio performance. The proposed risk fear portfolio strategy outperforms various competing types of optimal portfolios, even in the presence of conservative transaction costs and frequent rebalancing. The risk monitoring of the optimal portfolio can serve as an early warning system against large market risks. In particular, the new strategy avoids all the losses during the 2008 financial crisis, and it profits from the subsequent market recovery.Keywords: comfort, financial crises, portfolio optimization, risk monitoring
Procedia PDF Downloads 52314769 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation
Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski
Abstract:
In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming
Procedia PDF Downloads 40514768 The Impact of the Global Financial Crisis on the Performance of Czech Industrial Enterprises
Authors: Maria Reznakova, Michala Strnadova, Lukas Reznak
Abstract:
The global financial crisis that erupted in 2008 is associated mainly with the debt crisis. It quickly spread globally through financial markets, international banks and trade links, and affected many economic sectors. Measured by the index of the year-on-year change in GDP and industrial production, the consequences of the global financial crisis manifested themselves with some delay also in the Czech economy. This can be considered a result of the overwhelming export orientation of Czech industrial enterprises. These events offer an important opportunity to study how financial and macroeconomic instability affects corporate performance. Corporate performance factors have long been given considerable attention. It is therefore reasonable to ask whether the findings published in the past are also valid in the times of economic instability and subsequent recession. The decisive factor in effective corporate performance measurement is the existence of an appropriate system of indicators that are able to assess progress in achieving corporate goals. Performance measures may be based on non-financial as well as on financial information. In this paper, financial indicators are used in combination with other characteristics, such as the firm size and ownership structure. Financial performance is evaluated based on traditional performance indicators, namely, return on equity and return on assets, supplemented with indebtedness and current liquidity indices. As investments are a very important factor in corporate performance, their trends and importance were also investigated by looking at the ratio of investments to previous year’s sales and the rate of reinvested earnings. In addition to traditional financial performance indicators, the Economic Value Added was also used. Data used in the research were obtained from a questionnaire survey administered in industrial enterprises in the Czech Republic and from AMADEUS (Analyse Major Database from European Sources), from which accounting data of companies were obtained. Respondents were members of the companies’ senior management. Research results unequivocally confirmed that corporate performance dropped significantly in the 2010-2012 period, which can be considered a result of the global financial crisis and a subsequent economic recession. It was reflected mainly in the decreasing values of profitability indicators and the Economic Value Added. Although the total year-on-year indebtedness declined, intercompany indebtedness increased. This can be considered a result of impeded access of companies to bank loans due to the credit crunch. Comparison of the results obtained with the conclusions of previous research on a similar topic showed that the assumption that firms under foreign control achieved higher performance during the period investigated was not confirmed.Keywords: corporate performance, foreign control, intercompany indebtedness, ratio of investment
Procedia PDF Downloads 33214767 Management of Nutritional Strategies in Prevention of Autism Before and During Pregnancy
Authors: Maryam Ghavam Sadri, Kimia Moiniafshari
Abstract:
Objectives: Autism is a neuro-developmental disorder that has negative effects on verbal, mental and behavioral development. Studies have shown the role of a maternal dietary pattern before and during pregnancy. The relation of exerting of nutritional management programs in prevention of Autism has been approved. This review article has been made to investigate the role of nutritional management strategies before and during pregnancy in the prevention of Autism. Methods: This review study was accomplished by using the keywords related to the topic, 67 articles were found (2000-2015) and finally 20 article with criteria such as including maternal lifestyle, nutritional deficiencies and Autism prevention were selected. Results: Maternal dietary pattern and health before and during pregnancy have important roles in the incidence of Autism. Studies have suggested that high dietary fat intake and obesity can increase the risk of Autism in offspring. Maternal metabolic condition specially gestational diabetes (GDM) (p-value < 0.04) and folate deficiency (p-value = 0.04) is associated with risk of Autism. Studies have shown that folate intake in mothers with autistic children is less than mothers who have typically developing children (TYP) (p-value<0.01). As folate is an essential micronutrient for fetus mental development, consumption of average 600 mcg/day especially in P1 phase of pregnancy results in significant reduction in incidence of Autism (OR:1.53, 95%CI=0.42-0.92, p-value = 0.02). furthermore, essential fatty acid deficiency especially omega-3 fatty acid increases the rate of Autism and consumption of supplements and food sources of omega-3 can decrease the risk of Autism up to 34% (RR=1.53, 95%CI=1-2.32). Conclusion: regards to nutritional deficiency and maternal metabolic condition before and during pregnancy in prevalence of Autism, carrying out the appropriate nutritional strategies such as well-timed folate supplementation before pregnancy and healthy lifestyle adherence for prevention of metabolic syndrome (GDM) seems to help Autism prevention.Keywords: autism, autism prevention, dietary inadequacy, maternal lifestyle
Procedia PDF Downloads 35514766 KPI and Tool for the Evaluation of Competency in Warehouse Management for Furniture Business
Authors: Kritchakhris Na-Wattanaprasert
Abstract:
The objective of this research is to design and develop a prototype of a key performance indicator system this is suitable for warehouse management in a case study and use requirement. In this study, we design a prototype of key performance indicator system (KPI) for warehouse case study of furniture business by methodology in step of identify scope of the research and study related papers, gather necessary data and users requirement, develop key performance indicator base on balance scorecard, design pro and database for key performance indicator, coding the program and set relationship of database and finally testing and debugging each module. This study use Balance Scorecard (BSC) for selecting and grouping key performance indicator. The system developed by using Microsoft SQL Server 2010 is used to create the system database. In regard to visual-programming language, Microsoft Visual C# 2010 is chosen as the graphic user interface development tool. This system consists of six main menus: menu login, menu main data, menu financial perspective, menu customer perspective, menu internal, and menu learning and growth perspective. Each menu consists of key performance indicator form. Each form contains a data import section, a data input section, a data searches – edit section, and a report section. The system generates outputs in 5 main reports, the KPI detail reports, KPI summary report, KPI graph report, benchmarking summary report and benchmarking graph report. The user will select the condition of the report and period time. As the system has been developed and tested, discovers that it is one of the ways to judging the extent to warehouse objectives had been achieved. Moreover, it encourages the warehouse functional proceed with more efficiency. In order to be useful propose for other industries, can adjust this system appropriately. To increase the usefulness of the key performance indicator system, the recommendations for further development are as follows: -The warehouse should review the target value and set the better suitable target periodically under the situation fluctuated in the future. -The warehouse should review the key performance indicators and set the better suitable key performance indicators periodically under the situation fluctuated in the future for increasing competitiveness and take advantage of new opportunities.Keywords: key performance indicator, warehouse management, warehouse operation, logistics management
Procedia PDF Downloads 43014765 Spatial Analysis for Wind Risk Index Assessment
Authors: Ljiljana Seric, Vladimir Divic, Marin Bugaric
Abstract:
This paper presents methodology for spatial analysis of GIS data that is used for assessing the microlocation risk index from potential damages of high winds. The analysis is performed on freely available GIS data comprising information about wind load, terrain cover and topography of the area. The methodology utilizes the legislation of Eurocode norms for determination of wind load of buildings and constructions. The core of the methodology is adoption of the wind load parameters related to location on geographical spatial grid. Presented work is a part of the Wind Risk Project, supported by the European Commission under the Civil Protection Financial Instrument of the European Union (ECHO). The partners involved in Wind Risk project performed Wind Risk assessment and proposed action plan for three European countries – Slovenia, Croatia and Germany. The proposed method is implemented in GRASS GIS open source GIS software and demonstrated for Case study area of wider area of Split, Croatia. Obtained Wind Risk Index is visualized and correlated with critical infrastructures like buildings, roads and power lines. The results show good correlation between high Wind Risk Index with recent incidents related to wind.Keywords: Eurocode norms, GIS, spatial analysis, wind distribution, wind risk
Procedia PDF Downloads 31414764 A Cohort and Empirical Based Multivariate Mortality Model
Authors: Jeffrey Tzu-Hao Tsai, Yi-Shan Wong
Abstract:
This article proposes a cohort-age-period (CAP) model to characterize multi-population mortality processes using cohort, age, and period variables. Distinct from the factor-based Lee-Carter-type decomposition mortality model, this approach is empirically based and includes the age, period, and cohort variables into the equation system. The model not only provides a fruitful intuition for explaining multivariate mortality change rates but also has a better performance in forecasting future patterns. Using the US and the UK mortality data and performing ten-year out-of-sample tests, our approach shows smaller mean square errors in both countries compared to the models in the literature.Keywords: longevity risk, stochastic mortality model, multivariate mortality rate, risk management
Procedia PDF Downloads 5114763 Behavioral and Cultural Risk Factor of Cardiovascular Disease in India: Evidence from SAGE-Study
Authors: Sunita Patel
Abstract:
Cardiovascular diseases are the leading cause of morbidity as well as mortality in India. Objective of this study is to examine CVDs prevalence and identify their behavioral and cultural risk factors with the help of SAGE-2007 data conducted on 6th states in India. Findings reveal that 18.3% of people diagnosed with CVDs in India. Higher disease occurs in an increasing rate between ages of 30-39 having OR 2.45 (CI: 1.66-3.63) and 70+ age OR 7.45 (CI: 4.82-11.49) times higher compare to 18-29 age group respectively. Wealth quintile higher CVD occurs as 3rd in 60% (CI: 1.16-2.21) and in richest 5th quintile 58% (CI: 1.13-2.21) contrast to lowest quintile. Relative risk depicted that 22.4% in moderate and 44% in vigorous activity have less chance of diseases compare to who performed no work and those who consumed alcohol. Results reveal that policy prospect should be recommended and that it would be beneficial for awareness of people and their future.Keywords: behavioral risk, cultural risk, cardio-vascular diseases, wealth quintile
Procedia PDF Downloads 39814762 The Evolving Customer Experience Management Landscape: A Case Study on the Paper Machine Companies
Authors: Babak Mohajeri, Sen Bao, Timo Nyberg
Abstract:
Customer experience is increasingly the differentiator between successful companies and those who struggle. Currently, customer experiences become more dynamic; and they advance with each interaction between the company and a customer. Every customer conversation and any effort to evolve these conversations would be beneficial and should ultimately result in a positive customer experience. The aim of this paper is to analyze the evolving customer experience management landscape and the relevant challenges and opportunities. A case study on the “paper machine” companies is chosen. Hence, this paper analyzes the challenges and opportunities in customer experience management of paper machine companies for the case of “road to steel”. Road to steel shows the journey of steel from raw material to end product (i.e. paper machine in this paper). ALPHA (Steel company) and BETA (paper machine company), are chosen and their efforts to evolve the customer experiences are investigated. Semi-structured interviews are conducted with experts in those companies to identify the challenges and opportunities of the evolving customer experience management from their point of view. The findings of this paper contribute to the theory and business practices in the realm of the evolving customer experience management landscape.Keywords: Customer Experience Management, Paper Machine , Value Chain Management, Risk Analysis
Procedia PDF Downloads 36114761 Formulation of a Rapid Earthquake Risk Ranking Criteria for National Bridges in the National Capital Region Affected by the West Valley Fault Using GIS Data Integration
Authors: George Mariano Soriano
Abstract:
In this study, a Rapid Earthquake Risk Ranking Criteria was formulated by integrating various existing maps and databases by the Department of Public Works and Highways (DPWH) and Philippine Institute of Volcanology and Seismology (PHIVOLCS). Utilizing Geographic Information System (GIS) software, the above-mentioned maps and databases were used in extracting seismic hazard parameters and bridge vulnerability characteristics in order to rank the seismic damage risk rating of bridges in the National Capital Region.Keywords: bridge, earthquake, GIS, hazard, risk, vulnerability
Procedia PDF Downloads 40614760 Overview of Standard Unit System of Shenzhen Land Spatial Planning and Case Analysis
Authors: Ziwei Huang
Abstract:
The standard unit of Shenzhen land spatial planning has the characteristics of vertical conduction, horizontal evaluation, internal balance and supervision of implementation. It mainly assumes the role of geospatial unit, assists in promoting the complex development of the business in Shenzhen and undertakes the management and transmission of upper and lower levels of planning as well as the Urban management functions such as gap analysis of public facilities, planning evaluation and dynamic monitoring of planning information. Combining with the application examples of the analysis of gaps in public facilities in Longgang District, it can be found that the standard unit of land spatial planning in Shenzhen as a small-scale geographic basic unit, has a stronger urban spatial coupling effect. However, the universality of the application of the system is still lacking and it is necessary to propose more scientific and powerful standard unit delineation standards and planning function evaluation indicators to guide the implementation of the system's popularization and application.Keywords: Shenzhen city, land spatial planning, standard unit system, urban delicacy management
Procedia PDF Downloads 127