Search results for: Health Data Standards
7456 Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application
Authors: Tahseen A. Jilani, Huda Yasin, Madiha Yasin, C. Ardil
Abstract:
In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.
Keywords: Acute coronary syndrome (ACS), binary logistic regression analyses, myocardial ischemia (MI), principle component analysis, unstable angina (U.A.).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21127455 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach
Authors: Elias K. Maragos, Petros E. Maravelakis
Abstract:
In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.
Keywords: Data envelopment analysis, Dynamic DEA, Piecewise linear inputs, Piecewise linear outputs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6547454 Fetal and Infant Mortality in Botucatu City, São Paulo State, Brazil: Evaluation of Maternal - Infant Health Care
Authors: Noda L. M., Salvador I. C, C. M. L. G. Parada, Fonseca C. R. B.
Abstract:
In Brazil, neonatal mortality rate is considered incompatible with the country development conditions, and has been a Public Health concern. Reduction in infant mortality rates has also been part of the Millennium Development Goals, a commitment made by countries, members of the Organization of United Nations (OUN), including Brazil. Fetal mortality rate is considered a highly sensitive indicator of health care quality. Suitable actions, such as good quality and access to health services may contribute positively towards reduction in these fetal and neonatal rates. With appropriate antenatal follow-up and health care during gestation and delivery, some death causes could be reduced or even prevented by means of early diagnosis and intervention, as well as changes in risk factors and interventions. Objectives: To study the quality of maternal and infant health care based on fetal and neonatal mortality, as well as the possible actions to prevent those deaths in Botucatu (Brazil). Methods: Classification of prevention according to the International Classification of Diseases and the modified Wigglesworth´s classification. In order to evaluate adequacy, indicators of quality of antenatal and delivery care were established by the authors. Results: Considering fetal deaths, 56.7% of them occurred before delivery, which reveals possible shortcomings in antenatal care, and 38.2% of them were a result of intra- labor changes, which could be prevented or reduced by adequate obstetric management. These findings were different from those in the group of early neonatal deaths which were also studied. Adequacy of health services showed that antenatal and childbirth care was appropriate for 24% and 33.3% of pregnant women, respectively, which corroborates the results of prevention. These results revealed that shortcomings in obstetric and antenatal care could be the causes of deaths in the study. Early and late neonatal deaths have similar characteristics: 76% could be prevented or reduced mainly by adequate newborn care (52.9%) and adequate health care for gestational women (11.7%). When adequacy of care was evaluated, childbirth and newborn care was adequate in 25.8% and antenatal care was adequate in 16.1%. In conclusion, direct relationship was found between adequacy and quality of care rendered to pregnant women and newborns, and fetal and infant mortality. Moreover, our findings highlight that deaths could be prevented by an adequate obstetric and neonatal management.
Keywords: Fetal Mortality, Infant Mortality, Maternal-Child Health Services, Program Evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50687453 Data Mining Determination of Sunlight Average Input for Solar Power Plant
Authors: Fl. Loury, P. Sablonière, C. Lamoureux, G. Magnier, Th. Gutierrez
Abstract:
A method is proposed to extract faithful representative patterns from data set of observations when they are suffering from non-negligible fluctuations. Supposing time interval between measurements to be extremely small compared to observation time, it consists in defining first a subset of intermediate time intervals characterizing coherent behavior. Data projection on these intervals gives a set of curves out of which an ideally “perfect” one is constructed by taking the sup limit of them. Then comparison with average real curve in corresponding interval gives an efficiency parameter expressing the degradation consecutive to fluctuation effect. The method is applied to sunlight data collected in a specific place, where ideal sunlight is the one resulting from direct exposure at location latitude over the year, and efficiency is resulting from action of meteorological parameters, mainly cloudiness, at different periods of the year. The extracted information already gives interesting element of decision, before being used for analysis of plant control.
Keywords: Base Input Reconstruction, Data Mining, Efficiency Factor, Information Pattern Operator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15277452 Instruction and Learning Design Consideration for the Development of Mobile Learning Application
Authors: M. Sarrab, M. Elbasir
Abstract:
The use of information technology in education have changed not only the learners learning style but also the way they taught, where nowadays learners are connected with diversity of information sources with means of knowledge available everywhere. The advantage of network wireless technologies and mobility technologies used in the education and learning processes lead to mobile learning as a new model of learning technology. Currently, most of mobile learning applications are developed for the formal education and learning environment. Despite the long history and large amount of research on mobile learning and instruction design model still there is a need of well-defined process in designing mobile learning applications. Based on this situation, this paper emphasizes on identifying instruction design phase’s considerations and influencing factors in developing mobile learning application. This set of instruction design steps includes analysis, design, development, implementation, evaluation and continuous has been built from a literature study, with focus on standards for learning, mobile application software quality and guidelines. The effort is part of an Omani-funded research project investigating the development, adoption and dissemination of mobile learning in Oman.Keywords: Instruction design, mobile learning, mobile application.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16877451 Food Safety Management: Concerns from EU Tourists in Thailand
Authors: Kevin Wongleedee
Abstract:
Culinary culture differences can cause health problems for international tourists in Thailand. This paper drew upon data collected from an international tourist survey conducted in Bangkok, Thailand during summer of 2012. Summer is the period that a variety food safety issues and incidents are often publicized in Thailand. The survey targeted European Union tourists- concerns toward a variety of food safety issues that they encountered during their trip in Thailand. A total of 400 respondents were elicited as data input for t-test, and one way ANOVA test. The findings revealed an astonishing result that up to 46.5 percent of respondents were sick at least one time or more in Thailand. However, the majority of respondents trusted that the Thai hotel and Thai restaurants would ensure food safety, but they did not trust street vendors to ensure food safety. The level of food safety concern can be ranked from most concern to least concern by using the value of mean scores as follows: 1) artificial coloring, 2) use of preservatives, 3) antibiotics, 4) growth hormones, 5) chemical residues, and 6) bacterial contamination. The overall mean score for level of concerns was 3.493 with standard deviation of 1.677 which did not indicate a very high level of concern. In addition, the result for t-test and one way ANOVA test revealed that there was not much effect from the demographic differences to level of food safety concerns.Keywords: Concerns, European Union Tourists, Food Safety Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28787450 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems
Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas
Abstract:
This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.Keywords: Transportation networks, freight delivery, data flow, monitoring, e-services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6067449 Inefficiency of Data Storing in Physical Memory
Authors: Kamaruddin Malik Mohamad, Sapiee Haji Jamel, Mustafa Mat Deris
Abstract:
Memory forensic is important in digital investigation. The forensic is based on the data stored in physical memory that involve memory management and processing time. However, the current forensic tools do not consider the efficiency in terms of storage management and the processing time. This paper shows the high redundancy of data found in the physical memory that cause inefficiency in processing time and memory management. The experiment is done using Borland C compiler on Windows XP with 512 MB of physical memory.Keywords: Digital Evidence, Memory Forensics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20187448 Development of an Avionics System for Flight Data Collection of an UAV Helicopter
Authors: Nikhil Ramaswamy, S.N.Omkar, Kashyap.H.Nathwani, Anil.M.Vanjare
Abstract:
In this present work, the development of an avionics system for flight data collection of a Raptor 30 V2 is carried out. For the data acquisition both onground and onboard avionics systems are developed for testing of a small-scale Unmanned Aerial Vehicle (UAV) helicopter. The onboard avionics record the helicopter state outputs namely accelerations, angular rates and Euler angles, in real time, and the on ground avionics system record the inputs given to the radio controlled helicopter through a transmitter, in real time. The avionic systems are designed and developed taking into consideration low weight, small size, anti-vibration, low power consumption, and easy interfacing. To mitigate the medium frequency vibrations embedded on the UAV helicopter during flight, a damper is designed and its performance is evaluated. A number of flight tests are carried out and the data obtained is then analyzed for accuracy and repeatability and conclusions are inferred.Keywords: Data collection, Flight Testing, Onground and Onboard Avionics, UAV helicopter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26407447 Characteristics of Neonates and Child Health Outcomes after the Mamuju Earthquake Disaster
Authors: Dimas T. Anantyo, Zsa-Zsa A. Laksmi, Adhie N. Radityo, Arsita E. Rini, Gatot I. Sarosa
Abstract:
A six-point-two-magnitude earthquake rocked Mamuju District, West Sulawesi Province, Indonesia, on 15 January 2021, causing significant health issues for the affected community, particularly among vulnerable populations such as neonates and children. The aim of this study is to examine and describe the diseases diagnosed in the pediatric population in Mamuju 14 days after the earthquake. This study uses a prospective observational study of the pediatric population presenting at West Sulawesi Regional Hospital, Mamuju Regional Public Hospital, and Bhayangkara Hospital for the period of 14 days after the earthquake. Demographic and clinical information was recorded. 153 children were admitted to the health center. Children younger than six years old were the highest proportion (78%). Out of 153 children, 82 of them were male (54%). The most frequently diagnosed disease during the first and second weeks after the earthquake was respiratory problems, followed by gastrointestinal problems that showed an increase in incidence in the second week. This study found that age has a correlation with common disease in children after an earthquake. Respiratory and gastrointestinal problems were found to be the most common diseases among the pediatric population in Mamuju after the earthquake.
Keywords: Health outcomes, pediatric population, earthquake, Mamuju.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1147446 The Research of Fuzzy Classification Rules Applied to CRM
Authors: Chien-Hua Wang, Meng-Ying Chou, Chin-Tzong Pang
Abstract:
In the era of great competition, understanding and satisfying customers- requirements are the critical tasks for a company to make a profits. Customer relationship management (CRM) thus becomes an important business issue at present. With the help of the data mining techniques, the manager can explore and analyze from a large quantity of data to discover meaningful patterns and rules. Among all methods, well-known association rule is most commonly seen. This paper is based on Apriori algorithm and uses genetic algorithms combining a data mining method to discover fuzzy classification rules. The mined results can be applied in CRM to help decision marker make correct business decisions for marketing strategies.Keywords: Customer relationship management (CRM), Data mining, Apriori algorithm, Genetic algorithm, Fuzzy classification rules.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16607445 Physicochemical and Microbiological Assessment of Source and Stored Domestic Water from Three Local Governments in Ile-Ife, Nigeria
Authors: Mary A. Bisi-Johnson, Kehinde A. Adediran, Saheed A. Akinola, Hamzat A. Oyelade
Abstract:
Some of the main problems man contends with are the quantity (source and amount) and quality of water in Nigeria. Scarcity leads to water being obtained from various sources and microbiological contamination of the water may thus occur between the collection point and the point of usage. This study thus aims to assess the general and microbiological quality of domestic water sources and household stored water used within selected areas in Ile-Ife, South-Western part of Nigeria for microbial contaminants. Physicochemical and microbiological examination were carried out on 45 source and stored water samples collected from well and spring in three different local government areas i.e. Ife east, Ife-south and Ife-north. Physicochemical analysis included pH value, temperature, total dissolved solid, dissolved oxygen and biochemical oxygen demand. Microbiology involved most probable number analysis, total coliform, heterotrophic plate, faecal coliform and streptococcus count.
The result of the physicochemical analysis of samples showed anomalies compared to acceptable standards with the pH value of 7.20-8.60 for stored and 6.50-7.80 for source samples. The total dissolved solids (TDS of stored 20-70mg/L, source 352-691mg/L), dissolved oxygen (DO of stored 1.60-9.60mg/L, source 1.60-4.80mg/L), biochemical oxygen demand (BOD stored 0.80-3.60mg/L, source 0.60-5.40mg/L). General microbiological quality indicated that both stored and source samples with the exception of a sample were not within acceptable range as indicated by analysis of the MPN/100ml which ranges between (stored 290-1100mg/L, source 9-1100mg/L). Apart from high counts, most samples did not meet the World Health Organization standard for drinking water with the presence of some pathogenic bacteria and fungi such as Salmonella and Aspergillus spp. To annul these constraints, standard treatment methods should be adopted to make water free from contaminants. This will help identify common and likely water related infection origin within the communities and thus help guide in terms of interventions required to prevent the general populace from such infections.
Keywords: Domestic, microbiology, physicochemical, quality, water.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27117444 Equilibrium Modeling of Carbon Dioxide Adsorption on Zeolites
Authors: Alireza Behvandi, Somayeh Tourani
Abstract:
High pressure adsorption of carbon dioxide on zeolite 13X was investigated in the pressure range (0 to 4) Mpa and temperatures 298, 308 and 323K. The data fitting is accomplished with the Toth, UNILAN, Dubinin-Astakhov and virial adsorption models which are generally used for micro porous adsorbents such as zeolites. Comparison with experimental data from the literature indicated that the virial model would best determine results. These results may be partly attributed to the flexibility of the virial model which can accommodate as many constants as the data warrants.Keywords: adsorption models, zeolite, carbon dioxide
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28827443 Application of Java-based Pointcuts in Aspect Oriented Programming (AOP) for Data Race Detection
Authors: Sadaf Khalid, Fahim Arif
Abstract:
Wide applicability of concurrent programming practices in developing various software applications leads to different concurrency errors amongst which data race is the most important. Java provides greatest support for concurrent programming by introducing various concurrency packages. Aspect oriented programming (AOP) is modern programming paradigm facilitating the runtime interception of events of interest and can be effectively used to handle the concurrency problems. AspectJ being an aspect oriented extension to java facilitates the application of concepts of AOP for data race detection. Volatile variables are usually considered thread safe, but they can become the possible candidates of data races if non-atomic operations are performed concurrently upon them. Various data race detection algorithms have been proposed in the past but this issue of volatility and atomicity is still unaddressed. The aim of this research is to propose some suggestions for incorporating certain conditions for data race detection in java programs at the volatile fields by taking into account support for atomicity in java concurrency packages and making use of pointcuts. Two simple test programs will demonstrate the results of research. The results are verified on two different Java Development Kits (JDKs) for the purpose of comparison.Keywords: Aspect Bench Compiler (abc), Aspect OrientedProgramming (AOP), AspectJ, Aspects, Concurrency packages, Concurrent programming, Cross-cutting Concerns, Data race, Eclipse, Java, Java Development Kits (JDKs), Pointcuts
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19297442 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency
Authors: Rania Alshikhe, Vinita Jindal
Abstract:
Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from trav-eling vehicles, such as taxis through installed global positioning sys-tem (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.
Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5087441 Actionable Rules: Issues and New Directions
Authors: Harleen Kaur
Abstract:
Knowledge Discovery in Databases (KDD) is the process of extracting previously unknown, hidden and interesting patterns from a huge amount of data stored in databases. Data mining is a stage of the KDD process that aims at selecting and applying a particular data mining algorithm to extract an interesting and useful knowledge. It is highly expected that data mining methods will find interesting patterns according to some measures, from databases. It is of vital importance to define good measures of interestingness that would allow the system to discover only the useful patterns. Measures of interestingness are divided into objective and subjective measures. Objective measures are those that depend only on the structure of a pattern and which can be quantified by using statistical methods. While, subjective measures depend only on the subjectivity and understandability of the user who examine the patterns. These subjective measures are further divided into actionable, unexpected and novel. The key issues that faces data mining community is how to make actions on the basis of discovered knowledge. For a pattern to be actionable, the user subjectivity is captured by providing his/her background knowledge about domain. Here, we consider the actionability of the discovered knowledge as a measure of interestingness and raise important issues which need to be addressed to discover actionable knowledge.
Keywords: Data Mining Community, Knowledge Discovery inDatabases (KDD), Interestingness, Subjective Measures, Actionability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19417440 Moving towards Zero Waste in a UK Local Authority Area: Challenges to the Introduction of Separate Food Waste Collections
Authors: C. Cole, M. Osmani, A. Wheatley, M. Quddus
Abstract:
EU and UK Government targets for minimising and recycling household waste has led the responsible authorities to research the alternatives to landfill. In the work reported here the local waste collection authority (Charnwood Borough Council) has adopted the aspirational strategy of becoming a “Zero Waste Borough” to lead the drive for public participation. The work concludes that the separate collection of food waste would be needed to meet the two regulatory standards on recycling and biologically active wastes.
An analysis of a neighbouring Authority (Newcastle-Under-Lyne Borough Council (NBC), a similar sized local authority that has a successful weekly food waste collection service was undertaken. Results indicate that the main challenges for Charnwood Borough Council would be gaining householder co-operation, the extra costs of collection and organising alternative treatment. The analysis also demonstrated that there was potential offset value via anaerobic digestion for CBC to overcome these difficulties and improve its recycling performance.
Keywords: England, Food Waste Collections, Household Waste, Local Authority.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19347439 Model Discovery and Validation for the Qsar Problem using Association Rule Mining
Authors: Luminita Dumitriu, Cristina Segal, Marian Craciun, Adina Cocu, Lucian P. Georgescu
Abstract:
There are several approaches in trying to solve the Quantitative 1Structure-Activity Relationship (QSAR) problem. These approaches are based either on statistical methods or on predictive data mining. Among the statistical methods, one should consider regression analysis, pattern recognition (such as cluster analysis, factor analysis and principal components analysis) or partial least squares. Predictive data mining techniques use either neural networks, or genetic programming, or neuro-fuzzy knowledge. These approaches have a low explanatory capability or non at all. This paper attempts to establish a new approach in solving QSAR problems using descriptive data mining. This way, the relationship between the chemical properties and the activity of a substance would be comprehensibly modeled.Keywords: association rules, classification, data mining, Quantitative Structure - Activity Relationship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17877438 Systematic Analysis of Dynamic Association of Health Outcomes with Computer Usage for Office Staff
Authors: Xiaoshu Lu, Esa-Pekka Takala, Risto Toivonen
Abstract:
This paper systematically investigates the timedependent health outcomes for office staff during computer work using the developed mathematical model. The model describes timedependent health outcomes in multiple body regions associated with computer usage. The association is explicitly presented with a doseresponse relationship which is parametrized by body region parameters. Using the developed model we perform extensive investigations of the health outcomes statically and dynamically. We compare the risk body regions and provide various severity rankings of the discomfort rate changes with respect to computer-related workload dynamically for the study population. Application of the developed model reveals a wide range of findings. Such broad spectrum of investigations in a single report literature is lacking. Based upon the model analysis, it is discovered that the highest average severity level of the discomfort exists in neck, shoulder, eyes, shoulder joint/upper arm, upper back, low back and head etc. The biggest weekly changes of discomfort rates are in eyes, neck, head, shoulder, shoulder joint/upper arm and upper back etc. The fastest discomfort rate is found in neck, followed by shoulder, eyes, head, shoulder joint/upper arm and upper back etc. Most of our findings are consistent with the literature, which demonstrates that the developed model and results are applicable and valuable and can be utilized to assess correlation between the amount of computer-related workload and health risk.Keywords: Computer-related workload, health outcomes, dynamic association, dose-response relationship, systematic analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12877437 The Key Challenges of the New Bank Regulations
Authors: Petr Teply
Abstract:
The New Basel Capital Accord (Basel II) influences how financial institutions around the world, and especially European Union institutions, determine the amount of capital to reserve. However, as the recent global crisis has shown, the revision of Basel II is needed to reflect current trends, such as increased volatility and correlation, in the world financial markets. The overall objective of Basel II is to increase the safety and soundness of the international financial system. Basel II builds on three main pillars: Pillar I deals with the minimum capital requirements for credit, market and operational risk, Pillar II focuses on the supervisory review process and finally Pillar III promotes market discipline through enhanced disclosure requirements for banks. The aim of this paper is to provide the historical background, key features and impact of Basel II on financial markets. Moreover, we discuss new proposals for international bank regulation (sometimes referred to as Basel III) which include requirements for higher quality, constituency and transparency of banks' capital and risk management, regulation of OTC markets and introduction of new liquidity standards for internationally active banks.
Keywords: Basel II, Basel III, risk management, bank regulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20737436 Logistic and Its Importance in Turkish Food Sector and an Analysis of the Logistics Sector in Turkey
Authors: Şule Turhan, Özlem Turan
Abstract:
Permanence in the international markets for many global companies is about being known as having effective logistics which targets customer satisfaction management and lower costs. Under competitive conditions, the necessity of providing the products to customers quickly and on time for the companies which constantly aim to improve their profitability increased the strategic importance of the logistics concept. Food logistic is one of the most difficult areas in logistics. In the process from manufacturer to final consumer, quality and hygiene standards must be provided constantly. In food logistics, reliable and extensive service network has great importance and on time delivery is the target. Developing logistics industry provide the supply of foods in the country and the development of export markets more quickly and has an important role in providing added value to the country's economy. Turkey that creates a bridge between the east and the west is an attractive market for logistics companies. In this study, by examining both the place and the importance of logistics in Turkish food sector, recommendations will be made for the food industry.Keywords: Logistics, Turkish food industry, competition, food industry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13327435 From Modeling of Data Structures towards Automatic Programs Generating
Authors: Valentin P. Velikov
Abstract:
Automatic program generation saves time, human resources, and allows receiving syntactically clear and logically correct modules. The 4-th generation programming languages are related to drawing the data and the processes of the subject area, as well as, to obtain a frame of the respective information system. The application can be separated in interface and business logic. That means, for an interactive generation of the needed system to be used an already existing toolkit or to be created a new one.Keywords: Computer science, graphical user interface, user dialog interface, dialog frames, data modeling, subject area modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14417434 Visual Analytics in K 12 Education - Emerging Dimensions of Complexity
Authors: Linnea Stenliden
Abstract:
The aim of this paper is to understand emerging learning conditions, when a visual analytics is implemented and used in K 12 (education). To date, little attention has been paid to the role visual analytics (digital media and technology that highlight visual data communication in order to support analytical tasks) can play in education, and to the extent to which these tools can process actionable data for young students. This study was conducted in three public K 12 schools, in four social science classes with students aged 10 to 13 years, over a period of two to four weeks at each school. Empirical data were generated using video observations and analyzed with help of metaphors within Actor-network theory (ANT). The learning conditions are found to be distinguished by broad complexity, characterized by four dimensions. These emerge from the actors’ deeply intertwined relations in the activities. The paper argues in relation to the found dimensions that novel approaches to teaching and learning could benefit students’ knowledge building as they work with visual analytics, analyzing visualized data.
Keywords: Analytical reasoning, complexity, data use, problem space, visual analytics, visual storytelling, translation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16967433 An Energy Efficient Cluster Formation Protocol with Low Latency In Wireless Sensor Networks
Authors: A. Allirani, M. Suganthi
Abstract:
Data gathering is an essential operation in wireless sensor network applications. So it requires energy efficiency techniques to increase the lifetime of the network. Similarly, clustering is also an effective technique to improve the energy efficiency and network lifetime of wireless sensor networks. In this paper, an energy efficient cluster formation protocol is proposed with the objective of achieving low energy dissipation and latency without sacrificing application specific quality. The objective is achieved by applying randomized, adaptive, self-configuring cluster formation and localized control for data transfers. It involves application - specific data processing, such as data aggregation or compression. The cluster formation algorithm allows each node to make independent decisions, so as to generate good clusters as the end. Simulation results show that the proposed protocol utilizes minimum energy and latency for cluster formation, there by reducing the overhead of the protocol.Keywords: Sensor networks, Low latency, Energy sorting protocol, data processing, Cluster formation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27407432 An Approach to Practical Determination of Fair Premium Rates in Crop-Hail Insurance Using Short-Term Insurance Data
Authors: Necati Içer
Abstract:
Crop-hail insurance plays a vital role in managing risks and reducing the financial consequences of hail damage on crop production. Predicting insurance premium rates with short-term data is a major challenge in numerous nations because of the unique characteristics of hailstorms. This study aims to suggest a feasible approach for establishing equitable premium rates in crop-hail insurance for nations with short-term insurance data. The primary goal of the rate-making process is to determine premium rates for high and zero loss costs of villages and enhance their credibility. To do this, a technique was created using the author's practical knowledge of crop-hail insurance. With this approach, the rate-making method was developed using a range of temporal and spatial factor combinations with both hypothetical and real data, including extreme cases. This article aims to show how to incorporate the temporal and spatial elements into determining fair premium rates using short-term insurance data. The article ends with a suggestion on the ultimate premium rates for insurance contracts.
Keywords: Crop-hail insurance, premium rate, short-term insurance data, spatial and temporal parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157431 The Pixel Value Data Approach for Rainfall Forecasting Based on GOES-9 Satellite Image Sequence Analysis
Authors: C. Yaiprasert, K. Jaroensutasinee, M. Jaroensutasinee
Abstract:
To develop a process of extracting pixel values over the using of satellite remote sensing image data in Thailand. It is a very important and effective method of forecasting rainfall. This paper presents an approach for forecasting a possible rainfall area based on pixel values from remote sensing satellite images. First, a method uses an automatic extraction process of the pixel value data from the satellite image sequence. Then, a data process is designed to enable the inference of correlations between pixel value and possible rainfall occurrences. The result, when we have a high averaged pixel value of daily water vapor data, we will also have a high amount of daily rainfall. This suggests that the amount of averaged pixel values can be used as an indicator of raining events. There are some positive associations between pixel values of daily water vapor images and the amount of daily rainfall at each rain-gauge station throughout Thailand. The proposed approach was proven to be a helpful manual for rainfall forecasting from meteorologists by which using automated analyzing and interpreting process of meteorological remote sensing data.
Keywords: Pixel values, satellite image, water vapor, rainfall, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18617430 Multistage Data Envelopment Analysis Model for Malmquist Productivity Index Using Grey's System Theory to Evaluate Performance of Electric Power Supply Chain in Iran
Authors: Mesbaholdin Salami, Farzad Movahedi Sobhani, Mohammad Sadegh Ghazizadeh
Abstract:
Evaluation of organizational performance is among the most important measures that help organizations and entities continuously improve their efficiency. Organizations can use the existing data and results from the comparison of units under investigation to obtain an estimation of their performance. The Malmquist Productivity Index (MPI) is an important index in the evaluation of overall productivity, which considers technological developments and technical efficiency at the same time. This article proposed a model based on the multistage MPI, considering limited data (Grey’s theory). This model can evaluate the performance of units using limited and uncertain data in a multistage process. It was applied by the electricity market manager to Iran’s electric power supply chain (EPSC), which contains uncertain data, to evaluate the performance of its actors. Results from solving the model showed an improvement in the accuracy of future performance of the units under investigation, using the Grey’s system theory. This model can be used in all case studies, in which MPI is used and there are limited or uncertain data.
Keywords: Malmquist Index, Grey's Theory, Charnes Cooper & Rhodes (CCR) Model, network data envelopment analysis, Iran electricity power chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5517429 Anti-Corruption Conventions in Nigeria: Legal and Administrative Challenges
Authors: Mohammed Albakariyu Kabir
Abstract:
There is a trend in development discourse to understand and explain the level of corruption in Nigeria, its anticorruption crusade and why it is failing, as well as its level of compliance with International standards of United Nations Convention against Corruption (UNCAC) & African Union Convention on Converting and Preventing Corruption) to which Nigeria is a signatory. This paper discusses the legal and Constitutional provisions relating to corrupt practices and safeguards in Nigeria, as well as the obstacles to the implementation of these Conventions. The paper highlights the challenges posed to the Anti-Corruption crusade by analysing the loopholes that exist both in administrative structure and in scope of the relevant laws. The paper argues that Nigerian Constitution did not make adequate provisions for the implementation of the conventions, hence a proposal which will ensure adequate provision for implementing the conventions to better the lives of Nigerians. The paper concludes that there is the need to build institutional parameters, adequate constitutional and structural safeguards, as well as to synergise strategies, collaborations and alliances to facilitate the timely domestication and implementation of the conventions.
Keywords: Anti-Corruption, Corruption, Convention, domestication, poverty, State Parties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26607428 Comparative Analysis of the Public Funding for Greek Universities: An Ordinal DEA/MCDM Approach
Authors: Yiannis Smirlis, Dimitris K. Despotis
Abstract:
This study performs a comparative analysis of the 21 Greek Universities in terms of their public funding, awarded for covering their operating expenditure. First it introduces a DEA/MCDM model that allocates the fund into four expenditure factors in the most favorable way for each university. Then, it presents a common, consensual assessment model to reallocate the amounts, remaining in the same level of total public budget. From the analysis it derives that a number of universities cannot justify the public funding in terms of their size and operational workload. For them, the sufficient reduction of their public funding amount is estimated as a future target. Due to the lack of precise data for a number of expenditure criteria, the analysis is based on a mixed crisp-ordinal data set.Keywords: Data envelopment analysis, Greek universities, operating expenditures, ordinal data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17657427 Inversion of Electrical Resistivity Data: A Review
Authors: Shrey Sharma, Gunjan Kumar Verma
Abstract:
High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.Keywords: Resistivity, inversion, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6071