Search results for: insurance research database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25866

Search results for: insurance research database

25386 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network

Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.

Keywords: big data, k-NN, machine learning, traffic speed prediction

Procedia PDF Downloads 357
25385 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison

Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo

Abstract:

A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.

Keywords: affective computing, interface, brain, intelligent interaction

Procedia PDF Downloads 385
25384 Research Progress of the Relationship between Urban Rail Transit and Residents' Travel Behavior during 1999-2019: A Scientific Knowledge Mapping Based on Citespace and Vosviewer

Authors: Zheng Yi

Abstract:

Among the attempts made worldwide to foster urban and transport sustainability, transit-oriented development certainly is one of the most successful. Residents' travel behavior is a concern in the researches about the impacts of transit-oriented development. The study takes 620 English journal papers in the core collection database of Web of Science as the study objects; the paper tries to map out the scientific knowledge mapping in the field and draw the basic conditions by co-citation analysis, co-word analysis, a total of citation network analysis and visualization techniques. This study teases out the research hotspots and evolution of the relationship between urban rail transit and resident's travel behavior from 1999 to 2019. According to the results of the analysis of the time-zone view and burst-detection, the paper discusses the trend of the next stage of international study. The results show that in the past 20 years, the research focuses on these keywords: land use, behavior, model, built environment, impact, travel behavior, walking, physical activity, smart card, big data, simulation, perception. According to different research contents, the key literature is further divided into these topics: the attributes of the built environment, land use, transportation network, transportation policies. The results of this paper can help to understand the related researches and achievements systematically. These results can also provide a reference for identifying the main challenges that relevant researches need to address in the future.

Keywords: urban rail transit, travel behavior, knowledge map, evolution of researches

Procedia PDF Downloads 104
25383 Transcriptome Analysis of Saffron (crocus sativus L.) Stigma Focusing on Identification Genes Involved in the Biosynthesis of Crocin

Authors: Parvaneh Mahmoudi, Ahmad Moeni, Seyed Mojtaba Khayam Nekoei, Mohsen Mardi, Mehrshad Zeinolabedini, Ghasem Hosseini Salekdeh

Abstract:

Saffron (Crocus sativus L.) is one of the most important spice and medicinal plants. The three-branch style of C. sativus flowers are the most important economic part of the plant and known as saffron, which has several medicinal properties. Despite the economic and biological significance of this plant, knowledge about its molecular characteristics is very limited. In the present study, we, for the first time, constructed a comprehensive dataset for C. sativus stigma through de novo transcriptome sequencing. We performed de novo transcriptome sequencing of C. sativus stigma using the Illumina paired-end sequencing technology. A total of 52075128 reads were generated and assembled into 118075 unigenes, with an average length of 629 bp and an N50 of 951 bp. A total of 66171unigenes were identified, among them, 66171 (56%) were annotated in the non-redundant National Center for Biotechnology Information (NCBI) database, 30938 (26%) were annotated in the Swiss-Prot database, 10273 (8.7%) unigenes were mapped to 141 Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway database, while 52560 (44%) and 40756 (34%) unigenes were assigned to Gen Ontology (GO) categories and Eukaryotic Orthologous Groups of proteins (KOG), respectively. In addition, 65 candidate genes involved in three stages of crocin biosynthesis were identified. Finally, transcriptome sequencing of saffron stigma was used to identify 6779 potential microsatellites (SSRs) molecular markers. High-throughput de novo transcriptome sequencing provided a valuable resource of transcript sequences of C. sativus in public databases. In addition, most of candidate genes potentially involved in crocin biosynthesis were identified which could be further utilized in functional genomics studies. Furthermore, numerous obtained SSRs might contribute to address open questions about the origin of this amphiploid spices with probable little genetic diversity.

Keywords: saffron, transcriptome, NGS, bioinformatic

Procedia PDF Downloads 95
25382 A General Framework for Knowledge Discovery from Echocardiographic and Natural Images

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: active contour, Bayesian, echocardiographic image, feature vector

Procedia PDF Downloads 437
25381 Frailty Patterns in the US and Implications for Long-Term Care

Authors: Joelle Fong

Abstract:

Older persons are at greatest risk of becoming frail. As survival to the age of 80 and beyond continues to increase, the health and frailty of older Americans has garnered much recent attention among policy makers and healthcare administrators. This paper examines patterns in old-age frailty within a multistate actuarial model that characterizes the stochastic process of biological ageing. Using aggregate population-level U.S. mortality data, we implement a stochastic aging model to examine cohort trends and gender differences in frailty distributions for older Americans born 1865 – 1894. The stochastic ageing model, which draws from the fields of actuarial science and gerontology, is well-established in the literature. The implications for public health insurance programs are also discussed. Our results suggest that, on average, women tend to be frailer than men at older ages and reveal useful insights about the magnitude of the male-female differential at critical age points. Specifically, we note that the frailty statuses of males and females are actually quite comparable from ages 65 to 80. Beyond age 80, however, the frailty levels start to diverge considerably implying that women are moving quicker into worse states of health than men. Tracking average frailty by gender over 30 successive birth cohorts, we also find that frailty levels for both genders follow a distinct peak-and-trough pattern. For instance, frailty among 85-year old American survivors increased in years 1954-1963, decreased in years 1964-1971, and again started to increase in years 1972-1979. A number of factors may have accounted for these cohort differences including differences in cohort life histories, differences in disease prevalence, differences in lifestyle and behavior, differential access to medical advances, as well as changes in environmental risk factors over time. We conclude with a discussion on the implications of our findings on spending for long-term care programs within the broader health insurance system.

Keywords: actuarial modeling, cohort analysis, frail elderly, health

Procedia PDF Downloads 242
25380 Real-World Comparison of Adherence to and Persistence with Dulaglutide and Liraglutide in UAE e-Claims Database

Authors: Ibrahim Turfanda, Soniya Rai, Karan Vadher

Abstract:

Objectives— The study aims to compare real-world adherence to and persistence with dulaglutide and liraglutide in patients with type 2 diabetes (T2D) initiating treatment in UAE. Methods— This was a retrospective, non-interventional study (observation period: 01 March 2017–31 August 2019) using the UAE Dubai e-Claims database. Included: adult patients initiating dulaglutide/liraglutide 01 September 2017–31 August 2018 (index period) with: ≥1 claim for T2D in the 6 months before index date (ID); ≥1 claim for dulaglutide/liraglutide during index period; and continuous medical enrolment for ≥6 months before and ≥12 months after ID. Key endpoints, assessed 3/6/12 months after ID: adherence to treatment (proportion of days covered [PDC; PDC ≥80% considered ‘adherent’], per-group mean±standard deviation [SD] PDC); and persistence (number of continuous therapy days from ID until discontinuation [i.e., >45 days gap] or end of observation period). Patients initiating dulaglutide/liraglutide were propensity score matched (1:1) based on baseline characteristics. Between-group comparison of adherence was analysed using the McNemar test (α=0.025). Persistence was analysed using Kaplan–Meier estimates with log-rank tests (α=0.025) for between-group comparisons. This study presents 12-month outcomes. Results— Following propensity score matching, 263 patients were included in each group. Mean±SD PDC for all patients at 12 months was significantly higher in the dulaglutide versus the liraglutide group (dulaglutide=0.48±0.30, liraglutide=0.39±0.28, p=0.0002). The proportion of adherent patients favored dulaglutide (dulaglutide=20.2%, liraglutide=12.9%, p=0.0302), as did the probability of being adherent to treatment (odds ratio [97.5% CI]: 1.70 [0.99, 2.91]; p=0.03). Proportion of persistent patients also favoured dulaglutide (dulaglutide=15.2%, liraglutide=9.1%, p=0.0528), as did the probability of discontinuing treatment 12 months after ID (p=0.027). Conclusions— Based on the UAE Dubai e-Claims database data, dulaglutide initiators exhibited significantly greater adherence in terms of mean PDC versus liraglutide initiators. The proportion of adherent patients and the probability of being adherent favored the dulaglutide group, as did treatment persistence.

Keywords: adherence, dulaglutide, effectiveness, liraglutide, persistence

Procedia PDF Downloads 120
25379 The Study of the Socio-Economic and Environmental Impact on the Semi-Arid Environments Using GIS in the Eastern Aurès, Algeria

Authors: Benmessaoud Hassen

Abstract:

We propose in this study to address the impact of socio-economic and environmental impact on the physical environment, especially their spatiotemporal dynamics in semi-arid and arid eastern Aurès. Including 11 municipalities, the study area spreads out over a relatively large surface area of about 60.000 ha. The hindsight is quite important and is determined by 03 days of analysis of environmental variation spread over thirty years (between 1987 and 2007). The multi-source data acquired in this context are integrated into a geographic information system (GIS).This allows, among other indices to calculate areas and classes for each thematic layer of the 4 layers previously defined by a method inspired MEDALUS (Mediterranean Desertification and Land Use).The database created is composed of four layers of information (population, livestock, farming and land use). His analysis in space and time has been supplemented by a validation of the ground truth. Once the database has corrected it used to develop the comprehensive map with the calculation of the index of socio-economic and environmental (ISCE). The map supports and the resulting information does not consist only of figures on the present situation but could be used to forecast future trends.

Keywords: impact of socio-economic and environmental, spatiotemporal dynamics, semi-arid environments, GIS, Eastern Aurès

Procedia PDF Downloads 319
25378 Mapping and Database on Mass Movements along the Eastern Edge of the East African Rift in Burundi

Authors: L. Nahimana

Abstract:

The eastern edge of the East African Rift in Burundi shows many mass movement phenomena corresponding to landslides, mudflow, debris flow, spectacular erosion (mega-gully), flash floods and alluvial deposits. These phenomena usually occur during the rainy season. Their extent and consecutive damages vary widely. To manage these phenomena, it is necessary to adopt a methodological approach of their mapping with a structured database. The elements for this database are: three-dimensional extent of the phenomenon, natural causes and conditions (geological lithology, slope, weathering depth and products, rainfall patterns, natural environment) and the anthropogenic factors corresponding to the various human activities. The extent of the area provides information about the possibilities and opportunities for mitigation technique. The lithological nature allows understanding the influence of the nature of the rock and its structure on the intensity of the weathering of rocks, as well as the geotechnical properties of the weathering products. The slope influences the land stability. The intensity of annual, monthly and daily rainfall helps to understand the conditions of water saturation of the terrains. Certain natural circumstances such as the presence of streams and rivers promote foot slope erosion and thus the occurrence and activity of mass movements. The construction of some infrastructures such as new roads and agglomerations deeply modify the flow of surface and underground water followed by mass movements. Using geospatial data selected on the East African Rift in Burundi, it is presented case of mass movements illustrating the nature, importance, various factors and the extent of the damages. An analysis of these elements for each hazard can guide the options for mitigation of the phenomenon and its consequences.

Keywords: mass movement, landslide, mudflow, debris flow, spectacular erosion, mega-gully, flash flood, alluvial deposit, East African rift, Burundi

Procedia PDF Downloads 303
25377 Energy Intensity: A Case of Indian Manufacturing Industries

Authors: Archana Soni, Arvind Mittal, Manmohan Kapshe

Abstract:

Energy has been recognized as one of the key inputs for the economic growth and social development of a country. High economic growth naturally means a high level of energy consumption. However, in the present energy scenario where there is a wide gap between the energy generation and energy consumption, it is extremely difficult to match the demand with the supply. India being one of the largest and rapidly growing developing countries, there is an impending energy crisis which requires immediate measures to be adopted. In this situation, the concept of Energy Intensity comes under special focus to ensure energy security in an environmentally sustainable way. Energy Intensity is defined as the energy consumed per unit output in the context of industrial energy practices. It is a key determinant of the projections of future energy demands which assists in policy making. Energy Intensity is inversely related to energy efficiency; lesser the energy required to produce a unit of output or service, the greater is the energy efficiency. Energy Intensity of Indian manufacturing industries is among the highest in the world and stands for enormous energy consumption. Hence, reducing the Energy Intensity of Indian manufacturing industries is one of the best strategies to achieve a low level of energy consumption and conserve energy. This study attempts to analyse the factors which influence the Energy Intensity of Indian manufacturing firms and how they can be used to reduce the Energy Intensity. The paper considers six of the largest energy consuming manufacturing industries in India viz. Aluminium, Cement, Iron & Steel Industries, Textile Industries, Fertilizer and Paper industries and conducts a detailed Energy Intensity analysis using the data from PROWESS database of the Centre for Monitoring Indian Economy (CMIE). A total of twelve independent explanatory variables based on various factors such as raw material, labour, machinery, repair and maintenance, production technology, outsourcing, research and development, number of employees, wages paid, profit margin and capital invested have been taken into consideration for the analysis.

Keywords: energy intensity, explanatory variables, manufacturing industries, PROWESS database

Procedia PDF Downloads 327
25376 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework

Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi

Abstract:

There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.

Keywords: video lectures, big video data, video retrieval, hadoop

Procedia PDF Downloads 527
25375 Least-Square Support Vector Machine for Characterization of Clusters of Microcalcifications

Authors: Baljit Singh Khehra, Amar Partap Singh Pharwaha

Abstract:

Clusters of Microcalcifications (MCCs) are most frequent symptoms of Ductal Carcinoma in Situ (DCIS) recognized by mammography. Least-Square Support Vector Machine (LS-SVM) is a variant of the standard SVM. In the paper, LS-SVM is proposed as a classifier for classifying MCCs as benign or malignant based on relevant extracted features from enhanced mammogram. To establish the credibility of LS-SVM classifier for classifying MCCs, a comparative evaluation of the relative performance of LS-SVM classifier for different kernel functions is made. For comparative evaluation, confusion matrix and ROC analysis are used. Experiments are performed on data extracted from mammogram images of DDSM database. A total of 380 suspicious areas are collected, which contain 235 malignant and 145 benign samples, from mammogram images of DDSM database. A set of 50 features is calculated for each suspicious area. After this, an optimal subset of 23 most suitable features is selected from 50 features by Particle Swarm Optimization (PSO). The results of proposed study are quite promising.

Keywords: clusters of microcalcifications, ductal carcinoma in situ, least-square support vector machine, particle swarm optimization

Procedia PDF Downloads 350
25374 GIS-Based Identification of Overloaded Distribution Transformers and Calculation of Technical Electric Power Losses

Authors: Awais Ahmed, Javed Iqbal

Abstract:

Pakistan has been for many years facing extreme challenges in energy deficit due to the shortage of power generation compared to increasing demand. A part of this energy deficit is also contributed by the power lost in transmission and distribution network. Unfortunately, distribution companies are not equipped with modern technologies and methods to identify and eliminate these losses. According to estimate, total energy lost in early 2000 was between 20 to 26 percent. To address this issue the present research study was designed with the objectives of developing a standalone GIS application for distribution companies having the capability of loss calculation as well as identification of overloaded transformers. For this purpose, Hilal Road feeder in Faisalabad Electric Supply Company (FESCO) was selected as study area. An extensive GPS survey was conducted to identify each consumer, linking it to the secondary pole of the transformer, geo-referencing equipment and documenting conductor sizes. To identify overloaded transformer, accumulative kWH reading of consumer on transformer was compared with threshold kWH. Technical losses of 11kV and 220V lines were calculated using the data from substation and resistance of the network calculated from the geo-database. To automate the process a standalone GIS application was developed using ArcObjects with engineering analysis capabilities. The application uses GIS database developed for 11kV and 220V lines to display and query spatial data and present results in the form of graphs. The result shows that about 14% of the technical loss on both high tension (HT) and low tension (LT) network while about 4 out of 15 general duty transformers were found overloaded. The study shows that GIS can be a very effective tool for distribution companies in management and planning of their distribution network.

Keywords: geographical information system, GIS, power distribution, distribution transformers, technical losses, GPS, SDSS, spatial decision support system

Procedia PDF Downloads 372
25373 Mapping and Measuring the Vulnerability Level of the Belawan District Community in Encountering the Rob Flood Disaster

Authors: Dessy Pinem, Rahmadian Sembiring, Adanil Bushra

Abstract:

Medan Belawan is one of the subdistricts of 21 districts in Medan. Medan Belawan Sub-district is directly adjacent to the Malacca Strait in the North. Due to its direct border with the Malacca Strait, the problem in this sub-district, which has continued for many years, is a flood of rob. In 2015, rob floods inundated Sicanang urban village, Belawan I urban village, Belawan Bahagia urban village and Bagan Deli village. The extent of inundation in the flood of rob that occurred in September 2015 reached 540, 938 ha. Rob flood is a phenomenon where the sea water is overflowing into the mainland. Rob floods can also be interpreted as a puddle of water on the coastal land that occurs when the tidal waters. So this phenomenon will inundate parts of the coastal plain or lower place of high tide sea level. Rob flood is a daily disaster faced by the residents in the district of Medan Belawan. Rob floods can happen every month and last for a week. The flood is not only the residents' houses, the flood also soaked the main road to Belawan Port reaching 50 cm. To deal with the problems caused by the flood and to prepare coastal communities to face the character of coastal areas, it is necessary to know the vulnerability of the people who are always the victims of the rob flood. Are the people of Medan Belawan sub-district, especially in the flood-affected villages, able to cope with the consequences of the floods? To answer this question, it is necessary to assess the vulnerability of the Belawan District community in the face of the flood disaster. This research is descriptive, qualitative and quantitative. Data were collected by observation, interview and questionnaires in 4 urban villages often affected by rob flood. The vulnerabilities measured are physical, economic, social, environmental, organizational and motivational vulnerabilities. For vulnerability in the physical field, the data collected is the distance of the building, floor area ratio, drainage, and building materials. For economic vulnerability, data collected are income, employment, building ownership, and insurance ownership. For the vulnerability in the social field, the data collected is education, number of family members, children, the elderly, gender, training for disasters, and how to dispose of waste. For the vulnerability in the field of organizational data collected is the existence of organizations that advocate for the victims, their policies and laws governing the handling of tidal flooding. The motivational vulnerability is seen from the information center or question and answer about the rob flood, and the existence of an evacuation plan or path to avoid disaster or reduce the victim. The results of this study indicate that most people in Medan Belawan sub-district have a high-level vulnerability in physical, economic, social, environmental, organizational and motivational fields. They have no access to economic empowerment, no insurance, no motivation to solve problems and only hope to the government, not to have organizations that support and defend them, and have physical buildings that are easily destroyed by rob floods.

Keywords: disaster, rob flood, Medan Belawan, vulnerability

Procedia PDF Downloads 122
25372 Management Support, Role Ambiguity and Role Ambiguity among Professional Nurses at National Health Insurance Pilot Sites in South Africa: An Interpretive Phenomenology

Authors: Nomcebo N. Mpili, Cynthia Z. Madlabana

Abstract:

The South African Primary Health Care (PHC) system has undergone a number of transformations such as the introduction of National Health Insurance (NHI) to bring about easily accessible universal health coverage and to meet the health needs for all its citizens. This provides ongoing challenges to ensure that health workers are equipped with appropriate knowledge, support, and skills to meet these changes. Therefore it is crucial to understand the experiences and challenges of nurses as the backbone of PHC in providing quality healthcare services. In addition there has been a need to understand nurses’ experiences with management support, role ambiguity and role conflict amongst other challenges in light of the current reforms in healthcare. Indeed these constructs are notorious for having a detrimental impact on the outcomes of change initiatives within any organisation, this is no different in healthcare. This draws a discussion on professional nurses within the South African health care system especially since they have been labelled as the backbone of PHC, meaning any healthcare backlog falls on them. The study made use of semi-structured interviews and adopted the interpretative phenomenological approach (IPA) as the researcher aimed to explore the lived experiences of (n= 18) participants. The study discovered that professional nurses experienced a lack of management support within PHC facilities and that management mainly played an administrative and disciplinary role. Although participants mainly held positive perceptions with regards to changes happening in health care however they also expressed negative experiences in terms of how change initiatives were introduced resulting in role conflict and role ambiguity. Participants mentioned a shortage of staff, inadequate training as well as a lack of management support as some of the key challenges faced in facilities. This study offers unique findings as participants have not only experienced the various reforms within the PHC system however they have also been part of NHI pilot. The authors are not aware of any other studies published that examine management support, role conflict and role ambiguity together especially in South African PHC facilities. In conclusion understanding these challenges may provide insight and opportunities available to improve the current landscape of PHC not only in South Africa but internationally.

Keywords: management support, professional nurse, role ambiguity, role conflict

Procedia PDF Downloads 139
25371 Investigating the Association between Escherichia Coli Infection and Breast Cancer Incidence: A Retrospective Analysis and Literature Review

Authors: Nadia Obaed, Lexi Frankel, Amalia Ardeljan, Denis Nigel, Anniki Witter, Omar Rashid

Abstract:

Breast cancer is the most common cancer among women, with a lifetime risk of one in eight of all women in the United States. Although breast cancer is prevalent throughout the world, the uneven distribution in incidence and mortality rates is shaped by the variation in population structure, environment, genetics and known lifestyle risk factors. Furthermore, the bacterial profile in healthy and cancerous breast tissue differs with a higher relative abundance of bacteria capable of causing DNA damage in breast cancer patients. Previous bacterial infections may change the composition of the microbiome and partially account for the environmental factors promoting breast cancer. One study found that higher amounts of Staphylococcus, Bacillus, and Enterobacteriaceae, of which Escherichia coli (E. coli) is a part, were present in breast tumor tissue. Based on E. coli’s ability to damage DNA, it is hypothesized that there is an increased risk of breast cancer associated with previous E. coli infection. Therefore, the purpose of this study was to evaluate the correlation between E. coli infection and the incidence of breast cancer. Holy Cross Health, Fort Lauderdale, provided access to the Health Insurance Portability and Accountability (HIPAA) compliant national database for the purpose of academic research. International Classification of Disease 9th and 10th Codes (ICD-9, ICD-10) was then used to conduct a retrospective analysis using data from January 2010 to December 2019. All breast cancer diagnoses and all patients infected versus not infected with E. coli that underwent typical E. coli treatment were investigated. The obtained data were matched for age, Charlson Comorbidity Score (CCI score), and antibiotic treatment. Standard statistical methods were applied to determine statistical significance and an odds ratio was used to estimate the relative risk. A total of 81286 patients were identified and analyzed from the initial query and then reduced to 31894 antibiotic-specific treated patients in both the infected and control group, respectively. The incidence of breast cancer was 2.51% and present in 2043 patients in the E. coli group compared to 5.996% and present in 4874 patients in the control group. The incidence of breast cancer was 3.84% and present in 1223 patients in the treated E. coli group compared to 6.38% and present in 2034 patients in the treated control group. The decreased incidence of breast cancer in the E. coli and treated E. coli groups was statistically significant with a p-value of 2.2x10-16 and 2.264x10-16, respectively. The odds ratio in the E. coli and treated E. coli groups was 0.784 and 0.787 with a 95% confidence interval, respectively (0.756-0.813; 0.743-0.833). The current study shows a statistically significant decrease in breast cancer incidence in association with previous Escherichia coli infection. Researching the relationship between single bacterial species is important as only up to 10% of breast cancer risk is attributable to genetics, while the contribution of environmental factors including previous infections potentially accounts for a majority of the preventable risk. Further evaluation is recommended to assess the potential and mechanism of E. coli in decreasing the risk of breast cancer.

Keywords: breast cancer, escherichia coli, incidence, infection, microbiome, risk

Procedia PDF Downloads 251
25370 Correlation and Prediction of Biodiesel Density

Authors: Nieves M. C. Talavera-Prieto, Abel G. M. Ferreira, António T. G. Portugal, Rui J. Moreira, Jaime B. Santos

Abstract:

The knowledge of biodiesel density over large ranges of temperature and pressure is important for predicting the behavior of fuel injection and combustion systems in diesel engines, and for the optimization of such systems. In this study, cottonseed oil was transesterified into biodiesel and its density was measured at temperatures between 288 K and 358 K and pressures between 0.1 MPa and 30 MPa, with expanded uncertainty estimated as ±1.6 kg.m^-3. Experimental pressure-volume-temperature (pVT) cottonseed data was used along with literature data relative to other 18 biodiesels, in order to build a database used to test the correlation of density with temperarure and pressure using the Goharshadi–Morsali–Abbaspour equation of state (GMA EoS). To our knowledge, this is the first that density measurements are presented for cottonseed biodiesel under such high pressures, and the GMA EoS used to model biodiesel density. The new tested EoS allowed correlations within 0.2 kg•m-3 corresponding to average relative deviations within 0.02%. The built database was used to develop and test a new full predictive model derived from the observed linear relation between density and degree of unsaturation (DU), which depended from biodiesel FAMEs profile. The average density deviation of this method was only about 3 kg.m-3 within the temperature and pressure limits of application. These results represent appreciable improvements in the context of density prediction at high pressure when compared with other equations of state.

Keywords: biodiesel density, correlation, equation of state, prediction

Procedia PDF Downloads 609
25369 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: active contour, bayesian, echocardiographic image, feature vector

Procedia PDF Downloads 416
25368 Approaches to Estimating the Radiation and Socio-Economic Consequences of the Fukushima Daiichi Nuclear Power Plant Accident Using the Data Available in the Public Domain

Authors: Dmitry Aron

Abstract:

Major radiation accidents carry not only the potential risks of negative consequences for public health due to exposure but also because of large-scale emergency measures were taken by authorities to protect the population, which can lead to unreasonable social and economic damage. It is technically difficult, as a rule, to assess the possible costs and damages from decisions on evacuation or resettlement of residents in the shortest possible time, since it requires specially prepared information systems containing relevant information on demographic, economic parameters and incoming data on radiation conditions. Foreign observers also face the difficulties in assessing the consequences of an accident in a foreign territory, since they usually do not have official and detailed statistical data on the territory of foreign state beforehand. Also, they can suppose the application of unofficial data from open Internet sources is an unreliable and overly labor-consuming procedure. This paper describes an approach to prompt creation of relational database that contains detailed actual data on economics, demographics and radiation situation at the Fukushima Prefecture during the Fukushima Daiichi NPP accident, received by the author from open Internet sources. This database was developed and used to assess the number of evacuated population, radiation doses, expected financial losses and other parameters of the affected areas. The costs for the areas with temporarily evacuated and long-term resettled population were investigated, and the radiological and economic effectiveness of the measures taken to protect the population was estimated. Some of the results are presented in the article. The study showed that such a tool for analyzing the consequences of radiation accidents can be prepared in a short space of time for the entire territory of Japan, and it can serve for the modeling of social and economic consequences for hypothetical accidents for any nuclear power plant in its territory.

Keywords: Fukushima, radiation accident, emergency measures, database

Procedia PDF Downloads 190
25367 Fluid Prescribing Post Laparotomies

Authors: Gusa Hall, Barrie Keeler, Achal Khanna

Abstract:

Introduction: NICE guidelines have highlighted the consequences of IV fluid mismanagement. The main aim of this study was to audit fluid prescribing post laparotomies to identify if fluids were prescribed in accordance to NICE guidelines. Methodology: Retrospective database search of eight specific laparotomy procedures (colectomy right and left, Hartmann’s procedure, small bowel resection, perforated ulcer, abdominal perineal resection, anterior resection, pan proctocolectomy, subtotal colectomy) highlighted 29 laparotomies between April 2019 and May 2019. Two of 29 patients had secondary procedures during the same admission, n=27 (patients). Database case notes were reviewed for date of procedure, length of admission, fluid prescribed and amount, nasal gastric tube output, daily bloods results for electrolytes sodium and potassium and operational losses. Results: n=27 based on 27 identified patients between April 2019 – May 2019, 93% (25/27) received IV fluids, only 19% (5/27) received the correct IV fluids in accordance to NICE guidelines, 93% (25/27) who received IV fluids had the correct electrolytes levels (sodium & potassium), 100% (27/27) patients received blood tests (U&E’s) for correct electrolytes levels. 0% (0/27) no documentation on operational losses. IV fluids matched nasogastric tube output in 100% (3/3) of the number of patients that had a nasogastric tube in situ. Conclusion: A PubMed database literature review on barriers to safer IV prescribing highlighted educational interventions focused on prescriber knowledge rather than how to execute the prescribing task. This audit suggests IV fluids post laparotomies are not being prescribed consistently in accordance to NICE guidelines. Surgical management plans should be clearer on IV fluids and electrolytes requirements for the following 24 hours after the plan has been initiated. In addition, further teaching and training around IV prescribing is needed together with frequent surgical audits on IV fluid prescribing post-surgery to evaluate improvements.

Keywords: audit, IV Fluid prescribing, laparotomy, NICE guidelines

Procedia PDF Downloads 114
25366 Investigate the Side Effects of Patients With Severe COVID-19 and Choose the Appropriate Medication Regimens to Deal With Them

Authors: Rasha Ahmadi

Abstract:

In December 2019, a coronavirus, currently identified as SARS-CoV-2, produced a series of acute atypical respiratory illnesses in Wuhan, Hubei Province, China. The sickness induced by this virus was named COVID-19. The virus is transmittable between humans and has caused pandemics worldwide. The number of death tolls continues to climb and a huge number of countries have been obliged to perform social isolation and lockdown. Lack of focused therapy continues to be a problem. Epidemiological research showed that senior patients were more susceptible to severe diseases, whereas children tend to have milder symptoms. In this study, we focus on other possible side effects of COVID-19 and more detailed treatment strategies. Using bioinformatics analysis, we first isolated the gene expression profile of patients with severe COVID-19 from the GEO database. Patients' blood samples were used in the GSE183071 dataset. We then categorized the genes with high and low expression. In the next step, we uploaded the genes separately to the Enrichr database and evaluated our data for signs and symptoms as well as related medication regimens. The results showed that 138 genes with high expression and 108 genes with low expression were observed differentially in the severe COVID-19 VS control group. Symptoms and diseases such as embolism and thrombosis of the abdominal aorta, ankylosing spondylitis, suicidal ideation or attempt, regional enteritis were observed in genes with high expression and in genes with low expression of acute and subacute forms of ischemic heart, CNS infection and poliomyelitis, synovitis and tenosynovitis. Following the detection of diseases and possible signs and symptoms, Carmustine, Bithionol, Leflunomide were evaluated more significantly for high-expression genes and Chlorambucil, Ifosfamide, Hydroxyurea, Bisphenol for low-expression genes. In general, examining the different and invisible aspects of COVID-19 and identifying possible treatments can help us significantly in the emergency and hospitalization of patients.

Keywords: phenotypes, drug regimens, gene expression profiles, bioinformatics analysis, severe COVID-19

Procedia PDF Downloads 135
25365 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves

Authors: Shengnan Chen, Shuhua Wang

Abstract:

Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.

Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves

Procedia PDF Downloads 281
25364 Attribute Index and Classification Method of Earthquake Damage Photographs of Engineering Structure

Authors: Ming Lu, Xiaojun Li, Bodi Lu, Juehui Xing

Abstract:

Earthquake damage phenomenon of each large earthquake gives comprehensive and profound real test to the dynamic performance and failure mechanism of different engineering structures. Cognitive engineering structure characteristics through seismic damage phenomenon are often far superior to expensive shaking table experiments. After the earthquake, people will record a variety of different types of engineering damage photos. However, a large number of earthquake damage photographs lack sufficient information and reduce their using value. To improve the research value and the use efficiency of engineering seismic damage photographs, this paper objects to explore and show seismic damage background information, which includes the earthquake magnitude, earthquake intensity, and the damaged structure characteristics. From the research requirement in earthquake engineering field, the authors use the 2008 China Wenchuan M8.0 earthquake photographs, and provide four kinds of attribute indexes and classification, which are seismic information, structure types, earthquake damage parts and disaster causation factors. The final object is to set up an engineering structural seismic damage database based on these four attribute indicators and classification, and eventually build a website providing seismic damage photographs.

Keywords: attribute index, classification method, earthquake damage picture, engineering structure

Procedia PDF Downloads 760
25363 A Framework for an Automated Decision Support System for Selecting Safety-Conscious Contractors

Authors: Rawan A. Abdelrazeq, Ahmed M. Khalafallah, Nabil A. Kartam

Abstract:

Selection of competent contractors for construction projects is usually accomplished through competitive bidding or negotiated contracting in which the contract bid price is the basic criterion for selection. The evaluation of contractor’s safety performance is still not a typical criterion in the selection process, despite the existence of various safety prequalification procedures. There is a critical need for practical and automated systems that enable owners and decision makers to evaluate contractor safety performance, among other important contractor selection criteria. These systems should ultimately favor safety-conscious contractors to be selected by the virtue of their past good safety records and current safety programs. This paper presents an exploratory sequential mixed-methods approach to develop a framework for an automated decision support system that evaluates contractor safety performance based on a multitude of indicators and metrics that have been identified through a comprehensive review of construction safety research, and a survey distributed to domain experts. The framework is developed in three phases: (1) determining the indicators that depict contractor current and past safety performance; (2) soliciting input from construction safety experts regarding the identified indicators, their metrics, and relative significance; and (3) designing a decision support system using relational database models to integrate the identified indicators and metrics into a system that assesses and rates the safety performance of contractors. The proposed automated system is expected to hold several advantages including: (1) reducing the likelihood of selecting contractors with poor safety records; (2) enhancing the odds of completing the project safely; and (3) encouraging contractors to exert more efforts to improve their safety performance and practices in order to increase their bid winning opportunities which can lead to significant safety improvements in the construction industry. This should prove useful to decision makers and researchers, alike, and should help improve the safety record of the construction industry.

Keywords: construction safety, contractor selection, decision support system, relational database

Procedia PDF Downloads 276
25362 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data

Authors: M. Mueller, M. Kuehn, M. Voelker

Abstract:

In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).

Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing

Procedia PDF Downloads 125
25361 Anemia and Nutritional Status as Dominant Factor of the Event Low Birth Weight in Indonesia: A Systematic Review

Authors: Lisnawati Hutagalung

Abstract:

Background: Low birth weight (LBW) is one cause of newborn death. Babies with low birth weight tend to have slower cognitive development, growth retardation, more at risk of infectious disease event at risk of death. Objective: Identifying risk factors and dominant factors that influence the incidence of LBW in Indonesia. Method: This research used some database of public health such as Google Scholar, UGM journals, UI journals and UNAND journals in 2012-2015. Data were filtered using keywords ‘Risk Factors’ AND ‘Cause LBW’ with amounts 2757 study. The filtrate obtained 5 public health research that meets the criteria. Results: Risk factors associated with LBW, among other environment factors (exposure to cigarette smoke and residence), social demographics (age and socio-economic) and maternal factors (anemia, placental abnormal, nutritional status of mothers, examinations antenatal, preeclampsia, parity, and complications in pregnancy). Anemia and nutritional status become the dominant factor affecting LBW. Conclusions: The risk factors that affect LBW, most commonly found in the maternal factors. The dominant factors are a big effect on LBW is anemia and nutritional status of the mother during pregnancy.

Keywords: low birth weight, anemia, nutritional status, the dominant factor

Procedia PDF Downloads 362
25360 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques

Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev

Abstract:

Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.

Keywords: data analysis, demand modeling, healthcare, medical facilities

Procedia PDF Downloads 142
25359 Operational Excellence Performance in Pharmaceutical Quality Control Labs: An Empirical Investigation of the Effectiveness and Efficiency Relation

Authors: Stephan Koehler, Thomas Friedli

Abstract:

Performance measurement has evolved over time from a unidimensional short-term efficiency focused approach into a balanced multidimensional approach. Today, integrated performance measurement frameworks are often used to avoid local optimization and to encourage continuous improvement of an organization. In literature, the multidimensional characteristic of performance measurement is often described by competitive priorities. At the same time, on the highest abstraction level an effectiveness and efficiency dimension of performance measurement can be distinguished. This paper aims at a better understanding of the composition of effectiveness and efficiency and their relation in pharmaceutical quality control labs. The research comprises a lab-specific operationalization of effectiveness and efficiency and examines how the two dimensions are interlinked. The basis for the analysis represents a database of the University of St. Gallen including a divers set of 40 different pharmaceutical quality control labs. The research provides empirical evidence that labs with a high effectiveness also accompany a high efficiency. Lab effectiveness explains 29.5 % of the variance in lab efficiency. In addition, labs with an above median operational excellence performance have a statistically significantly higher lab effectiveness and lab efficiency compared to the below median performing labs.

Keywords: empirical study, operational excellence, performance measurement, pharmaceutical quality control lab

Procedia PDF Downloads 156
25358 A Cloud Computing System Using Virtual Hyperbolic Coordinates for Services Distribution

Authors: Telesphore Tiendrebeogo, Oumarou Sié

Abstract:

Cloud computing technologies have attracted considerable interest in recent years. Thus, these latters have become more important for many existing database applications. It provides a new mode of use and of offer of IT resources in general. Such resources can be used “on demand” by anybody who has access to the internet. Particularly, the Cloud platform provides an ease to use interface between providers and users, allow providers to develop and provide software and databases for users over locations. Currently, there are many Cloud platform providers support large scale database services. However, most of these only support simple keyword-based queries and can’t response complex query efficiently due to lack of efficient in multi-attribute index techniques. Existing Cloud platform providers seek to improve performance of indexing techniques for complex queries. In this paper, we define a new cloud computing architecture based on a Distributed Hash Table (DHT) and design a prototype system. Next, we perform and evaluate our cloud computing indexing structure based on a hyperbolic tree using virtual coordinates taken in the hyperbolic plane. We show through our experimental results that we compare with others clouds systems to show our solution ensures consistence and scalability for Cloud platform.

Keywords: virtual coordinates, cloud, hyperbolic plane, storage, scalability, consistency

Procedia PDF Downloads 420
25357 The Impact of Corporate Social Responsibility and Knowledge Management Factors on University's Students' Learning Process

Authors: Naritphol Boonyakiat

Abstract:

This research attempts to investigate the effects of corporate social responsibility and knowledge management factors on students’ learning process of the Silpakorn University. The goal of this study is to fill the literature gap by gaining an understanding of corporate social responsibility and the knowledge management factors that fundamentally relate to students’ learning process within the university context. Thus, this study will focus on the outcomes that derive from a set of quantitative data that were obtained using Silpakorn university’s database of 200 students. The results represent the perceptions of students regarding the impact of corporate social responsibility and knowledge management factors on their learning process within the university. The findings indicate that corporate social responsibility and knowledge management have significant effects on students’ learning process. This study may assist us in gaining a better understanding of the integrated aspects of university and learning environments to discover how to allocate optimally university’s resources and management approaches to gain benefits from corporate social responsibility and knowledge management practices toward students’ learning process within the university bodies. Therefore, there is a sufficient reason to believe that the findings can contribute to research in the area of CSR, KM and students’ learning process as an essential aspect of university’s stakeholder.

Keywords: corporate social responsibility, knowledge management, learning process, university’s students

Procedia PDF Downloads 314