Search results for: robust penalized regression
3946 Catastrophic Health Expenditures: Evaluating the Effectiveness of Nepal's National Health Insurance Program Using Propensity Score Matching and Doubly Robust Methodology
Authors: Simrin Kafle, Ulrika Enemark
Abstract:
Catastrophic health expenditure (CHE) is a critical issue in low- and middle-income countries like Nepal, exacerbating financial hardship among vulnerable households. This study assesses the effectiveness of Nepal’s National Health Insurance Program (NHIP), launched in 2015, to reduce out-of-pocket (OOP) healthcare costs and mitigate CHE. Conducted in Pokhara Metropolitan City, the study used an analytical cross-sectional design, sampling 1276 households through a two-stage random sampling method. Data was collected via face-to-face interviews between May and October 2023. The analysis was conducted using SPSS version 29, incorporating propensity score matching to minimize biases and create comparable groups of enrolled and non-enrolled households in the NHIP. PSM helped reduce confounding effects by matching households with similar baseline characteristics. Additionally, a doubly robust methodology was employed, combining propensity score adjustment with regression modeling to enhance the reliability of the results. This comprehensive approach ensured a more accurate estimation of the impact of NHIP enrollment on CHE. Among the 1276 samples, 534 households (41.8%) were enrolled in NHIP. Of them, 84.3% of households renewed their insurance card, though some cited long waiting times, lack of medications, and complex procedures as barriers to renewal. Approximately 57.3% of households reported known diseases before enrollment, with 49.8% attending routine health check-ups in the past year. The primary motivation for enrollment was encouragement from insurance employees (50.2%). The data indicates that 12.5% of enrolled households experienced CHE versus 7.5% among non-enrolled. Enrollment into NHIP does not contribute to lower CHE (AOR: 1.98, 95% CI: 1.21-3.24). Key factors associated with increased CHE risk were presence of non-communicable diseases (NCDs) (AOR: 3.94, 95% CI: 2.10-7.39), acute illnesses/injuries (AOR: 6.70, 95% CI: 3.97-11.30), larger household size (AOR: 3.09, 95% CI: 1.81-5.28), and households below the poverty line (AOR: 5.82, 95% CI: 3.05-11.09). Other factors such as gender, education level, caste/ethnicity, presence of elderly members, and under-five children also showed varying associations with CHE, though not all were statistically significant. The study concludes that enrollment in the NHIP does not significantly reduce the risk of CHE. The reason for this could be inadequate coverage, where high-cost medicines, treatments, and transportation costs are not fully included in the insurance package, leading to significant out-of-pocket expenses. We also considered the long waiting time, lack of medicines, and complex procedures for the utilization of NHIP benefits, which might result in the underuse of covered services. Finally, gaps in enrollment and retention might leave certain households vulnerable to CHE despite the existence of NHIP. Key factors contributing to increased CHE include NCDs, acute illnesses, larger household sizes, and poverty. To improve the program’s effectiveness, it is recommended that NHIP benefits and coverage be expanded to better protect against high healthcare costs. Additionally, simplifying the renewal process, addressing long waiting times, and enhancing the availability of services could improve member satisfaction and retention. Targeted financial protection measures should be implemented for high-risk groups, and efforts should be made to increase awareness and encourage routine health check-ups to prevent severe health issues that contribute to CHE.Keywords: catastrophic health expenditure, effectiveness, national health insurance program, Nepal
Procedia PDF Downloads 243945 Development of an Interactive and Robust Image Analysis and Diagnostic Tool in R for Early Detection of Cervical Cancer
Authors: Kumar Dron Shrivastav, Ankan Mukherjee Das, Arti Taneja, Harpreet Singh, Priya Ranjan, Rajiv Janardhanan
Abstract:
Cervical cancer is one of the most common cancer among women worldwide which can be cured if detected early. Manual pathology which is typically utilized at present has many limitations. The current gold standard for cervical cancer diagnosis is exhaustive and time-consuming because it relies heavily on the subjective knowledge of the oncopathologists which leads to mis-diagnosis and missed diagnosis resulting false negative and false positive. To reduce time and complexities associated with early diagnosis, we require an interactive diagnostic tool for early detection particularly in developing countries where cervical cancer incidence and related mortality is high. Incorporation of digital pathology in place of manual pathology for cervical cancer screening and diagnosis can increase the precision and strongly reduce the chances of error in a time-specific manner. Thus, we propose a robust and interactive cervical cancer image analysis and diagnostic tool, which can categorically process both histopatholgical and cytopathological images to identify abnormal cells in the least amount of time and settings with minimum resources. Furthermore, incorporation of a set of specific parameters that are typically referred to for identification of abnormal cells with the help of open source software -’R’ is one of the major highlights of the tool. The software has the ability to automatically identify and quantify the morphological features, color intensity, sensitivity and other parameters digitally to differentiate abnormal from normal cells, which may improve and accelerate screening and early diagnosis, ultimately leading to timely treatment of cervical cancer.Keywords: cervical cancer, early detection, digital Pathology, screening
Procedia PDF Downloads 1783944 Mapping of Urban Micro-Climate in Lyon (France) by Integrating Complementary Predictors at Different Scales into Multiple Linear Regression Models
Authors: Lucille Alonso, Florent Renard
Abstract:
The characterizations of urban heat island (UHI) and their interactions with climate change and urban climates are the main research and public health issue, due to the increasing urbanization of the population. These solutions require a better knowledge of the UHI and micro-climate in urban areas, by combining measurements and modelling. This study is part of this topic by evaluating microclimatic conditions in dense urban areas in the Lyon Metropolitan Area (France) using a combination of data traditionally used such as topography, but also from LiDAR (Light Detection And Ranging) data, Landsat 8 satellite observation and Sentinel and ground measurements by bike. These bicycle-dependent weather data collections are used to build the database of the variable to be modelled, the air temperature, over Lyon’s hyper-center. This study aims to model the air temperature, measured during 6 mobile campaigns in Lyon in clear weather, using multiple linear regressions based on 33 explanatory variables. They are of various categories such as meteorological parameters from remote sensing, topographic variables, vegetation indices, the presence of water, humidity, bare soil, buildings, radiation, urban morphology or proximity and density to various land uses (water surfaces, vegetation, bare soil, etc.). The acquisition sources are multiple and come from the Landsat 8 and Sentinel satellites, LiDAR points, and cartographic products downloaded from an open data platform in Greater Lyon. Regarding the presence of low, medium, and high vegetation, the presence of buildings and ground, several buffers close to these factors were tested (5, 10, 20, 25, 50, 100, 200 and 500m). The buffers with the best linear correlations with air temperature for ground are 5m around the measurement points, for low and medium vegetation, and for building 50m and for high vegetation is 100m. The explanatory model of the dependent variable is obtained by multiple linear regression of the remaining explanatory variables (Pearson correlation matrix with a |r| < 0.7 and VIF with < 5) by integrating a stepwise sorting algorithm. Moreover, holdout cross-validation is performed, due to its ability to detect over-fitting of multiple regression, although multiple regression provides internal validation and randomization (80% training, 20% testing). Multiple linear regression explained, on average, 72% of the variance for the study days, with an average RMSE of only 0.20°C. The impact on the model of surface temperature in the estimation of air temperature is the most important variable. Other variables are recurrent such as distance to subway stations, distance to water areas, NDVI, digital elevation model, sky view factor, average vegetation density, or building density. Changing urban morphology influences the city's thermal patterns. The thermal atmosphere in dense urban areas can only be analysed on a microscale to be able to consider the local impact of trees, streets, and buildings. There is currently no network of fixed weather stations sufficiently deployed in central Lyon and most major urban areas. Therefore, it is necessary to use mobile measurements, followed by modelling to characterize the city's multiple thermal environments.Keywords: air temperature, LIDAR, multiple linear regression, surface temperature, urban heat island
Procedia PDF Downloads 1373943 Estimation of Missing Values in Aggregate Level Spatial Data
Authors: Amitha Puranik, V. S. Binu, Seena Biju
Abstract:
Missing data is a common problem in spatial analysis especially at the aggregate level. Missing can either occur in covariate or in response variable or in both in a given location. Many missing data techniques are available to estimate the missing data values but not all of these methods can be applied on spatial data since the data are autocorrelated. Hence there is a need to develop a method that estimates the missing values in both response variable and covariates in spatial data by taking account of the spatial autocorrelation. The present study aims to develop a model to estimate the missing data points at the aggregate level in spatial data by accounting for (a) Spatial autocorrelation of the response variable (b) Spatial autocorrelation of covariates and (c) Correlation between covariates and the response variable. Estimating the missing values of spatial data requires a model that explicitly account for the spatial autocorrelation. The proposed model not only accounts for spatial autocorrelation but also utilizes the correlation that exists between covariates, within covariates and between a response variable and covariates. The precise estimation of the missing data points in spatial data will result in an increased precision of the estimated effects of independent variables on the response variable in spatial regression analysis.Keywords: spatial regression, missing data estimation, spatial autocorrelation, simulation analysis
Procedia PDF Downloads 3823942 Simultaneous Determination of Methotrexate and Aspirin Using Fourier Transform Convolution Emission Data under Non-Parametric Linear Regression Method
Authors: Marwa A. A. Ragab, Hadir M. Maher, Eman I. El-Kimary
Abstract:
Co-administration of methotrexate (MTX) and aspirin (ASP) can cause a pharmacokinetic interaction and a subsequent increase in blood MTX concentrations which may increase the risk of MTX toxicity. Therefore, it is important to develop a sensitive, selective, accurate and precise method for their simultaneous determination in urine. A new hybrid chemometric method has been applied to the emission response data of the two drugs. Spectrofluorimetric method for determination of MTX through measurement of its acid-degradation product, 4-amino-4-deoxy-10-methylpteroic acid (4-AMP), was developed. Moreover, the acid-catalyzed degradation reaction enables the spectrofluorimetric determination of ASP through the formation of its active metabolite salicylic acid (SA). The proposed chemometric method deals with convolution of emission data using 8-points sin xi polynomials (discrete Fourier functions) after the derivative treatment of these emission data. The first and second derivative curves (D1 & D2) were obtained first then convolution of these curves was done to obtain first and second derivative under Fourier functions curves (D1/FF) and (D2/FF). This new application was used for the resolution of the overlapped emission bands of the degradation products of both drugs to allow their simultaneous indirect determination in human urine. Not only this chemometric approach was applied to the emission data but also the obtained data were subjected to non-parametric linear regression analysis (Theil’s method). The proposed method was fully validated according to the ICH guidelines and it yielded linearity ranges as follows: 0.05-0.75 and 0.5-2.5 µg mL-1 for MTX and ASP respectively. It was found that the non-parametric method was superior over the parametric one in the simultaneous determination of MTX and ASP after the chemometric treatment of the emission spectra of their degradation products. The work combines the advantages of derivative and convolution using discrete Fourier function together with the reliability and efficacy of the non-parametric analysis of data. The achieved sensitivity along with the low values of LOD (0.01 and 0.06 µg mL-1) and LOQ (0.04 and 0.2 µg mL-1) for MTX and ASP respectively, by the second derivative under Fourier functions (D2/FF) were promising and guarantee its application for monitoring the two drugs in patients’ urine samples.Keywords: chemometrics, emission curves, derivative, convolution, Fourier transform, human urine, non-parametric regression, Theil’s method
Procedia PDF Downloads 4303941 Novel Framework for MIMO-Enhanced Robust Selection of Critical Control Factors in Auto Plastic Injection Moulding Quality Optimization
Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari
Abstract:
Apparent quality defects such as warpage, shrinkage, weld line, etc. are such an irresistible phenomenon in mass production of auto plastic appearance parts. These frequently occurred manufacturing defects should be satisfied concurrently so as to achieve a final product with acceptable quality standards. Determining the significant control factors that simultaneously affect multiple quality characteristics can significantly improve the optimization results by eliminating the deviating effect of the so-called ineffective outliers. Hence, a robust quantitative approach needs to be developed upon which major control factors and their level can be effectively determined to help improve the reliability of the optimal processing parameter design. Hence, the primary objective of current study was to develop a systematic methodology for selection of significant control factors (SCF) relevant to multiple quality optimization of auto plastic appearance part. Auto bumper was used as a specimen with the most identical quality and production characteristics to APAP group. A preliminary failure modes and effect analysis (FMEA) was conducted to nominate a database of pseudo significant significant control factors prior to the optimization phase. Later, CAE simulation Moldflow analysis was implemented to manipulate four rampant plastic injection quality defects concerned with APAP group including warpage deflection, volumetric shrinkage, sink mark and weld line. Furthermore, a step-backward elimination searching method (SESME) has been developed for systematic pre-optimization selection of SCF based on hierarchical orthogonal array design and priority-based one-way analysis of variance (ANOVA). The development of robust parameter design in the second phase was based on DOE module powered by Minitab v.16 statistical software. Based on the F-test (F 0.05, 2, 14) one-way ANOVA results, it was concluded that for warpage deflection, material mixture percentage was the most significant control factor yielding a 58.34% of contribution while for the other three quality defects, melt temperature was the most significant control factor with a 25.32%, 84.25%, and 34.57% contribution for sin mark, shrinkage and weld line strength control. Also, the results on the he least significant control factors meaningfully revealed injection fill time as the least significant factor for both warpage and sink mark with respective 1.69% and 6.12% contribution. On the other hand, for shrinkage and weld line defects, the least significant control factors were holding pressure and mold temperature with a 0.23% and 4.05% overall contribution accordingly.Keywords: plastic injection moulding, quality optimization, FMEA, ANOVA, SESME, APAP
Procedia PDF Downloads 3483940 Losing Benefits from Social Network Sites Usage: An Approach to Estimate the Relationship between Social Network Sites Usage and Social Capital
Authors: Maoxin Ye
Abstract:
This study examines the relationship between social network sites (SNS) usage and social capital. Because SNS usage can expand the users’ networks, and people who are connected in this networks may become resources to SNS users and lead them to advantage in some situation, it is important to estimate the relationship between SNS usage and ‘who’ is connected or what resources the SNS users can get. Additionally, ‘who’ can be divided in two aspects – people who possess high position and people who are different, hence, it is important to estimate the relationship between SNS usage and high position people and different people. This study adapts Lin’s definition of social capital and the measurement of position generator which tells us who was connected, and can be divided into the same two aspects as well. A national data of America (N = 2,255) collected by Pew Research Center is utilized to do a general regression analysis about SNS usage and social capital. The results indicate that SNS usage is negatively associated with each factor of social capital, and it suggests that, in fact, comparing with non-users, although SNS users can get more connections, the variety and resources of these connections are fewer. For this reason, we could lose benefits through SNS usage.Keywords: social network sites, social capital, position generator, general regression
Procedia PDF Downloads 2623939 The Relation between Spiritual Intelligence and Organizational Health and Job Satisfaction among the Female Staff in Islamic Azad University of Marvdasht
Authors: Reza Zarei
Abstract:
The result of the present study is to determine the relation between spiritual intelligence and organizational health and job satisfaction among the female staff in Islamic Azad University of Marvdasht. The population of the study includes the female staff and the faculty of Islamic Azad University of Marvdasht. The method is correlational and the instrument in the research is three questionnaires namely the spiritual intelligence by (ISIS), Amraam and Dryer, organizational health by Fieldman and Job satisfaction questionnaire. In order to test the hypotheses we used interpretive statistics, Pearson and regression correlation coefficient. The findings show that there is a significant relation between the spiritual intelligence and organizational health among the female staff of this unit. In addition, the organizational health has a significant relation with the elements of self-consciousness and social skills and on the other hand, job satisfaction is in significant relation with the elements of self-consciousness, self-control, self-provocation, sympathy and social skills in the whole sample regardless of the participants' gender. Finally, the results of multiple regression and variance analysis showed that using the variables of the spiritual intelligence of the female staff could predict the organizational health and their job satisfaction.Keywords: job satisfaction, spiritual intelligence, organizational health, Islamic Azad University
Procedia PDF Downloads 3763938 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs
Authors: Junaid Mahmood Alam
Abstract:
Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.Keywords: revalidation, standardized, IFCC, CAP, harmonized
Procedia PDF Downloads 2693937 An Application of Quantile Regression to Large-Scale Disaster Research
Authors: Katarzyna Wyka, Dana Sylvan, JoAnn Difede
Abstract:
Background and significance: The following disaster, population-based screening programs are routinely established to assess physical and psychological consequences of exposure. These data sets are highly skewed as only a small percentage of trauma-exposed individuals develop health issues. Commonly used statistical methodology in post-disaster mental health generally involves population-averaged models. Such models aim to capture the overall response to the disaster and its aftermath; however, they may not be sensitive enough to accommodate population heterogeneity in symptomatology, such as post-traumatic stress or depressive symptoms. Methods: We use an archival longitudinal data set from Weill-Cornell 9/11 Mental Health Screening Program established following the World Trade Center (WTC) terrorist attacks in New York in 2001. Participants are rescue and recovery workers who participated in the site cleanup and restoration (n=2960). The main outcome is the post-traumatic stress symptoms (PTSD) severity score assessed via clinician interviews (CAPS). For a detailed understanding of response to the disaster and its aftermath, we are adapting quantile regression methodology with particular focus on predictors of extreme distress and resilience to trauma. Results: The response variable was defined as the quantile of the CAPS score for each individual under two different scenarios specifying the unconditional quantiles based on: 1) clinically meaningful CAPS cutoff values and 2) CAPS distribution in the population. We present graphical summaries of the differential effects. For instance, we found that the effect of the WTC exposures, namely seeing bodies and feeling that life was in danger during rescue/recovery work was associated with very high PTSD symptoms. A similar effect was apparent in individuals with prior psychiatric history. Differential effects were also present for age and education level of the individuals. Conclusion: We evaluate the utility of quantile regression in disaster research in contrast to the commonly used population-averaged models. We focused on assessing the distribution of risk factors for post-traumatic stress symptoms across quantiles. This innovative approach provides a comprehensive understanding of the relationship between dependent and independent variables and could be used for developing tailored training programs and response plans for different vulnerability groups.Keywords: disaster workers, post traumatic stress, PTSD, quantile regression
Procedia PDF Downloads 2843936 Artificial Intelligence in the Design of High-Strength Recycled Concrete
Authors: Hadi Rouhi Belvirdi, Davoud Beheshtizadeh
Abstract:
The increasing demand for sustainable construction materials has led to a growing interest in high-strength recycled concrete (HSRC). Utilizing recycled materials not only reduces waste but also minimizes the depletion of natural resources. This study explores the application of artificial intelligence (AI) techniques to model and predict the properties of HSRC. In the past two decades, the production levels in various industries and, consequently, the amount of waste have increased significantly. Continuing this trend will undoubtedly cause irreparable damage to the environment. For this reason, engineers have been constantly seeking practical solutions for recycling industrial waste in recent years. This research utilized the results of the compressive strength of 90-day high-strength recycled concrete. The method for creating recycled concrete involved replacing sand with crushed glass and using glass powder instead of cement. Subsequently, a feedforward artificial neural network was employed to model the compressive strength results for 90 days. The regression and error values obtained indicate that this network is suitable for modeling the compressive strength data.Keywords: high-strength recycled concrete, feedforward artificial neural network, regression, construction materials
Procedia PDF Downloads 123935 Assessing the Impact of Covid-19 Pandemic on Waste Management Workers in Ghana
Authors: Mensah-Akoto Julius, Kenichi Matsui
Abstract:
This paper examines the impact of COVID-19 on waste management workers in Ghana. A questionnaire survey was conducted among 60 waste management workers in Accra metropolis, the capital region of Ghana, to understand the impact of the COVID-19 pandemic on waste generation, workers’ safety in collecting solid waste, and service delivery. To find out correlations between the pandemic and safety of waste management workers, a regression analysis was used. Regarding waste generation, the results show the pandemic led to the highest annual per capita solid waste generation, or 3,390 tons, in 2020. Regarding the safety of workers, the regression analysis shows a significant and inverse association between COVID-19 and waste management services. This means that contaminated wastes may infect field workers with COVID-19 due to their direct exposure. A rise in new infection cases would have a negative impact on the safety and service delivery of the workers. The result also shows that an increase in economic activities negatively impacts waste management workers. The analysis, however, finds no statistical relationship between workers’ service deliveries and employees’ salaries. The study then discusses how municipal waste management authorities can ensure safe and effective waste collection during the pandemic.Keywords: Covid-19, waste management worker, waste collection, Ghana
Procedia PDF Downloads 2033934 An Investigation of the Relevant Factors of Unplanned Readmission within 14 Days of Discharge in a Regional Teaching Hospital in South Taiwan
Authors: Xuan Hua Huang, Shu Fen Wu, Yi Ting Huang, Pi Yueh Lee
Abstract:
Background: In Taiwan, the Taiwan healthcare care Indicator Series regards the rate of hospital readmission as an important indicator of healthcare quality. Unplanned readmission not only effects patient’s condition but also increase healthcare utilization rate and healthcare costs. Purpose: The purpose of this study was explored the effects of adult unplanned readmission within 14 days of discharge at a regional teaching hospital in South Taiwan. Methods: The retrospectively review design was used. A total 495 participants of unplanned readmissions and 878 of non-readmissions within 14 days recruited from a regional teaching hospital in Southern Taiwan. The instruments used included the Charlson Comorbidity Index, and demographic characteristics, and disease-related variables. Statistical analyses were performed with SPSS version 22.0. The descriptive statistics were used (means, standard deviations, and percentage) and the inferential statistics were used T-test, Chi-square test and Logistic regression. Results: The unplanned readmissions within 14 days rate was 36%. The majorities were 268 males (54.1%), aged >65 were 318 (64.2%), and mean age was 68.8±14.65 years (23-98years). The mean score for the comorbidities was 3.77±2.73. The top three diagnosed of the readmission were digestive diseases (32.7%), respiratory diseases (15.2%), and genitourinary diseases (10.5%). There were significant relationships among the gender, age, marriage, comorbidity status, and discharge planning services (χ2: 3.816-16.474, p: 0.051~0.000). Logistic regression analysis showed that old age (OR = 1.012, 95% CI: 1.003, 1.021), had the multi-morbidity (OR = 0.712~4.040, 95% CI: 0.559~8.522), had been consult with discharge planning services (OR = 1.696, 95% CI: 1.105, 2.061) have a higher risk of readmission. Conclusions: This study finds that multi-morbidity was independent risk factor for unplanned readmissions at 14 days, recommended that the interventional treatment of the medical team be provided to provide integrated care for multi-morbidity to improve the patient's self-care ability and reduce the 14-day unplanned readmission rate.Keywords: unplanned readmission, comorbidities, Charlson comorbidity index, logistic regression
Procedia PDF Downloads 1473933 Exploring Factors Related to Unplanning Readmission of Elderly Patients in Taiwan
Authors: Hui-Yen Lee, Hsiu-Yun Wei, Guey-Jen Lin, Pi-Yueh Lee Lee
Abstract:
Background: Unplanned hospital readmissions increase healthcare costs and have been considered a marker of poor healthcare performance. The elderly face a higher risk of unplanned readmission due to elderly-specific characteristics such as deteriorating body functions and the relatively high incidence of complications after treatment of acute diseases. Purpose: The aim of this study was exploring the factors that relate to the unplanned readmission of elderly within 14 days of discharge at our hospital in southern Taiwan. Methods: We retrospectively reviewed the medical records of patients aged ≥65 years who had been re-admitted between January 2018 and December 2018.The Charlson Comorbidity score was calculated using previous used method. Related factors that affected the rate of unplanned readmission within 14 days of discharge were screened and analyzed using the chi-squared test and logistic regression analysis. Results: This study enrolled 829 subjects aged more than 65 years. The numbers of unplanned readmission patients within 14 days were 318 cases, while those did not belong to the unplanned readmission were 511 cases. In 2018, the rate of elderly patients in unplanned 14 days readmissions was 38.4%. The majority patients were females (166 cases, 52.2%), with an average age of 77.6 ± 7.90 years (65-98). The average value of Charlson Comorbidity score was 4.42±2.76. Using logistic regression analysis, we found that the gastric or peptic ulcer (OR=1.917 , P< 0.002), diabetes (OR= 0.722, P< 0.043), hemiplegia (OR= 2.292, P< 0.015), metastatic solid tumor (OR= 2.204, P< 0.025), hypertension (OR= 0.696, P< 0.044), and skin ulcer/cellulitis (OR= 2.747, P< 0.022) have significantly higher risk of 14-day readmissions. Conclusion: The results of the present study may assist the healthcare teams to understand the factors that may affect unplanned readmission in the elderly. We recommend that these teams give efficient approach in their medical practice, provide timely health education for elderly, and integrative healthcare for chronic diseases in order to reduce unplanned readmissions.Keywords: unplanning readmission, elderly, Charlson comorbidity score, logistic regression analysis
Procedia PDF Downloads 1303932 The Relations between Spatial Structure and Land Price
Authors: Jung-Hun Cho, Tae-Heon Moon, Jin-Hak Lee
Abstract:
Land price contains the comprehensive characteristics of urban space, representing the social and economic features of the city. Accordingly, land price can be utilized as an indicator, which can identify the changes of spatial structure and socioeconomic variations caused by urban development. This study attempted to explore the changes in land price by a new road construction. Methodologically, it adopted Space Syntax, which can interpret urban spatial structure comprehensively, to identify the relationship between the forms of road networks and land price. The result of the regression analysis showed the ‘integration index’ of Space Syntax is statistically significant and has a strong correlation with land price. If the integration value is high, land price increases proportionally. Subsequently, using regression equation, it tried to predict the land price changes of each of the lots surrounding the roads that are newly opened. The research methods or study results have the advantage of predicting the changes in land price in an easy way. In addition, it will contribute to planners and project managers to establish relevant polices and smoothing urban regeneration projects through enhancing residents’ understanding by providing possible results and advantages in their land price before the execution of urban regeneration and development projects.Keywords: space syntax, urban regeneration, spatial structure, official land price
Procedia PDF Downloads 3283931 A Novel Method for Face Detection
Authors: H. Abas Nejad, A. R. Teymoori
Abstract:
Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model
Procedia PDF Downloads 3383930 Relationship of Religious Coping with Occupational Stress and the Quality of Working Life of Midwives in Maternity Hospitals in Zahedan
Authors: Fatemeh Roostaee, Zahra Nikmanesh
Abstract:
This study was done to investigate the role of religious coping components on occupational stress and the quality of working life of midwives. The method of study was descriptive-correlation. The sample was comprised of all midwives in maternity hospitals in Zahedan during 1393. Participants were selected through applying census method. The instruments of data collection were three questionnaires: the quality of working life, occupational stress, and religious opposition. For statistical analysis, Pearson correlation and step by step regression analysis methods were used. The results showed that there is a significant negative relationship between the component of religious activities (r=-0/454) and occupational stress, and regression analysis was also shown that the variable of religious activities has been explained 45% of occupational stress variable changes. The Pearson correlation test showed that there isn't any significant relationship between religious opposition components and the quality of life. Therefore, it is necessary to present essential trainings on (the field of) strengthening compatibility strategies and religious activities to reduce occupational stress.Keywords: the quality of working life, occupational stress, religious, midwife
Procedia PDF Downloads 5803929 Predictor Factors in Predictive Model of Soccer Talent Identification among Male Players Aged 14 to 17 Years
Authors: Muhamad Hafiz Ismail, Ahmad H., Nelfianty M. R.
Abstract:
The longitudinal study is conducted to identify predictive factors of soccer talent among male players aged 14 to 17 years. Convenience sampling involving elite respondents (n=20) and sub-elite respondents (n=20) male soccer players. Descriptive statistics were reported as frequencies and percentages. The inferential statistical analysis is used to report the status of reliability, independent samples t-test, paired samples t-test, and multiple regression analysis. Generally, there are differences in mean of height, muscular strength, muscular endurance, cardiovascular endurance, task orientation, cognitive anxiety, self-confidence, juggling skills, short pass skills, long pass skills, dribbling skills, and shooting skills for 20 elite players and sub-elite players. Accordingly, there was a significant difference between pre and post-test for thirteen variables of height, weight, fat percentage, muscle strength, muscle endurance, cardiovascular endurance, flexibility, BMI, task orientation, juggling skills, short pass skills, a long pass skills, and dribbling skills. Based on the first predictive factors (physical), second predictive factors (fitness), third predictive factors (psychological), and fourth predictive factors (skills in playing football) pledged to the soccer talent; four multiple regression models were produced. The first predictive factor (physical) contributed 53.5 percent, supported by height and percentage of fat in soccer talents. The second predictive factor (fitness) contributed 63.2 percent and the third predictive factors (psychology) contributed 66.4 percent of soccer talent. The fourth predictive factors (skills) contributed 59.0 percent of soccer talent. The four multiple regression models could be used as a guide for talent scouting for soccer players of the future.Keywords: soccer talent identification, fitness and physical test, soccer skills test, psychological test
Procedia PDF Downloads 1573928 Exploring Syntactic and Semantic Features for Text-Based Authorship Attribution
Authors: Haiyan Wu, Ying Liu, Shaoyun Shi
Abstract:
Authorship attribution is to extract features to identify authors of anonymous documents. Many previous works on authorship attribution focus on statistical style features (e.g., sentence/word length), content features (e.g., frequent words, n-grams). Modeling these features by regression or some transparent machine learning methods gives a portrait of the authors' writing style. But these methods do not capture the syntactic (e.g., dependency relationship) or semantic (e.g., topics) information. In recent years, some researchers model syntactic trees or latent semantic information by neural networks. However, few works take them together. Besides, predictions by neural networks are difficult to explain, which is vital in authorship attribution tasks. In this paper, we not only utilize the statistical style and content features but also take advantage of both syntactic and semantic features. Different from an end-to-end neural model, feature selection and prediction are two steps in our method. An attentive n-gram network is utilized to select useful features, and logistic regression is applied to give prediction and understandable representation of writing style. Experiments show that our extracted features can improve the state-of-the-art methods on three benchmark datasets.Keywords: authorship attribution, attention mechanism, syntactic feature, feature extraction
Procedia PDF Downloads 1363927 The Influences of Accountants’ Potential Performance on Their Working Process: Government Savings Bank, Northeast, Thailand
Authors: Prateep Wajeetongratana
Abstract:
The purpose of this research was to study the influence of accountants’ potential performance on their working process, a case study of Government Savings Banks in the northeast of Thailand. The independent variables included accounting knowledge, accounting skill, accounting value, accounting ethics, and accounting attitude, while the dependent variable included the success of the working process. A total of 155 accountants working for Government Savings Banks were selected by random sampling. A questionnaire was used as a tool for collecting data. Descriptive statistics in this research included percentage, mean, and multiple regression analyses. The findings revealed that the majority of accountants were female with an age between 35-40 years old. Most of the respondents had an undergraduate degree with ten years of experience. Moreover, the factors of accounting knowledge, accounting skill, accounting a value and accounting ethics and accounting attitude were rated at a high level. The findings from regression analysis of observation data revealed a causal relationship in that the observation data could explain at least 51 percent of the success in the accountants’ working process.Keywords: influence, potential performance, success, working process
Procedia PDF Downloads 2273926 Application of Multilinear Regression Analysis for Prediction of Synthetic Shear Wave Velocity Logs in Upper Assam Basin
Authors: Triveni Gogoi, Rima Chatterjee
Abstract:
Shear wave velocity (Vs) estimation is an important approach in the seismic exploration and characterization of a hydrocarbon reservoir. There are varying methods for prediction of S-wave velocity, if recorded S-wave log is not available. But all the available methods for Vs prediction are empirical mathematical models. Shear wave velocity can be estimated using P-wave velocity by applying Castagna’s equation, which is the most common approach. The constants used in Castagna’s equation vary for different lithologies and geological set-ups. In this study, multiple regression analysis has been used for estimation of S-wave velocity. The EMERGE module from Hampson-Russel software has been used here for generation of S-wave log. Both single attribute and multi attributes analysis have been carried out for generation of synthetic S-wave log in Upper Assam basin. Upper Assam basin situated in North Eastern India is one of the most important petroleum provinces of India. The present study was carried out using four wells of the study area. Out of these wells, S-wave velocity was available for three wells. The main objective of the present study is a prediction of shear wave velocities for wells where S-wave velocity information is not available. The three wells having S-wave velocity were first used to test the reliability of the method and the generated S-wave log was compared with actual S-wave log. Single attribute analysis has been carried out for these three wells within the depth range 1700-2100m, which corresponds to Barail group of Oligocene age. The Barail Group is the main target zone in this study, which is the primary producing reservoir of the basin. A system generated list of attributes with varying degrees of correlation appeared and the attribute with the highest correlation was concerned for the single attribute analysis. Crossplot between the attributes shows the variation of points from line of best fit. The final result of the analysis was compared with the available S-wave log, which shows a good visual fit with a correlation of 72%. Next multi-attribute analysis has been carried out for the same data using all the wells within the same analysis window. A high correlation of 85% has been observed between the output log from the analysis and the recorded S-wave. The almost perfect fit between the synthetic S-wave and the recorded S-wave log validates the reliability of the method. For further authentication, the generated S-wave data from the wells have been tied to the seismic and correlated them. Synthetic share wave log has been generated for the well M2 where S-wave is not available and it shows a good correlation with the seismic. Neutron porosity, density, AI and P-wave velocity are proved to be the most significant variables in this statistical method for S-wave generation. Multilinear regression method thus can be considered as a reliable technique for generation of shear wave velocity log in this study.Keywords: Castagna's equation, multi linear regression, multi attribute analysis, shear wave logs
Procedia PDF Downloads 2293925 Survival Analysis Based Delivery Time Estimates for Display FAB
Authors: Paul Han, Jun-Geol Baek
Abstract:
In the flat panel display industry, the scheduler and dispatching system to meet production target quantities and the deadline of production are the major production management system which controls each facility production order and distribution of WIP (Work in Process). In dispatching system, delivery time is a key factor for the time when a lot can be supplied to the facility. In this paper, we use survival analysis methods to identify main factors and a forecasting model of delivery time. Of survival analysis techniques to select important explanatory variables, the cox proportional hazard model is used to. To make a prediction model, the Accelerated Failure Time (AFT) model was used. Performance comparisons were conducted with two other models, which are the technical statistics model based on transfer history and the linear regression model using same explanatory variables with AFT model. As a result, the Mean Square Error (MSE) criteria, the AFT model decreased by 33.8% compared to the existing prediction model, decreased by 5.3% compared to the linear regression model. This survival analysis approach is applicable to implementing a delivery time estimator in display manufacturing. And it can contribute to improve the productivity and reliability of production management system.Keywords: delivery time, survival analysis, Cox PH model, accelerated failure time model
Procedia PDF Downloads 5433924 Predictors of Glycaemic Variability and Its Association with Mortality in Critically Ill Patients with or without Diabetes
Authors: Haoming Ma, Guo Yu, Peiru Zhou
Abstract:
Background: Previous studies show that dysglycemia, mostly hyperglycemia, hypoglycemia and glycemic variability(GV), are associated with excess mortality in critically ill patients, especially those without diabetes. Glycemic variability is an increasingly important measure of glucose control in the intensive care unit (ICU) due to this association. However, there is limited data pertaining to the relationship between different clinical factors and glycemic variability and clinical outcomes categorized by their DM status. This retrospective study of 958 intensive care unit(ICU) patients was conducted to investigate the relationship between GV and outcome in critically ill patients and further to determine the significant factors that contribute to the glycemic variability. Aim: We hypothesize that the factors contributing to mortality and the glycemic variability are different from critically ill patients with or without diabetes. And the primary aim of this study was to determine which dysglycemia (hyperglycemia\hypoglycemia\glycemic variability) is independently associated with an increase in mortality among critically ill patients in different groups (DM/Non-DM). Secondary objectives were to further investigate any factors affecting the glycemic variability in two groups. Method: A total of 958 diabetic and non-diabetic patients with severe diseases in the ICU were selected for this retrospective analysis. The glycemic variability was defined as the coefficient of variation (CV) of blood glucose. The main outcome was death during hospitalization. The secondary outcome was GV. The logistic regression model was used to identify factors associated with mortality. The relationships between GV and other variables were investigated using linear regression analysis. Results: Information on age, APACHE II score, GV, gender, in-ICU treatment and nutrition was available for 958 subjects. Predictors remaining in the final logistic regression model for mortality were significantly different in DM/Non-DM groups. Glycemic variability was associated with an increase in mortality in both DM(odds ratio 1.05; 95%CI:1.03-1.08,p<0.001) or Non-DM group(odds ratio 1.07; 95%CI:1.03-1.11,p=0.002). For critically ill patients without diabetes, factors associated with glycemic variability included APACHE II score(regression coefficient, 95%CI:0.29,0.22-0.36,p<0.001), Mean BG(0.73,0.46-1.01,p<0.001), total parenteral nutrition(2.87,1.57-4.17,p<0.001), serum albumin(-0.18,-0.271 to -0.082,p<0.001), insulin treatment(2.18,0.81-3.55,p=0.002) and duration of ventilation(0.006,0.002-1.010,p=0.003).However, for diabetes patients, APACHE II score(0.203,0.096-0.310,p<0.001), mean BG(0.503,0.138-0.869,p=0.007) and duration of diabetes(0.167,0.033-0.301,p=0.015) remained as independent risk factors of GV. Conclusion: We found that the relation between dysglycemia and mortality is different in the diabetes and non-diabetes groups. And we confirm that GV was associated with excess mortality in DM or Non-DM patients. Furthermore, APACHE II score, Mean BG, total parenteral nutrition, serum albumin, insulin treatment and duration of ventilation were significantly associated with an increase in GV in Non-DM patients. While APACHE II score, mean BG and duration of diabetes (years) remained as independent risk factors of increased GV in DM patients. These findings provide important context for further prospective trials investigating the effect of different clinical factors in critically ill patients with or without diabetes.Keywords: diabetes, glycemic variability, predictors, severe disease
Procedia PDF Downloads 1893923 Comparing Skill, Employment, and Productivity of Industrial City Case Study: Bekasi Industrial Area and Special Economic Zone Sei Mangkei
Authors: Auliya Adzillatin Uzhma, M. Adrian Rizky, Puri Diah Santyarini
Abstract:
Bekasi Industrial Area in Kab. Bekasi and SEZ (Special Economic Zone) Sei Mangkei in Kab. Simalungun are two areas whose have the same main economic activity that are manufacturing industrial. Manufacturing industry in Bekasi Industrial Area contributes more than 70% of Kab. Bekasi’s GDP, while manufacturing industry in SEZ Sei Mangkei contributes less than 20% of Kab. Simalungun’s GDP. The dependent variable in the research is labor productivity, while the independent variable is the amount of labor, the level of labor education, the length of work and salary. This research used linear regression method to find the model for represent actual condition of productivity in two industrial area, then the equalization using dummy variable on labor education level variable. The initial hypothesis (Ho) in this research is that labor productivity in Bekasi Industrial Area will be higher than the productivity of labor in SEZ Sei Mangkei. The variable that supporting the accepted hypothesis are more labor, higher education, longer work and higher salary in Bekasi Industrial Area.Keywords: labor, industrial city, linear regression, productivity
Procedia PDF Downloads 1793922 Artificial Neural Network and Statistical Method
Authors: Tomas Berhanu Bekele
Abstract:
Traffic congestion is one of the main problems related to transportation in developed as well as developing countries. Traffic control systems are based on the idea of avoiding traffic instabilities and homogenizing traffic flow in such a way that the risk of accidents is minimized and traffic flow is maximized. Lately, Intelligent Transport Systems (ITS) has become an important area of research to solve such road traffic-related issues for making smart decisions. It links people, roads and vehicles together using communication technologies to increase safety and mobility. Moreover, accurate prediction of road traffic is important to manage traffic congestion. The aim of this study is to develop an ANN model for the prediction of traffic flow and to compare the ANN model with the linear regression model of traffic flow predictions. Data extraction was carried out in intervals of 15 minutes from the video player. Video of mixed traffic flow was taken and then counted during office work in order to determine the traffic volume. Vehicles were classified into six categories, namely Car, Motorcycle, Minibus, mid-bus, Bus, and Truck vehicles. The average time taken by each vehicle type to travel the trap length was measured by time displayed on a video screen.Keywords: intelligent transport system (ITS), traffic flow prediction, artificial neural network (ANN), linear regression
Procedia PDF Downloads 673921 Optimal Design of RC Pier Accompanied with Multi Sliding Friction Damping Mechanism Using Combination of SNOPT and ANN Method
Authors: Angga S. Fajar, Y. Takahashi, J. Kiyono, S. Sawada
Abstract:
The structural system concept of RC pier accompanied with multi sliding friction damping mechanism was developed based on numerical analysis approach. However in the implementation, to make design for such kind of this structural system consumes a lot of effort in case high of complexity. During making design, the special behaviors of this structural system should be considered including flexible small deformation, sufficient elastic deformation capacity, sufficient lateral force resistance, and sufficient energy dissipation. The confinement distribution of friction devices has significant influence to its. Optimization and prediction with multi function regression of this structural system expected capable of providing easier and simpler design method. The confinement distribution of friction devices is optimized with SNOPT in Opensees, while some design variables of the structure are predicted using multi function regression of ANN. Based on the optimization and prediction this structural system is able to be designed easily and simply.Keywords: RC Pier, multi sliding friction device, optimal design, flexible small deformation
Procedia PDF Downloads 3673920 Is Electricity Consumption Stationary in Turkey?
Authors: Eyup Dogan
Abstract:
The number of research articles analyzing the integration properties of energy variables has rapidly increased in the energy literature for about a decade. The stochastic behaviors of energy variables are worth knowing due to several reasons. For instance, national policies to conserve or promote energy consumption, which should be taken as shocks to energy consumption, will have transitory effects in energy consumption if energy consumption is found to be stationary in one country. Furthermore, it is also important to know the order of integration to employ an appropriate econometric model. Despite being an important subject for applied energy (economics) and having a huge volume of studies, several known limitations still exist with the existing literature. For example, many of the studies use aggregate energy consumption and national level data. In addition, a huge part of the literature is either multi-country studies or solely focusing on the U.S. This is the first study in the literature that considers a form of energy consumption by sectors at sub-national level. This research study aims at investigating unit root properties of electricity consumption for 12 regions of Turkey by four sectors in addition to total electricity consumption for the purpose of filling the mentioned limits in the literature. In this regard, we analyze stationarity properties of 60 cases . Because the use of multiple unit root tests make the results robust and consistent, we apply Dickey-Fuller unit root test based on Generalized Least Squares regression (DFGLS), Phillips-Perron unit root test (PP) and Zivot-Andrews unit root test with one endogenous structural break (ZA). The main finding of this study is that electricity consumption is trend stationary in 7 cases according to DFGLS and PP, whereas it is stationary process in 12 cases when we take into account the structural change by applying ZA. Thus, shocks to electricity consumption have transitory effects in those cases; namely, agriculture in region 1, region 4 and region 7, industrial in region 5, region 8, region 9, region 10 and region 11, business in region 4, region 7 and region 9, total electricity consumption in region 11. Regarding policy implications, policies to decrease or stimulate the use of electricity have a long-run impact on electricity consumption in 80% of cases in Turkey given that 48 cases are non-stationary process. On the other hand, the past behavior of electricity consumption can be used to predict the future behavior of that in 12 cases only.Keywords: unit root, electricity consumption, sectoral data, subnational data
Procedia PDF Downloads 4103919 Characterization of the Ignitability and Flame Regression Behaviour of Flame Retarded Natural Fibre Composite Panel
Authors: Timine Suoware, Sylvester Edelugo, Charles Amgbari
Abstract:
Natural fibre composites (NFC) are becoming very attractive especially for automotive interior and non-structural building applications because they are biodegradable, low cost, lightweight and environmentally friendly. NFC are known to release high combustible products during exposure to heat atmosphere and this behaviour has raised concerns to end users. To improve on their fire response, flame retardants (FR) such as aluminium tri-hydroxide (ATH) and ammonium polyphosphate (APP) are incorporated during processing to delay the start and spread of fire. In this paper, APP was modified with Gum Arabic powder (GAP) and synergized with carbon black (CB) to form new FR species. Four FR species at 0, 12, 15 and 18% loading ratio were added to oil palm fibre polyester composite (OPFC) panels as follows; OPFC12%APP-GAP, OPFC15%APP-GAP/CB, OPFC18%ATH/APP-GAP and OPFC18%ATH/APPGAP/CB. The panels were produced using hand lay-up compression moulding and cured at room temperature. Specimens were cut from the panels and these were tested for ignition time (Tig), peak heat released rate (HRRp), average heat release rate (HRRavg), peak mass loss rate (MLRp), residual mass (Rm) and average smoke production rate (SPRavg) using cone calorimeter apparatus as well as the available flame energy (ɸ) in driving the flame using radiant panel flame spread apparatus. From the ignitability data obtained at 50 kW/m2 heat flux (HF), it shows that the hybrid FR modified with APP that is OPFC18%ATH/APP-GAP exhibited superior flame retardancy and the improvement was based on comparison with those without FR which stood at Tig = 20 s, HRRp = 86.6 kW/m2, HRRavg = 55.8 kW/m2, MLRp =0.131 g/s, Rm = 54.6% and SPRavg = 0.05 m2/s representing respectively 17.6%, 67.4%, 62.8%, 50.9%, 565% and 62.5% improvements less than those without FR (OPFC0%). In terms of flame spread, the least flame energy (ɸ) of 0.49 kW2/s3 for OPFC18%ATH/APP-GAP caused early flame regression. This was less than 39.6 kW2/s3 compared to those without FR (OPFC0%). It can be concluded that hybrid FR modified with APP could be useful in the automotive and building industries to delay the start and spread of fire.Keywords: flame retardant, flame regression, oil palm fibre, composite panel
Procedia PDF Downloads 1283918 Use of the Gas Chromatography Method for Hydrocarbons' Quality Evaluation in the Offshore Fields of the Baltic Sea
Authors: Pavel Shcherban, Vlad Golovanov
Abstract:
Currently, there is an active geological exploration and development of the subsoil shelf of the Kaliningrad region. To carry out a comprehensive and accurate assessment of the volumes and degree of extraction of hydrocarbons from open deposits, it is necessary to establish not only a number of geological and lithological characteristics of the structures under study, but also to determine the oil quality, its viscosity, density, fractional composition as accurately as possible. In terms of considered works, gas chromatography is one of the most capacious methods that allow the rapid formation of a significant amount of initial data. The aspects of the application of the gas chromatography method for determining the chemical characteristics of the hydrocarbons of the Kaliningrad shelf fields are observed in the article, as well as the correlation-regression analysis of these parameters in comparison with the previously obtained chemical characteristics of hydrocarbon deposits located on the land of the region. In the process of research, a number of methods of mathematical statistics and computer processing of large data sets have been applied, which makes it possible to evaluate the identity of the deposits, to specify the amount of reserves and to make a number of assumptions about the genesis of the hydrocarbons under analysis.Keywords: computer processing of large databases, correlation-regression analysis, hydrocarbon deposits, method of gas chromatography
Procedia PDF Downloads 1573917 Challenge of Baseline Hydrology Estimation at Large-Scale Watersheds
Authors: Can Liu, Graham Markowitz, John Balay, Ben Pratt
Abstract:
Baseline or natural hydrology is commonly employed for hydrologic modeling and quantification of hydrologic alteration due to manmade activities. It can inform planning and policy related efforts for various state and federal water resource agencies to restore natural streamflow flow regimes. A common challenge faced by hydrologists is how to replicate unaltered streamflow conditions, particularly in large watershed settings prone to development and regulation. Three different methods were employed to estimate baseline streamflow conditions for 6 major subbasins the Susquehanna River Basin; those being: 1) incorporation of consumptive water use and reservoir operations back into regulated gaged records; 2) using a map correlation method and flow duration (exceedance probability) regression equations; 3) extending the pre-regulation streamflow records based on the relationship between concurrent streamflows at unregulated and regulated gage locations. Parallel analyses were perform among the three methods and limitations associated with each are presented. Results from these analyses indicate that generating baseline streamflow records at large-scale watersheds remain challenging, even with long-term continuous stream gage records available.Keywords: baseline hydrology, streamflow gage, subbasin, regression
Procedia PDF Downloads 324