Search results for: Cox proportional hazard regression
3885 The Effect of Mathematical Modeling of Damping on the Seismic Energy Demands
Authors: Selamawit Dires, Solomon Tesfamariam, Thomas Tannert
Abstract:
Modern earthquake engineering and design encompass performance-based design philosophy. The main objective in performance-based design is to achieve a system performing precisely to meet the design objectives so to reduce unintended seismic risks and associated losses. Energy-based earthquake-resistant design is one of the design methodologies that can be implemented in performance-based earthquake engineering. In energy-based design, the seismic demand is usually described as the ratio of the hysteretic to input energy. Once the hysteretic energy is known as a percentage of the input energy, it is distributed among energy-dissipating components of a structure. The hysteretic to input energy ratio is highly dependent on the inherent damping of a structural system. In numerical analysis, damping can be modeled as stiffness-proportional, mass-proportional, or a linear combination of stiffness and mass. In this study, the effect of mathematical modeling of damping on the estimation of seismic energy demands is investigated by considering elastic-perfectly-plastic single-degree-of-freedom systems representing short to long period structures. Furthermore, the seismicity of Vancouver, Canada, is used in the nonlinear time history analysis. According to the preliminary results, the input energy demand is not sensitive to the type of damping models deployed. Hence, consistent results are achieved regardless of the damping models utilized in the numerical analyses. On the other hand, the hysteretic to input energy ratios vary significantly for the different damping models.Keywords: damping, energy-based seismic design, hysteretic energy, input energy
Procedia PDF Downloads 1683884 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model
Authors: Chaudhuri Manoj Kumar Swain, Susmita Das
Abstract:
This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis
Procedia PDF Downloads 1793883 Medical Surveillance Management
Authors: Jina K., Kittinan C. Athitaya J., Weerapat B., Amornrat T., Waraphan N.
Abstract:
Working in the exploration and production of petroleum exposed workers to various health risks, including but not limited to physical and chemical risks. Although lots of barriers have been put in place, e.g., hazard monitoring in the workplace, appropriate training on health hazards, proper personal protective equipment (PPE), the health hazard may harm the workers if the barriers are not effectively implemented. To prove the effectiveness of these barriers, it is necessary to monitor exposure by putting in place the medical surveillance program via biological monitoring of chemical hazards and physical check-ups for physical hazards. Medical surveillance management is the systematic assessment and monitoring of employees exposed or potentially exposed to occupational hazards with the goal of reducing and ultimately preventing occupational illness and injury. The paper aims to demonstrate the effectiveness of medical surveillance management in mitigating health risks associated with physical and chemical hazards in the petroleum industry by focusing on implementing programs for biological monitoring and physical examinations, including defining procedures for biological monitoring, urine sample collection, physical examinations, and result management on offshore petroleum platforms. The implementation of medical surveillance management has proven effective in monitoring worker exposure to physical and chemical hazards, leading to reduced medical expenses and the risk associated with work-related diseases significantly.Keywords: medical surveillance, petroleum industry, occupational hazards, medical surveillance process
Procedia PDF Downloads 193882 Regression Analysis in Estimating Stream-Flow and the Effect of Hierarchical Clustering Analysis: A Case Study in Euphrates-Tigris Basin
Authors: Goksel Ezgi Guzey, Bihrat Onoz
Abstract:
The scarcity of streamflow gauging stations and the increasing effects of global warming cause designing water management systems to be very difficult. This study is a significant contribution to assessing regional regression models for estimating streamflow. In this study, simulated meteorological data was related to the observed streamflow data from 1971 to 2020 for 33 stream gauging stations of the Euphrates-Tigris Basin. Ordinary least squares regression was used to predict flow for 2020-2100 with the simulated meteorological data. CORDEX- EURO and CORDEX-MENA domains were used with 0.11 and 0.22 grids, respectively, to estimate climate conditions under certain climate scenarios. Twelve meteorological variables simulated by two regional climate models, RCA4 and RegCM4, were used as independent variables in the ordinary least squares regression, where the observed streamflow was the dependent variable. The variability of streamflow was then calculated with 5-6 meteorological variables and watershed characteristics such as area and height prior to the application. Of the regression analysis of 31 stream gauging stations' data, the stations were subjected to a clustering analysis, which grouped the stations in two clusters in terms of their hydrometeorological properties. Two streamflow equations were found for the two clusters of stream gauging stations for every domain and every regional climate model, which increased the efficiency of streamflow estimation by a range of 10-15% for all the models. This study underlines the importance of homogeneity of a region in estimating streamflow not only in terms of the geographical location but also in terms of the meteorological characteristics of that region.Keywords: hydrology, streamflow estimation, climate change, hydrologic modeling, HBV, hydropower
Procedia PDF Downloads 1293881 The Impact of Governance on Happiness: Evidence from Quantile Regressions
Authors: Chiung-Ju Huang
Abstract:
This study utilizes the quantile regression analysis to examine the impact of governance (including democratic quality and technical quality) on happiness in 101 countries worldwide, classified as “developed countries” and “developing countries”. The empirical results show that the impact of democratic quality and technical quality on happiness is significantly positive for “developed countries”, while is insignificant for “developing countries”. The results suggest that the authorities in developed countries can enhance the level of individual happiness by means of improving the democracy quality and technical quality. However, for developing countries, promoting the quality of governance in order to enhance the level of happiness may not be effective. Policy makers in developed countries may pay more attention on increasing real GDP per capita instead of promoting the quality of governance to enhance individual happiness.Keywords: governance, happiness, multiple regression, quantile regression
Procedia PDF Downloads 2823880 Application of Ground-Penetrating Radar in Environmental Hazards
Authors: Kambiz Teimour Najad
Abstract:
The basic methodology of GPR involves the use of a transmitting antenna to send electromagnetic waves into the subsurface, which then bounce back to the surface and are detected by a receiving antenna. The transmitter and receiver antennas are typically placed on the ground surface and moved across the area of interest to create a profile of the subsurface. The GPR system consists of a control unit that powers the antennas and records the data, as well as a display unit that shows the results of the survey. The control unit sends a pulse of electromagnetic energy into the ground, which propagates through the soil or rock until it encounters a change in material or structure. When the electromagnetic wave encounters a buried object or structure, some of the energy is reflected back to the surface and detected by the receiving antenna. The GPR data is then processed using specialized software that analyzes the amplitude and travel time of the reflected waves. By interpreting the data, GPR can provide information on the depth, location, and nature of subsurface features and structures. GPR has several advantages over other geophysical survey methods, including its ability to provide high-resolution images of the subsurface and its non-invasive nature, which minimizes disruption to the site. However, the effectiveness of GPR depends on several factors, including the type of soil or rock, the depth of the features being investigated, and the frequency of the electromagnetic waves used. In environmental hazard assessments, GPR can be used to detect buried structures, such as underground storage tanks, pipelines, or utilities, which may pose a risk of contamination to the surrounding soil or groundwater. GPR can also be used to assess soil stability by identifying areas of subsurface voids or sinkholes, which can lead to the collapse of the surface. Additionally, GPR can be used to map the extent and movement of groundwater contamination, which is critical in designing effective remediation strategies. the methodology of GPR in environmental hazard assessments involves the use of electromagnetic waves to create high of the subsurface, which are then analyzed to provide information on the depth, location, and nature of subsurface features and structures. This information is critical in identifying and mitigating environmental hazards, and the non-invasive nature of GPR makes it a valuable tool in this field.Keywords: GPR, hazard, landslide, rock fall, contamination
Procedia PDF Downloads 843879 Development of a 3D Model of Real Estate Properties in Fort Bonifacio, Taguig City, Philippines Using Geographic Information Systems
Authors: Lyka Selene Magnayi, Marcos Vinas, Roseanne Ramos
Abstract:
As the real estate industry continually grows in the Philippines, Geographic Information Systems (GIS) provide advantages in generating spatial databases for efficient delivery of information and services. The real estate sector is not only providing qualitative data about real estate properties but also utilizes various spatial aspects of these properties for different applications such as hazard mapping and assessment. In this study, a three-dimensional (3D) model and a spatial database of real estate properties in Fort Bonifacio, Taguig City are developed using GIS and SketchUp. Spatial datasets include political boundaries, buildings, road network, digital terrain model (DTM) derived from Interferometric Synthetic Aperture Radar (IFSAR) image, Google Earth satellite imageries, and hazard maps. Multiple model layers were created based on property listings by a partner real estate company, including existing and future property buildings. Actual building dimensions, building facade, and building floorplans are incorporated in these 3D models for geovisualization. Hazard model layers are determined through spatial overlays, and different scenarios of hazards are also presented in the models. Animated maps and walkthrough videos were created for company presentation and evaluation. Model evaluation is conducted through client surveys requiring scores in terms of the appropriateness, information content, and design of the 3D models. Survey results show very satisfactory ratings, with the highest average evaluation score equivalent to 9.21 out of 10. The output maps and videos obtained passing rates based on the criteria and standards set by the intended users of the partner real estate company. The methodologies presented in this study were found useful and have remarkable advantages in the real estate industry. This work may be extended to automated mapping and creation of online spatial databases for better storage, access of real property listings and interactive platform using web-based GIS.Keywords: geovisualization, geographic information systems, GIS, real estate, spatial database, three-dimensional model
Procedia PDF Downloads 1593878 The Impact of Female Education on Fertility: A Natural Experiment from Egypt
Authors: Fatma Romeh, Shiferaw Gurmu
Abstract:
This paper examines the impact of female education on fertility, using the change in length of primary schooling in Egypt in 1988-89 as the source of exogenous variation in schooling. In particular, beginning in 1988, children had to attend primary school for only five years rather than six years. This change was applicable to all individuals born on or after October 1977. Using a nonparametric regression discontinuity approach, we compare education and fertility of women born just before and after October 1977. The results show that female education significantly reduces the number of children born per woman and delays the time until first birth. Applying a robust regression discontinuity approach, however, the impact of education on the number of children is no longer significant. The impact on the timing of first birth remained significant under the robust approach. Each year of female education postponed childbearing by three months, on average.Keywords: Egypt, female education, fertility, robust regression discontinuity
Procedia PDF Downloads 3383877 Breast Cancer Mortality and Comorbidities in Portugal: A Predictive Model Built with Real World Data
Authors: Cecília M. Antão, Paulo Jorge Nogueira
Abstract:
Breast cancer (BC) is the first cause of cancer mortality among Portuguese women. This retrospective observational study aimed at identifying comorbidities associated with BC female patients admitted to Portuguese public hospitals (2010-2018), investigating the effect of comorbidities on BC mortality rate, and building a predictive model using logistic regression. Results showed that the BC mortality in Portugal decreased in this period and reached 4.37% in 2018. Adjusted odds ratio indicated that secondary malignant neoplasms of liver, of bone and bone marrow, congestive heart failure, and diabetes were associated with an increased chance of dying from breast cancer. Although the Lisbon district (the most populated area) accounted for the largest percentage of BC patients, the logistic regression model showed that, besides patient’s age, being resident in Bragança, Castelo Branco, or Porto districts was directly associated with an increase of the mortality rate.Keywords: breast cancer, comorbidities, logistic regression, adjusted odds ratio
Procedia PDF Downloads 893876 Assessing Relationships between Glandularity and Gray Level by Using Breast Phantoms
Authors: Yun-Xuan Tang, Pei-Yuan Liu, Kun-Mu Lu, Min-Tsung Tseng, Liang-Kuang Chen, Yuh-Feng Tsai, Ching-Wen Lee, Jay Wu
Abstract:
Breast cancer is predominant of malignant tumors in females. The increase in the glandular density increases the risk of breast cancer. BI-RADS is a frequently used density indicator in mammography; however, it significantly overestimates the glandularity. Therefore, it is very important to accurately and quantitatively assess the glandularity by mammography. In this study, 20%, 30% and 50% glandularity phantoms were exposed using a mammography machine at 28, 30 and 31 kVp, and 30, 55, 80 and 105 mAs, respectively. The regions of interest (ROIs) were drawn to assess the gray level. The relationship between the glandularity and gray level under various compression thicknesses, kVp, and mAs was established by the multivariable linear regression. A phantom verification was performed with automatic exposure control (AEC). The regression equation was obtained with an R-square value of 0.928. The average gray levels of the verification phantom were 8708, 8660 and 8434 for 0.952, 0.963 and 0.985 g/cm3, respectively. The percent differences of glandularity to the regression equation were 3.24%, 2.75% and 13.7%. We concluded that the proposed method could be clinically applied in mammography to improve the glandularity estimation and further increase the importance of breast cancer screening.Keywords: mammography, glandularity, gray value, BI-RADS
Procedia PDF Downloads 4953875 Mobilizing Resources for Social Entrepreneurial Opportunity: A Framework of Engagement Strategy
Authors: Balram Bhushan
Abstract:
The emergence of social entrepreneurship challenges the strict categorization of not-for-profit, for-profit and hybrid organizations. Although the blurring of boundaries helps social entrepreneurial organizations (SEOs) make better use of emerging opportunities, it poses a significant challenge while mobilizing money from different sources. Additionally, for monetary resources, the legal framework of the host country may further complicate the issue by imposing strict accounting standards. Under such circumstances, the resource providers fail to recognize the suitable engagement strategy with the SEO of their choice. Based on the process of value creation and value capture, this paper develops a guiding framework for resource providers to design an appropriate mix of engagement with the identified SEOs. Essentially, social entrepreneurship creates value at the societal level, but value capture is a characteristic of an organization. Additionally, SEOs prefer value creation over value capture. The paper argued that the nature of the relationship between value creation and value capture determines the extent of blurred boundaries of the organization. Accordingly, synergistic, antagonistic and sequential relationships were proposed between value capture and value creation. When value creation is synergistically associated with value creation, the preferred nature of such action falls within the nature of for-profit organizations within the strictest legal framework. Banks offering micro-loans are good examples of this category. Opposite to this, the antagonist relationship between value creation and value capture, where value capture opportunities are sacrificed for value creation, dictates non-profit organizational structure. Examples of this category include non-government organizations and charity organizations. Finally, the sequential relationship between value capture opportunities is followed for value creation opportunities and guides the action closer to the hybrid structure. Examples of this category include organizations where a non-for-profit unit controls for-profit units of the organization either legally or structurally. As an SEO may attempt to utilize multiple entrepreneurial opportunities falling across any of the three relationships between value creation and value capture, the resource providers need to evaluate an appropriate mix of these relationships before designing their engagement strategies. The paper suggests three guiding principles for the engagement strategy. First, the extent of investment should be proportional to the synergistic relationship between value capture and value creation. Second, the subsidized support should be proportional to the sequential relationship. Finally, the funding (charity contribution) should be proportional to the antagonistic relationship. Finally, the resource providers are needed to keep a close watch on the evolving relationship between value creation and value capture for introducing appropriate changes in their engagement strategy.Keywords: social entrepreneurship, value creation, value capture, entrepreneurial opportunity
Procedia PDF Downloads 1333874 An Analysis of the Regression Hypothesis from a Shona Broca’s Aphasci Perspective
Authors: Esther Mafunda, Simbarashe Muparangi
Abstract:
The present paper tests the applicability of the Regression Hypothesis on the pathological language dissolution of a Shona male adult with Broca’s aphasia. It particularly assesses the prediction of the Regression Hypothesis, which states that the process according to which language is forgotten will be the reversal of the process according to which it will be acquired. The main aim of the paper is to find out whether mirror symmetries between L1 acquisition and L1 dissolution of tense in Shona and, if so, what might cause these regression patterns. The paper also sought to highlight the practical contributions that Linguistic theory can make to solving language-related problems. Data was collected from a 46-year-old male adult with Broca’s aphasia who was receiving speech therapy at St Giles Rehabilitation Centre in Harare, Zimbabwe. The primary data elicitation method was experimental, using the probe technique. The TART (Test for Assessing Reference Time) Shona version in the form of sequencing pictures was used to access tense by Broca’s aphasic and 3.5-year-old child. Using the SPSS (Statistical Package for Social Studies) and Excel analysis, it was established that the use of the future tense was impaired in Shona Broca’s aphasic whilst the present and past tense was intact. However, though the past tense was intact in the male adult with Broca’s aphasic, a reference to the remote past was made. The use of the future tense was also found to be difficult for the 3,5-year-old speaking child. No difficulties were encountered in using the present and past tenses. This means that mirror symmetries were found between L1 acquisition and L1 dissolution of tense in Shona. On the basis of the results of this research, it can be concluded that the use of tense in a Shona adult with Broca’s aphasia supports the Regression Hypothesis. The findings of this study are important in terms of speech therapy in the context of Zimbabwe. The study also contributes to Bantu linguistics in general and to Shona linguistics in particular. Further studies could also be done focusing on the rest of the Bantu language varieties in terms of aphasia.Keywords: Broca’s Aphasia, regression hypothesis, Shona, language dissolution
Procedia PDF Downloads 973873 Apricot Insurance Portfolio Risk
Authors: Kasirga Yildirak, Ismail Gur
Abstract:
We propose a model to measure hail risk of an Agricultural Insurance portfolio. Hail is one of the major catastrophic event that causes big amount of loss to an insurer. Moreover, it is very hard to predict due to its strange atmospheric characteristics. We make use of parcel based claims data on apricot damage collected by the Turkish Agricultural Insurance Pool (TARSIM). As our ultimate aim is to compute the loadings assigned to specific parcels, we build a portfolio risk model that makes use of PD and the severity of the exposures. PD is computed by Spherical-Linear and Circular –Linear regression models as the data carries coordinate information and seasonality. Severity is mapped into integer brackets so that Probability Generation Function could be employed. Individual regressions are run on each clusters estimated on different criteria. Loss distribution is constructed by Panjer Recursion technique. We also show that one risk-one crop model can easily be extended to the multi risk–multi crop model by assuming conditional independency.Keywords: hail insurance, spherical regression, circular regression, spherical clustering
Procedia PDF Downloads 2513872 Factors Controlling Durability of Some Egyptian Non-Stylolitic Marbleized Limestone to Salt Weathering
Authors: H. El Shayab, G. M. Kamh, N. G. Abdel Ghafour, M. L. Abdel Latif
Abstract:
Nowadays, marbleized limestone becomes one of the most important sources of the mineral wealth in Egypt as they have beautiful colors (white, grey, rose, yellow and creamy, etc.) make it very suitable for decoration purposes. Non-styolitic marbleized limestone which not contains styolitic surfaces. The current study aims to study different factors controlling durability of non-styolitic marbleized limestone against salt crystallization weathering. The achievement aim of the research was required nine representative samples were collected from the studied areas. Three samples from each of the studied areas. The studied samples was characterized by various instrumental methods before salt weathering, to determine its mineralogical composition, chemical composition and pore physical properties respectively. The obtained results revealed that both of Duwi and Delga studied samples nearly have the same average ∆M% 1.63 and 1.51 respectively and consequently A.I. stage of deformation. On the other hand, average ∆M% of Wata studied samples is 0.29 i.e. lower than two other studied areas. Wata studied samples are more durable against salt crystallization test than Duwi and Delga. The difference in salt crystallization durability may be resulted from one of the following factors: Microscopic textural effect as both of micrite and skeletal percent are in directly proportional to durability of stones to salt weathering. Dolomite mineral present as a secondary are in indirectly proportional to durability of stones to salt weathering. Increase in MgO% also associated with decrease the durability of studied samples against salt crystallization test. Finally, all factors affecting positively against salt crystallization test presents in Wadi Wata studied samples rather than others two areas.Keywords: marbleized limestone, salt weathering, Wata, salt weathering
Procedia PDF Downloads 3283871 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis
Authors: Wenbo Du, Xiaomei Ma
Abstract:
With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression
Procedia PDF Downloads 1463870 Geoplanology Modeling and Applications Engineering of Earth in Spatial Planning Related with Geological Hazard in Cilegon, Banten, Indonesia
Authors: Muhammad L. A. Dwiyoga
Abstract:
The condition of a spatial land in the industrial park needs special attention to be studied more deeply. Geoplanology modeling can help arrange area according to his ability. This research method is to perform the analysis of remote sensing, Geographic Information System, and more comprehensive analysis to determine geological characteristics and the ability to land on the area of research and its relation to the geological disaster. Cilegon is part of Banten province located in western Java, and the direction of the north is the Strait of Borneo. While the southern part is bordering the Indian Ocean. Morphology study area is located in the highlands to low. In the highlands of identified potential landslide prone, whereas in low-lying areas of potential flooding. Moreover, in the study area has the potential prone to earthquakes, this is due to the proximity of enough research to Mount Krakatau and Subdcution Zone. From the results of this study show that the study area has a susceptibility to landslides located around the District Waringinkurung. While the region as a potential flood areas in the District of Cilegon and surrounding areas. Based on the seismic data, this area includes zones with a range of magnitude 1.5 to 5.5 magnitude at a depth of 1 to 60 Km. As for the ability of its territory, based on the analyzes and studies carried out the need for renewal of the map Spatial Plan that has been made, considering the development of a fairly rapid Cilegon area.Keywords: geoplanology, spatial plan, geological hazard, cilegon, Indonesia
Procedia PDF Downloads 5043869 The Factors of Supply Chain Collaboration
Authors: Ghada Soltane
Abstract:
The objective of this study was to identify factors impacting supply chain collaboration. a quantitative study was carried out on a sample of 84 Tunisian industrial companies. To verify the research hypotheses and test the direct effect of these factors on supply chain collaboration a multiple regression method was used using SPSS 26 software. The results show that there are four factors direct effects that affect supply chain collaboration in a meaningful and positive way, including: trust, engagement, information sharing and information qualityKeywords: supply chain collaboration, factors of collaboration, principal component analysis, multiple regression
Procedia PDF Downloads 513868 Neural Network Supervisory Proportional-Integral-Derivative Control of the Pressurized Water Reactor Core Power Load Following Operation
Authors: Derjew Ayele Ejigu, Houde Song, Xiaojing Liu
Abstract:
This work presents the particle swarm optimization trained neural network (PSO-NN) supervisory proportional integral derivative (PID) control method to monitor the pressurized water reactor (PWR) core power for safe operation. The proposed control approach is implemented on the transfer function of the PWR core, which is computed from the state-space model. The PWR core state-space model is designed from the neutronics, thermal-hydraulics, and reactivity models using perturbation around the equilibrium value. The proposed control approach computes the control rod speed to maneuver the core power to track the reference in a closed-loop scheme. The particle swarm optimization (PSO) algorithm is used to train the neural network (NN) and to tune the PID simultaneously. The controller performance is examined using integral absolute error, integral time absolute error, integral square error, and integral time square error functions, and the stability of the system is analyzed by using the Bode diagram. The simulation results indicated that the controller shows satisfactory performance to control and track the load power effectively and smoothly as compared to the PSO-PID control technique. This study will give benefit to design a supervisory controller for nuclear engineering research fields for control application.Keywords: machine learning, neural network, pressurized water reactor, supervisory controller
Procedia PDF Downloads 1573867 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques
Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas
Abstract:
The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.Keywords: Artificial Neural network, Competitive dynamics, Logistic Regression, Text classification, Text mining
Procedia PDF Downloads 1223866 Rapid Flood Damage Assessment of Population and Crops Using Remotely Sensed Data
Authors: Urooj Saeed, Sajid Rashid Ahmad, Iqra Khalid, Sahar Mirza, Imtiaz Younas
Abstract:
Pakistan, a flood-prone country, has experienced worst floods in the recent past which have caused extensive damage to the urban and rural areas by loss of lives, damage to infrastructure and agricultural fields. Poor flood management system in the country has projected the risks of damages as the increasing frequency and magnitude of floods are felt as a consequence of climate change; affecting national economy directly or indirectly. To combat the needs of flood emergency, this paper focuses on remotely sensed data based approach for rapid mapping and monitoring of flood extent and its damages so that fast dissemination of information can be done, from local to national level. In this research study, spatial extent of the flooding caused by heavy rains of 2014 has been mapped by using space borne data to assess the crop damages and affected population in sixteen districts of Punjab. For this purpose, moderate resolution imaging spectroradiometer (MODIS) was used to daily mark the flood extent by using Normalised Difference Water Index (NDWI). The highest flood value data was integrated with the LandScan 2014, 1km x 1km grid based population, to calculate the affected population in flood hazard zone. It was estimated that the floods covered an area of 16,870 square kilometers, with 3.0 million population affected. Moreover, to assess the flood damages, Object Based Image Analysis (OBIA) aided with spectral signatures was applied on Landsat image to attain the thematic layers of healthy (0.54 million acre) and damaged crops (0.43 million acre). The study yields that the population of Jhang district (28% of 2.5 million population) was affected the most. Whereas, in terms of crops, Jhang and Muzzafargarh are the ‘highest damaged’ ranked district of floods 2014 in Punjab. This study was completed within 24 hours of the peak flood time, and proves to be an effective methodology for rapid assessment of damages due to flood hazardKeywords: flood hazard, space borne data, object based image analysis, rapid damage assessment
Procedia PDF Downloads 3303865 Optimized Real Ground Motion Scaling for Vulnerability Assessment of Building Considering the Spectral Uncertainty and Shape
Authors: Chen Bo, Wen Zengping
Abstract:
Based on the results of previous studies, we focus on the research of real ground motion selection and scaling method for structural performance-based seismic evaluation using nonlinear dynamic analysis. The input of earthquake ground motion should be determined appropriately to make them compatible with the site-specific hazard level considered. Thus, an optimized selection and scaling method are established including the use of not only Monte Carlo simulation method to create the stochastic simulation spectrum considering the multivariate lognormal distribution of target spectrum, but also a spectral shape parameter. Its applications in structural fragility analysis are demonstrated through case studies. Compared to the previous scheme with no consideration of the uncertainty of target spectrum, the method shown here can make sure that the selected records are in good agreement with the median value, standard deviation and spectral correction of the target spectrum, and greatly reveal the uncertainty feature of site-specific hazard level. Meanwhile, it can help improve computational efficiency and matching accuracy. Given the important infection of target spectrum’s uncertainty on structural seismic fragility analysis, this work can provide the reasonable and reliable basis for structural seismic evaluation under scenario earthquake environment.Keywords: ground motion selection, scaling method, seismic fragility analysis, spectral shape
Procedia PDF Downloads 2953864 Study on Optimal Control Strategy of PM2.5 in Wuhan, China
Authors: Qiuling Xie, Shanliang Zhu, Zongdi Sun
Abstract:
In this paper, we analyzed the correlation relationship among PM2.5 from other five Air Quality Indices (AQIs) based on the grey relational degree, and built a multivariate nonlinear regression equation model of PM2.5 and the five monitoring indexes. For the optimal control problem of PM2.5, we took the partial large Cauchy distribution of membership equation as satisfaction function. We established a nonlinear programming model with the goal of maximum performance to price ratio. And the optimal control scheme is given.Keywords: grey relational degree, multiple linear regression, membership function, nonlinear programming
Procedia PDF Downloads 3013863 Learning the Most Common Causes of Major Industrial Accidents and Apply Best Practices to Prevent Such Accidents
Authors: Rajender Dahiya
Abstract:
Investigation outcomes of major process incidents have been consistent for decades and validate that the causes and consequences are often identical. The debate remains as we continue to experience similar process incidents even with enormous development of new tools, technologies, industry standards, codes, regulations, and learning processes? The objective of this paper is to investigate the most common causes of major industrial incidents and reveal industry challenges and best practices to prevent such incidents. The author, in his current role, performs audits and inspections of a variety of high-hazard industries in North America, including petroleum refineries, chemicals, petrochemicals, manufacturing, etc. In this paper, he shares real life scenarios, examples, and case studies from high hazards operating facilities including key challenges and best practices. This case study will provide a clear understanding of the importance of near miss incident investigation. The incident was a Safe operating limit excursion. The case describes the deficiencies in management programs, the competency of employees, and the culture of the corporation that includes hazard identification and risk assessment, maintaining the integrity of safety-critical equipment, operating discipline, learning from process safety near misses, process safety competency, process safety culture, audits, and performance measurement. Failure to identify the hazards and manage the risks of highly hazardous materials and processes is one of the primary root-causes of an incident, and failure to learn from past incidents is the leading cause of the recurrence of incidents. Several investigations of major incidents discovered that each showed several warning signs before occurring, and most importantly, all were preventable. The author will discuss why preventable incidents were not prevented and review the mutual causes of learning failures from past major incidents. The leading causes of past incidents are summarized below. Management failure to identify the hazard and/or mitigate the risk of hazardous processes or materials. This process starts early in the project stage and continues throughout the life cycle of the facility. For example, a poorly done hazard study such as HAZID, PHA, or LOPA is one of the leading causes of the failure. If this step is performed correctly, then the next potential cause is. Management failure to maintain the integrity of safety critical systems and equipment. In most of the incidents, mechanical integrity of the critical equipment was not maintained, safety barriers were either bypassed, disabled, or not maintained. The third major cause is Management failure to learn and/or apply learning from the past incidents. There were several precursors before those incidents. These precursors were either ignored altogether or not taken seriously. This paper will conclude by sharing how a well-implemented operating management system, good process safety culture, and competent leaders and staff contributed to managing the risks to prevent major incidents.Keywords: incident investigation, risk management, loss prevention, process safety, accident prevention
Procedia PDF Downloads 573862 Evaluation of Patients’ Quality of Life After Lumbar Disc Surgery and Movement Limitations
Authors: Shirin Jalili, Ramin Ghasemi
Abstract:
Lumbar microdiscectomy is the most commonly performed spinal surgery strategy; it is regularly performed to lighten the indications and signs of sciatica within the lower back and leg caused by a lumbar disc herniation. This surgery aims to progress leg pain, reestablish function, and enable a return to ordinary day-by-day exercises. Rates of lumbar disc surgery show critical geographic varieties recommending changing treatment criteria among working specialists. Few population-based considers have investigated the hazard of reoperation after disc surgery, and regional or inter specialty varieties within the reoperations are obscure. The conventional approach to recouping from lumbar microdiscectomy has been to restrain bending, lifting, or turning for a least 6 weeks in arrange to anticipate the disc from herniating once more. Traditionally, patients were exhorted to limit post-operative action, which was accepted to decrease the hazard of disc herniation and progressive insecurity. In modern hone, numerous specialists don't limit understanding of postoperative action due to the discernment this practice is pointless. There's a need of thinks about highlighting the result by distinctive scores or parameters after surgery for repetitive circle herniations of the lumbar spine at the starting herniation location. This study will evaluate the quality of life after surgical treatment of recurrent herniations with distinctive standardized approved result instruments.Keywords: post-operative activity, disc, quality of life, treatment, movements
Procedia PDF Downloads 803861 SVM-Based Modeling of Mass Transfer Potential of Multiple Plunging Jets
Authors: Surinder Deswal, Mahesh Pal
Abstract:
The paper investigates the potential of support vector machines based regression approach to model the mass transfer capacity of multiple plunging jets, both vertical (θ = 90°) and inclined (θ = 60°). The data set used in this study consists of four input parameters with a total of eighty eight cases. For testing, tenfold cross validation was used. Correlation coefficient values of 0.971 and 0.981 (root mean square error values of 0.0025 and 0.0020) were achieved by using polynomial and radial basis kernel functions based support vector regression respectively. Results suggest an improved performance by radial basis function in comparison to polynomial kernel based support vector machines. The estimated overall mass transfer coefficient, by both the kernel functions, is in good agreement with actual experimental values (within a scatter of ±15 %); thereby suggesting the utility of support vector machines based regression approach.Keywords: mass transfer, multiple plunging jets, support vector machines, ecological sciences
Procedia PDF Downloads 4643860 Assessment of Gamma Radiation Exposure of Soils Associated with Granitic Rocks in Kapıdağ Peninsula, Turkey
Authors: Buket Canbaz Öztürk, N. Füsun Çam, Günseli Yaprak, Osman Candan
Abstract:
The external terrestrial radiation exposure is related to the types of rock from which the soils originate. Higher radiation levels are associated with igneous rocks, such as granite, and lower levels with sedimentary rocks. Therefore, this study aims to assess the gamma radiation exposure of soils associated with granitic rocks in Kapıdağ Peninsula, Turkey. In the ongoing study, a comprehensive survey carried out systematically as a part of the environmental monitoring program on radiologic impact of the granitoid areas in Western Anatolia. The activity measurements of the gamma emitters (238U, 232Th and 40K) in the surface soil samples and the granitic rocks carried out by means of NaI(Tl) gamma-ray spectrometry system. To evaluate the radiological hazard of the natural radioactivity, the absorbed dose rate (D), the annual effective dose rate (AED), the radium equivalent activity (Raeq) and the external (Hex) hazard index were calculated according to the UNSCEAR 2000 report. The corresponding absorbed dose rates in air from all natural radionuclides were always much lower than 200 nGy h-1 and did not exceed the typical range of worldwide average values noticed in the UNSCEAR (2000) report. Furthermore, the correlation between soil and granitic rock samples were utilized, and external gamma radiation exposure distribution was mapped in Kapıdağ Peninsula.Keywords: external absorbed dose, granitic rocks, Kapıdağ Peninsula, soil
Procedia PDF Downloads 2363859 Supervised-Component-Based Generalised Linear Regression with Multiple Explanatory Blocks: THEME-SCGLR
Authors: Bry X., Trottier C., Mortier F., Cornu G., Verron T.
Abstract:
We address component-based regularization of a Multivariate Generalized Linear Model (MGLM). A set of random responses Y is assumed to depend, through a GLM, on a set X of explanatory variables, as well as on a set T of additional covariates. X is partitioned into R conceptually homogeneous blocks X1, ... , XR , viewed as explanatory themes. Variables in each Xr are assumed many and redundant. Thus, Generalised Linear Regression (GLR) demands regularization with respect to each Xr. By contrast, variables in T are assumed selected so as to demand no regularization. Regularization is performed searching each Xr for an appropriate number of orthogonal components that both contribute to model Y and capture relevant structural information in Xr. We propose a very general criterion to measure structural relevance (SR) of a component in a block, and show how to take SR into account within a Fisher-scoring-type algorithm in order to estimate the model. We show how to deal with mixed-type explanatory variables. The method, named THEME-SCGLR, is tested on simulated data.Keywords: Component-Model, Fisher Scoring Algorithm, GLM, PLS Regression, SCGLR, SEER, THEME
Procedia PDF Downloads 3973858 Parameter Estimation via Metamodeling
Authors: Sergio Haram Sarmiento, Arcady Ponosov
Abstract:
Based on appropriate multivariate statistical methodology, we suggest a generic framework for efficient parameter estimation for ordinary differential equations and the corresponding nonlinear models. In this framework classical linear regression strategies is refined into a nonlinear regression by a locally linear modelling technique (known as metamodelling). The approach identifies those latent variables of the given model that accumulate most information about it among all approximations of the same dimension. The method is applied to several benchmark problems, in particular, to the so-called ”power-law systems”, being non-linear differential equations typically used in Biochemical System Theory.Keywords: principal component analysis, generalized law of mass action, parameter estimation, metamodels
Procedia PDF Downloads 5183857 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum
Authors: Abdulrahman Sumayli, Saad M. AlShahrani
Abstract:
For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectivelyKeywords: temperature, pressure variations, machine learning, oil treatment
Procedia PDF Downloads 693856 Flash Flood in Gabes City (Tunisia): Hazard Mapping and Vulnerability Assessment
Authors: Habib Abida, Noura Dahri
Abstract:
Flash floods are among the most serious natural hazards that have disastrous environmental and human impacts. They are associated with exceptional rain events, characterized by short durations, very high intensities, rapid flows and small spatial extent. Flash floods happen very suddenly and are difficult to forecast. They generally cause damage to agricultural crops and property, infrastructures, and may even result in the loss of human lives. The city of Gabes (South-eastern Tunisia) has been exposed to numerous damaging floods because of its mild topography, clay soil, high urbanization rate and erratic rainfall distribution. The risks associated with this situation are expected to increase further in the future because of climate change, deemed responsible for the increase of the frequency and the severity of this natural hazard. Recently, exceptional events hit Gabes City causing death and major property losses. A major flooding event hit the region on June 2nd, 2014, causing human deaths and major material losses. It resulted in the stagnation of storm water in the numerous low zones of the study area, endangering thereby human health and causing disastrous environmental impacts. The characterization of flood risk in Gabes Watershed (South-eastern Tunisia) is considered an important step for flood management. Analytical Hierarchy Process (AHP) method coupled with Monte Carlo simulation and geographic information system were applied to delineate and characterize flood areas. A spatial database was developed based on geological map, digital elevation model, land use, and rainfall data in order to evaluate the different factors susceptible to affect flood analysis. Results obtained were validated by remote sensing data for the zones that showed very high flood hazard during the extreme rainfall event of June 2014 that hit the study basin. Moreover, a survey was conducted from different areas of the city in order to understand and explore the different causes of this disaster, its extent and its consequences.Keywords: analytical hierarchy process, flash floods, Gabes, remote sensing, Tunisia
Procedia PDF Downloads 109