Search results for: internal rate of return
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10888

Search results for: internal rate of return

6118 Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

Authors: J. R. Wang, S. W. Chen, Y. Chiang, W. S. Hsu, J. H. Yang, Y. S. Tseng, C. Shih

Abstract:

In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the parameters (wind speed, atmospheric stability classification, air temperature, and control room intake flow rate) was also performed in this research.

Keywords: PWR, HABIT, Habitability, Maanshan

Procedia PDF Downloads 436
6117 Using HABIT to Estimate the Concentration of CO2 and H2SO4 for Kuosheng Nuclear Power Plant

Authors: Y. Chiang, W. Y. Li, J. R. Wang, S. W. Chen, W. S. Hsu, J. H. Yang, Y. S. Tseng, C. Shih

Abstract:

In this research, the HABIT code was used to estimate the concentration under the CO2 and H2SO4 storage burst conditions for Kuosheng nuclear power plant (NPP). The Final Safety Analysis Report (FSAR) and reports were used in this research. In addition, to evaluate the control room habitability for these cases, the HABIT analysis results were compared with the R.G. 1.78 failure criteria. The comparison results show that the HABIT results are below the criteria. Additionally, some sensitivity studies (stability classification, wind speed and control room intake rate) were performed in this study.

Keywords: BWR, HABIT, habitability, Kuosheng

Procedia PDF Downloads 484
6116 Humanitarian Emergency of the Refugee Condition for Central American Immigrants in Irregular Situation

Authors: María de los Ángeles Cerda González, Itzel Arriaga Hurtado, Pascacio José Martínez Pichardo

Abstract:

In México, the recognition of refugee condition is a fundamental right which, as host State, has the obligation of respect, protect, and fulfill to the foreigners – where we can find the figure of immigrants in irregular situation-, that cannot return to their country of origin for humanitarian reasons. The recognition of the refugee condition as a fundamental right in the Mexican law system proceeds under these situations: 1. The immigrant applies for the refugee condition, even without the necessary proving elements to accredit the humanitarian character of his departure from his country of origin. 2. The immigrant does not apply for the recognition of refugee because he does not know he has the right to, even if he has the profile to apply for. 3. The immigrant who applies fulfills the requirements of the administrative procedure and has access to the refugee recognition. Of the three situations above, only the last one is contemplated for the national indexes of the status refugee; and the first two prove the inefficiency of the governmental system viewed from its lack of sensibility consequence of the no education in human rights matter and which results in the legal vulnerability of the immigrants in irregular situation because they do not have access to the procuration and administration of justice. In the aim of determining the causes and consequences of the no recognition of the refugee status, this investigation was structured from a systemic analysis which objective is to show the advances in Central American humanitarian emergency investigation, the Mexican States actions to protect, respect and fulfil the fundamental right of refugee of immigrants in irregular situation and the social and legal vulnerabilities suffered by Central Americans in Mexico. Therefore, to achieve the deduction of the legal nature of the humanitarian emergency from the Human Rights as a branch of the International Public Law, a conceptual framework is structured using the inductive deductive method. The problem statement is made from a legal framework to approach a theoretical scheme under the theory of social systems, from the analysis of the lack of communication of the governmental and normative subsystems of the Mexican legal system relative to the process undertaken by the Central American immigrants to achieve the recognition of the refugee status as a human right. Accordingly, is determined that fulfilling the obligations of the State referent to grant the right of the recognition of the refugee condition, would mean a guideline for a new stage in Mexican Law, because it would enlarge the constitutional benefits to everyone whose right to the recognition of refugee has been denied an as consequence, a great advance in human rights matter would be achieved.

Keywords: central American immigrants in irregular situation, humanitarian emergency, human rights, refugee

Procedia PDF Downloads 285
6115 Liver and Liver Lesion Segmentation From Abdominal CT Scans

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithm

Keywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm

Procedia PDF Downloads 445
6114 Effects of Supplementation of Nano-Particle Zinc Oxide and Mannan-Oligosaccharide (MOS) on Growth, Feed Utilization, Fatty Acid Profile, Intestinal Morphology, and Hematology in Nile tilapia, Oreochromis niloticus (L.) fry

Authors: Tewodros Abate Alemayehu, Abebe Getahun, Akewake Geremew, Dawit Solomon Demeke, John Recha, Dawit Solomon, Gebremedihin Ambaw, Fasil Dawit Moges

Abstract:

The purpose of this study was to examine the effects of supplementation of zinc oxide (ZnO) nanoparticles and Mannan-oligosaccharide (MOS) on growth performance, feed utilization, fatty acid profiles, hematology, and intestinal morphology of Chamo strain Nile tilapia Oreochromis niloticus (L.) fry reared at optimal temperature (28.62 ± 0.11 ⁰C). Nile tilapia fry (initial weight 1.45 ± 0.01g) were fed basal diet/control diet (Diet-T1), 6 g kg-¹ MOS supplemented diet (Diet-T2), 4 mg ZnO-NPs supplemented diet (Diet-T3), 4 mg ZnO-Bulk supplemented diet (Diet-T4), a combination of 6 g kg-¹ MOS and 4 mg ZnO-Bulk supplemented diet (Diet-T5) and combination of 6 g kg-¹ MOS and 4 mg ZnO-NPs supplemented diet (Diet-T6). Randomly, duplicate aquariums for each diet were assigned and hand-fed to apparent satiation three times daily (08:00, 12:00, and 16:00) for 12 weeks. Fish fed MOS, ZnO-NPs, and a combination of MOS and ZnO-Bulk supplemented diet had higher weight gain, Daily Growth Rate (DGR), and Specific Growth Rate (SGR) than fish fed the basal diet and other feeding groups, although the effect was not significant. According to the GC analysis, Nile tilapia was supplemented with 6 g kg-¹ MOS, 4 mg ZnO-NPs, or a combination of ZnO-NPs, and MOS showed the highest content of EPA, DHA, and higher ratios of PUFA/SFA than other feeding groups. Mean villi length in the proximal and middle portion of the Nile tilapia intestine was affected significantly (p<0.05) by diet. Fish fed Diet-T2 and Diet-T3 had significantly higher villi lengths in the proximal and middle portions of the intestine compared to other feeding groups. The inclusion of additives significantly improved goblet numbers at the proximal, middle, and distal portions of the intestine. Supplementation of additives had also improved some hematological parameters compared with control groups. In conclusion, dietary supplementation of additives MOS and ZnO-NPs could confer benefits on growth performance, fatty acid profiles, hematology, and intestinal morphology of Chamo strain Nile tilapia.

Keywords: chamo strain nile tilapia, fatty acid profile, hematology, intestinal morphology, MOS, ZnO-Bulk, ZnO-NPs

Procedia PDF Downloads 70
6113 Quantifying Automation in the Architectural Design Process via a Framework Based on Task Breakdown Systems and Recursive Analysis: An Exploratory Study

Authors: D. M. Samartsev, A. G. Copping

Abstract:

As with all industries, architects are using increasing amounts of automation within practice, with approaches such as generative design and use of AI becoming more commonplace. However, the discourse on the rate at which the architectural design process is being automated is often personal and lacking in objective figures and measurements. This results in confusion between people and barriers to effective discourse on the subject, in turn limiting the ability of architects, policy makers, and members of the public in making informed decisions in the area of design automation. This paper proposes the use of a framework to quantify the progress of automation within the design process. The use of a reductionist analysis of the design process allows it to be quantified in a manner that enables direct comparison across different times, as well as locations and projects. The methodology is informed by the design of this framework – taking on the aspects of a systematic review but compressed in time to allow for an initial set of data to verify the validity of the framework. The use of such a framework of quantification enables various practical uses such as predicting the future of the architectural industry with regards to which tasks will be automated, as well as making more informed decisions on the subject of automation on multiple levels ranging from individual decisions to policy making from governing bodies such as the RIBA. This is achieved by analyzing the design process as a generic task that needs to be performed, then using principles of work breakdown systems to split the task of designing an entire building into smaller tasks, which can then be recursively split further as required. Each task is then assigned a series of milestones that allow for the objective analysis of its automation progress. By combining these two approaches it is possible to create a data structure that describes how much various parts of the architectural design process are automated. The data gathered in the paper serves the dual purposes of providing the framework with validation, as well as giving insights into the current situation of automation within the architectural design process. The framework can be interrogated in many ways and preliminary analysis shows that almost 40% of the architectural design process has been automated in some practical fashion at the time of writing, with the rate at which progress is made slowly increasing over the years, with the majority of tasks in the design process reaching a new milestone in automation in less than 6 years. Additionally, a further 15% of the design process is currently being automated in some way, with various products in development but not yet released to the industry. Lastly, various limitations of the framework are examined in this paper as well as further areas of study.

Keywords: analysis, architecture, automation, design process, technology

Procedia PDF Downloads 102
6112 Attributable Mortality of Nosocomial Infection: A Nested Case Control Study in Tunisia

Authors: S. Ben Fredj, H. Ghali, M. Ben Rejeb, S. Layouni, S. Khefacha, L. Dhidah, H. Said

Abstract:

Background: The Intensive Care Unit (ICU) provides continuous care and uses a high level of treatment technologies. Although developed country hospitals allocate only 5–10% of beds in critical care areas, approximately 20% of nosocomial infections (NI) occur among patients treated in ICUs. Whereas in the developing countries the situation is still less accurate. The aim of our study is to assess mortality rates in ICUs and to determine its predictive factors. Methods: We carried out a nested case-control study in a 630-beds public tertiary care hospital in Eastern Tunisia. We included in the study all patients hospitalized for more than two days in the surgical or medical ICU during the entire period of the surveillance. Cases were patients who died before ICU discharge, whereas controls were patients who survived to discharge. NIs were diagnosed according to the definitions of ‘Comité Technique des Infections Nosocomiales et les Infections Liées aux Soins’ (CTINLIS, France). Data collection was based on the protocol of Rea-RAISIN 2009 of the National Institute for Health Watch (InVS, France). Results: Overall, 301 patients were enrolled from medical and surgical ICUs. The mean age was 44.8 ± 21.3 years. The crude ICU mortality rate was 20.6% (62/301). It was 35.8% for patients who acquired at least one NI during their stay in ICU and 16.2% for those without any NI, yielding an overall crude excess mortality rate of 19.6% (OR= 2.9, 95% CI, 1.6 to 5.3). The population-attributable fraction due to ICU-NI in patients who died before ICU discharge was 23.46% (95% CI, 13.43%–29.04%). Overall, 62 case-patients were compared to 239 control patients for the final analysis. Case patients and control patients differed by age (p=0,003), simplified acute physiology score II (p < 10-3), NI (p < 10-3), nosocomial pneumonia (p=0.008), infection upon admission (p=0.002), immunosuppression (p=0.006), days of intubation (p < 10-3), tracheostomy (p=0.004), days with urinary catheterization (p < 10-3), days with CVC ( p=0.03), and length of stay in ICU (p=0.003). Multivariate analysis demonstrated 3 factors: age older than 65 years (OR, 5.78 [95% CI, 2.03-16.05] p=0.001), duration of intubation 1-10 days (OR, 6.82 [95% CI, [1.90-24.45] p=0.003), duration of intubation > 10 days (OR, 11.11 [95% CI, [2.85-43.28] p=0.001), duration of CVC 1-7 days (OR, 6.85[95% CI, [1.71-27.45] p=0.007) and duration of CVC > 7 days (OR, 5.55[95% CI, [1.70-18.04] p=0.004). Conclusion: While surveillance provides important baseline data, successful trials with more active intervention protocols, adopting multimodal approach for the prevention of nosocomial infection incited us to think about the feasibility of similar trial in our context. Therefore, the implementation of an efficient infection control strategy is a crucial step to improve the quality of care.

Keywords: intensive care unit, mortality, nosocomial infection, risk factors

Procedia PDF Downloads 403
6111 Efficiency and Scale Elasticity in Network Data Envelopment Analysis: An Application to International Tourist Hotels in Taiwan

Authors: Li-Hsueh Chen

Abstract:

Efficient operation is more and more important for managers of hotels. Unlike the manufacturing industry, hotels cannot store their products. In addition, many hotels provide room service, and food and beverage service simultaneously. When efficiencies of hotels are evaluated, the internal structure should be considered. Hence, based on the operational characteristics of hotels, this study proposes a DEA model to simultaneously assess the efficiencies among the room production division, food and beverage production division, room service division and food and beverage service division. However, not only the enhancement of efficiency but also the adjustment of scale can improve the performance. In terms of the adjustment of scale, scale elasticity or returns to scale can help to managers to make decisions concerning expansion or contraction. In order to construct a reasonable approach to measure the efficiencies and scale elasticities of hotels, this study builds an alternative variable-returns-to-scale-based two-stage network DEA model with the combination of parallel and series structures to explore the scale elasticities of the whole system, room production division, food and beverage production division, room service division and food and beverage service division based on the data of international tourist hotel industry in Taiwan. The results may provide valuable information on operational performance and scale for managers and decision makers.

Keywords: efficiency, scale elasticity, network data envelopment analysis, international tourist hotel

Procedia PDF Downloads 220
6110 Effects of Using a Recurrent Adverse Drug Reaction Prevention Program on Safe Use of Medicine among Patients Receiving Services at the Accident and Emergency Department of Songkhla Hospital Thailand

Authors: Thippharat Wongsilarat, Parichat tuntilanon, Chonlakan Prataksitorn

Abstract:

Recurrent adverse drug reactions are harmful to patients with mild to fatal illnesses, and affect not only patients but also their relatives, and organizations. To compare safe use of medicine among patients before and after using the recurrent adverse drug reaction prevention program . Quasi-experimental research with the target population of 598 patients with drug allergy history. Data were collected through an observation form tested for its validity by three experts (IOC = 0.87), and analyzed with a descriptive statistic (percentage). The research was conducted jointly with a multidisciplinary team to analyze and determine the weak points and strong points in the recurrent adverse drug reaction prevention system during the past three years, and 546, 329, and 498 incidences, respectively, were found. Of these, 379, 279, and 302 incidences, or 69.4; 84.80; and 60.64 percent of the patients with drug allergy history, respectively, were found to have caused by incomplete warning system. In addition, differences in practice in caring for patients with drug allergy history were found that did not cover all the steps of the patient care process, especially a lack of repeated checking, and a lack of communication between the multidisciplinary team members. Therefore, the recurrent adverse drug reaction prevention program was developed with complete warning points in the information technology system, the repeated checking step, and communication among related multidisciplinary team members starting from the hospital identity card room, patient history recording officers, nurses, physicians who prescribe the drugs, and pharmacists. Including in the system were surveillance, nursing, recording, and linking the data to referring units. There were also training concerning adverse drug reactions by pharmacists, monthly meetings to explain the process to practice personnel, creating safety culture, random checking of practice, motivational encouragement, supervising, controlling, following up, and evaluating the practice. The rate of prescribing drugs to which patients were allergic per 1,000 prescriptions was 0.08, and the incidence rate of recurrent drug reaction per 1,000 prescriptions was 0. Surveillance of recurrent adverse drug reactions covering all service providing points can ensure safe use of medicine for patients.

Keywords: recurrent drug, adverse reaction, safety, use of medicine

Procedia PDF Downloads 451
6109 Yawning Computing Using Bayesian Networks

Authors: Serge Tshibangu, Turgay Celik, Zenzo Ncube

Abstract:

Road crashes kill nearly over a million people every year, and leave millions more injured or permanently disabled. Various annual reports reveal that the percentage of fatal crashes due to fatigue/driver falling asleep comes directly after the percentage of fatal crashes due to intoxicated drivers. This percentage is higher than the combined percentage of fatal crashes due to illegal/Un-Safe U-turn and illegal/Un-Safe reversing. Although a relatively small percentage of police reports on road accidents highlights drowsiness and fatigue, the importance of these factors is greater than we might think, hidden by the undercounting of their events. Some scenarios show that these factors are significant in accidents with killed and injured people. Thus the need for an automatic drivers fatigue detection system in order to considerably reduce the number of accidents owing to fatigue.This research approaches the drivers fatigue detection problem in an innovative way by combining cues collected from both temporal analysis of drivers’ faces and environment. Monotony in driving environment is inter-related with visual symptoms of fatigue on drivers’ faces to achieve fatigue detection. Optical and infrared (IR) sensors are used to analyse the monotony in driving environment and to detect the visual symptoms of fatigue on human face. Internal cues from drivers faces and external cues from environment are combined together using machine learning algorithms to automatically detect fatigue.

Keywords: intelligent transportation systems, bayesian networks, yawning computing, machine learning algorithms

Procedia PDF Downloads 452
6108 In silico Model of Transamination Reaction Mechanism

Authors: Sang-Woo Han, Jong-Shik Shin

Abstract:

w-Transaminase (w-TA) is broadly used for synthesizing chiral amines with a high enantiopurity. However, the reaction mechanism of w-TA has been not well studied, contrary to a-transaminase (a-TA) such as AspTA. Here, we propose in silico model on the reaction mechanism of w-TA. Based on the modeling results which showed large free energy gaps between external aldimine and quinonoid on deamination (or ketimine and quinonoid on amination), withdrawal of Ca-H seemed as a critical step which determines the reaction rate on both amination and deamination reactions, which is consistent with previous researches. Hyperconjugation was also observed in both external aldimine and ketimine which weakens Ca-H bond to elevate Ca-H abstraction.

Keywords: computational modeling, reaction intermediates, w-transaminase, in silico model

Procedia PDF Downloads 540
6107 Comparison of Two Strategies in Thoracoscopic Ablation of Atrial Fibrillation

Authors: Alexander Zotov, Ilkin Osmanov, Emil Sakharov, Oleg Shelest, Aleksander Troitskiy, Robert Khabazov

Abstract:

Objective: Thoracoscopic surgical ablation of atrial fibrillation (AF) includes two technologies in performing of operation. 1st strategy used is the AtriCure device (bipolar, nonirrigated, non clamping), 2nd strategy is- the Medtronic device (bipolar, irrigated, clamping). The study presents a comparative analysis of clinical outcomes of two strategies in thoracoscopic ablation of AF using AtriCure vs. Medtronic devices. Methods: In 2 center study, 123 patients underwent thoracoscopic ablation of AF for the period from 2016 to 2020. Patients were divided into two groups. The first group is represented by patients who applied the AtriCure device (N=63), and the second group is - the Medtronic device (N=60), respectively. Patients were comparable in age, gender, and initial severity of the condition. Among the patients, in group 1 were 65% males with a median age of 57 years, while in group 2 – 75% and 60 years, respectively. Group 1 included patients with paroxysmal form -14,3%, persistent form - 68,3%, long-standing persistent form – 17,5%, group 2 – 13,3%, 13,3% and 73,3% respectively. Median ejection fraction and indexed left atrial volume amounted in group 1 – 63% and 40,6 ml/m2, in group 2 - 56% and 40,5 ml/m2. In addition, group 1 consisted of 39,7% patients with chronic heart failure (NYHA Class II) and 4,8% with chronic heart failure (NYHA Class III), when in group 2 – 45% and 6,7%, respectively. Follow-up consisted of laboratory tests, chest Х-ray, ECG, 24-hour Holter monitor, and cardiopulmonary exercise test. Duration of freedom from AF, distant mortality rate, and prevalence of cerebrovascular events were compared between the two groups. Results: Exit block was achieved in all patients. According to the Clavien-Dindo classification of surgical complications fraction of adverse events was 14,3% and 16,7% (1st group and 2nd group, respectively). Mean follow-up period in the 1st group was 50,4 (31,8; 64,8) months, in 2nd group - 30,5 (14,1; 37,5) months (P=0,0001). In group 1 - total freedom of AF was in 73,3% of patients, among which 25% had additional antiarrhythmic drugs (AADs) therapy or catheter ablation (CA), in group 2 – 90% and 18,3%, respectively (for total freedom of AF P<0,02). At follow-up, the distant mortality rate in the 1st group was – 4,8%, and in the 2nd – no fatal events. Prevalence of cerebrovascular events was higher in the 1st group than in the 2nd (6,7% vs. 1,7% respectively). Conclusions: Despite the relatively shorter follow-up of the 2nd group in the study, applying the strategy using the Medtronic device showed quite encouraging results. Further research is needed to evaluate the effectiveness of this strategy in the long-term period.

Keywords: atrial fibrillation, clamping, ablation, thoracoscopic surgery

Procedia PDF Downloads 106
6106 The Impact Of Environmental Management System ISO 14001 Adoption on Firm Performance

Authors: Raymond Treacy, Paul Humphreys, Ronan McIvor, Trevor Cadden, Alan McKittrick

Abstract:

This study employed event study methodology to examine the role of institutions, resources and dynamic capabilities in the relationship between the Environmental Management System ISO 14001 adoption and firm performance. Utilising financial data from 140 ISO 14001 certified firms and 320 non-certified firms, the results of the study suggested that the UK and Irish manufacturers were not implementing ISO 14001 solely to gain legitimacy. In contrast, the results demonstrated that firms were fully integrating the ISO 14001 standard within their operations as certified firms were able to improve both financial and operating performance when compared to non-certified firms. However, while there were significant and long lasting improvements for employee productivity, manufacturing cost efficiency, return on assets and sales turnover, the sample firms operating cycle and fixed asset efficiency displayed evidence of diminishing returns in the long-run, underlying the observation that no operating advantage based on incremental improvements can be everlasting. Hence, there is an argument for investing in dynamic capabilities which help renew and refresh the resource base and help the firm adapt to changing environments. Indeed, the results of the regression analysis suggest that dynamic capabilities for innovation acted as a moderator in the relationship between ISO 14001 certification and firm performance. This, in turn, will have a significant and symbiotic influence on sustainability practices within the participating organisations. The study not only provides new and original insights, but demonstrates pragmatically how firms can take advantage of environmental management systems as a moderator to significantly enhance firm performance. However, while it was shown that firm innovation aided both short term and long term ROA performance, adaptive market capabilities only aided firms in the short-term at the marketing strategy deployment stage. Finally, the results have important implications for firms operating in an economic recession as the results suggest that firms should scale back investment in R&D while operating in an economic downturn. Conversely, under normal trading conditions, consistent and long term investments in R&D was found to moderate the relationship between ISO 14001 certification and firm performance. Hence, the results of the study have important implications for academics and management alike.

Keywords: supply chain management, environmental management systems, quality management, sustainability, firm performance

Procedia PDF Downloads 305
6105 Efficient Use of Energy through Incorporation of a Gas Turbine in Methanol Plant

Authors: M. Azadi, N. Tahouni, M. H. Panjeshahi

Abstract:

A techno-economic evaluation for efficient use of energy in a large scale industrial plant of methanol is carried out. This assessment is based on integration of a gas turbine with an existing plant of methanol in which the outlet gas products of exothermic reactor is expanded to power generation. Also, it is decided that methanol production rate is constant through addition of power generation system to the existing methanol plant. Having incorporated a gas turbine with the existing plant, the economic results showed total investment of MUSD 16.9, energy saving of 3.6 MUSD/yr with payback period of approximately 4.7 years.

Keywords: energy saving, methanol, gas turbine, power generation

Procedia PDF Downloads 463
6104 Forecasting Regional Data Using Spatial Vars

Authors: Taisiia Gorshkova

Abstract:

Since the 1980s, spatial correlation models have been used more often to model regional indicators. An increasingly popular method for studying regional indicators is modeling taking into account spatial relationships between objects that are part of the same economic zone. In 2000s the new class of model – spatial vector autoregressions was developed. The main difference between standard and spatial vector autoregressions is that in the spatial VAR (SpVAR), the values of indicators at time t may depend on the values of explanatory variables at the same time t in neighboring regions and on the values of explanatory variables at time t-k in neighboring regions. Thus, VAR is a special case of SpVAR in the absence of spatial lags, and the spatial panel data model is a special case of spatial VAR in the absence of time lags. Two specifications of SpVAR were applied to Russian regional data for 2000-2017. The values of GRP and regional CPI are used as endogenous variables. The lags of GRP, CPI and the unemployment rate were used as explanatory variables. For comparison purposes, the standard VAR without spatial correlation was used as “naïve” model. In the first specification of SpVAR the unemployment rate and the values of depending variables, GRP and CPI, in neighboring regions at the same moment of time t were included in equations for GRP and CPI respectively. To account for the values of indicators in neighboring regions, the adjacency weight matrix is used, in which regions with a common sea or land border are assigned a value of 1, and the rest - 0. In the second specification the values of depending variables in neighboring regions at the moment of time t were replaced by these values in the previous time moment t-1. According to the results obtained, when inflation and GRP of neighbors are added into the model both inflation and GRP are significantly affected by their previous values, and inflation is also positively affected by an increase in unemployment in the previous period and negatively affected by an increase in GRP in the previous period, which corresponds to economic theory. GRP is not affected by either the inflation lag or the unemployment lag. When the model takes into account lagged values of GRP and inflation in neighboring regions, the results of inflation modeling are practically unchanged: all indicators except the unemployment lag are significant at a 5% significance level. For GRP, in turn, GRP lags in neighboring regions also become significant at a 5% significance level. For both spatial and “naïve” VARs the RMSE were calculated. The minimum RMSE are obtained via SpVAR with lagged explanatory variables. Thus, according to the results of the study, it can be concluded that SpVARs can accurately model both the actual values of macro indicators (particularly CPI and GRP) and the general situation in the regions

Keywords: forecasting, regional data, spatial econometrics, vector autoregression

Procedia PDF Downloads 137
6103 Establishment of a Nomogram Prediction Model for Postpartum Hemorrhage during Vaginal Delivery

Authors: Yinglisong, Jingge Chen, Jingxuan Chen, Yan Wang, Hui Huang, Jing Zhnag, Qianqian Zhang, Zhenzhen Zhang, Ji Zhang

Abstract:

Purpose: The study aims to establish a nomogram prediction model for postpartum hemorrhage (PPH) in vaginal delivery. Patients and Methods: Clinical data were retrospectively collected from vaginal delivery patients admitted to a hospital in Zhengzhou, China, from June 1, 2022 - October 31, 2022. Univariate and multivariate logistic regression were used to filter out independent risk factors. A nomogram model was established for PPH in vaginal delivery based on the risk factors coefficient. Bootstrapping was used for internal validation. To assess discrimination and calibration, receiver operator characteristics (ROC) and calibration curves were generated in the derivation and validation groups. Results: A total of 1340 cases of vaginal delivery were enrolled, with 81 (6.04%) having PPH. Logistic regression indicated that history of uterine surgery, induction of labor, duration of first labor, neonatal weight, WBC value (during the first stage of labor), and cervical lacerations were all independent risk factors of hemorrhage (P <0.05). The area-under-curve (AUC) of ROC curves of the derivation group and the validation group were 0.817 and 0.821, respectively, indicating good discrimination. Two calibration curves showed that nomogram prediction and practical results were highly consistent (P = 0.105, P = 0.113). Conclusion: The developed individualized risk prediction nomogram model can assist midwives in recognizing and diagnosing high-risk groups of PPH and initiating early warning to reduce PPH incidence.

Keywords: vaginal delivery, postpartum hemorrhage, risk factor, nomogram

Procedia PDF Downloads 70
6102 Legal Contestation of Non-Legal Norms: The Case of Humanitarian Intervention Norm between 1999 and 2018

Authors: Nazli Ustunes Demirhan

Abstract:

Norms of any nature are subject to pressures of change throughout their lifespans, as they are interpreted and re-interpreted every time they are used rhetorically or practically by international actors. The inevitable contestation of different interpretations may lead to an erosion of the norm, as well as to its strengthening. This paper aims to question the role of formal legality on the change of norm strength, using a norm contestation framework and a multidimensional norm strength conceptualization. It argues that the role of legality is not necessarily linked to the formal legal characteristics of a norm, but is about the legality of the contestation processes. In order to demonstrate this argument, the paper examines the evolutionary path of the humanitarian intervention norm as a case study. Humanitarian intervention, as a norm of very low formal legal characteristics, has been subject to numerous cycles of contestation, demonstrating a fluctuating pattern of norm strength. With the purpose of examining the existence and role of legality in the selected contestation periods from 1999 to 2017, this paper uses process tracing method with a detailed document analysis on the Security Council documents; including decisions, resolutions, meeting minutes, press releases as well as individual country statements. Through the empirical analysis, it is demonstrated that the legality of the contestation processes has a positive effect at least on the authoritativeness dimension of norm strength. This study tries to contribute to the developing dialogue between international relations (IR) and internal law (IL) disciplines with its better-tuned understanding of legality. It connects to further questions in IR/IL nexus, relating to the value added of norm legality, and politics of legalization as well as better international policies for norm reinforcement.

Keywords: humanitarian intervention, legality, norm contestation, norm dynamics, responsibility to protect

Procedia PDF Downloads 140
6101 CFD Simulation of Forced Convection Nanofluid Heat Transfer in the Automotive Radiator

Authors: Sina Movafagh, Younes Bakhshan

Abstract:

Heat transfer of coolant flow through the automobile radiators is of great importance for the optimization of fuel consumption. In this study, the heat transfer performance of the automobile radiator is evaluated numerically. Different concentrations of nanofluids have been investigated by the addition of Al2O3 nano-particles into the water. Also, the effect of the inlet temperature of nanofluid on the performance of radiator is studied. Results show that with an increase of inlet temperature the outlet temperature and pressure drop along the radiator increase. Also, it has been observed that increase of nono-particle concentration will result in an increase in heat transfer rate within the radiator.

Keywords: heat transfer, nanofluid, car radiator, CFD simulation

Procedia PDF Downloads 299
6100 Physical Tests on Localized Fluidization in Offshore Suction Bucket Foundations

Authors: Li-Hua Luu, Alexis Doghmane, Abbas Farhat, Mohammad Sanayei, Pierre Philippe, Pablo Cuellar

Abstract:

Suction buckets are promising innovative foundations for offshore wind turbines. They generally feature the shape of an inverted bucket and rely on a suction system as a driving agent for their installation into the seabed. Water is pumped out of the buckets that are initially placed to rest on the seabed, creating a net pressure difference across the lid that generates a seepage flow, lowers the soil resistance below the foundation skirt, and drives them effectively into the seabed. The stability of the suction mechanism as well as the possibility of a piping failure (i.e., localized fluidization within the internal soil plug) during their installation are some of the key questions that remain open. The present work deals with an experimental study of localized fluidization by suction within a fixed bucket partially embedded into a submerged artificial soil made of spherical beads. The transient process, from the onset of granular motion until reaching a stationary regime for the fluidization at the embedded bucket wall, is recorded using the combined optical techniques of planar laser-induced fluorescence and refractive index matching. To conduct a systematic study of the piping threshold for the seepage flow, we vary the beads size, the suction pressure, and the initial depth for the bucket. This experimental modelling, by dealing with erosion-related phenomena from a micromechanical perspective, shall provide qualitative scenarios for the local processes at work which are missing in the offshore practice so far.

Keywords: fluidization, micromechanical approach, offshore foundations, suction bucket

Procedia PDF Downloads 178
6099 Micro-Milling Process Development of Advanced Materials

Authors: M. A. Hafiz, P. T. Matevenga

Abstract:

Micro-level machining of metals is a developing field which has shown to be a prospective approach to produce features on the parts in the range of a few to a few hundred microns with acceptable machining quality. It is known that the mechanics (i.e. the material removal mechanism) of micro-machining and conventional machining have significant differences due to the scaling effects associated with tool-geometry, tool material and work piece material characteristics. Shape memory alloys (SMAs) are those metal alloys which display two exceptional properties, pseudoelasticity and the shape memory effect (SME). Nickel-titanium (NiTi) alloys are one of those unique metal alloys. NiTi alloys are known to be difficult-to-cut materials specifically by using conventional machining techniques due to their explicit properties. Their high ductility, high amount of strain hardening, and unusual stress–strain behaviour are the main properties accountable for their poor machinability in terms of tool wear and work piece quality. The motivation of this research work was to address the challenges and issues of micro-machining combining with those of machining of NiTi alloy which can affect the desired performance level of machining outputs. To explore the significance of range of cutting conditions on surface roughness and tool wear, machining tests were conducted on NiTi. Influence of different cutting conditions and cutting tools on surface and sub-surface deformation in work piece was investigated. Design of experiments strategy (L9 Array) was applied to determine the key process variables. The dominant cutting parameters were determined by analysis of variance. These findings showed that feed rate was the dominant factor on surface roughness whereas depth of cut found to be dominant factor as far as tool wear was concerned. The lowest surface roughness was achieved at the feed rate of equal to the cutting edge radius where as the lowest flank wear was observed at lowest depth of cut. Repeated machining trials have yet to be carried out in order to observe the tool life, sub-surface deformation and strain induced hardening which are also expecting to be amongst the critical issues in micro machining of NiTi. The machining performance using different cutting fluids and strategies have yet to be studied.

Keywords: nickel titanium, micro-machining, surface roughness, machinability

Procedia PDF Downloads 338
6098 ‘Undressed Star’, Sexual Scenes and Discourses in Mass Media: Exploring 1980s Taiwan Female Film Stars’ Onscreen Erotic Acting

Authors: Xinchen Zhu

Abstract:

In the history of Chinese-language film, female stars’ acting is connected with issues of national ideology, consumerism, and sexual politics. In the 1980s, Taiwan entered a period of ‘soft authoritarianism’ in which the economy prospered politics became more democratic, and mass culture became more diverse. Film censorship was more flexible and sexual scenes were increasingly shown on screen. Female stars’ bodies were eroticized and commercialized through sexual and nude scenes and, by challenging conservative film censorship and social taboos, became the focus of mass media. This article will explore how discourses in mass media constructed the erotic images of female stars and, conversely, impacted film censorship, filmmakers and film actresses in 1980s’ Taiwan. This article will regard the eroticized female film stars’ acting as a ‘field’ of internal interaction and continuous reproduction, where the ideology of male dominance and voices of female film stars conflict with each other. Based on textual analysis of female stars’ sexual acting and the debate in mass media, the argument is that the eroticized female bodies were gazed upon on and off the screen. In the discourses of mass media, the artistry of actresses’ erotic acting was not only ignored, devalued and delegitimized, these stars were also labelled as ‘undressed star’ or ‘nude star’ and construed as victims of the film industry. However, the female stars were able to speak through mass media platforms, emphasizing their efforts in erotic acting and highlighting modern female subjectivity.

Keywords: sexual scenes, Taiwan female stars, erotic acting, discourses in mass media, female subjectivity

Procedia PDF Downloads 182
6097 GynApp: A Mobile Application for the Organization and Control of Gynecological Studies

Authors: Betzabet García-Mendoza, Rocío Abascal-Mena

Abstract:

Breast and cervical cancer are among the leading causes of death of women in Mexico. The mortality rate for these diseases is alarming, even though there have been many campaigns for making people self-aware of the importance of conducting gynecological studies for a timely prevention and detection, these have not been enough. This paper presents a mobile application for organizing and controlling gynecological studies in order to help and boost women to take care of their bodies and health. The process of analyzing and designing the mobile application is presented, along with all the steps carried out by following a user-centered design methodology.

Keywords: breast cancer, cervical cancer, gynecological mobile application, paper prototyping, storyboard, women health

Procedia PDF Downloads 302
6096 Breast Cancer Incidence Estimation in Castilla-La Mancha (CLM) from Mortality and Survival Data

Authors: C. Romero, R. Ortega, P. Sánchez-Camacho, P. Aguilar, V. Segur, J. Ruiz, G. Gutiérrez

Abstract:

Introduction: Breast cancer is a leading cause of death in CLM. (2.8% of all deaths in women and 13,8% of deaths from tumors in womens). It is the most tumor incidence in CLM region with 26.1% from all tumours, except nonmelanoma skin (Cancer Incidence in Five Continents, Volume X, IARC). Cancer registries are a good information source to estimate cancer incidence, however the data are usually available with a lag which makes difficult their use for health managers. By contrast, mortality and survival statistics have less delay. In order to serve for resource planning and responding to this problem, a method is presented to estimate the incidence of mortality and survival data. Objectives: To estimate the incidence of breast cancer by age group in CLM in the period 1991-2013. Comparing the data obtained from the model with current incidence data. Sources: Annual number of women by single ages (National Statistics Institute). Annual number of deaths by all causes and breast cancer. (Mortality Registry CLM). The Breast cancer relative survival probability. (EUROCARE, Spanish registries data). Methods: A Weibull Parametric survival model from EUROCARE data is obtained. From the model of survival, the population and population data, Mortality and Incidence Analysis MODel (MIAMOD) regression model is obtained to estimate the incidence of cancer by age (1991-2013). Results: The resulting model is: Ix,t = Logit [const + age1*x + age2*x2 + coh1*(t – x) + coh2*(t-x)2] Where: Ix,t is the incidence at age x in the period (year) t; the value of the parameter estimates is: const (constant term in the model) = -7.03; age1 = 3.31; age2 = -1.10; coh1 = 0.61 and coh2 = -0.12. It is estimated that in 1991 were diagnosed in CLM 662 cases of breast cancer (81.51 per 100,000 women). An estimated 1,152 cases (112.41 per 100,000 women) were diagnosed in 2013, representing an increase of 40.7% in gross incidence rate (1.9% per year). The annual average increases in incidence by age were: 2.07% in women aged 25-44 years, 1.01% (45-54 years), 1.11% (55-64 years) and 1.24% (65-74 years). Cancer registries in Spain that send data to IARC declared 2003-2007 the average annual incidence rate of 98.6 cases per 100,000 women. Our model can obtain an incidence of 100.7 cases per 100,000 women. Conclusions: A sharp and steady increase in the incidence of breast cancer in the period 1991-2013 is observed. The increase was seen in all age groups considered, although it seems more pronounced in young women (25-44 years). With this method you can get a good estimation of the incidence.

Keywords: breast cancer, incidence, cancer registries, castilla-la mancha

Procedia PDF Downloads 306
6095 Discrimination during a Resume Audit: The Impact of Job Context in Hiring

Authors: Alexandra Roy

Abstract:

Building on literature on cognitive matching and social categorization and using the correspondence testing method, we test the interaction effect of person characteristics (Gender with physical attractiveness) and job context (client contact, industry status, coworker contact). As expected, while findings show a strong impact of gender with beauty on hiring chances, job context characteristics have also a significant overall effect of this hiring outcome. Moreover, the rate of positive responses varies according some of the recruiter’s characteristics. Results are robust to various sensitivity checks. Implications of the results, limitations of the study, and directions for future research are discussed.

Keywords: correspondence testing, discrimination, hiring, physical attractiveness

Procedia PDF Downloads 203
6094 Algorithm Development of Individual Lumped Parameter Modelling for Blood Circulatory System: An Optimization Study

Authors: Bao Li, Aike Qiao, Gaoyang Li, Youjun Liu

Abstract:

Background: Lumped parameter model (LPM) is a common numerical model for hemodynamic calculation. LPM uses circuit elements to simulate the human blood circulatory system. Physiological indicators and characteristics can be acquired through the model. However, due to the different physiological indicators of each individual, parameters in LPM should be personalized in order for convincing calculated results, which can reflect the individual physiological information. This study aimed to develop an automatic and effective optimization method to personalize the parameters in LPM of the blood circulatory system, which is of great significance to the numerical simulation of individual hemodynamics. Methods: A closed-loop LPM of the human blood circulatory system that is applicable for most persons were established based on the anatomical structures and physiological parameters. The patient-specific physiological data of 5 volunteers were non-invasively collected as personalized objectives of individual LPM. In this study, the blood pressure and flow rate of heart, brain, and limbs were the main concerns. The collected systolic blood pressure, diastolic blood pressure, cardiac output, and heart rate were set as objective data, and the waveforms of carotid artery flow and ankle pressure were set as objective waveforms. Aiming at the collected data and waveforms, sensitivity analysis of each parameter in LPM was conducted to determine the sensitive parameters that have an obvious influence on the objectives. Simulated annealing was adopted to iteratively optimize the sensitive parameters, and the objective function during optimization was the root mean square error between the collected waveforms and data and simulated waveforms and data. Each parameter in LPM was optimized 500 times. Results: In this study, the sensitive parameters in LPM were optimized according to the collected data of 5 individuals. Results show a slight error between collected and simulated data. The average relative root mean square error of all optimization objectives of 5 samples were 2.21%, 3.59%, 4.75%, 4.24%, and 3.56%, respectively. Conclusions: Slight error demonstrated good effects of optimization. The individual modeling algorithm developed in this study can effectively achieve the individualization of LPM for the blood circulatory system. LPM with individual parameters can output the individual physiological indicators after optimization, which are applicable for the numerical simulation of patient-specific hemodynamics.

Keywords: blood circulatory system, individual physiological indicators, lumped parameter model, optimization algorithm

Procedia PDF Downloads 133
6093 The Evaluation of the Impact of Tobacco Heating System and Conventional Cigarette Smoking on Self Reported Oral Symptoms (Dry Mouth, Halitosis, Burning Sensation, Taste Changes) and Salivary Flow Rate: A Cross-sectional Study

Authors: Ella Sever, Irena Glažar, Ema Saltović

Abstract:

Conventional cigarette smoking is associated with an increased risk of oral diseases and oral symptoms such as dry mouth, bad breath, burning sensation, and changes in taste sensation. The harmful effects of conventional cigarette smoking on oral health have been extensively studied previously. However, there is a severe lack of studies investigating the effects of Tobacco Heating System (THS) on oral structures. As a preventive measure, a new alternative Tobacco THS has been developed, and according to the manufacturer, it has fewer potentially harmful and harmful constituents and consequently, lowers the risk of developing tobacco-related diseases. The aim is to analyze the effects of conventional cigarettes and THS on salivary flow rate (SFR), and self-reported oral symptoms.The stratified cross-sectional study included 90 subjects divided into three groups: THS smokers, conventional cigarette smokers, and nonsmokers. The subjects completed questionnaires on smoking habits, and symptoms (dry mouth, bad breath, burning sensation, and changes in taste sensation). SFR test were performed on each subject. The lifetime exposure to smoking was calculated using the Brinkman index (BI). Participants were 20-55 years old (median 31), and 66.67 % were female. The study included three groups of equal size (n = 20), and no statistically significant differences were found between the groups in terms of age (p = 0.632), sex (p = 1.0), and lifetime exposure to smoking (the BI) (p=0,129). Participants from the smoking group had an average of 10 (2-30) years of smoking experience in the conventional cigarettes group and 6 (1-20) years of smoking experience in the THS group. Daily consumption of cigarettes/heets per day was the same for both smokers’ groups (12(2-20) cigarettes/heets per day). The self-reported symptoms were present in 40 % of participants in the smokers group. There were significant differences in the presence of halitosis (p = 0.025) and taste sensation (p=0.013). There were no statistical differences in the presence of dry mouth (p =0.416) and burning sensation (0.7). The SFR differed between groups (p < 0.001) and was significantly lower in the THS and conventional cigarette smokers’ groups than the nonsmokers’ group. There were no significant differences between THS smokers and conventional cigarette smokers. The results of the study show that THS products have a similar effect to conventional cigarettes on oral cavity structures, especially in terms of SFR, self-reported halitosis, and changes in taste.

Keywords: oral health, tobacco products, halitosis, cigarette smoking

Procedia PDF Downloads 53
6092 Enhancing Solar Fuel Production by CO₂ Photoreduction Using Transition Metal Oxide Catalysts in Reactors Prepared by Additive Manufacturing

Authors: Renata De Toledo Cintra, Bruno Ramos, Douglas Gouvêa

Abstract:

There is a huge global concern due to the emission of greenhouse gases, consequent environmental problems, and the increase in the average temperature of the planet, caused mainly by fossil fuels, petroleum derivatives represent a big part. One of the main greenhouse gases, in terms of volume, is CO₂. Recovering a part of this product through chemical reactions that use sunlight as an energy source and even producing renewable fuel (such as ethane, methane, ethanol, among others) is a great opportunity. The process of artificial photosynthesis, through the conversion of CO₂ and H₂O into organic products and oxygen using a metallic oxide catalyst, and incidence of sunlight, is one of the promising solutions. Therefore, this research is of great relevance. To this reaction take place efficiently, an optimized reactor was developed through simulation and prior analysis so that the geometry of the internal channel is an efficient route and allows the reaction to happen, in a controlled and optimized way, in flow continuously and offering the least possible resistance. The design of this reactor prototype can be made in different materials, such as polymers, ceramics and metals, and made through different processes, such as additive manufacturing (3D printer), CNC, among others. To carry out the photocatalysis in the reactors, different types of catalysts will be used, such as ZnO deposited by spray pyrolysis in the lighting window, probably modified ZnO, TiO₂ and modified TiO₂, among others, aiming to increase the production of organic molecules, with the lowest possible energy.

Keywords: artificial photosynthesis, CO₂ reduction, photocatalysis, photoreactor design, 3D printed reactors, solar fuels

Procedia PDF Downloads 77
6091 In vitro Protein Folding and Stability Using Thermostable Exoshells

Authors: Siddharth Deshpande, Nihar Masurkar, Vallerinteavide Mavelli Girish, Malan Desai, Chester Drum

Abstract:

Folding and stabilization of recombinant proteins remain a consistent challenge for industrial and therapeutic applications. Proteins derived from thermophilic bacteria often have superior expression and stability qualities. To develop a generalizable approach to protein folding and stabilization, we tested the hypothesis that wrapping a thermostable exoshell around a protein substrate would aid folding and impart thermostable qualities to the internalized substrate. To test the effect of internalizing a protein within a thermostable exoshell (tES), we tested in vitro folding and stability using green fluorescent protein (GFPuv), horseradish peroxidase (HRP) and renilla luciferase (rLuc). The 8nm interior volume of a thermostable ferritin assembly was engineered to accommodate foreign proteins and either present a positive, neutral or negative interior charge environment. We further engineered the tES complex to reversibly assemble and disassemble with pH titration. Template proteins were expressed as inclusion bodies and an in vitro folding protocol was developed that forced proteins to fold inside a single tES. Functional yield was improved 100-fold, 100-fold and 150-fold with use of tES for GFPuv, HRP and rLuc respectively and was highly dependent on the internal charge environment of the tES. After folding, functional proteins could be released from the tES folding cavity using size exclusion chromatography at pH 5.8. Internalized proteins were tested for improved stability against thermal, organic, urea and guanidine denaturation. Our results demonstrated that thermostable exoshells can efficiently refold and stabilize inactive aggregates into functional proteins.

Keywords: thermostable shell, in vitro folding, stability, functional yield

Procedia PDF Downloads 240
6090 Horizontal Cooperative Game Theory in Hotel Revenue Management

Authors: Ririh Rahma Ratinghayu, Jayu Pramudya, Nur Aini Masruroh, Shi-Woei Lin

Abstract:

This research studies pricing strategy in cooperative setting of hotel duopoly selling perishable product under fixed capacity constraint by using the perspective of managers. In hotel revenue management, competitor’s average room rate and occupancy rate should be taken into manager’s consideration in determining pricing strategy to generate optimum revenue. This information is not provided by business intelligence or available in competitor’s website. Thus, Information Sharing (IS) among players might result in improved performance of pricing strategy. IS is widely adopted in the logistics industry, but IS within hospitality industry has not been well-studied. This research put IS as one of cooperative game schemes, besides Mutual Price Setting (MPS) scheme. In off-peak season, hotel manager arranges pricing strategy to offer promotion package and various kinds of discounts up to 60% of full-price to attract customers. Competitor selling homogenous product will react the same, then triggers a price war. Price war which generates lower revenue may be avoided by creating collaboration in pricing strategy to optimize payoff for both players. In MPS cooperative game, players collaborate to set a room rate applied for both players. Cooperative game may avoid unfavorable players’ payoff caused by price war. Researches on horizontal cooperative game in logistics show better performance and payoff for the players, however, horizontal cooperative game in hotel revenue management has not been demonstrated. This paper aims to develop hotel revenue management models under duopoly cooperative schemes (IS & MPS), which are compared to models under non-cooperative scheme too. Each scheme has five models, Capacity Allocation Model; Demand Model; Revenue Model; Optimal Price Model; and Equilibrium Price Model. Capacity Allocation Model and Demand Model employs self-hotel and competitor’s full and discount price as predictors under non-linear relation. Optimal price is obtained by assuming revenue maximization motive. Equilibrium price is observed by interacting self-hotel’s and competitor’s optimal price under reaction equation. Equilibrium is analyzed using game theory approach. The sequence applies for three schemes. MPS Scheme differently aims to optimize total players’ payoff. The case study in which theoretical models are applied observes two hotels offering homogenous product in Indonesia during a year. The Capacity Allocation, Demand, and Revenue Models are built using multiple regression and statistically tested for validation. Case study data confirms that price behaves within demand model in a non-linear manner. IS Models can represent the actual demand and revenue data better than Non-IS Models. Furthermore, IS enables hotels to earn significantly higher revenue. Thus, duopoly hotel players in general, might have reasonable incentives to share information horizontally. During off-peak season, MPS Models are able to predict the optimal equal price for both hotels. However, Nash equilibrium may not always exist depending on actual payoff of adhering or betraying mutual agreement. To optimize performance, horizontal cooperative game may be chosen over non-cooperative game. Mathematical models can be used to detect collusion among business players. Empirical testing can be used as policy input for market regulator in preventing unethical business practices potentially harming society welfare.

Keywords: horizontal cooperative game theory, hotel revenue management, information sharing, mutual price setting

Procedia PDF Downloads 286
6089 Technical and Economical Feasibility Analysis of Solar Water Pumping System - Case Study in Iran

Authors: A. Gharib, M. Moradi

Abstract:

The technical analysis of using solar energy and electricity for water pumping in the Khuzestan province in Iran is investigated. For this purpose, the ecological conditions such as the weather data, air clearness and sunshine hours are analyzed. The nature of groundwater in the region was examined in terms of depth, static and dynamic head, water pumping rate. Three configurations for solar water pumping system were studied in this thesis; AC solar water pumping with a storage battery, AC solar water pumping with a storage tank, and DC direct solar water pumping.

Keywords: technical and economic feasibility, solar energy, photovoltaic systems, solar water pumping system

Procedia PDF Downloads 564