Search results for: comprehensive model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19166

Search results for: comprehensive model

18956 Debris Flow Mapping Using Geographical Information System Based Model and Geospatial Data in Middle Himalayas

Authors: Anand Malik

Abstract:

The Himalayas with high tectonic activities poses a great threat to human life and property. Climate change is another reason which triggering extreme events multiple fold effect on high mountain glacial environment, rock falls, landslides, debris flows, flash flood and snow avalanches. One such extreme event of cloud burst along with breach of moraine dammed Chorabri Lake occurred from June 14 to June 17, 2013, triggered flooding of Saraswati and Mandakini rivers in the Kedarnath Valley of Rudraprayag district of Uttrakhand state of India. As a result, huge volume of water with its high velocity created a catastrophe of the century, which resulted into loss of large number of human/animals, pilgrimage, tourism, agriculture and property. Thus a comprehensive assessment of debris flow hazards requires GIS-based modeling using numerical methods. The aim of present study is to focus on analysis and mapping of debris flow movements using geospatial data with flow-r (developed by team at IGAR, University of Lausanne). The model is based on combined probabilistic and energetic algorithms for the assessment of spreading of flow with maximum run out distances. Aster Digital Elevation Model (DEM) with 30m x 30m cell size (resolution) is used as main geospatial data for preparing the run out assessment, while Landsat data is used to analyze land use land cover change in the study area. The results of the study area show that model can be applied with great accuracy as the model is very useful in determining debris flow areas. The results are compared with existing available landslides/debris flow maps. ArcGIS software is used in preparing run out susceptibility maps which can be used in debris flow mitigation and future land use planning.

Keywords: debris flow, geospatial data, GIS based modeling, flow-R

Procedia PDF Downloads 274
18955 India and Space Insurance Policy: An Analytical Insight

Authors: Shreyas Jayasimha, Suneel Anand Sundharesan, Rohan Tigadi

Abstract:

In the recent past, the United States of America and Russia were the only two dominant players in the field of space exploration and had a virtual monopoly in the field of space and technology. However, this has changed over the past few years. Many other nation states such as India, China, and the UK have made significant progress in this field. Amongst these nations, the growth and development of the Indian space program have been nothing short of a miracle. Starting recently, India has successfully launched a series of satellites including its much acclaimed Mangalyaan mission, which placed a satellite in Mars’ orbit. The fact that India was able to attain this feat in its attempt demonstrates the enormous growth potential and promise that the Indian space program holds for the coming years. However, unlike other space-faring nations, India does not have a comprehensive and consolidated space insurance policy. In this regard, it is pertinent to note that, the costs and risks involved in a administering a space program are enormous. Therefore, in the absence of a comprehensive space insurance policy, any losses from an unsuccessful will have to be borne by the state exchequer. Thus, in order to ensure that Indian space program continues on its upward trajectory, the Indian establishment should seriously consider formulating a comprehensive insurance policy. This paper intends to analyze the international best practices followed by other space-faring nations in relation to space insurance policy. Thereafter, the authors seek to examine the current regime in India relating to space insurance policy. Finally, the authors will conclude by providing a series of recommendations regarding the essential elements that should be part of any Indian space insurance policy regime.

Keywords: India, space insurance policy, space law, Indian space research organization

Procedia PDF Downloads 230
18954 Gas Flaring Utilization at KK Station

Authors: Abd Alati Ali Abushnaq, Malek Essnni, Abduraouf Eteer

Abstract:

The present study proposes a comprehensive approach to effectively utilize associated gas from the KK remote station, eliminating the practice of flaring and mitigating greenhouse gas (GHG) emissions. The proposed integrated system involves diverting the associated gas via a newly designed pipeline, seamlessly connecting to the existing 12-inch pipeline at the tie-in point. The proposed destination is the low-pressure system at A-100 or 3rd stage, where the associated gas will be channeled towards the NGL (natural gas liquid) plant for processing. To ensure the system's efficacy under varying gas production scenarios, the study employs two industry-standard simulation software packages, Aspen HYSYS and PIPSIM. The simulated results demonstrate the system's ability to handle the projected increase in gas production, reaching up to 38 MMSCFD. This comprehensive analysis ensures the system's robustness and adaptability to future production demands.

Keywords: associated gas, flaring mitigation, GHG emissions, pipeline diversion, NGL plant, KK remote station, production forecasting, Aspen HYSYS, PIPSIM

Procedia PDF Downloads 90
18953 Economic Integration in Eurasia: Modeling of the Current and Future Architecture

Authors: M. G. Shilina

Abstract:

The prospects for political and economic development of the Eurasian space are currently discussed at both governmental and expert levels. New concepts actively proposed by the Eurasian governments require the analysis and search for effective implementation options. In the paper, an attempt to identify effective solutions to the problems surrounding the current economic integration of the Eurasian states is given on the basis of an interdisciplinary, comprehensive, structured analysis. The phenomenon is considered through the prism of the international law, world economy and politics, combined with the study of existing intergovernmental practice. The modeling method was taken as the basis for the research and is supplemented by legal and empirical methods. The detailed multi-level model of practical construction the 'Great Eurasia' (the GE) concept is proposed, the option for building a phased interaction in Eurasia is given through the prism of construction by the Eurasian Economic Union (the EAEU) as the main tool. The Shanghai Cooperation Organization (the SCO) is seen as the most promising element of the model. The SCO is capable of streamlining the formation of the GE and determine the transformation of Eurasia into a common economic space. Effective development of the economic integration between Eurasian states on the framework of the SCO is optimal. The SCO+ could be used as a platform for integration-integration processes formation. The creation of stable financial ties could become the basis for the possible formation of an expanded transregional integration platform. The paper concludes that the implementation of the proposed model could entail a gradual economic rapprochement of Eurasia and beyond.

Keywords: economic integration, The Eurasian Economic Union, The European Union, The Shanghai Cooperation Organization, the silk road economic belt

Procedia PDF Downloads 121
18952 The Development of a Comprehensive Sustainable Supply Chain Performance Measurement Theoretical Framework in the Oil Refining Sector

Authors: Dina Tamazin, Nicoleta Tipi, Sahar Validi

Abstract:

The oil refining industry plays vital role in the world economy. Oil refining companies operate in a more complex and dynamic environment than ever before. In addition, oil refining companies and the public are becoming more conscious of crude oil scarcity and climate changes. Hence, sustainability in the oil refining industry is becoming increasingly critical to the industry's long-term viability and to the environmental sustainability. Mainly, it is relevant to the measurement and evaluation of the company's sustainable performance to support the company in understanding their performance and its implication more objectively and establishing sustainability development plans. Consequently, the oil refining companies attempt to re-engineer their supply chain to meet the sustainable goals and standards. On the other hand, this research realized that previous research in oil refining sustainable supply chain performance measurements reveals that there is a lack of studies that consider the integration of sustainability in the supply chain performance measurement practices in the oil refining industry. Therefore, there is a need for research that provides performance guidance, which can be used to measure sustainability and assist in setting sustainable goals for oil refining supply chains. Accordingly, this paper aims to present a comprehensive oil refining sustainable supply chain performance measurement theoretical framework. In development of this theoretical framework, the main characteristics of oil refining industry have been identified. For this purpose, a thorough review of relevant literature on performance measurement models and sustainable supply chain performance measurement models has been conducted. The comprehensive oil refining sustainable supply chain performance measurement theoretical framework introduced in this paper aims to assist oil refining companies in measuring and evaluating their performance from a sustainability aspect to achieve sustainable operational excellence.

Keywords: oil refining industry, oil refining sustainable supply chain, performance measurement, sustainability

Procedia PDF Downloads 288
18951 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies

Keywords: crop yield, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 411
18950 Numerical Modeling of the Depth-Averaged Flow over a Hill

Authors: Anna Avramenko, Heikki Haario

Abstract:

This paper reports the development and application of a 2D depth-averaged model. The main goal of this contribution is to apply the depth averaged equations to a wind park model in which the treatment of the geometry, introduced on the mathematical model by the mass and momentum source terms. The depth-averaged model will be used in future to find the optimal position of wind turbines in the wind park. K-E and 2D LES turbulence models were consider in this article. 2D CFD simulations for one hill was done to check the depth-averaged model in practise.

Keywords: depth-averaged equations, numerical modeling, CFD, wind park model

Procedia PDF Downloads 603
18949 Interaction Between Task Complexity and Collaborative Learning on Virtual Patient Design: The Effects on Students’ Performance, Cognitive Load, and Task Time

Authors: Fatemeh Jannesarvatan, Ghazaal Parastooei, Jimmy frerejan, Saedeh Mokhtari, Peter Van Rosmalen

Abstract:

Medical and dental education increasingly emphasizes the acquisition, integration, and coordination of complex knowledge, skills, and attitudes that can be applied in practical situations. Instructional design approaches have focused on using real-life tasks in order to facilitate complex learning in both real and simulated environments. The Four component instructional design (4C/ID) model has become a useful guideline for designing instructional materials that improve learning transfer, especially in health profession education. The objective of this study was to apply the 4C/ID model in the creation of virtual patients (VPs) that dental students can use to practice their clinical management and clinical reasoning skills. The study first explored the context and concept of complication factors and common errors for novices and how they can affect the design of a virtual patient program. The study then selected key dental information and considered the content needs of dental students. The design of virtual patients was based on the 4C/ID model's fundamental principles, which included: Designing learning tasks that reflect real patient scenarios and applying different levels of task complexity to challenge students to apply their knowledge and skills in different contexts. Creating varied learning materials that support students during the VP program and are closely integrated with the learning tasks and students' curricula. Cognitive feedback was provided at different levels of the program. Providing procedural information where students followed a step-by-step process from history taking to writing a comprehensive treatment plan. Four virtual patients were designed using the 4C/ID model's principles, and an experimental design was used to test the effectiveness of the principles in achieving the intended educational outcomes. The 4C/ID model provides an effective framework for designing engaging and successful virtual patients that support the transfer of knowledge and skills for dental students. However, there are some challenges and pitfalls that instructional designers should take into account when developing these educational tools.

Keywords: 4C/ID model, virtual patients, education, dental, instructional design

Procedia PDF Downloads 82
18948 UBCSAND Model Calibration for Generic Liquefaction Triggering Curves

Authors: Jui-Ching Chou

Abstract:

Numerical simulation is a popular method used to evaluate the effects of soil liquefaction on a structure or the effectiveness of a mitigation plan. Many constitutive models (UBCSAND model, PM4 model, SANISAND model, etc.) were presented to model the liquefaction phenomenon. In general, inputs of a constitutive model need to be calibrated against the soil cyclic resistance before being applied to the numerical simulation model. Then, simulation results can be compared with results from simplified liquefaction potential assessing methods. In this article, inputs of the UBCSAND model, a simple elastic-plastic stress-strain model, are calibrated against several popular generic liquefaction triggering curves of simplified liquefaction potential assessing methods via FLAC program. Calibrated inputs can provide engineers to perform a preliminary evaluation of an existing structure or a new design project.

Keywords: calibration, liquefaction, numerical simulation, UBCSAND Model

Procedia PDF Downloads 174
18947 Efficacy of Comprehensive Diabetic Care Program with the Reduction of HbA1c in Overweight Type II Diabetes Mellitus Patients: A Retrospective Study

Authors: Rohit Sane, Pravin Ghadigaonkar, Purvi Ahuja, Suvarna Tirmare, Archana Kelhe, Kranti Shinde, Rahul Mandole

Abstract:

To evaluate the efficacy of Comprehensive Diabetic Care Program with the reduction of HbA1c in overweight Diabetes Mellitus Type II patients retrospectively. Methods: Retrospective study was carried out on 34 overweight type II diabetic patients (Mean Age = 54.58 ±11.38 yrs). A total of 34 patients were enrolled after screening of 68 patients (HbA1c 7-10%). The patients were on concomitant drugs namely insulin (11.76%), DPP-4 inhibitor (17.64%), Biguanide (55.88%), Sulfonylurea (52.94%), thiazolidinedione (11.76%), other medications (20.58%) and no allopathic medications (14.70%). The patients were given Comprehensive Diabetic Care Program consisting of panchkarma procedures namely snehana (external oleation), swedana (passive heat therapy) and basti (enema), which was completed in 15 sittings. During the therapy and next 90 days, the patients followed low carbohydrate and moderate protein & fat diet. The primary endpoint of this study was the evaluation of reduction in HbA1c at the end of the follow-up after 90 days. Results: Thirty-four overweight type II diabetic patients (mean age: 54.58[±11.38], HbA1c[7-10%], 67.64% male and 32.35% female) were enrolled in the study. A significant reduction was observed in HbA1c levels (14.30%, p<0.05) at the end of the 90 days follow-up as compared to baseline. Also, BMI was reduced by 5.87%. There was reduction in the usage of the concomitant drugs namely insulin (2.94%), DPP-4 inhibitor (2.94%), Biguanide (32.35%), Sulfonylurea (35.29%), thiazolidinedione (5.88%), other medications(17.64%) and no allopathic medications (32.35%). Conclusion: The results of the study highlight not only in the reduction of HbA1c, but also in BMI and drug tapering of the CDC program in the overweight type II diabetic patients with HbA1c (7-10%).

Keywords: HbA1c, low carb diet, Panchakarma therapy, Type II Diabetes

Procedia PDF Downloads 284
18946 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading

Authors: Peter Shi

Abstract:

Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.

Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market

Procedia PDF Downloads 72
18945 A Study on the Development of Social Participation Activity Scale for the Elderly

Authors: Young-Kwang Lee, Eun-Gu Ji, Min-Joo Kim, Seung-Jae Oh

Abstract:

The purpose of this study is to develop a social participation activity scale for the elderly. As a result of exploratory factor analysis, confirmatory factor analysis was conducted using maximum likelihood method using bundled items. In conclusion, thirteen items of social participation activity scale seemed appropriate. Finally, convergent validity and discriminant validity were verified on the scale with the fit. The convergent validity was based on the variance extracted value. In other words, the hypothesis that the variables are the same is rejected and the validity is confirmed. This study extensively considered the measurement items of the social participation activity scale used to measure social participation activities of the elderly. In the future, it will be meaningful that it can be used as a tool to verify the effectiveness of services in organizations that provide social welfare services to elderly people such as comprehensive social welfare centers and the elderly comprehensive social welfare centers.

Keywords: elderly, social participation, scale development, validity

Procedia PDF Downloads 189
18944 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model 1: Description

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies.

Keywords: runoff, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 378
18943 Stock Market Prediction by Regression Model with Social Moods

Authors: Masahiro Ohmura, Koh Kakusho, Takeshi Okadome

Abstract:

This paper presents a regression model with autocorrelated errors in which the inputs are social moods obtained by analyzing the adjectives in Twitter posts using a document topic model. The regression model predicts Dow Jones Industrial Average (DJIA) more precisely than autoregressive moving-average models.

Keywords: stock market prediction, social moods, regression model, DJIA

Procedia PDF Downloads 549
18942 Identifying Metabolic Pathways Associated with Neuroprotection Mediated by Tibolone in Human Astrocytes under an Induced Inflammatory Model

Authors: Daniel Osorio, Janneth Gonzalez, Andres Pinzon

Abstract:

In this work, proteins and metabolic pathways associated with the neuroprotective response mediated by the synthetic neurosteroid tibolone under a palmitate-induced inflammatory model were identified by flux balance analysis (FBA). Three different metabolic scenarios (‘healthy’, ‘inflamed’ and ‘medicated’) were modeled over a gene expression data-driven constructed tissue-specific metabolic reconstruction of mature astrocytes. Astrocyte reconstruction was built, validated and constrained using three open source software packages (‘minval’, ‘g2f’ and ‘exp2flux’) released through the Comprehensive R Archive Network repositories during the development of this work. From our analysis, we predict that tibolone executes their neuroprotective effects through a reduction of neurotoxicity mediated by L-glutamate in astrocytes, inducing the activation several metabolic pathways with neuroprotective actions associated such as taurine metabolism, gluconeogenesis, calcium and the Peroxisome Proliferator Activated Receptor signaling pathways. Also, we found a tibolone associated increase in growth rate probably in concordance with previously reported side effects of steroid compounds in other human cell types.

Keywords: astrocytes, flux balance analysis, genome scale metabolic reconstruction, inflammation, neuroprotection, tibolone

Procedia PDF Downloads 224
18941 Project Progress Prediction in Software Devlopment Integrating Time Prediction Algorithms and Large Language Modeling

Authors: Dong Wu, Michael Grenn

Abstract:

Managing software projects effectively is crucial for meeting deadlines, ensuring quality, and managing resources well. Traditional methods often struggle with predicting project timelines accurately due to uncertain schedules and complex data. This study addresses these challenges by combining time prediction algorithms with Large Language Models (LLMs). It makes use of real-world software project data to construct and validate a model. The model takes detailed project progress data such as task completion dynamic, team Interaction and development metrics as its input and outputs predictions of project timelines. To evaluate the effectiveness of this model, a comprehensive methodology is employed, involving simulations and practical applications in a variety of real-world software project scenarios. This multifaceted evaluation strategy is designed to validate the model's significant role in enhancing forecast accuracy and elevating overall management efficiency, particularly in complex software project environments. The results indicate that the integration of time prediction algorithms with LLMs has the potential to optimize software project progress management. These quantitative results suggest the effectiveness of the method in practical applications. In conclusion, this study demonstrates that integrating time prediction algorithms with LLMs can significantly improve the predictive accuracy and efficiency of software project management. This offers an advanced project management tool for the industry, with the potential to improve operational efficiency, optimize resource allocation, and ensure timely project completion.

Keywords: software project management, time prediction algorithms, large language models (LLMS), forecast accuracy, project progress prediction

Procedia PDF Downloads 80
18940 Theoretical Evaluation of Minimum Superheat, Energy and Exergy in a High-Temperature Heat Pump System Operating with Low GWP Refrigerants

Authors: Adam Y. Sulaiman, Donal F. Cotter, Ming J. Huang, Neil J. Hewitt

Abstract:

Suitable low global warming potential (GWP) refrigerants that conform to F-gas regulations are required to extend the operational envelope of high-temperature heat pumps (HTHPs) used for industrial waste heat recovery processes. The thermophysical properties and characteristics of these working fluids need to be assessed to provide a comprehensive understanding of operational effectiveness in HTHP applications. This paper presents the results of a theoretical simulation to investigate a range of low-GWP refrigerants and their suitability to supersede refrigerants HFC-245fa and HFC-365mfc. A steady-state thermodynamic model of a single-stage HTHP with an internal heat exchanger (IHX) was developed to assess system cycle characteristics at temperature ranges between 50 to 80 °C heat source and 90 to 150 °C heat sink. A practical approach to maximize the operational efficiency was examined to determine the effects of regulating minimum superheat within the process and subsequent influence on energetic and exergetic efficiencies. A comprehensive map of minimum superheat across the HTHP operating variables were used to assess specific tipping points in performance at 30 and 70 K temperature lifts. Based on initial results, the refrigerants HCFO-1233zd(E) and HFO-1336mzz(Z) were found to be closely aligned matches for refrigerants HFC-245fa and HFC-365mfc. The overall results show effective performance for HCFO-1233zd(E) occurs between 5-7 K minimum superheat, and HFO-1336mzz(Z) between 18-21 K dependant on temperature lift. This work provides a method to optimize refrigerant selection based on operational indicators to maximize overall HTHPs system performance.

Keywords: high-temperature heat pump, minimum superheat, energy & exergy efficiency, low GWP refrigerants

Procedia PDF Downloads 186
18939 Roundabout Implementation Analyses Based on Traffic Microsimulation Model

Authors: Sanja Šurdonja, Aleksandra Deluka-Tibljaš, Mirna Klobučar, Irena Ištoka Otković

Abstract:

Roundabouts are a common choice in the case of reconstruction of an intersection, whether it is to improve the capacity of the intersection or traffic safety, especially in urban conditions. The regulation for the design of roundabouts is often related to driving culture, the tradition of using this type of intersection, etc. Individual values in the regulation are usually recommended in a wide range (this is the case in Croatian regulation), and the final design of a roundabout largely depends on the designer's experience and his/her choice of design elements. Therefore, before-after analyses are a good way to monitor the performance of roundabouts and possibly improve the recommendations of the regulation. This paper presents a comprehensive before-after analysis of a roundabout on the country road network near Rijeka, Croatia. The analysis is based on a thorough collection of traffic data (operating speeds and traffic load) and design elements data, both before and after the reconstruction into a roundabout. At the chosen location, the roundabout solution aimed to improve capacity and traffic safety. Therefore, the paper analyzed the collected data to see if the roundabout achieved the expected effect. A traffic microsimulation model (VISSIM) of the roundabout was created based on the real collected data, and the influence of the increase of traffic load and different traffic structures, as well as of the selected design elements on the capacity of the roundabout, were analyzed. Also, through the analysis of operating speeds and potential conflicts by application of the Surrogate Safety Assessment Model (SSAM), the traffic safety effect of the roundabout was analyzed. The results of this research show the practical value of before-after analysis as an indicator of roundabout effectiveness at a specific location. The application of a microsimulation model provides a practical method for analyzing intersection functionality from a capacity and safety perspective in present and changed traffic and design conditions.

Keywords: before-after analysis, operating speed, capacity, design.

Procedia PDF Downloads 24
18938 Structural Equation Modeling Semiparametric Truncated Spline Using Simulation Data

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

SEM analysis is a complex multivariate analysis because it involves a number of exogenous and endogenous variables that are interconnected to form a model. The measurement model is divided into two, namely, the reflective model (reflecting) and the formative model (forming). Before carrying out further tests on SEM, there are assumptions that must be met, namely the linearity assumption, to determine the form of the relationship. There are three modeling approaches to path analysis, including parametric, nonparametric and semiparametric approaches. The aim of this research is to develop semiparametric SEM and obtain the best model. The data used in the research is secondary data as the basis for the process of obtaining simulation data. Simulation data was generated with various sample sizes of 100, 300, and 500. In the semiparametric SEM analysis, the form of the relationship studied was determined, namely linear and quadratic and determined one and two knot points with various levels of error variance (EV=0.5; 1; 5). There are three levels of closeness of relationship for the analysis process in the measurement model consisting of low (0.1-0.3), medium (0.4-0.6) and high (0.7-0.9) levels of closeness. The best model lies in the form of the relationship X1Y1 linear, and. In the measurement model, a characteristic of the reflective model is obtained, namely that the higher the closeness of the relationship, the better the model obtained. The originality of this research is the development of semiparametric SEM, which has not been widely studied by researchers.

Keywords: semiparametric SEM, measurement model, structural model, reflective model, formative model

Procedia PDF Downloads 43
18937 Analyzing How Working From Home Can Lead to Higher Job Satisfaction for Employees Who Have Care Responsibilities Using Structural Equation Modeling

Authors: Christian Louis Kühner, Florian Pfeffel, Valentin Nickolai

Abstract:

Taking care of children, dependents, or pets can be a difficult and time-consuming task. Especially for part- and full-time employees, it can feel exhausting and overwhelming to meet these obligations besides working a job. Thus, working mostly at home and not having to drive to the company can save valuable time and stress. This study aims to show the influence that the working model has on the job satisfaction of employees with care responsibilities in comparison to employees who do not have such obligations. Using structural equation modeling (SEM), the three work models, “work from home”, “working remotely”, and a hybrid model, have been analyzed based on 13 influencing constructs on job satisfaction. These 13 factors have been further summarized into three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, an online survey was conducted with n = 684 employees from the service sector. Here, Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. In addition, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that among the employees with care responsibilities, the higher the proportion of working from home in comparison to working from the office, the more satisfied the employees are with their job. Since the work models that meet the requirements of comprehensive care led to higher job satisfaction amongst employees with such obligations, adapting as a company to such private obligations by employees can be crucial to sustained success. Conversely, the satisfaction level of the working model where employees work at the office is higher for workers without caregiving responsibilities.

Keywords: care responsibilities, home office, job satisfaction, structural equation modeling

Procedia PDF Downloads 84
18936 A Comprehensive Study on Quality Assurance in Game Development

Authors: Maria Komal, Zaineb Khalil, Mehreen Sirshar

Abstract:

Due to the recent technological advancements, Games have become one of the most demanding applications. Gaming industry is rapidly growing and the key to success in this industry is the development of good quality games, which is a highly competitive issue. The ultimate goal of game developers is to provide player’s satisfaction by developing high-quality games. This research is the comprehensive survey of techniques followed by game industries to ensure games quality. After analysis of various techniques, it has been found that quality simulation according to ISO standards and play test methods are used to ensure games quality. Because game development requires cross-disciplined team, an increasing trend towards distributed game development has been observed. This paper evaluates the strengths and weaknesses of current methodologies used in game industry and draws a conclusion. We have also proposed quality parameters which can be used as a heuristic framework to identify those attributes which have high testing priorities.

Keywords: game development, computer games, video games, gaming industry, quality assurance, playability, user experience

Procedia PDF Downloads 534
18935 Metabolic Predictive Model for PMV Control Based on Deep Learning

Authors: Eunji Choi, Borang Park, Youngjae Choi, Jinwoo Moon

Abstract:

In this study, a predictive model for estimating the metabolism (MET) of human body was developed for the optimal control of indoor thermal environment. Human body images for indoor activities and human body joint coordinated values were collected as data sets, which are used in predictive model. A deep learning algorithm was used in an initial model, and its number of hidden layers and hidden neurons were optimized. Lastly, the model prediction performance was analyzed after the model being trained through collected data. In conclusion, the possibility of MET prediction was confirmed, and the direction of the future study was proposed as developing various data and the predictive model.

Keywords: deep learning, indoor quality, metabolism, predictive model

Procedia PDF Downloads 258
18934 A Comprehensive Study on CO₂ Capture and Storage: Advances in Technology and Environmental Impact Mitigation

Authors: Oussama Fertaq

Abstract:

This paper investigates the latest advancements in CO₂ capture and storage (CCS) technologies, which are vital for addressing the growing challenge of climate change. The study focuses on multiple techniques for CO₂ capture, including chemical absorption, membrane separation, and adsorption, analyzing their efficiency, scalability, and environmental impact. The research further explores geological storage options such as deep saline aquifers and depleted oil fields, providing insights into the challenges and opportunities presented by each method. This paper emphasizes the importance of integrating CCS with existing industrial processes to reduce greenhouse gas emissions effectively. It also discusses the economic and policy frameworks required to promote wider adoption of CCS technologies. The findings of this study offer a comprehensive view of the potential of CCS in achieving global climate goals, particularly in hard-to-abate sectors such as energy and manufacturing.

Keywords: CO₂ capture, carbon storage, climate change mitigation, carbon sequestration, environmental sustainability

Procedia PDF Downloads 16
18933 Model Averaging in a Multiplicative Heteroscedastic Model

Authors: Alan Wan

Abstract:

In recent years, the body of literature on frequentist model averaging in statistics has grown significantly. Most of this work focuses on models with different mean structures but leaves out the variance consideration. In this paper, we consider a regression model with multiplicative heteroscedasticity and develop a model averaging method that combines maximum likelihood estimators of unknown parameters in both the mean and variance functions of the model. Our weight choice criterion is based on a minimisation of a plug-in estimator of the model average estimator's squared prediction risk. We prove that the new estimator possesses an asymptotic optimality property. Our investigation of finite-sample performance by simulations demonstrates that the new estimator frequently exhibits very favourable properties compared to some existing heteroscedasticity-robust model average estimators. The model averaging method hedges against the selection of very bad models and serves as a remedy to variance function misspecification, which often discourages practitioners from modeling heteroscedasticity altogether. The proposed model average estimator is applied to the analysis of two real data sets.

Keywords: heteroscedasticity-robust, model averaging, multiplicative heteroscedasticity, plug-in, squared prediction risk

Procedia PDF Downloads 387
18932 An Educational Program Based on Health Belief Model to Prevent of Non-alcoholic Fatty Liver Disease Among Iranian Women

Authors: Arezoo Fallahi

Abstract:

Background and purpose: Non-alcoholic fatty liver is one of the most common liver disorders, which, as the most important cause of death from liver disease, has unpleasant consequences and complications. The aim of this study was to investigate the effect of an educational intervention based on a health belief model to prevent non-alcoholic fatty liver among women. Materials and Methods: This experimental study was performed among 110 women referring to comprehensive health service centers in Malayer City, west of Iran, in 2023. Using the available sampling method, 110 Participants were divided into experimental and control groups. The data collection tool included demographic characteristics and a questionnaire based on the health belief model. In The experimental group, three one-hour training sessions were conducted in the form of pamphlets, lectures and group discussions. Data were analyzed using SPSS software version 21, by correlation tests, paired t-tests independent t-tests. Results: The mean age of participants was 38.07±6.28 years, and Most of the participants were middle-aged, married, housewives with academic education, middle-income and overweight. After the educational intervention, the mean scores of the constructs include perceived sensitivity (p=0.01), perceived severity (p=0.01), perceived benefits (p=0.01), guidance for internal (p=0.01) and external action (p=0.01), and perceived self-efficacy (p=0.01) in the experimental group were significantly higher than the control group. The score of perceived barriers in the experimental group decreased after training. The perceived obstacles score in the test group decreased after the training (15.2 ± 3.9 v.s 11.2 ± 3.3, (p<0.01). Conclusion: The findings of the study showed that the design and implementation of educational programs based on the constructs of the health belief model can be effective in preventing women from developing higher levels of non-alcoholic fatty liver.

Keywords: health, education, believe, behaviour

Procedia PDF Downloads 53
18931 Reliability Prediction of Tires Using Linear Mixed-Effects Model

Authors: Myung Hwan Na, Ho- Chun Song, EunHee Hong

Abstract:

We widely use normal linear mixed-effects model to analysis data in repeated measurement. In case of detecting heteroscedasticity and the non-normality of the population distribution at the same time, normal linear mixed-effects model can give improper result of analysis. To achieve more robust estimation, we use heavy tailed linear mixed-effects model which gives more exact and reliable analysis conclusion than standard normal linear mixed-effects model.

Keywords: reliability, tires, field data, linear mixed-effects model

Procedia PDF Downloads 564
18930 Development and Modeling of the Process of Narrow-seam Laser Welding of Ni-Superalloy in a Hard-to-Reach Place

Authors: Vladimir Isakov, Evgeniy Rykov, Lubov Magerramova, Nikolay Emmaussky

Abstract:

For the manufacture of critical hollow products, a laser narrow-seam welding scheme based on the supply of a laser beam into the inner cavity has been developed. The report presents the results of comprehensive studies aimed at creating a sealed weld that repeats the geometric shape of the inner cavity using a rotary mirror. Laser welding of hard-to-reach places requires preliminary modeling of the process to identify defect-free modes performed at the highest possible welding speed. Optimization of the technological modes of the welded joint with a ratio of the seam width to its depth equal to 1/5 of the thickness of the Ni superalloy 6.0 mm was performed using the Verhulst limited growth model in a discrete representation. This mathematical model in the form of a recurrence relation made it possible to numerically investigate the entire variety of laser melting modes: chaotic; self-oscillating; stationary and attenuated. The control parameters and the parameter of the order to which other variables of the technological system of laser welding are subordinated are established. In it, the coefficient of relative heat capacity of the melt bath was used as a control parameter, characterizing the competition between the heat input by the laser and the heat sink into the surrounding metal. The parameter of the order of the narrow–seam laser welding process, in this interpretation, is a dimensionless value of the penetration depth, which is an argument of the function of the desired logistic equation. Experimental studies of narrow-seam welding were performed using a copper, water-cooled mirror by radiation from a powerful fiber laser. The obtained results were used to validate the evolutionary mathematical model of the laser welding process.

Keywords: laser welding, internal cavity, limited growth model, ni-superalloy

Procedia PDF Downloads 12
18929 Training AI to Be Empathetic and Determining the Psychotype of a Person During a Conversation with a Chatbot

Authors: Aliya Grig, Konstantin Sokolov, Igor Shatalin

Abstract:

The report describes the methodology for collecting data and building an ML model for determining the personality psychotype using profiling and personality traits methods based on several short messages of a user communicating on an arbitrary topic with a chitchat bot. In the course of the experiments, the minimum amount of text was revealed to confidently determine aspects of personality. Model accuracy - 85%. Users' language of communication is English. AI for a personalized communication with a user based on his mood, personality, and current emotional state. Features investigated during the research: personalized communication; providing empathy; adaptation to a user; predictive analytics. In the report, we describe the processes that captures both structured and unstructured data pertaining to a user in large quantities and diverse forms. This data is then effectively processed through ML tools to construct a knowledge graph and draw inferences regarding users of text messages in a comprehensive manner. Specifically, the system analyzes users' behavioral patterns and predicts future scenarios based on this analysis. As a result of the experiments, we provide for further research on training AI models to be empathetic, creating personalized communication for a user

Keywords: AI, empathetic, chatbot, AI models

Procedia PDF Downloads 94
18928 Compromising Relevance for Elegance: A Danger of Dominant Growth Models for Backward Economies

Authors: Givi Kupatadze

Abstract:

Backward economies are facing a challenge of achieving sustainable high economic growth rate. Dominant growth models represent a roadmap in framing economic development strategy. This paper examines a relevance of the dominant growth models for backward economies. Cobb-Douglas production function, the Harrod-Domar model of economic growth, the Solow growth model and general formula of gross domestic product are examined to undertake a comprehensive study of the dominant growth models. Deductive research method allows to uncover major weaknesses of the dominant growth models and to come up with practical implications for economic development strategy. The key finding of the paper shows, contrary to what used to be taught by textbooks of economics, that constant returns to scale property of the dominant growth models are a mere coincidence and its generalization over space and time can be regarded as one of the most unfortunate mistakes in the whole field of political economy. The major suggestion of the paper for backward economies is that understanding and considering taxonomy of economic activities based on increasing and diminishing returns to scale represent a cornerstone of successful economic development strategy.

Keywords: backward economies, constant returns to scale, dominant growth models, taxonomy of economic activities

Procedia PDF Downloads 376
18927 Enhancing the Resilience of Combat System-Of-Systems Under Certainty and Uncertainty: Two-Phase Resilience Optimization Model and Deep Reinforcement Learning-Based Recovery Optimization Method

Authors: Xueming Xu, Jiahao Liu, Jichao Li, Kewei Yang, Minghao Li, Bingfeng Ge

Abstract:

A combat system-of-systems (CSoS) comprises various types of functional combat entities that interact to meet corresponding task requirements in the present and future. Enhancing the resilience of CSoS holds significant military value in optimizing the operational planning process, improving military survivability, and ensuring the successful completion of operational tasks. Accordingly, this research proposes an integrated framework called CSoS resilience enhancement (CSoSRE) to enhance the resilience of CSoS from a recovery perspective. Specifically, this research presents a two-phase resilience optimization model to define a resilience optimization objective for CSoS. This model considers not only task baseline, recovery cost, and recovery time limit but also the characteristics of emergency recovery and comprehensive recovery. Moreover, the research extends it from the deterministic case to the stochastic case to describe the uncertainty in the recovery process. Based on this, a resilience-oriented recovery optimization method based on deep reinforcement learning (RRODRL) is proposed to determine a set of entities requiring restoration and their recovery sequence, thereby enhancing the resilience of CSoS. This method improves the deep Q-learning algorithm by designing a discount factor that adapts to changes in CSoS state at different phases, simultaneously considering the network’s structural and functional characteristics within CSoS. Finally, extensive experiments are conducted to test the feasibility, effectiveness and superiority of the proposed framework. The obtained results offer useful insights for guiding operational recovery activity and designing a more resilient CSoS.

Keywords: combat system-of-systems, resilience optimization model, recovery optimization method, deep reinforcement learning, certainty and uncertainty

Procedia PDF Downloads 18