Search results for: facility data model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35670

Search results for: facility data model

34470 An Approach for Estimation in Hierarchical Clustered Data Applicable to Rare Diseases

Authors: Daniel C. Bonzo

Abstract:

Practical considerations lead to the use of unit of analysis within subjects, e.g., bleeding episodes or treatment-related adverse events, in rare disease settings. This is coupled with data augmentation techniques such as extrapolation to enlarge the subject base. In general, one can think about extrapolation of data as extending information and conclusions from one estimand to another estimand. This approach induces hierarchichal clustered data with varying cluster sizes. Extrapolation of clinical trial data is being accepted increasingly by regulatory agencies as a means of generating data in diverse situations during drug development process. Under certain circumstances, data can be extrapolated to a different population, a different but related indication, and different but similar product. We consider here the problem of estimation (point and interval) using a mixed-models approach under an extrapolation. It is proposed that estimators (point and interval) be constructed using weighting schemes for the clusters, e.g., equally weighted and with weights proportional to cluster size. Simulated data generated under varying scenarios are then used to evaluate the performance of this approach. In conclusion, the evaluation result showed that the approach is a useful means for improving statistical inference in rare disease settings and thus aids not only signal detection but risk-benefit evaluation as well.

Keywords: clustered data, estimand, extrapolation, mixed model

Procedia PDF Downloads 133
34469 Disintegration of Deuterons by Photons Reaction Model for GEANT4 with Dibaryon Formalism

Authors: Jae Won Shin, Chang Ho Hyun

Abstract:

A disintegration of deuterons by photons (dγ → np) reaction model for GEANT4 is developed in this work. An effective field theory with dibaryon fields Introducing a dibaryon field, we can take into account the effective range contribution to the propagator up to infinite order, and it consequently makes the convergence of the theory better than the pionless effective field theory without dibaryon fields. We develop a hadronic model for GEANT4 which is specialized for the disintegration of the deuteron by photons, dγ → np. For the description of two-nucleon interactions, we employ an effective field theory so called pionless theory with dibaryon fields (dEFT). In spite of its simplicity, the theory has proven very effective and useful in the applications to various two-nucleon systems and processes at low energies. We apply the new model of GEANT4 (G4dEFT) to the calculation of total and differential cross sections in dγ → np, and obtain good agreements to experimental data for a wide range of incoming photon energies.

Keywords: dγ → np, dibaryon fields, effective field theory, GEANT4

Procedia PDF Downloads 372
34468 Airbnb, Hotel Industry and Optimum Strategies: Evidence from European Cities, Barcelona, London and Paris

Authors: Juan Pedro Aznar Alarcon, Josep Maria Sayeras Maspera

Abstract:

Airbnb and other similar platforms are offering a near substitute to the traditional accommodation service supplied by the hotel sector. In this context, hotels can try to compete by offering higher quality and additional services, which imply the need for new investments or try to compete by reducing prices. The theoretical model presented in this paper analyzes the best response using a sequential game theory model. The main conclusion is that due to the financial constraints that small and medium hotels have these hotels have reduced prices whereas hotels that belong to international groups or have an easy access to financial resources have increased their investment to increase the quality of the service provided. To check the validity of the theoretical model financial data from Barcelona, London and Paris hotels have been used analyzing profitability, quality of the service provided, the investment propensity and the evolution of the gross profit. The model and the empirical data provide the base for some industrial policy in the hospitality industry. To address the extra cost that small hotels in Europe have to face compared by bigger firms would help to improve the level of quality provided and to some extent have positive externalities in terms of job creation and an increasing added value for the industry.

Keywords: Airbnb, profitability, hospitality industry, game theory

Procedia PDF Downloads 344
34467 Tackling the Value-Action-Gap: Improving Civic Participation Using a Holistic Behavioral Model Approach

Authors: Long Pham, Julia Blanke

Abstract:

An increasingly popular way of establishing citizen engagement within communities is through ‘city apps’. Currently, most of these mobile applications seem to be extensions of the existing communication media, sometimes merely replicating the information available on the classical city web sites, and therefore provide minimal additional impact on citizen behavior and engagement. In order to overcome this challenge, we propose to use a holistic behavioral model to generate dynamic and contextualized app content based on optimizing well defined city-related performance goals constrained by the proposed behavioral model. In this paper, we will show how the data collected by the CorkCitiEngage project in the Irish city of Cork can be utilized to calibrate aspects of the proposed model enabling the design of a personalized citizen engagement app aiming at positively influencing people’s behavior towards more active participation in their communities. We will focus on the important aspect of intentions to act, which is essential for understanding the reasons behind the common value-action-gap being responsible for the mismatch between good intentions and actual observable behavior, and will discuss how customized app design can be based on a rigorous model of behavior optimized towards maximizing well defined city-related performance goals.

Keywords: city apps, holistic behaviour model, intention to act, value-action-gap, citizen engagement

Procedia PDF Downloads 224
34466 Combining the Dynamic Conditional Correlation and Range-GARCH Models to Improve Covariance Forecasts

Authors: Piotr Fiszeder, Marcin Fałdziński, Peter Molnár

Abstract:

The dynamic conditional correlation model of Engle (2002) is one of the most popular multivariate volatility models. However, this model is based solely on closing prices. It has been documented in the literature that the high and low price of the day can be used in an efficient volatility estimation. We, therefore, suggest a model which incorporates high and low prices into the dynamic conditional correlation framework. Empirical evaluation of this model is conducted on three datasets: currencies, stocks, and commodity exchange-traded funds. The utilisation of realized variances and covariances as proxies for true variances and covariances allows us to reach a strong conclusion that our model outperforms not only the standard dynamic conditional correlation model but also a competing range-based dynamic conditional correlation model.

Keywords: volatility, DCC model, high and low prices, range-based models, covariance forecasting

Procedia PDF Downloads 180
34465 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana

Authors: Gautier Viaud, Paul-Henry Cournède

Abstract:

Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.

Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models

Procedia PDF Downloads 297
34464 The Climate Impact Due to Clouds and Selected Greenhouse Gases by Short Wave Upwelling Radiative Flux within Spectral Range of Space-Orbiting Argus1000 Micro-Spectrometer

Authors: Rehan Siddiqui, Brendan Quine

Abstract:

The Radiance Enhancement (RE) and integrated absorption technique is applied to develop a synthetic model to determine the enhancement in radiance due to cloud scene and Shortwave upwelling Radiances (SHupR) by O2, H2O, CO2 and CH4. This new model is used to estimate the magnitude variation for RE and SHupR over spectral range of 900 nm to 1700 nm by varying surface altitude, mixing ratios and surface reflectivity. In this work, we employ satellite real observation of space orbiting Argus 1000 especially for O2, H2O, CO2 and CH4 together with synthetic model by using line by line GENSPECT radiative transfer model. All the radiative transfer simulations have been performed by varying over a different range of percentages of water vapor contents and carbon dioxide with the fixed concentration oxygen and methane. We calculate and compare both the synthetic and real measured observed data set of different week per pass of Argus flight. Results are found to be comparable for both approaches, after allowing for the differences with the real and synthetic technique. The methodology based on RE and SHupR of the space spectral data can be promising for the instant and reliable classification of the cloud scenes.

Keywords: radiance enhancement, radiative transfer, shortwave upwelling radiative flux, cloud reflectivity, greenhouse gases

Procedia PDF Downloads 331
34463 The Magnitude Scale Evaluation of Cross-Platform Internet Public Opinion

Authors: Yi Wang, Xun Liang

Abstract:

This paper introduces a model of internet public opinion waves, which describes the message propagation and measures the influence of a detected event. We collect data on public opinion propagation from different platforms on the internet, including micro-blogs and news. Then, we compare the spread of public opinion to the seismic waves and correspondently define the P-wave and S-wave and other essential attributes and characteristics in the process. Further, a model is established to evaluate the magnitude scale of the events. In the end, a practical example is used to analyze the influence of network public opinion and test the reasonability and effectiveness of the proposed model.

Keywords: internet public opinion waves (IPOW), magnitude scale, cross-platform, information propagation

Procedia PDF Downloads 284
34462 Preferred Service Delivery options for Female Sex Workers in the Riverine Area of lome, Togo

Authors: Gbone Akou Sophie

Abstract:

Lome state in Togo is considered to have the highest HIV prevalence in Togo according to NAIIS 2023, with the prevalence of 5.5%, Female Sex Workers (FSW) are one of the most vulnerable population, and they are vital in HIV programming. They have the highest HIV prevalence compared to others such as HRM, PWID and Transgender in lome State, Togo. Evidence from Integrated Biological Behavioral Surveillance Survey shows increasing burden of HIV infection from 13.7% in 20018 to 17.2% in 2020 and now 22.9% in 2021 among Female Sex Workers (FSW). This shows their HIV prevalence has been rising over time. The vulnerability status of the FSW in the riverine areas of lome is heightened because of cultural and economic issues where there is exchange of sex for commodities with cross border traders as well as limited access to HIV prevention information. Methods:A cross sectional study which recruited 120 FSW from two Riverine LGAs of Agoe and Kpehenou LGA of Lome State using both snowballing and simple random sampling technique. While semi-structured questionnaire was used as an instrument for data collection among the 120 FSW respondents. Additional information was also elicited from 10 FSW key opinion leaders and community members through in-depth interviews (IDI). Results: 44(36%) of respondents were willing to receive regular HIV care and services as well as visit for STI check-ups at any service point. However, 47(40%) were willing to receive services at private facilities alone, 10 (8%) were willing to receive services at public facilities, 6 (5%) were willing to access services in their homes rather than in the health facility. 13 (11%) were also willing to have peers assist in getting HIV testing services. Conclusion: integrated differentiated model of care for HIV services helps improve HIV services uptake among FSW community especially in the hard- to reach riverine areas which will further lead to epidemic control. Also targeted HIV information should be designed to suit the learning needs of the hard-to reach communities like the riverine areas. More peer educators should be engaged to ensure information and other HIV services reach the riverine communities.

Keywords: female sex workers ( FSW), human immuno-deficiency virus(HIV), prevanlence, service delivery

Procedia PDF Downloads 66
34461 LaPEA: Language for Preprocessing of Edge Applications in Smart Factory

Authors: Masaki Sakai, Tsuyoshi Nakajima, Kazuya Takahashi

Abstract:

In order to improve the productivity of a factory, it is often the case to create an inference model by collecting and analyzing operational data off-line and then to develop an edge application (EAP) that evaluates the quality of the products or diagnoses machine faults in real-time. To accelerate this development cycle, an edge application framework for the smart factory is proposed, which enables to create and modify EAPs based on prepared inference models. In the framework, the preprocessing component is the key part to make it work. This paper proposes a language for preprocessing of edge applications, called LaPEA, which can flexibly process several sensor data from machines into explanatory variables for an inference model, and proves that it meets the requirements for the preprocessing.

Keywords: edge application framework, edgecross, preprocessing language, smart factory

Procedia PDF Downloads 141
34460 Developing a Sustainable Business Model for Platform-Based Applications in Small and Medium-Sized Enterprise Sawmills: A Systematic Approach

Authors: Franziska Mais, Till Gramberg

Abstract:

The paper presents the development of a sustainable business model for a platform-based application tailored for sawing companies in small and medium-sized enterprises (SMEs). The focus is on the integration of sustainability principles into the design of the business model to ensure a technologically advanced, legally sound, and economically efficient solution. Easy2IoT is a research project that aims to enable companies in the prefabrication sheet metal and sheet metal processing industry to enter the Industrial Internet of Things (IIoT) with a low-threshold and cost-effective approach. The methodological approach of Easy2IoT includes an in-depth requirements analysis and customer interviews with stakeholders along the value chain. Based on these insights, actions, requirements, and potential solutions for smart services are derived. The structuring of the business ecosystem within the application plays a central role, whereby the roles of the partners, the management of the IT infrastructure and services, as well as the design of a sustainable operator model are considered. The business model is developed using the value proposition canvas, whereby a detailed analysis of the requirements for the business model is carried out, taking sustainability into account. This includes coordination with the business model patterns, according to Gassmann, and integration into a business model canvas for the Easy2IoT product. Potential obstacles and problems are identified and evaluated in order to formulate a comprehensive and sustainable business model. In addition, sustainable payment models and distribution channels are developed. In summary, the article offers a well-founded insight into the systematic development of a sustainable business model for platform-based applications in SME sawmills, with a particular focus on the synergy of ecological responsibility and economic efficiency.

Keywords: business model, sustainable business model, IIoT, IIoT-platform, industrie 4.0, big data

Procedia PDF Downloads 77
34459 Understanding Post-Displacement Earnings Losses: The Role of Wealth Inequality

Authors: M. Bartal

Abstract:

A large empirical evidence points to sizable lifetime earnings losses associated with the displacement of tenured workers. The causes of these losses are still not well-understood. Existing explanations are heavily based on human capital depreciation during non-employment spells. In this paper, a new avenue is explored. Evidence on the role of household liquidity constraints in accounting for the persistence of post-displacement earning losses is provided based on SIPP data. Then, a directed search and matching model with endogenous human capital and wealth accumulation is introduced. The model is computationally tractable thanks to its block-recursive structure and highlights a non-trivial, yet intuitive, interaction between wealth and human capital. Constrained workers tend to accept jobs with low firm-sponsored training because the latter are (endogenously) easier to find. This new channel provides a plausible explanation for why young (highly constrained) workers suffer persistent scars after displacement. Finally, the model is calibrated on US data to show that the interplay between wealth and human capital is crucial to replicate the observed lifecycle pattern of earning losses. JEL— E21, E24, J24, J63.

Keywords: directed search, human capital accumulation, job displacement, wealth accumulation

Procedia PDF Downloads 203
34458 Assessment of Soil Erosion Risk Using Soil and Water Assessment Tools Model: Case of Siliana Watershed, Northwest Tunisia

Authors: Sana Dridi, Jalel Aouissi, Rafla Attia, Taoufik Hermassi, Thouraya Sahli

Abstract:

Soil erosion is an increasing issue in Mediterranean countries. In Tunisia, the capacity of dam reservoirs continues to decrease as a consequence of soil erosion. This study aims to predict sediment yield to enrich soil management practices using Soil and Water Assessment Tools model (SWAT) in the Siliana watershed (1041.6 km²), located in the northwest of Tunisia. A database was constructed using remote sensing and Geographical Information System. Climatic and flow data were collected from water resources directorates in Tunisia. The SWAT model was built to simulate hydrological processes and sediment transport. A sensitivity analysis, calibration, and validation were performed using SWAT-CUP software. The model calibration of stream flow simulations shows a good performance with NSE and R² values of 0.77 and 0.79, respectively. The model validation shows a very good performance with values of NSE and R² for 0.8 and 0.88, respectively. After calibration and validation of stream flow simulation, the model was used to simulate the soil erosion and sediment load transport. The spatial distributions of soil loss rate for determining the critical sediment source areas show that 63 % of the study area has a low soil loss rate less than 7 t ha⁻¹y⁻¹. The annual average soil loss rate simulated with the SWAT model in the Siliana watershed is 4.62 t ha⁻¹y⁻¹.

Keywords: water erosion, SWAT model, streamflow, SWATCUP, sediment yield

Procedia PDF Downloads 98
34457 Reaching the Goals of Routine HIV Screening Programs: Quantifying and Implementing an Effective HIV Screening System in Northern Nigeria Facilities Based on Optimal Volume Analysis

Authors: Folajinmi Oluwasina, Towolawi Adetayo, Kate Ssamula, Penninah Iutung, Daniel Reijer

Abstract:

Objective: Routine HIV screening has been promoted as an essential component of efforts to reduce incidence, morbidity, and mortality. The objectives of this study were to identify the optimal annual volume needed to realize the public health goals of HIV screening in the AIDS Healthcare Foundation supported hospitals and establish an implementation process to realize that optimal annual volume. Methods: Starting in 2011 a program was established to routinize HIV screening within communities and government hospitals. In 2016 Five-years of HIV screening data were reviewed to identify the optimal annual proportions of age-eligible patients screened to realize the public health goals of reducing new diagnoses and ending late-stage diagnosis (tracked as concurrent HIV/AIDS diagnosis). Analysis demonstrated that rates of new diagnoses level off when 42% of age-eligible patients were screened, providing a baseline for routine screening efforts; and concurrent HIV/AIDS diagnoses reached statistical zero at screening rates of 70%. Annual facility based targets were re-structured to meet these new target volumes. Restructuring efforts focused on right-sizing HIV screening programs to align and transition programs to integrated HIV screening within standard medical care and treatment. Results: Over one million patients were screened for HIV during the five years; 16, 033 new HIV diagnoses and access to care and treatment made successfully for 82 % (13,206), and concurrent diagnosis rates went from 32.26% to 25.27%. While screening rates increased by 104.7% over the 5-years, volume analysis demonstrated that rates need to further increase by 62.52% to reach desired 20% baseline and more than double to reach optimal annual screening volume. In 2011 facility targets for HIV screening were increased to reflect volume analysis, and in that third year, 12 of the 19 facilities reached or exceeded new baseline targets. Conclusions and Recommendation: Quantifying targets against routine HIV screening goals identified optimal annual screening volume and allowed facilities to scale their program size and allocate resources accordingly. The program transitioned from utilizing non-evidence based annual volume increases to establishing annual targets based on optimal volume analysis. This has allowed efforts to be evaluated on the ability to realize quantified goals related to the public health value of HIV screening. Optimal volume analysis helps to determine the size of an HIV screening program. It is a public health tool, not a tool to determine if an individual patient should receive screening.

Keywords: HIV screening, optimal volume, HIV diagnosis, routine

Procedia PDF Downloads 261
34456 Model Based Simulation Approach to a 14-Dof Car Model Using Matlab/Simulink

Authors: Ishit Sheth, Chandrasekhar Jinendran, Chinmaya Ranjan Sahu

Abstract:

A fourteen degree of freedom (DOF) ride and handling control mathematical model is developed for a car using generalized boltzmann hamel equation which will create a basis for design of ride and handling controller. Mathematical model developed yield equations of motion for non-holonomic constrained systems in quasi-coordinates. The governing differential equation developed integrates ride and handling control of car. Model-based systems engineering approach is implemented for simulation using matlab/simulink, vehicle’s response in different DOF is examined and later validated using commercial software (ADAMS). This manuscript involves detailed derivation of full car vehicle model which provides response in longitudinal, lateral and yaw motion to demonstrate the advantages of the developed model over the existing dynamic model. The dynamic behaviour of the developed ride and handling model is simulated for different road conditions.

Keywords: Full Vehicle Model, MBSE, Non Holonomic Constraints, Boltzmann Hamel Equation

Procedia PDF Downloads 222
34455 Study on Water Level Management Criteria of Reservoir Failure Alert System

Authors: B. Lee, B. H. Choi

Abstract:

The loss of safety for reservoirs brought about by climate change and facility aging leads to reservoir failures, which results in the loss of lives and property damage in downstream areas. Therefore, it is necessary to provide a reservoir failure alert system for downstream residents to detect the early signs of failure (with sensors) in real-time and perform safety management to prevent and minimize possible damage. 10 case studies were carried out to verify the water level management criteria of four levels (attention, caution, alert, serious). Peak changes in water level data were analysed. The results showed that ‘Caution’ and ‘Alert’ were closed to 33% and 66% of difference in level between flood water level and full water level. Therefore, it is adequate to use initial water level management criteria of reservoir failure alert system for the first year. Acknowledgment: This research was supported by a grant (2017-MPSS31-002) from 'Supporting Technology Development Program for Disaster Management' funded by the Ministry of the Interior and Safety(MOIS)

Keywords: alert system, management criteria, reservoir failure, sensor

Procedia PDF Downloads 196
34454 Pattern of Physical Activity and Its Impact on the Quality of Life: A Structural Equation Modelling Analysis

Authors: Ali Maksum

Abstract:

In a number of countries, including Indonesia, the tendency for non-communicable diseases is increasing. As a result, health costs must be paid by the state continues to increase as well. People's lifestyles, including due to lack of physical activity, are thought to have contributed significantly to the problem. This study aims to examine the impact of participation in sports on quality of life, which is reflected in three main indicators, namely health, psychological, and social aspects. The study was conducted in the city of Surabaya and its surroundings, with a total of 490 participants, consisting of 245 men and 245 women with an average age of 45.4 years. Data on physical activity and quality of life were collected by questionnaire and analyzed using structural equation modeling. The test results of the model prove that the value of chi-square = 8,259 with p = .409, RMSEA = .008, NFI = .992, and CFI = 1. This means that the model is compatible with the data. The model explains that physical activity has a significant effect on quality of life. People who exercise regularly are better able to cope with stress, have a lower risk of illness, and have higher pro-social behavior. Therefore, it needs serious efforts from stakeholders, especially the government, to create an ecosystem that allows the growth of movement culture in the community.

Keywords: participation, physical activity, quality of life, structural equation modelling

Procedia PDF Downloads 118
34453 Comprehensive Risk Assessment Model in Agile Construction Environment

Authors: Jolanta Tamošaitienė

Abstract:

The article focuses on a developed comprehensive model to be used in an agile environment for the risk assessment and selection based on multi-attribute methods. The model is based on a multi-attribute evaluation of risk in construction, and the determination of their optimality criterion values are calculated using complex Multiple Criteria Decision-Making methods. The model may be further applied to risk assessment in an agile construction environment. The attributes of risk in a construction project are selected by applying the risk assessment condition to the construction sector, and the construction process efficiency in the construction industry accounts for the agile environment. The paper presents the comprehensive risk assessment model in an agile construction environment. It provides a background and a description of the proposed model and the developed analysis of the comprehensive risk assessment model in an agile construction environment with the criteria.

Keywords: assessment, environment, agile, model, risk

Procedia PDF Downloads 250
34452 Algorithm Development of Individual Lumped Parameter Modelling for Blood Circulatory System: An Optimization Study

Authors: Bao Li, Aike Qiao, Gaoyang Li, Youjun Liu

Abstract:

Background: Lumped parameter model (LPM) is a common numerical model for hemodynamic calculation. LPM uses circuit elements to simulate the human blood circulatory system. Physiological indicators and characteristics can be acquired through the model. However, due to the different physiological indicators of each individual, parameters in LPM should be personalized in order for convincing calculated results, which can reflect the individual physiological information. This study aimed to develop an automatic and effective optimization method to personalize the parameters in LPM of the blood circulatory system, which is of great significance to the numerical simulation of individual hemodynamics. Methods: A closed-loop LPM of the human blood circulatory system that is applicable for most persons were established based on the anatomical structures and physiological parameters. The patient-specific physiological data of 5 volunteers were non-invasively collected as personalized objectives of individual LPM. In this study, the blood pressure and flow rate of heart, brain, and limbs were the main concerns. The collected systolic blood pressure, diastolic blood pressure, cardiac output, and heart rate were set as objective data, and the waveforms of carotid artery flow and ankle pressure were set as objective waveforms. Aiming at the collected data and waveforms, sensitivity analysis of each parameter in LPM was conducted to determine the sensitive parameters that have an obvious influence on the objectives. Simulated annealing was adopted to iteratively optimize the sensitive parameters, and the objective function during optimization was the root mean square error between the collected waveforms and data and simulated waveforms and data. Each parameter in LPM was optimized 500 times. Results: In this study, the sensitive parameters in LPM were optimized according to the collected data of 5 individuals. Results show a slight error between collected and simulated data. The average relative root mean square error of all optimization objectives of 5 samples were 2.21%, 3.59%, 4.75%, 4.24%, and 3.56%, respectively. Conclusions: Slight error demonstrated good effects of optimization. The individual modeling algorithm developed in this study can effectively achieve the individualization of LPM for the blood circulatory system. LPM with individual parameters can output the individual physiological indicators after optimization, which are applicable for the numerical simulation of patient-specific hemodynamics.

Keywords: blood circulatory system, individual physiological indicators, lumped parameter model, optimization algorithm

Procedia PDF Downloads 134
34451 Impact of Climate on Sugarcane Yield Over Belagavi District, Karnataka Using Statistical Mode

Authors: Girish Chavadappanavar

Abstract:

The impact of climate on agriculture could result in problems with food security and may threaten the livelihood activities upon which much of the population depends. In the present study, the development of a statistical yield forecast model has been carried out for sugarcane production over Belagavi district, Karnataka using weather variables of crop growing season and past observed yield data for the period of 1971 to 2010. The study shows that this type of statistical yield forecast model could efficiently forecast yield 5 weeks and even 10 weeks in advance of the harvest for sugarcane within an acceptable limit of error. The performance of the model in predicting yields at the district level for sugarcane crops is found quite satisfactory for both validation (2007 and 2008) as well as forecasting (2009 and 2010).In addition to the above study, the climate variability of the area has also been studied, and hence, the data series was tested for Mann Kendall Rank Statistical Test. The maximum and minimum temperatures were found to be significant with opposite trends (decreasing trend in maximum and increasing in minimum temperature), while the other three are found in significant with different trends (rainfall and evening time relative humidity with increasing trend and morning time relative humidity with decreasing trend).

Keywords: climate impact, regression analysis, yield and forecast model, sugar models

Procedia PDF Downloads 67
34450 Fair Federated Learning in Wireless Communications

Authors: Shayan Mohajer Hamidi

Abstract:

Federated Learning (FL) has emerged as a promising paradigm for training machine learning models on distributed data without the need for centralized data aggregation. In the realm of wireless communications, FL has the potential to leverage the vast amounts of data generated by wireless devices to improve model performance and enable intelligent applications. However, the fairness aspect of FL in wireless communications remains largely unexplored. This abstract presents an idea for fair federated learning in wireless communications, addressing the challenges of imbalanced data distribution, privacy preservation, and resource allocation. Firstly, the proposed approach aims to tackle the issue of imbalanced data distribution in wireless networks. In typical FL scenarios, the distribution of data across wireless devices can be highly skewed, resulting in unfair model updates. To address this, we propose a weighted aggregation strategy that assigns higher importance to devices with fewer samples during the aggregation process. By incorporating fairness-aware weighting mechanisms, the proposed approach ensures that each participating device's contribution is proportional to its data distribution, thereby mitigating the impact of data imbalance on model performance. Secondly, privacy preservation is a critical concern in federated learning, especially in wireless communications where sensitive user data is involved. The proposed approach incorporates privacy-enhancing techniques, such as differential privacy, to protect user privacy during the model training process. By adding carefully calibrated noise to the gradient updates, the proposed approach ensures that the privacy of individual devices is preserved without compromising the overall model accuracy. Moreover, the approach considers the heterogeneity of devices in terms of computational capabilities and energy constraints, allowing devices to adaptively adjust the level of privacy preservation to strike a balance between privacy and utility. Thirdly, efficient resource allocation is crucial for federated learning in wireless communications, as devices operate under limited bandwidth, energy, and computational resources. The proposed approach leverages optimization techniques to allocate resources effectively among the participating devices, considering factors such as data quality, network conditions, and device capabilities. By intelligently distributing the computational load, communication bandwidth, and energy consumption, the proposed approach minimizes resource wastage and ensures a fair and efficient FL process in wireless networks. To evaluate the performance of the proposed fair federated learning approach, extensive simulations and experiments will be conducted. The experiments will involve a diverse set of wireless devices, ranging from smartphones to Internet of Things (IoT) devices, operating in various scenarios with different data distributions and network conditions. The evaluation metrics will include model accuracy, fairness measures, privacy preservation, and resource utilization. The expected outcomes of this research include improved model performance, fair allocation of resources, enhanced privacy preservation, and a better understanding of the challenges and solutions for fair federated learning in wireless communications. The proposed approach has the potential to revolutionize wireless communication systems by enabling intelligent applications while addressing fairness concerns and preserving user privacy.

Keywords: federated learning, wireless communications, fairness, imbalanced data, privacy preservation, resource allocation, differential privacy, optimization

Procedia PDF Downloads 74
34449 A Sharp Interface Model for Simulating Seawater Intrusion in the Coastal Aquifer of Wadi Nador (Algeria)

Authors: Abdelkader Hachemi, Boualem Remini

Abstract:

Seawater intrusion is a significant challenge faced by coastal aquifers in the Mediterranean basin. This study aims to determine the position of the sharp interface between seawater and freshwater in the aquifer of Wadi Nador, located in the Wilaya of Tipaza, Algeria. A numerical areal sharp interface model using the finite element method is developed to investigate the spatial and temporal behavior of seawater intrusion. The aquifer is assumed to be homogeneous and isotropic. The simulation results are compared with geophysical prospection data obtained through electrical methods in 2011 to validate the model. The simulation results demonstrate a good agreement with the geophysical prospection data, confirming the accuracy of the sharp interface model. The position of the sharp interface in the aquifer is found to be approximately 1617 meters from the sea. Two scenarios are proposed to predict the interface position for the year 2024: one without pumping and the other with pumping. The results indicate a noticeable retreat of the sharp interface position in the first scenario, while a slight decline is observed in the second scenario. The findings of this study provide valuable insights into the dynamics of seawater intrusion in the Wadi Nador aquifer. The predicted changes in the sharp interface position highlight the potential impact of pumping activities on the aquifer's vulnerability to seawater intrusion. This study emphasizes the importance of implementing measures to manage and mitigate seawater intrusion in coastal aquifers. The sharp interface model developed in this research can serve as a valuable tool for assessing and monitoring the vulnerability of aquifers to seawater intrusion.

Keywords: seawater intrusion, sharp interface, coastal aquifer, algeria

Procedia PDF Downloads 116
34448 BIM-Based Tool for Sustainability Assessment and Certification Documents Provision

Authors: Taki Eddine Seghier, Mohd Hamdan Ahmad, Yaik-Wah Lim, Samuel Opeyemi Williams

Abstract:

The assessment of building sustainability to achieve a specific green benchmark and the preparation of the required documents in order to receive a green building certification, both are considered as major challenging tasks for green building design team. However, this labor and time-consuming process can take advantage of the available Building Information Modeling (BIM) features such as material take-off and scheduling. Furthermore, the workflow can be automated in order to track potentially achievable credit points and provide rating feedback for several design options by using integrated Visual Programing (VP) to handle the stored parameters within the BIM model. Hence, this study proposes a BIM-based tool that uses Green Building Index (GBI) rating system requirements as a unique input case to evaluate the building sustainability in the design stage of the building project life cycle. The tool covers two key models for data extraction, firstly, a model for data extraction, calculation and the classification of achievable credit points in a green template, secondly, a model for the generation of the required documents for green building certification. The tool was validated on a BIM model of residential building and it serves as proof of concept that building sustainability assessment of GBI certification can be automatically evaluated and documented through BIM.

Keywords: green building rating system, GBRS, building information modeling, BIM, visual programming, VP, sustainability assessment

Procedia PDF Downloads 323
34447 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach

Authors: Theertha Chandroth

Abstract:

This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.

Keywords: XML, JSON, data comparison, integration testing, Python, SQL

Procedia PDF Downloads 131
34446 The Location-Routing Problem with Pickup Facilities and Heterogeneous Demand: Formulation and Heuristics Approach

Authors: Mao Zhaofang, Xu Yida, Fang Kan, Fu Enyuan, Zhao Zhao

Abstract:

Nowadays, last-mile distribution plays an increasingly important role in the whole industrial chain delivery link and accounts for a large proportion of the whole distribution process cost. Promoting the upgrading of logistics networks and improving the layout of final distribution points has become one of the trends in the development of modern logistics. Due to the discrete and heterogeneous needs and spatial distribution of customer demand, which will lead to a higher delivery failure rate and lower vehicle utilization, last-mile delivery has become a time-consuming and uncertain process. As a result, courier companies have introduced a range of innovative parcel storage facilities, including pick-up points and lockers. The introduction of pick-up points and lockers has not only improved the users’ experience but has also helped logistics and courier companies achieve large-scale economy. Against the backdrop of the COVID-19 of the previous period, contactless delivery has become a new hotspot, which has also created new opportunities for the development of collection services. Therefore, a key issue for logistics companies is how to design/redesign their last-mile distribution network systems to create integrated logistics and distribution networks that consider pick-up points and lockers. This paper focuses on the introduction of self-pickup facilities in new logistics and distribution scenarios and the heterogeneous demands of customers. In this paper, we consider two types of demand, including ordinary products and refrigerated products, as well as corresponding transportation vehicles. We consider the constraints associated with self-pickup points and lockers and then address the location-routing problem with self-pickup facilities and heterogeneous demands (LRP-PFHD). To solve this challenging problem, we propose a mixed integer linear programming (MILP) model that aims to minimize the total cost, which includes the facility opening cost, the variable transport cost, and the fixed transport cost. Due to the NP-hardness of the problem, we propose a hybrid adaptive large-neighbourhood search algorithm to solve LRP-PFHD. We evaluate the effectiveness and efficiency of the proposed algorithm by using instances generated based on benchmark instances. The results demonstrate that the hybrid adaptive large neighbourhood search algorithm is more efficient than MILP solvers such as Gurobi for LRP-PFHD, especially for large-scale instances. In addition, we made a comprehensive analysis of some important parameters (e.g., facility opening cost and transportation cost) to explore their impacts on the results and suggested helpful managerial insights for courier companies.

Keywords: city logistics, last-mile delivery, location-routing, adaptive large neighborhood search

Procedia PDF Downloads 76
34445 Application of a Generalized Additive Model to Reveal the Relations between the Density of Zooplankton with Other Variables in the West Daya Bay, China

Authors: Weiwen Li, Hao Huang, Chengmao You, Jianji Liao, Lei Wang, Lina An

Abstract:

Zooplankton are a central issue in the ecology which makes a great contribution to maintaining the balance of an ecosystem. It is critical in promoting the material cycle and energy flow within the ecosystems. A generalized additive model (GAM) was applied to analyze the relationships between the density (individuals per m³) of zooplankton and other variables in West Daya Bay. All data used in this analysis (the survey month, survey station (longitude and latitude), the depth of the water column, the superficial concentration of chlorophyll a, the benthonic concentration of chlorophyll a, the number of zooplankton species and the number of zooplankton species) were collected through monthly scientific surveys during January to December 2016. GLM model (generalized linear model) was used to choose the significant variables’ impact on the density of zooplankton, and the GAM was employed to analyze the relationship between the density of zooplankton and the significant variables. The results showed that the density of zooplankton increased with an increase of the benthonic concentration of chlorophyll a, but decreased with a decrease in the depth of the water column. Both high numbers of zooplankton species and the overall total number of zooplankton individuals led to a higher density of zooplankton.

Keywords: density, generalized linear model, generalized additive model, the West Daya Bay, zooplankton

Procedia PDF Downloads 146
34444 Fast Bayesian Inference of Multivariate Block-Nearest Neighbor Gaussian Process (NNGP) Models for Large Data

Authors: Carlos Gonzales, Zaida Quiroz, Marcos Prates

Abstract:

Several spatial variables collected at the same location that share a common spatial distribution can be modeled simultaneously through a multivariate geostatistical model that takes into account the correlation between these variables and the spatial autocorrelation. The main goal of this model is to perform spatial prediction of these variables in the region of study. Here we focus on a geostatistical multivariate formulation that relies on sharing common spatial random effect terms. In particular, the first response variable can be modeled by a mean that incorporates a shared random spatial effect, while the other response variables depend on this shared spatial term, in addition to specific random spatial effects. Each spatial random effect is defined through a Gaussian process with a valid covariance function, but in order to improve the computational efficiency when the data are large, each Gaussian process is approximated to a Gaussian random Markov field (GRMF), specifically to the block nearest neighbor Gaussian process (Block-NNGP). This approach involves dividing the spatial domain into several dependent blocks under certain constraints, where the cross blocks allow capturing the spatial dependence on a large scale, while each individual block captures the spatial dependence on a smaller scale. The multivariate geostatistical model belongs to the class of Latent Gaussian Models; thus, to achieve fast Bayesian inference, it is used the integrated nested Laplace approximation (INLA) method. The good performance of the proposed model is shown through simulations and applications for massive data.

Keywords: Block-NNGP, geostatistics, gaussian process, GRMF, INLA, multivariate models.

Procedia PDF Downloads 93
34443 Development of Simple-To-Apply Biogas Kinetic Models for the Co-Digestion of Food Waste and Maize Husk

Authors: Owamah Hilary, O. C. Izinyon

Abstract:

Many existing biogas kinetic models are difficult to apply to substrates they were not developed for, as they are substrate specific. Biodegradability kinetic (BIK) model and maximum biogas production potential and stability assessment (MBPPSA) model were therefore developed in this study for the anaerobic co-digestion of food waste and maize husk. Biodegradability constant (k) was estimated as 0.11d-1 using the BIK model. The results of maximum biogas production potential (A) obtained using the MBPPSA model corresponded well with the results obtained using the popular but complex modified Gompertz model for digesters B-1, B-2, B-3, B-4, and B-5. The (If) value of MBPPSA model also showed that digesters B-3, B-4, and B-5 were stable, while B-1 and B-2 were unstable. Similar stability observation was also obtained using the modified Gompertz model. The MBPPSA model can therefore be used as alternative model for anaerobic digestion feasibility studies and plant design.

Keywords: biogas, inoculum, model development, stability assessment

Procedia PDF Downloads 422
34442 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics

Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan

Abstract:

The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).

Keywords: cloud forensics, data protection Laws, GDPR, IoT forensics, machine Learning

Procedia PDF Downloads 149
34441 ANN Modeling for Cadmium Biosorption from Potable Water Using a Packed-Bed Column Process

Authors: Dariush Jafari, Seyed Ali Jafari

Abstract:

The recommended limit for cadmium concentration in potable water is less than 0.005 mg/L. A continuous biosorption process using indigenous red seaweed, Gracilaria corticata, was performed to remove cadmium from the potable water. The process was conducted under fixed conditions and the breakthrough curves were achieved for three consecutive sorption-desorption cycles. A modeling based on Artificial Neural Network (ANN) was employed to fit the experimental breakthrough data. In addition, a simplified semi empirical model, Thomas, was employed for this purpose. It was found that ANN well described the experimental data (R2>0.99) while the Thomas prediction were a bit less successful with R2>0.97. The adjusted design parameters using the nonlinear form of Thomas model was in a good agreement with the experimentally obtained ones. The results approve the capability of ANN to predict the cadmium concentration in potable water.

Keywords: ANN, biosorption, cadmium, packed-bed, potable water

Procedia PDF Downloads 424