Search results for: management models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15480

Search results for: management models

13320 Towards Better Quality in Healthcare and Operations Management: A Developmental Literature Review

Authors: Marc Dorval, Marie-Hélène Jobin

Abstract:

This work presents the various perspectives, dimensions, components and definitions given to quality in the operations management (OM) and healthcare services (HCS) literature in time, highlighting gaps and learning opportunities between the two disciplines through a thorough search into their rich and distinct body of knowledge. Greater and new insights about the general nature of quality are obtained with findings such as in OM, quality has been approached in six fairly distinct paradigms (excellence, value, conformity to specifications, attributes, satisfaction and meeting or exceeding customer expectations), whereas in HCS, two approaches are prominent (Donabedian’s structure, process and outcomes model and Lohr and Schroeder’s circumscribed definition). The two disciplines views on quality seem to have progressed much in parallel with little cross-learning from each other. This work then proposes an encompassing definition of quality as a lever and suggests further research and development avenues for a better use of the concept of quality by academics and practitioners alike toward the goals of greater organizational performance and improved management in healthcare and possibly other service domains.

Keywords: healthcare, management, operations, quality, services

Procedia PDF Downloads 229
13319 Promoting Biofuels in India: Assessing Land Use Shifts Using Econometric Acreage Response Models

Authors: Y. Bhatt, N. Ghosh, N. Tiwari

Abstract:

Acreage response function are modeled taking account of expected harvest prices, weather related variables and other non-price variables allowing for partial adjustment possibility. At the outset, based on the literature on price expectation formation, we explored suitable formulations for estimating the farmer’s expected prices. Assuming that farmers form expectations rationally, the prices of food and biofuel crops are modeled using time-series methods for possible ARCH/GARCH effects to account for volatility. The prices projected on the basis of the models are then inserted to proxy for the expected prices in the acreage response functions. Food crop acreages in different growing states are found sensitive to their prices relative to those of one or more of the biofuel crops considered. The required percentage improvement in food crop yields is worked to offset the acreage loss.

Keywords: acreage response function, biofuel, food security, sustainable development

Procedia PDF Downloads 301
13318 Communication Barriers in Disaster Risk Management

Authors: Pooja Pandey

Abstract:

The role of communication plays an integral part in the management of any disaster, whether natural or human-induced, both require effective and strategic delivery of information. The way any information is conveyed carries the most weight while dealing with the disaster. Hence, integrating communication strategies in disaster risk management (DRM) are extensively acknowledged however, these integration and planning are missing from the practical books. Researchers are continuously exploring integrated DRM and have established substantial vents between research and implementation of the strategies (gaps between science and policy). For this reason, this paper reviews the communication barriers that obstruct effective management of the disaster. Communication between first responders (government agencies, police, medical services) and the public (people directly affected by the disaster) is most critical and lacks proper delivery during a disaster. And these challenges can only be resolved if the foundation of the problem is properly dealt with, which is resolving the issues within the organizations. Through this study, it was found that it is necessary to build the communication gap between the organizations themselves as most of the hindrances occur during the mitigation, preparedness, response and recovery phase of the disaster. The study is concluded with the main aim to review the communication barriers within and at the organizational, technological, and social levels that impact effective DRM. In the end, some suggestions are made to strengthen the knowledge for future improvement in communication between the responders and their organizations.

Keywords: communication, organization, barriers, first responders, disaster risk management

Procedia PDF Downloads 300
13317 The Use of Empirical Models to Estimate Soil Erosion in Arid Ecosystems and the Importance of Native Vegetation

Authors: Meshal M. Abdullah, Rusty A. Feagin, Layla Musawi

Abstract:

When humans mismanage arid landscapes, soil erosion can become a primary mechanism that leads to desertification. This study focuses on applying soil erosion models to a disturbed landscape in Umm Nigga, Kuwait, and identifying its predicted change under restoration plans, The northern portion of Umm Nigga, containing both coastal and desert ecosystems, falls within the boundaries of the Demilitarized Zone (DMZ) adjacent to Iraq, and has been fenced off to restrict public access since 1994. The central objective of this project was to utilize GIS and remote sensing to compare the MPSIAC (Modified Pacific South West Inter Agency Committee), EMP (Erosion Potential Method), and USLE (Universal Soil Loss Equation) soil erosion models and determine their applicability for arid regions such as Kuwait. Spatial analysis was used to develop the necessary datasets for factors such as soil characteristics, vegetation cover, runoff, climate, and topography. Results showed that the MPSIAC and EMP models produced a similar spatial distribution of erosion, though the MPSIAC had more variability. For the MPSIAC model, approximately 45% of the land surface ranged from moderate to high soil loss, while 35% ranged from moderate to high for the EMP model. The USLE model had contrasting results and a different spatial distribution of the soil loss, with 25% of area ranging from moderate to high erosion, and 75% ranging from low to very low. We concluded that MPSIAC and EMP were the most suitable models for arid regions in general, with the MPSIAC model best. We then applied the MPSIAC model to identify the amount of soil loss between coastal and desert areas, and fenced and unfenced sites. In the desert area, soil loss was different between fenced and unfenced sites. In these desert fenced sites, 88% of the surface was covered with vegetation and soil loss was very low, while at the desert unfenced sites it was 3% and correspondingly higher. In the coastal areas, the amount of soil loss was nearly similar between fenced and unfenced sites. These results implied that vegetation cover played an important role in reducing soil erosion, and that fencing is much more important in the desert ecosystems to protect against overgrazing. When applying the MPSIAC model predictively, we found that vegetation cover could be increased from 3% to 37% in unfenced areas, and soil erosion could then decrease by 39%. We conclude that the MPSIAC model is best to predict soil erosion for arid regions such as Kuwait.

Keywords: soil erosion, GIS, modified pacific South west inter agency committee model (MPSIAC), erosion potential method (EMP), Universal soil loss equation (USLE)

Procedia PDF Downloads 297
13316 Removal of Heavy Metal from Wastewater using Bio-Adsorbent

Authors: Rakesh Namdeti

Abstract:

The liquid waste-wastewater- is essentially the water supply of the community after it has been used in a variety of applications. In recent years, heavy metal concentrations, besides other pollutants, have increased to reach dangerous levels for the living environment in many regions. Among the heavy metals, Lead has the most damaging effects on human health. It can enter the human body through the uptake of food (65%), water (20%), and air (15%). In this background, certain low-cost and easily available biosorbent was used and reported in this study. The scope of the present study is to remove Lead from its aqueous solution using Olea EuropaeaResin as biosorbent. The results showed that the biosorption capacity of Olea EuropaeaResin biosorbent was more for Lead removal. The Langmuir, Freundlich, Tempkin, and Dubinin-Radushkevich (D-R) models were used to describe the biosorption equilibrium of Lead Olea EuropaeaResin biosorbent, and the biosorption followed the Langmuir isotherm. The kinetic models showed that the pseudo-second-order rate expression was found to represent well the biosorption data for the biosorbent.

Keywords: novel biosorbent, central composite design, Lead, isotherms, kinetics

Procedia PDF Downloads 78
13315 Online Learning Management System for Teaching

Authors: Somchai Buaroong

Abstract:

This research aims to investigating strong points and challenges in application of an online learning management system to an English course. Data were collected from observation, learners’ oral and written reports, and the teacher’s journals. A questionnaire was utilized as a tool to collect data. Statistics utilized in this research included frequency, percentage, mean, standard deviation, and multiple regression analysis. The findings show that the system was an additional channel to enhance English language learning through written class assignments that were digitally accessible by any group members, and through communication between the teacher and learners and among learners themselves. Thus, the learning management system could be a promising tool for foreign language teachers. Also revealed in the study were difficulties in its use. The article ends with discussions of findings of the system for foreign language classes in association to pedagogy are also included and in the level of signification.

Keywords: english course, foreign language system, online learning management system, teacher’s journals

Procedia PDF Downloads 285
13314 BIM Application Research Based on the Main Entrance and Garden Area Project of Shanghai Disneyland

Authors: Ying Yuken, Pengfei Wang, Zhang Qilin, Xiao Ben

Abstract:

Based on the main entrance and garden area (ME&G) project of Shanghai Disneyland, this paper introduces the application of BIM technology in this kind of low-rise comprehensive building with complex facade system, electromechanical system and decoration system. BIM technology is applied to the whole process of design, construction and completion of the whole project. With the construction of BIM application framework of the whole project, the key points of BIM modeling methods of different systems and the integration and coordination of BIM models are elaborated in detail. The specific application methods of BIM technology in similar complex low-rise building projects are sorted out. Finally, the paper summarizes the benefits of BIM technology application, and puts forward some suggestions for BIM management mode and practical application of similar projects in the future.

Keywords: BIM, complex low-rise building, BIM modeling, model integration and coordination, 3D scanning

Procedia PDF Downloads 172
13313 Refitting Equations for Peak Ground Acceleration in Light of the PF-L Database

Authors: Matevž Breška, Iztok Peruš, Vlado Stankovski

Abstract:

Systematic overview of existing Ground Motion Prediction Equations (GMPEs) has been published by Douglas. The number of earthquake recordings that have been used for fitting these equations has increased in the past decades. The current PF-L database contains 3550 recordings. Since the GMPEs frequently model the peak ground acceleration (PGA) the goal of the present study was to refit a selection of 44 of the existing equation models for PGA in light of the latest data. The algorithm Levenberg-Marquardt was used for fitting the coefficients of the equations and the results are evaluated both quantitatively by presenting the root mean squared error (RMSE) and qualitatively by drawing graphs of the five best fitted equations. The RMSE was found to be as low as 0.08 for the best equation models. The newly estimated coefficients vary from the values published in the original works.

Keywords: Ground Motion Prediction Equations, Levenberg-Marquardt algorithm, refitting PF-L database, peak ground acceleration

Procedia PDF Downloads 462
13312 Finite Element Modeling Techniques of Concrete in Steel and Concrete Composite Members

Authors: J. Bartus, J. Odrobinak

Abstract:

The paper presents a nonlinear analysis 3D model of composite steel and concrete beams with web openings using the Finite Element Method (FEM). The core of the study is the introduction of basic modeling techniques comprehending the description of material behavior, appropriate elements selection, and recommendations for overcoming problems with convergence. Results from various finite element models are compared in the study. The main objective is to observe the concrete failure mechanism and its influence on the structural performance of numerical models of the beams at particular load stages. The bearing capacity of beams, corresponding deformations, stresses, strains, and fracture patterns were determined. The results show how load-bearing elements consisting of concrete parts can be analyzed using FEM software with various options to create the most suitable numerical model. The paper demonstrates the versatility of Ansys software usage for structural simulations.

Keywords: Ansys, concrete, modeling, steel

Procedia PDF Downloads 121
13311 Generalization of Zhou Fixed Point Theorem

Authors: Yu Lu

Abstract:

Fixed point theory is a basic tool for the study of the existence of Nash equilibria in game theory. This paper presents a significant generalization of the Veinott-Zhou fixed point theorem for increasing correspondences, which serves as an essential framework for investigating the existence of Nash equilibria in supermodular and quasisupermodular games. To establish our proofs, we explore different conceptions of multivalued increasingness and provide comprehensive results concerning the existence of the largest/least fixed point. We provide two distinct approaches to the proof, each offering unique insights and advantages. These advancements not only extend the applicability of the Veinott-Zhou theorem to a broader range of economic scenarios but also enhance the theoretical framework for analyzing equilibrium behavior in complex game-theoretic models. Our findings pave the way for future research in the development of more sophisticated models of economic behavior and strategic interaction.

Keywords: fixed-point, Tarski’s fixed-point theorem, Nash equilibrium, supermodular game

Procedia PDF Downloads 55
13310 Interpretive Structural Modeling Technique for Hierarchal Ranking of Barriers in Implementation ofGreen Supply Chain Management-Case of Indian Petroleum Industry

Authors: Kavish Kejriwal, Richa Grover

Abstract:

Consumer awareness and pending legislation have pushed environmental issues into the spotlight, making it imperative for organizations to have a plan of action for “going green.” This is the reason why Green Supply Chain Management has become the integral part of many organization with a goal to reduce cost, increase efficiency and be environmental friendly. Implementation of GSCM involves many factors which act as barriers, making it a tedious task. These barriers have different relationship among themselves creating different impact on implementation Green Supply Chain Management. This work focuses on determining those barriers which have essentially to be removed in the initial stages of GSCM adoption. In this work, the author has taken the case of a petroleum industry in order to come up with a solution. A DEMATEL approach is used to reach the solution.

Keywords: barriers, environment, green supply chain management, impact, interpretive structural modeling

Procedia PDF Downloads 278
13309 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.

Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution

Procedia PDF Downloads 372
13308 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining

Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride

Abstract:

In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning

Procedia PDF Downloads 134
13307 A Non-Parametric Based Mapping Algorithm for Use in Audio Fingerprinting

Authors: Analise Borg, Paul Micallef

Abstract:

Over the past few years, the online multimedia collection has grown at a fast pace. Several companies showed interest to study the different ways to organize the amount of audio information without the need of human intervention to generate metadata. In the past few years, many applications have emerged on the market which are capable of identifying a piece of music in a short time. Different audio effects and degradation make it much harder to identify the unknown piece. In this paper, an audio fingerprinting system which makes use of a non-parametric based algorithm is presented. Parametric analysis is also performed using Gaussian Mixture Models (GMMs). The feature extraction methods employed are the Mel Spectrum Coefficients and the MPEG-7 basic descriptors. Bin numbers replaced the extracted feature coefficients during the non-parametric modelling. The results show that non-parametric analysis offer potential results as the ones mentioned in the literature.

Keywords: audio fingerprinting, mapping algorithm, Gaussian Mixture Models, MFCC, MPEG-7

Procedia PDF Downloads 421
13306 Risk Assessment Tools Applied to Deep Vein Thrombosis Patients Treated with Warfarin

Authors: Kylie Mueller, Nijole Bernaitis, Shailendra Anoopkumar-Dukie

Abstract:

Background: Vitamin K antagonists particularly warfarin is the most frequently used oral medication for deep vein thrombosis (DVT) treatment and prophylaxis. Time in therapeutic range (TITR) of the international normalised ratio (INR) is widely accepted as a measure to assess the quality of warfarin therapy. Multiple factors can affect warfarin control and the subsequent adverse outcomes including thromboembolic and bleeding events. Predictor models have been developed to assess potential contributing factors and measure the individual risk of these adverse events. These predictive models have been validated in atrial fibrillation (AF) patients, however, there is a lack of literature on whether these can be successfully applied to other warfarin users including DVT patients. Therefore, the aim of the study was to assess the ability of these risk models (HAS BLED and CHADS2) to predict haemorrhagic and ischaemic incidences in DVT patients treated with warfarin. Methods: A retrospective analysis of DVT patients receiving warfarin management by a private pathology clinic was conducted. Data was collected from November 2007 to September 2014 and included demographics, medical and drug history, INR targets and test results. Patients receiving continuous warfarin therapy with an INR reference range between 2.0 and 3.0 were included in the study with mean TITR calculated using the Rosendaal method. Bleeding and thromboembolic events were recorded and reported as incidences per patient. The haemorrhagic risk model HAS BLED and ischaemic risk model CHADS2 were applied to the data. Patients were then stratified into either the low, moderate, or high-risk categories. The analysis was conducted to determine if a correlation existed between risk assessment tool and patient outcomes. Data was analysed using GraphPad Instat Version 3 with a p value of <0.05 considered to be statistically significant. Patient characteristics were reported as mean and standard deviation for continuous data and categorical data reported as number and percentage. Results: Of the 533 patients included in the study, there were 268 (50.2%) female and 265 (49.8%) male patients with a mean age of 62.5 years (±16.4). The overall mean TITR was 78.3% (±12.7) with an overall haemorrhagic incidence of 0.41 events per patient. For the HAS BLED model, there was a haemorrhagic incidence of 0.08, 0.53, and 0.54 per patient in the low, moderate and high-risk categories respectively showing a statistically significant increase in incidence with increasing risk category. The CHADS2 model showed an increase in ischaemic events according to risk category with no ischaemic events in the low category, and an ischaemic incidence of 0.03 in the moderate category and 0.47 high-risk categories. Conclusion: An increasing haemorrhagic incidence correlated to an increase in the HAS BLED risk score in DVT patients treated with warfarin. Furthermore, a greater incidence of ischaemic events occurred in patients with an increase in CHADS2 category. In an Australian population of DVT patients, the HAS BLED and CHADS2 accurately predicts incidences of haemorrhage and ischaemic events respectively.

Keywords: anticoagulant agent, deep vein thrombosis, risk assessment, warfarin

Procedia PDF Downloads 263
13305 Using Confirmatory Factor Analysis to Test the Dimensional Structure of Tourism Service Quality

Authors: Ibrahim A. Elshaer, Alaa M. Shaker

Abstract:

Several previous empirical studies have operationalized service quality as either a multidimensional or unidimensional construct. While few earlier studies investigated some practices of the assumed dimensional structure of service quality, no study has been found to have tested the construct’s dimensionality using confirmatory factor analysis (CFA). To gain a better insight into the dimensional structure of service quality construct, this paper tests its dimensionality using three CFA models (higher order factor model, oblique factor model, and one factor model) on a set of data collected from 390 British tourists visited Egypt. The results of the three tests models indicate that service quality construct is multidimensional. This result helps resolving the problems that might arise from the lack of clarity concerning the dimensional structure of service quality, as without testing the dimensional structure of a measure, researchers cannot assume that the significant correlation is a result of factors measuring the same construct.

Keywords: service quality, dimensionality, confirmatory factor analysis, Egypt

Procedia PDF Downloads 591
13304 Colored Image Classification Using Quantum Convolutional Neural Networks Approach

Authors: Farina Riaz, Shahab Abdulla, Srinjoy Ganguly, Hajime Suzuki, Ravinesh C. Deo, Susan Hopkins

Abstract:

Recently, quantum machine learning has received significant attention. For various types of data, including text and images, numerous quantum machine learning (QML) models have been created and are being tested. Images are exceedingly complex data components that demand more processing power. Despite being mature, classical machine learning still has difficulties with big data applications. Furthermore, quantum technology has revolutionized how machine learning is thought of, by employing quantum features to address optimization issues. Since quantum hardware is currently extremely noisy, it is not practicable to run machine learning algorithms on it without risking the production of inaccurate results. To discover the advantages of quantum versus classical approaches, this research has concentrated on colored image data. Deep learning classification models are currently being created on Quantum platforms, but they are still in a very early stage. Black and white benchmark image datasets like MNIST and Fashion MINIST have been used in recent research. MNIST and CIFAR-10 were compared for binary classification, but the comparison showed that MNIST performed more accurately than colored CIFAR-10. This research will evaluate the performance of the QML algorithm on the colored benchmark dataset CIFAR-10 to advance QML's real-time applicability. However, deep learning classification models have not been developed to compare colored images like Quantum Convolutional Neural Network (QCNN) to determine how much it is better to classical. Only a few models, such as quantum variational circuits, take colored images. The methodology adopted in this research is a hybrid approach by using penny lane as a simulator. To process the 10 classes of CIFAR-10, the image data has been translated into grey scale and the 28 × 28-pixel image containing 10,000 test and 50,000 training images were used. The objective of this work is to determine how much the quantum approach can outperform a classical approach for a comprehensive dataset of color images. After pre-processing 50,000 images from a classical computer, the QCNN model adopted a hybrid method and encoded the images into a quantum simulator for feature extraction using quantum gate rotations. The measurements were carried out on the classical computer after the rotations were applied. According to the results, we note that the QCNN approach is ~12% more effective than the traditional classical CNN approaches and it is possible that applying data augmentation may increase the accuracy. This study has demonstrated that quantum machine and deep learning models can be relatively superior to the classical machine learning approaches in terms of their processing speed and accuracy when used to perform classification on colored classes.

Keywords: CIFAR-10, quantum convolutional neural networks, quantum deep learning, quantum machine learning

Procedia PDF Downloads 129
13303 Dynamical Models for Enviromental Effect Depuration for Structural Health Monitoring of Bridges

Authors: Francesco Morgan Bono, Simone Cinquemani

Abstract:

This research aims to enhance bridge monitoring by employing innovative techniques that incorporate exogenous factors into the modeling of sensor signals, thereby improving long-term predictability beyond traditional static methods. Using real datasets from two different bridges equipped with Linear Variable Displacement Transducer (LVDT) sensors, the study investigates the fundamental principles governing sensor behavior for more precise long-term forecasts. Additionally, the research evaluates performance on noisy and synthetically damaged data, proposing a residual-based alarm system to detect anomalies in the bridge. In summary, this novel approach combines advanced modeling, exogenous factors, and anomaly detection to extend prediction horizons and improve preemptive damage recognition, significantly advancing structural health monitoring practices.

Keywords: structural health monitoring, dynamic models, sindy, railway bridges

Procedia PDF Downloads 38
13302 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings

Authors: Jude K. Safo

Abstract:

Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.

Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics

Procedia PDF Downloads 68
13301 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 108
13300 DUSP16 Inhibition Rescues Neurogenic and Cognitive Deficits in Alzheimer's Disease Mice Models

Authors: Huimin Zhao, Xiaoquan Liu, Haochen Liu

Abstract:

The major challenge facing Alzheimer's Disease (AD) drug development is how to effectively improve cognitive function in clinical practice. Growing evidence indicates that stimulating hippocampal neurogenesis is a strategy for restoring cognition in animal models of AD. The mitogen-activated protein kinase (MAPK) pathway is a crucial factor in neurogenesis, which is negatively regulated by Dual-specificity phosphatase 16 (DUSP16). Transcriptome analysis of post-mortem brain tissue revealed up-regulation of DUSP16 expression in AD patients. Additionally, DUSP16 was involved in regulating the proliferation and neural differentiation of neural progenitor cells (NPCs). Nevertheless, whether the effect of DUSP16 on ameliorating cognitive disorders by influencing NPCs differentiation in AD mice remains unclear. Our study demonstrates an association between DUSP16 SNPs and clinical progression in individuals with mild cognitive impairment (MCI). Besides, we found that increased DUSP16 expression in both 3×Tg and SAMP8 models of AD led to NPC differentiation impairments. By silencing DUSP16, cognitive benefits, the induction of AHN and synaptic plasticity, were observed in AD mice. Furthermore, we found that DUSP16 is involved in the process of NPC differentiation by regulating c-Jun N-terminal kinase (JNK) phosphorylation. Moreover, the increased DUSP16 may be regulated by the ETS transcription factor (ELK1), which binds to the promoter region of DUSP16. Loss of ELK1 resulted in decreased DUSP16 mRNA and protein levels. Our data uncover a potential regulatory role for DUSP16 in adult hippocampal neurogenesis and provide a possibility to find the target of AD intervention.

Keywords: alzheimer's disease, cognitive function, DUSP16, hippocampal neurogenesis

Procedia PDF Downloads 72
13299 Banking Risk Management between the Prudential and the Operational Approaches

Authors: Mustapha Achibane, Imane Allam

Abstract:

Since the nineties, all Moroccan banking institutions have to respect an arsenal of prudential ratios. The respect of these prudential measures aims to ensure the financial system stability. In order to do so, regulatory authorities tried to reduce the financial and operational risks incurred by the banking entities. Meanwhile, regulatory authorities demanded a balance sheet management work from banks. They also asked them to establish a management control system to manage operational risk, as well as an effort in terms of incurred risk-based commitments. Therefore, the prudential approach has a macroeconomic nature and it is presented as a determinant of the operational, microeconomic approach. This operational approach takes the form of a strategy that each banking entity must develop to manage the different banking risks. This study seeks to analyze the problem of risk management between the prudential and the operational approaches. It was processed through a literature review followed by an analysis of the Moroccan banking sector’s performance. At first, we will reconcile the inductive logic and then, the analytical one. The first approach consists of analyzing the phenomenon from a normative and conceptual perspective, while the second one will consist of considering the Moroccan banking system and analyzing the behavior of Moroccan banking entities in terms of risk management and performance. The results identified a favorable growth in terms of performance, despite the huge provisioning effort made to meet the international standards and the harmonization of the regulations.

Keywords: banking performance, financial intermediation, operational approach, prudential standards, risk management

Procedia PDF Downloads 142
13298 Strategic Leadership and Sustainable Project Management in Enugu, Nigeria

Authors: Nnadi Ezekiel Ejiofor

Abstract:

In Enugu, Nigeria, this study investigates the connection between strategic leadership and project management sustainability, with an emphasis on building projects in the State. The study set out to accomplish two specific goals: first, it sought to establish a link between creative project management and resource efficiency in construction projects in Enugu State, Nigeria; and second, it sought to establish a link between innovative thinking and waste minimization in those same projects. A structured questionnaire was used to collect primary data from 45 registered construction enterprises in the study area as part of the study's descriptive research approach. Due to the nonparametric nature of the data, Spearman Rank Order Correlation was used to evaluate the acquired data. The findings demonstrate that creative project management had a significant positive impact on resource efficiency in construction projects carried out by architecture firms in Enugu State, Nigeria (r =.849; p.001), and that innovative thinking had a significant impact on waste reduction in those same projects (r =.849; p.001). It was determined that strategic leadership had a significant impact on the sustainability of project management, and it was thus advised that project managers should foresee, prepare for, and effectively communicate present and future developments to project staff in order to ensure that the objective of sustainable initiatives, such as recycling and reuse, is implemented in construction projects.

Keywords: construction, project management, strategic leadership, sustainability, waste reduction

Procedia PDF Downloads 50
13297 Static and Dynamic Behaviors of Sandwich Structures With Metallic Connections

Authors: Shidokht Rashiddadash, Mojtaba Sadighi, Soheil Dariushi

Abstract:

Since sandwich structures are used in many areas ranging from ships, trains, automobiles, aircrafts, bridge and building, connecting sandwich structures is necessary almost in all industries. So application of metallic joints between sandwich panels is increasing. Various joining methods are available such as mechanically fastened joints (riveting or bolting) or adhesively bonded joints and choosing one of them depends on the application. In this research, sandwich specimens were fabricated with two different types of metallic connections with dissimilar geometries. These specimens included beams and plates and were manufactured using glass-epoxy skins and aluminum honeycomb core. After construction of the specimens, bending and low velocity impact tests were executed on them and the behaviors of specimens were discussed. Numerical models were developed using LS-DYNA software and validated with test results. Finally, parametric studies were performed on the thicknesses and lengths of two connections by employing the numerical models.

Keywords: connection, honeycomb, low velocity impact, sandwich panel, static test

Procedia PDF Downloads 56
13296 A Guide for Using Viscoelasticity in ANSYS

Authors: A. Fettahoglu

Abstract:

Theory of viscoelasticity is used by many researchers to represent the behavior of many materials such as pavements on roads or bridges. Several researches used analytical methods and rheology to predict the material behaviors of simple models. Today, more complex engineering structures are analyzed using Finite Element Method, in which material behavior is embedded by means of three dimensional viscoelastic material laws. As a result, structures of unordinary geometry and domain can be analyzed by means of Finite Element Method and three dimensional viscoelastic equations. In the scope of this study, rheological models embedded in ANSYS, namely, generalized Maxwell model and Prony series, which are two methods used by ANSYS to represent viscoelastic material behavior, are presented explicitly. Afterwards, a guide is illustrated to ease using of viscoelasticity tool in ANSYS.

Keywords: ANSYS, generalized Maxwell model, finite element method, Prony series, viscoelasticity, viscoelastic material curve fitting

Procedia PDF Downloads 604
13295 Notions of Criticality in Strategic Management of Hospitality Services in Nigeria

Authors: Chigozie P. Ugochukwu Okoro

Abstract:

While the needs of the traveling public are becoming more evolving due to the ever-changing tourism ecosphere; there is a seeming inability to sustain competitive advantage through hospitality enterprise service quality differentials and effective service delivery. Contending with these evolving needs demands a re-assessment of the notions that drive service evolvement thoughts pattern and service delivery processes management in the hospitality enterprise. The intent of this study was to explicate the trends of the evolving needs of the traveling public that are critical to hospitality enterprise service management. The hypothetical study used customer satisfaction to dissect the strategic implication of perception, experience, and socio-cultural engagements in customization of hospitality enterprise services. The study found out that customer perception is cognitive and does not shape service customization. The study also elucidated that customer experience which can be evaluated, is critical in service structure determination and delivery. Socio-cultural engagement is intrinsic in driving service diversification. The study recommends tourists’ audit and cognitive insights as strategic actions for re-designing service efficiency and delivery in hospitality enterprise service management.

Keywords: hospitality enterprise services, strategic management, quality service delivery, notions of criticality

Procedia PDF Downloads 174
13294 Evaluation of Free Technologies as Tools for Business Process Management

Authors: Julio Sotomayor, Daniel Yucra, Jorge Mayhuasca

Abstract:

The article presents an evaluation of free technologies for business process automation, with emphasis only on tools compatible with the general public license (GPL). The compendium of technologies was based on promoting a service-oriented enterprise architecture (SOA) and the establishment of a business process management system (BPMS). The methodology for the selection of tools was Agile UP. This proposal allows businesses to achieve technological sovereignty and independence, in addition to the promotion of service orientation and the development of free software based on components.

Keywords: BPM, BPMS suite, open-source software, SOA, enterprise architecture, business process management

Procedia PDF Downloads 288
13293 Measuring Resource Recovery and Environmental Benefits of Global Waste Management System Using the Zero Waste Index

Authors: Atiq Uz Zaman

Abstract:

Sustainable waste management is one of the major global challenges that we face today. A poor waste management system not only symbolises the inefficiency of our society but also depletes valuable resources and emits pollutions to the environment. Presently, we extract more natural resources than ever before in order to meet the demand for constantly growing resource consumption. It is estimated that around 71 tonnes of ‘upstream’ materials are used for every tonne of MSW. Therefore, resource recovery from waste potentially offsets a significant amount of upstream resource being depleted. This study tries to measure the environmental benefits of global waste management systems by applying a tool called the Zero Waste Index (ZWI). The ZWI measures the waste management performance by accounting for the potential amount of virgin material that can be offset by recovering resources from waste. In addition, the ZWI tool also considers the energy, GHG and water savings by offsetting virgin materials and recovering energy from waste. This study analyses the municipal solid waste management system of 172 countries from all over the globe and the population covers in the study is 3.37 billion. This study indicates that we generated around 1.47 billion tonnes (436kg/cap/year) of municipal solid waste each year and the waste generation is increasing over time. This study also finds a strong and positive correlation (R2=0.29, p = < .001) between income (GDP/capita/year) and amount of waste generated (kg/capita/year). About 84% of the waste is collected globally and only 15% of the collected waste is recycled. The ZWI of the world is measured in this study of 0.12, which means that the current waste management system potentially offsets only 12% of the total virgin material substitution potential from waste. Annually, an average person saved around 219kWh of energy, emitted around 48kg of GHG and saved around 38l of water. Findings of this study are very important to measure the current waste management performance in a global context. In addition, the study also analysed countries waste management performance based on their income level.

Keywords: global performance, material substitution; municipal waste, resource recovery, waste management, zero waste index

Procedia PDF Downloads 244
13292 Critical Success Factors Quality Requirement Change Management

Authors: Jamshed Ahmad, Abdul Wahid Khan, Javed Ali Khan

Abstract:

Managing software quality requirements change management is a difficult task in the field of software engineering. Avoiding incoming changes result in user dissatisfaction while accommodating to many requirement changes may delay product delivery. Poor requirements management is solely considered the primary cause of the software failure. It becomes more challenging in global software outsourcing. Addressing success factors in quality requirement change management is desired today due to the frequent change requests from the end-users. In this research study, success factors are recognized and scrutinized with the help of a systematic literature review (SLR). In total, 16 success factors were identified, which significantly impacted software quality requirement change management. The findings show that Proper Requirement Change Management, Rapid Delivery, Quality Software Product, Access to Market, Project Management, Skills and Methodologies, Low Cost/Effort Estimation, Clear Plan and Road Map, Agile Processes, Low Labor Cost, User Satisfaction, Communication/Close Coordination, Proper Scheduling and Time Constraints, Frequent Technological Changes, Robust Model, Geographical distribution/Cultural differences are the key factors that influence software quality requirement change. The recognized success factors and validated with the help of various research methods, i.e., case studies, interviews, surveys and experiments. These factors are then scrutinized in continents, database, company size and period of time. Based on these findings, requirement change will be implemented in a better way.

Keywords: global software development, requirement engineering, systematic literature review, success factors

Procedia PDF Downloads 197
13291 Bayesian Meta-Analysis to Account for Heterogeneity in Studies Relating Life Events to Disease

Authors: Elizabeth Stojanovski

Abstract:

Associations between life events and various forms of cancers have been identified. The purpose of a recent random-effects meta-analysis was to identify studies that examined the association between adverse events associated with changes to financial status including decreased income and breast cancer risk. The same association was studied in four separate studies which displayed traits that were not consistent between studies such as the study design, location and time frame. It was of interest to pool information from various studies to help identify characteristics that differentiated study results. Two random-effects Bayesian meta-analysis models are proposed to combine the reported estimates of the described studies. The proposed models allow major sources of variation to be taken into account, including study level characteristics, between study variance, and within study variance and illustrate the ease with which uncertainty can be incorporated using a hierarchical Bayesian modelling approach.

Keywords: random-effects, meta-analysis, Bayesian, variation

Procedia PDF Downloads 160