Search results for: model driven rrchitecture (MDA)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17260

Search results for: model driven rrchitecture (MDA)

17080 Design of Orientation-Free Handler and Fuzzy Controller for Wire-Driven Heavy Object Lifting System

Authors: Bo-Wei Song, Yun-Jung Lee

Abstract:

This paper presents an intention interface and controller for a wire-driven heavy object lifting system that assists the operator with moving a heavy object. The handler is designed to allow a comfortable working posture for the operator. Plus, as a human assistive system, the operator is involved in the control loop, where a fuzzy control system is used to consider the human control characteristics. The effectiveness and performance of the proposed system are proved by experiments.

Keywords: fuzzy controller, handler design, heavy object lifting system, human-assistive device, human-in-the-loop system

Procedia PDF Downloads 482
17079 Mathematical Model to Quantify the Phenomenon of Democracy

Authors: Mechlouch Ridha Fethi

Abstract:

This paper presents a recent mathematical model in political sciences concerning democracy. The model is represented by a logarithmic equation linking the Relative Index of Democracy (RID) to Participation Ratio (PR). Firstly the meanings of the different parameters of the model were presented; and the variation curve of the RID according to PR with different critical areas was discussed. Secondly, the model was applied to a virtual group where we show that the model can be applied depending on the gender. Thirdly, it was observed that the model can be extended to different language models of democracy and that little use to assess the state of democracy for some International organizations like UNO.

Keywords: democracy, mathematic, modelization, quantification

Procedia PDF Downloads 327
17078 The Achievement Model of University Social Responsibility

Authors: Le Kang

Abstract:

On the research question of 'how to achieve USR', this contribution reflects the concept of university social responsibility, identify three achievement models of USR as the society - diversified model, the university-cooperation model, the government - compound model, also conduct a case study to explore characteristics of Chinese achievement model of USR. The contribution concludes with discussion of how the university, government and society balance demands and roles, make necessarily strategic adjustment and innovative approach to repair the shortcomings of each achievement model.

Keywords: modern university, USR, achievement model, compound model

Procedia PDF Downloads 717
17077 Automated Driving Deep Neural Networks Model Accuracy and Performance Assessment in a Simulated Environment

Authors: David Tena-Gago, Jose M. Alcaraz Calero, Qi Wang

Abstract:

The evolution and integration of automated vehicles have become more and more tangible in recent years. State-of-the-art technological advances in the field of camera-based Artificial Intelligence (AI) and computer vision greatly favor the performance and reliability of the Advanced Driver Assistance System (ADAS), leading to a greater knowledge of vehicular operation and resembling human behavior. However, the exclusive use of this technology still seems insufficient to control vehicular operation at 100%. To reveal the degree of accuracy of the current camera-based automated driving AI modules, this paper studies the structure and behavior of one of the main solutions in a controlled testing environment. The results obtained clearly outline the lack of reliability when using exclusively the AI model in the perception stage, thereby entailing using additional complementary sensors to improve its safety and performance.

Keywords: accuracy assessment, AI-driven mobility, artificial intelligence, automated vehicles

Procedia PDF Downloads 80
17076 Predicting Oil Spills in Real-Time: A Machine Learning and AIS Data-Driven Approach

Authors: Tanmay Bisen, Aastha Shayla, Susham Biswas

Abstract:

Oil spills from tankers can cause significant harm to the environment and local communities, as well as have economic consequences. Early predictions of oil spills can help to minimize these impacts. Our proposed system uses machine learning and neural networks to predict potential oil spills by monitoring data from ship Automatic Identification Systems (AIS). The model analyzes ship movements, speeds, and changes in direction to identify patterns that deviate from the norm and could indicate a potential spill. Our approach not only identifies anomalies but also predicts spills before they occur, providing early detection and mitigation measures. This can prevent or minimize damage to the reputation of the company responsible and the country where the spill takes place. The model's performance on the MV Wakashio oil spill provides insight into its ability to detect and respond to real-world oil spills, highlighting areas for improvement and further research.

Keywords: Anomaly Detection, Oil Spill Prediction, Machine Learning, Image Processing, Graph Neural Network (GNN)

Procedia PDF Downloads 38
17075 D3Advert: Data-Driven Decision Making for Ad Personalization through Personality Analysis Using BiLSTM Network

Authors: Sandesh Achar

Abstract:

Personalized advertising holds greater potential for higher conversion rates compared to generic advertisements. However, its widespread application in the retail industry faces challenges due to complex implementation processes. These complexities impede the swift adoption of personalized advertisement on a large scale. Personalized advertisement, being a data-driven approach, necessitates consumer-related data, adding to its complexity. This paper introduces an innovative data-driven decision-making framework, D3Advert, which personalizes advertisements by analyzing personalities using a BiLSTM network. The framework utilizes the Myers–Briggs Type Indicator (MBTI) dataset for development. The employed BiLSTM network, specifically designed and optimized for D3Advert, classifies user personalities into one of the sixteen MBTI categories based on their social media posts. The classification accuracy is 86.42%, with precision, recall, and F1-Score values of 85.11%, 84.14%, and 83.89%, respectively. The D3Advert framework personalizes advertisements based on these personality classifications. Experimental implementation and performance analysis of D3Advert demonstrate a 40% improvement in impressions. D3Advert’s innovative and straightforward approach has the potential to transform personalized advertising and foster widespread personalized advertisement adoption in marketing.

Keywords: personalized advertisement, deep Learning, MBTI dataset, BiLSTM network, NLP.

Procedia PDF Downloads 12
17074 Investigation of Software Integration for Simulations of Buoyancy-Driven Heat Transfer in a Vehicle Underhood during Thermal Soak

Authors: R. Yuan, S. Sivasankaran, N. Dutta, K. Ebrahimi

Abstract:

This paper investigates the software capability and computer-aided engineering (CAE) method of modelling transient heat transfer process occurred in the vehicle underhood region during vehicle thermal soak phase. The heat retention from the soak period will be beneficial to the cold start with reduced friction loss for the second 14°C worldwide harmonized light-duty vehicle test procedure (WLTP) cycle, therefore provides benefits on both CO₂ emission reduction and fuel economy. When vehicle undergoes soak stage, the airflow and the associated convective heat transfer around and inside the engine bay is driven by the buoyancy effect. This effect along with thermal radiation and conduction are the key factors to the thermal simulation of the engine bay to obtain the accurate fluids and metal temperature cool-down trajectories and to predict the temperatures at the end of the soak period. Method development has been investigated in this study on a light-duty passenger vehicle using coupled aerodynamic-heat transfer thermal transient modelling method for the full vehicle under 9 hours of thermal soak. The 3D underhood flow dynamics were solved inherently transient by the Lattice-Boltzmann Method (LBM) method using the PowerFlow software. This was further coupled with heat transfer modelling using the PowerTHERM software provided by Exa Corporation. The particle-based LBM method was capable of accurately handling extremely complicated transient flow behavior on complex surface geometries. The detailed thermal modelling, including heat conduction, radiation, and buoyancy-driven heat convection, were integrated solved by PowerTHERM. The 9 hours cool-down period was simulated and compared with the vehicle testing data of the key fluid (coolant, oil) and metal temperatures. The developed CAE method was able to predict the cool-down behaviour of the key fluids and components in agreement with the experimental data and also visualised the air leakage paths and thermal retention around the engine bay. The cool-down trajectories of the key components obtained for the 9 hours thermal soak period provide vital information and a basis for the further development of reduced-order modelling studies in future work. This allows a fast-running model to be developed and be further imbedded with the holistic study of vehicle energy modelling and thermal management. It is also found that the buoyancy effect plays an important part at the first stage of the 9 hours soak and the flow development during this stage is vital to accurately predict the heat transfer coefficients for the heat retention modelling. The developed method has demonstrated the software integration for simulating buoyancy-driven heat transfer in a vehicle underhood region during thermal soak with satisfying accuracy and efficient computing time. The CAE method developed will allow integration of the design of engine encapsulations for improving fuel consumption and reducing CO₂ emissions in a timely and robust manner, aiding the development of low-carbon transport technologies.

Keywords: ATCT/WLTC driving cycle, buoyancy-driven heat transfer, CAE method, heat retention, underhood modeling, vehicle thermal soak

Procedia PDF Downloads 121
17073 Analyzing the Job Satisfaction of Silver Workers Using Structural Equation Modeling

Authors: Valentin Nickolai, Florian Pfeffel, Christian Louis Kühner

Abstract:

In many industrialized nations, the demand for skilled workers rises, causing the current market for employees to be more candidate-driven than employer-driven. Therefore, losing highly skilled and experienced employees due to early or partial retirement negatively impacts firms. Therefore, finding new ways to incentivize older employees (Silver Workers) to stay longer with the company and in their job can be crucial for the success of a firm. This study analyzes how working remotely can be a valid incentive for experienced Silver Workers to stay in their job and instead work from home with more flexible working hours. An online survey with n = 684 respondents, who are employed in the service sector, has been conducted based on 13 constructs that influence job satisfaction. These have been further categorized into three groups “classic influencing factors,” “influencing factors changed by remote working,” and new remote working influencing factors,” and were analyzed using structural equation modeling (SEM). Here, Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. Additionally, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). It was shown in the SEM-analysis that the influencing factor on job satisfaction, “identification with the work,” is the most significant with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis also shows that the identification with the work is the most significant factor in all three work models mentioned above and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that employees between the ages of 56 and 65 years have the highest job satisfaction when working entirely from home or remotely. Furthermore, their job satisfaction score of 5.4 on a scale from 1 (very dissatisfied) to 7 (very satisfied) is the highest amongst all age groups in any of the three work models. Due to the significantly higher job satisfaction, it can be argued that giving Silver Workers the offer to work from home or remotely can incentivize them not to opt for early retirement or partial retirement but to stay in their job full-time Furthermore, these findings can indicate that employees in the Silver Worker age are much more inclined to leave their job for early retirement if they have to entirely work in the office.

Keywords: home office, remote work instead of early or partial retirement, silver worker, structural equation modeling

Procedia PDF Downloads 46
17072 Utilizing Computational Fluid Dynamics in the Analysis of Natural Ventilation in Buildings

Authors: A. W. J. Wong, I. H. Ibrahim

Abstract:

Increasing urbanisation has driven building designers to incorporate natural ventilation in the designs of sustainable buildings. This project utilises Computational Fluid Dynamics (CFD) to investigate the natural ventilation of an academic building, SIT@SP, using an assessment criterion based on daily mean temperature and mean velocity. The areas of interest are the pedestrian level of first and fourth levels of the building. A reference case recommended by the Architectural Institute of Japan was used to validate the simulation model. The validated simulation model was then used for coupled simulations on SIT@SP and neighbouring geometries, under two wind speeds. Both steady and transient simulations were used to identify differences in results. Steady and transient results are agreeable with the transient simulation identifying peak velocities during flow development. Under a lower wind speed, the first level was sufficiently ventilated while the fourth level was not. The first level has excessive wind velocities in the higher wind speed and the fourth level was adequately ventilated. Fourth level flow velocity was consistently lower than those of the first level. This is attributed to either simulation model error or poor building design. SIT@SP is concluded to have a sufficiently ventilated first level and insufficiently ventilated fourth level. Future works for this project extend to modifying the urban geometry, simulation model improvements, evaluation using other assessment metrics and extending the area of interest to the entire building.

Keywords: buildings, CFD Simulations, natural ventilation, urban airflow

Procedia PDF Downloads 193
17071 Free Convection in a Darcy Thermally Stratified Porous Medium That Embeds a Vertical Wall of Constant Heat Flux and Concentration

Authors: Maria Neagu

Abstract:

This paper presents the heat and mass driven natural convection succession in a Darcy thermally stratified porous medium that embeds a vertical semi-infinite impermeable wall of constant heat flux and concentration. The scale analysis of the system determines the two possible maps of the heat and mass driven natural convection sequence along the wall as a function of the process parameters. These results are verified using the finite differences method applied to the conservation equations.

Keywords: finite difference method, natural convection, porous medium, scale analysis, thermal stratification

Procedia PDF Downloads 295
17070 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation

Authors: Jonathan Gong

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning

Procedia PDF Downloads 91
17069 High Motivational Salient Face Distractors Slowed Target Detection: Evidence from Behavioral Studies

Authors: Rashmi Gupta

Abstract:

Rewarding stimuli capture attention involuntarily as a result of an association process that develops quickly during value learning, referred to as the reward or value-driven attentional capture. It is essential to compare reward with punishment processing to get a full picture of value-based modulation in visual attention processing. Hence, the present study manipulated both valence/value (reward as well as punishment) and motivational salience (probability of an outcome: high vs. low) together. Series of experiments were conducted, and there were two phases in each experiment. In phase 1, participants were required to learn to associate specific face stimuli with a high or low probability of winning or losing points. In the second phase, these conditioned stimuli then served as a distractor or prime in a speeded letter search task. Faces with high versus low outcome probability, regardless of valence, slowed the search for targets (specifically the left visual field target) and suggesting that the costs to performance on non-emotional cognitive tasks were only driven by motivational salience (high vs. loss) associated with the stimuli rather than the valence (gain vs. loss). It also suggests that the processing of motivationally salient stimuli is right-hemisphere biased. Together, results of these studies strengthen the notion that our visual attention system is more sensitive to affected by motivational saliency rather than valence, which termed here as motivational-driven attentional capture.

Keywords: attention, distractors, motivational salience, valence

Procedia PDF Downloads 194
17068 Model Averaging for Poisson Regression

Authors: Zhou Jianhong

Abstract:

Model averaging is a desirable approach to deal with model uncertainty, which, however, has rarely been explored for Poisson regression. In this paper, we propose a model averaging procedure based on an unbiased estimator of the expected Kullback-Leibler distance for the Poisson regression. Simulation study shows that the proposed model average estimator outperforms some other commonly used model selection and model average estimators in some situations. Our proposed methods are further applied to a real data example and the advantage of this method is demonstrated again.

Keywords: model averaging, poission regression, Kullback-Leibler distance, statistics

Procedia PDF Downloads 483
17067 Hybrid Data-Driven Drilling Rate of Penetration Optimization Scheme Guided by Geological Formation and Historical Data

Authors: Ammar Alali, Mahmoud Abughaban, William Contreras Otalvora

Abstract:

Optimizing the drilling process for cost and efficiency requires the optimization of the rate of penetration (ROP). ROP is the measurement of the speed at which the wellbore is created, in units of feet per hour. It is the primary indicator of measuring drilling efficiency. Maximization of the ROP can indicate fast and cost-efficient drilling operations; however, high ROPs may induce unintended events, which may lead to nonproductive time (NPT) and higher net costs. The proposed ROP optimization solution is a hybrid, data-driven system that aims to improve the drilling process, maximize the ROP, and minimize NPT. The system consists of two phases: (1) utilizing existing geological and drilling data to train the model prior, and (2) real-time adjustments of the controllable dynamic drilling parameters [weight on bit (WOB), rotary speed (RPM), and pump flow rate (GPM)] that direct influence on the ROP. During the first phase of the system, geological and historical drilling data are aggregated. After, the top-rated wells, as a function of high instance ROP, are distinguished. Those wells are filtered based on NPT incidents, and a cross-plot is generated for the controllable dynamic drilling parameters per ROP value. Subsequently, the parameter values (WOB, GPM, RPM) are calculated as a conditioned mean based on physical distance, following Inverse Distance Weighting (IDW) interpolation methodology. The first phase is concluded by producing a model of drilling best practices from the offset wells, prioritizing the optimum ROP value. This phase is performed before the commencing of drilling. Starting with the model produced in phase one, the second phase runs an automated drill-off test, delivering live adjustments in real-time. Those adjustments are made by directing the driller to deviate two of the controllable parameters (WOB and RPM) by a small percentage (0-5%), following the Constrained Random Search (CRS) methodology. These minor incremental variations will reveal new drilling conditions, not explored before through offset wells. The data is then consolidated into a heat-map, as a function of ROP. A more optimum ROP performance is identified through the heat-map and amended in the model. The validation process involved the selection of a planned well in an onshore oil field with hundreds of offset wells. The first phase model was built by utilizing the data points from the top-performing historical wells (20 wells). The model allows drillers to enhance decision-making by leveraging existing data and blending it with live data in real-time. An empirical relationship between controllable dynamic parameters and ROP was derived using Artificial Neural Networks (ANN). The adjustments resulted in improved ROP efficiency by over 20%, translating to at least 10% saving in drilling costs. The novelty of the proposed system lays is its ability to integrate historical data, calibrate based geological formations, and run real-time global optimization through CRS. Those factors position the system to work for any newly drilled well in a developing field event.

Keywords: drilling optimization, geological formations, machine learning, rate of penetration

Procedia PDF Downloads 94
17066 Geodesign Application for Bio-Swale Design: A Data-Driven Design Approach for a Case Site in Ottawa Street North in Hamilton, Ontario, Canada

Authors: Adele Pierre, Nadia Amoroso

Abstract:

Changing climate patterns are resulting in increased in storm severity, challenging traditional methods of managing stormwater runoff. This research compares a system of bioswales to existing curb and gutter infrastructure in a post-industrial streetscape of Hamilton, Ontario. Using the geodesign process, including rule-based set parameters and an integrated approach combining geospatial information with stakeholder input, a section of Ottawa St. North was modelled to show how green infrastructure can ease the burden on aging, combined sewer systems. Qualitative data was gathered from residents of the neighbourhood through field notes, and quantitative geospatial data through GIS and site analysis. Parametric modelling was used to generate multiple design scenarios, each visualizing resulting impacts on stormwater runoff along with their calculations. The selected design scenarios offered both an aesthetically pleasing urban bioswale street-scape system while minimizing and controlling stormwater runoff. Interactive maps, videos and the 3D model were presented for stakeholder comment via ESRI’s (Environmental System Research Institute) web-scene. The results of the study demonstrate powerful tools that can assist landscape architects in designing, collaborating and communicating stormwater strategies.

Keywords: bioswale, geodesign, data-driven and rule-based design, geodesign, GIS, stormwater management

Procedia PDF Downloads 147
17065 Assessing the Cumulative Impact of PM₂.₅ Emissions from Power Plants by Using the Hybrid Air Quality Model and Evaluating the Contributing Salient Factor in South Taiwan

Authors: Jackson Simon Lusagalika, Lai Hsin-Chih, Dai Yu-Tung

Abstract:

Particles with an aerodynamic diameter of 2.5 meters or less are referred to as "fine particulate matter" (PM₂.₅) are easily inhaled and can go deeper into the lungs than other particles in the atmosphere, where it may have detrimental health consequences. In this study, we use a hybrid model that combined CMAQ and AERMOD as well as initial meteorological fields from the Weather Research and Forecasting (WRF) model to study the impact of power plant PM₂.₅ emissions in South Taiwan since it frequently experiences higher PM₂.₅ levels. A specific date of March 3, 2022, was chosen as a result of a power outage that prompted the bulk of power plants to shut down. In some way, it is not conceivable anywhere in the world to turn off the power for the sole purpose of doing research. Therefore, this catastrophe involving a power outage and the shutdown of power plants offers a great occasion to evaluate the impact of air pollution driven by this power sector. As a result, four numerical experiments were conducted in the study using the Continuous Emission Data System (CEMS), assuming that the power plants continued to function normally after the power outage. The hybrid model results revealed that power plants have a minor impact in the study region. However, we examined the accumulation of PM₂.₅ in the study and discovered that once the vortex at 925hPa was established and moved to the north of Taiwan's coast, the study region experienced higher observed PM₂.₅ concentrations influenced by meteorological factors. This study recommends that decision-makers take into account not only control techniques, specifically emission reductions, but also the atmospheric and meteorological implications for future investigations.

Keywords: PM₂.₅ concentration, powerplants, hybrid air quality model, CEMS, Vorticity

Procedia PDF Downloads 42
17064 Neural Network Based Compressor Flow Estimator in an Aircraft Vapor Cycle System

Authors: Justin Reverdi, Sixin Zhang, Serge Gratton, Said Aoues, Thomas Pellegrini

Abstract:

In Vapor Cycle Systems, the flow sensor plays a key role in different monitoring and control purposes. However, physical sensors can be expensive, inaccurate, heavy, cumbersome, or highly sensitive to vibrations, which is especially problematic when embedded into an aircraft. The conception of a virtual sensor based on other standard sensors is a good alternative. In this paper, a data-driven model using a Convolutional Neural Network is proposed to estimate the flow of the compressor. To fit the model to our dataset, we tested different loss functions. We show in our application that a Dynamic Time Warping based loss function called DILATE leads to better dynamical performance than the vanilla mean squared error (MSE) loss function. DILATE allows choosing a trade-off between static and dynamic performance.

Keywords: deep learning, dynamic time warping, vapor cycle system, virtual sensor

Procedia PDF Downloads 116
17063 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities

Authors: Anudeep Appe, Bhanu Poluparthi, Lakshmi Kasivajjula, Udai Mv, Sobha Bagadi, Punya Modi, Aditya Singh, Hemanth Gunupudi, Spenser Troiano, Jeff Paul, Justin Stovall, Justin Yamamoto

Abstract:

The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data is considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP, to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since its data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for ex. quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP (SHapley Additive exPlanations), a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.

Keywords: competition, DAGs, facility, healthcare, machine learning, market share, random forest, SHAP

Procedia PDF Downloads 49
17062 Analysis of NFC and Biometrics in the Retail Industry

Authors: Ziwei Xu

Abstract:

The increasing emphasis on mobility has driven the application of innovative communication technologies across various industries. In the retail sector, Near Field Communication (NFC) has emerged as a significant and transformative technology, particularly in the payment and retail supermarket sectors. NFC enables new payment methods, such as electronic wallets, and enhances information management in supermarkets, contributing to the growth of the trade. This report presents a comprehensive analysis of NFC technology, focusing on five key aspects. Firstly, it provides an overview of NFC, including its application methods and development history. Additionally, it incorporates Arthur's work on combinatorial evolution to elucidate the emergence and impact of NFC technology, while acknowledging the limitations of the model in analyzing NFC. The report then summarizes the positive influence of NFC on the retail industry along with its associated constraints. Furthermore, it explores the adoption of NFC from both organizational and individual perspectives, employing the Best Predictors of organizational IT adoption and UTAUT2 models, respectively. Finally, the report discusses the potential future replacement of NFC with biometrics technology, highlighting its advantages over NFC and leveraging Arthur's model to investigate its future development prospects.

Keywords: innovation, NFC, industry, biometrics

Procedia PDF Downloads 40
17061 Analyzing the Evolution of Adverse Events in Pharmacovigilance: A Data-Driven Approach

Authors: Kwaku Damoah

Abstract:

This study presents a comprehensive data-driven analysis to understand the evolution of adverse events (AEs) in pharmacovigilance. Utilizing data from the FDA Adverse Event Reporting System (FAERS), we employed three analytical methods: rank-based, frequency-based, and percentage change analyses. These methods assessed temporal trends and patterns in AE reporting, focusing on various drug-active ingredients and patient demographics. Our findings reveal significant trends in AE occurrences, with both increasing and decreasing patterns from 2000 to 2023. This research highlights the importance of continuous monitoring and advanced analysis in pharmacovigilance, offering valuable insights for healthcare professionals and policymakers to enhance drug safety.

Keywords: event analysis, FDA adverse event reporting system, pharmacovigilance, temporal trend analysis

Procedia PDF Downloads 20
17060 Assessing the Implementation of Community Driven Development through Social Capital in Migrant and Indigenous Informal Settlements in Accra, Ghana

Authors: Beatrice Eyram Afi Ziorklui, Norihisa Shima

Abstract:

Community Driven Development (CDD) is now a widely recommended and accepted development strategy for informal communities across the continent. Centered on the utilization of social capital through community structures, different informal settlements have different structures and different levels of social capital, which affect the implementation and ability to overcome CDD challenges. Although known to be very successful, there are few perspectives on the implementation of CDD initiatives in different informal settlements. This study assesses the implementation of CDD initiatives in migrant and indigenous informal settlements and their ability to navigate challenges. The case study research design was adopted in this research, and respondents were chosen through simple random sampling. Using the Statistical Package for social scientists (SPSS) for data analysis, the study found that migrant informal settlements implement CDD projects through the network of hierarchical structures based on government systems, whereas indigenous informal settlements implement through the hierarchical social structure based on traditions and culture. The study also found that, with the exception of the challenge of land accessibility in migrant informal settlements, all other challenges, such as participation, resource mobilization, and maintenance, have a significant relationship with social capital, although indigenous informal settlements have higher levels of social capital than migrant informal settlements. The study recommends a framework that incorporates community characteristics and the underlying social capital to facilitate upgrading strategies in informal in Ghana.

Keywords: community driven development, informal settlements, social capital, upgrading

Procedia PDF Downloads 58
17059 Chief Financial Officer Compensation in Mergers and Acquisitions Activities

Authors: Martin Bugeja, Helen Spiropolos

Abstract:

Using a sample of U.S. firms during the period 1993-2015, this study examines whether mergers and acquisitions (M&A) impact the compensation of the Chief Financial Officer (CFO) in the bidding and integration phases of M&As. The study finds that after controlling for CEO power, CFOs’ total compensation is higher during M&A years and is driven by higher equity incentives. These results are robust to controlling for self-selection. Furthermore, CFOs receive a greater bonus during the year of acquisition and the year prior. The study also investigates if CFO compensation during M&A years is driven by M&A characteristics and finds that deal size and diversification are positively related to total compensation while completion time is negatively related. The results are robust to a number of sensitivity tests and additional analyses.

Keywords: chief financial officer, compensation, mergers, acquisitions

Procedia PDF Downloads 21
17058 Influence of Ammonia Emissions on Aerosol Formation in Northern and Central Europe

Authors: A. Aulinger, A. M. Backes, J. Bieser, V. Matthias, M. Quante

Abstract:

High concentrations of particles pose a threat to human health. Thus, legal maximum concentrations of PM10 and PM2.5 in ambient air have been steadily decreased over the years. In central Europe, the inorganic species ammonium sulphate and ammonium nitrate make up a large fraction of fine particles. Many studies investigate the influence of emission reductions of sulfur- and nitrogen oxides on aerosol concentration. Here, we focus on the influence of ammonia (NH3) emissions. While emissions of sulphate and nitrogen oxides are quite well known, ammonia emissions are subject to high uncertainty. This is due to the uncertainty of location, amount, time of fertilizer application in agriculture, and the storage and treatment of manure from animal husbandry. For this study, we implemented a crop growth model into the SMOKE emission model. Depending on temperature, local legislation, and crop type individual temporal profiles for fertilizer and manure application are calculated for each model grid cell. Additionally, the diffusion from soils and plants and the direct release from open and closed barns are determined. The emission data was used as input for the Community Multiscale Air Quality (CMAQ) model. Comparisons to observations from the EMEP measurement network indicate that the new ammonia emission module leads to a better agreement of model and observation (for both ammonia and ammonium). Finally, the ammonia emission model was used to create emission scenarios. This includes emissions based on future European legislation, as well as a dynamic evaluation of the influence of different agricultural sectors on particle formation. It was found that a reduction of ammonia emissions by 50% lead to a 24% reduction of total PM2.5 concentrations during winter time in the model domain. The observed reduction was mainly driven by reduced formation of ammonium nitrate. Moreover, emission reductions during winter had a larger impact than during the rest of the year.

Keywords: ammonia, ammonia abatement strategies, ctm, seasonal impact, secondary aerosol formation

Procedia PDF Downloads 320
17057 Artificial Neural Network-Based Prediction of Effluent Quality of Wastewater Treatment Plant Employing Data Preprocessing Approaches

Authors: Vahid Nourani, Atefeh Ashrafi

Abstract:

Prediction of treated wastewater quality is a matter of growing importance in water treatment procedure. In this way artificial neural network (ANN), as a robust data-driven approach, has been widely used for forecasting the effluent quality of wastewater treatment. However, developing ANN model based on appropriate input variables is a major concern due to the numerous parameters which are collected from treatment process and the number of them are increasing in the light of electronic sensors development. Various studies have been conducted, using different clustering methods, in order to classify most related and effective input variables. This issue has been overlooked in the selecting dominant input variables among wastewater treatment parameters which could effectively lead to more accurate prediction of water quality. In the presented study two ANN models were developed with the aim of forecasting effluent quality of Tabriz city’s wastewater treatment plant. Biochemical oxygen demand (BOD) was utilized to determine water quality as a target parameter. Model A used Principal Component Analysis (PCA) for input selection as a linear variance-based clustering method. Model B used those variables identified by the mutual information (MI) measure. Therefore, the optimal ANN structure when the result of model B compared with model A showed up to 15% percent increment in Determination Coefficient (DC). Thus, this study highlights the advantage of PCA method in selecting dominant input variables for ANN modeling of wastewater plant efficiency performance.

Keywords: Artificial Neural Networks, biochemical oxygen demand, principal component analysis, mutual information, Tabriz wastewater treatment plant, wastewater treatment plant

Procedia PDF Downloads 95
17056 Investigation of Effects of Geomagnetic Storms Produced by Different Solar Sources on the Total Electron Content (TEC)

Authors: P. K. Purohit, Azad A. Mansoori, Parvaiz A. Khan, Purushottam Bhawre, Sharad C. Tripathi, A. M. Aslam, Malik A. Waheed, Shivangi Bhardwaj, A. K. Gwal

Abstract:

The geomagnetic storm represents the most outstanding example of solar wind-magnetospheric interaction, which causes global disturbances in the geomagnetic field as well as the trigger ionospheric disturbances. We study the behaviour of ionospheric Total Electron Content (TEC) during the geomagnetic storms. For the present investigation we have selected 47 intense geomagnetic storms (Dst ≤ -100nT) that were observed during the solar cycle 23 i.e. during 1998-2006. We then categorized these storms into four categories depending upon their solar sources like Magnetic Cloud (MC), Co-rotating Interaction Region (CIR), SH+ICME and SH+MC. We then studied the behaviour of ionospheric TEC at a mid latitude station Usuda (36.13N, 138.36E), Japan during these storm events produced by four different solar sources. During our study we found that the smooth variations in TEC are replaced by rapid fluctuations and the value of TEC is strongly enhanced during the time of these storms belonging to all the four categories. However, the greatest enhancements in TEC are produced during those geomagnetic storms which are either caused by sheath driven magnetic cloud (SH+MC) or sheath driven ICME (SH+ICME). We also derived the correlation between the TEC enhancements produced during storms of each category with the minimum Dst. We found the strongest correlation exists for the SH+ICME category followed by SH+MC, MC and finally CIR. Since the most intense storms were either caused by SH+ICME or SH+MC while the least intense storms were caused by CIR, consequently the correlation was the strongest with SH+ICME and SH+MC and least with CIR.

Keywords: GPS, TEC, geomagnetic storm, sheath driven magnetic cloud

Procedia PDF Downloads 511
17055 Implementation and Validation of a Damage-Friction Constitutive Model for Concrete

Authors: L. Madouni, M. Ould Ouali, N. E. Hannachi

Abstract:

Two constitutive models for concrete are available in ABAQUS/Explicit, the Brittle Cracking Model and the Concrete Damaged Plasticity Model, and their suitability and limitations are well known. The aim of the present paper is to implement a damage-friction concrete constitutive model and to evaluate the performance of this model by comparing the predicted response with experimental data. The constitutive formulation of this material model is reviewed. In order to have consistent results, the parameter identification and calibration for the model have been performed. Several numerical simulations are presented in this paper, whose results allow for validating the capability of the proposed model for reproducing the typical nonlinear performances of concrete structures under different monotonic and cyclic load conditions. The results of the evaluation will be used for recommendations concerning the application and further improvements of the investigated model.

Keywords: Abaqus, concrete, constitutive model, numerical simulation

Procedia PDF Downloads 327
17054 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh

Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila

Abstract:

Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.

Keywords: data culture, data-driven organization, data mesh, data quality for business success

Procedia PDF Downloads 89
17053 Vertebrate Model to Examine the Biological Effectiveness of Different Radiation Qualities

Authors: Rita Emília Szabó, Róbert Polanek, Tünde Tőkés, Zoltán Szabó, Szabolcs Czifrus, Katalin Hideghéty

Abstract:

Purpose: Several feature of zebrafish are making them amenable for investigation on therapeutic approaches such as ionizing radiation. The establishment of zebrafish model for comprehensive radiobiological research stands in the focus of our investigation, comparing the radiation effect curves of neutron and photon irradiation. Our final aim is to develop an appropriate vertebrate model in order to investigate the relative biological effectiveness of laser driven ionizing radiation. Methods and Materials: After careful dosimetry series of viable zebrafish embryos were exposed to a single fraction whole-body neutron-irradiation (1,25; 1,875; 2; 2,5 Gy) at the research reactor of the Technical University of Budapest and to conventional 6 MeV photon beam at 24 hour post-fertilization (hpf). The survival and morphologic abnormalities (pericardial edema, spine curvature) of each embryo were assessed for each experiment at 24-hour intervals from the point of fertilization up to 168 hpf (defining the dose lethal for 50% (LD50)). Results: In the zebrafish embryo model LD50 at 20 Gy dose level was defined and the same lethality were found at 2 Gy dose from the reactor neutron beam resulting RBE of 10. Dose-dependent organ perturbations were detected on macroscopic (shortening of the body length, spine curvature, microcephaly, micro-ophthalmia, micrognathia, pericardial edema, and inhibition of yolk sac resorption) and microscopic (marked cellular changes in skin, cardiac, gastrointestinal system) with the same magnitude of dose difference. Conclusion: In our observations, we found that zebrafish embryo model can be used for investigating the effects of different type of ionizing radiation and this system proved to be highly efficient vertebrate model for preclinical examinations.

Keywords: ionizing radiation, LD50, relative biological effectiveness, zebrafish embryo

Procedia PDF Downloads 280
17052 Novel Low-cost Bubble CPAP as an Alternative Non-invasive Oxygen Therapy for Newborn Infants with Respiratory Distress Syndrome in a Tertiary Level Neonatal Intensive Care Unit in the Philippines: A Single Blind Randomized Controlled Trial

Authors: Navid P Roodaki, Rochelle Abila, Daisy Evangeline Garcia

Abstract:

Background and Objective: Respiratory Distress Syndrome (RDS) among premature infants is a major causes of neonatal death. The use of Continuous Positive Airway Pressure (CPAP) has become a standard of care for preterm newborns with RDS hence cost-effective innovations are needed. This study compared a novel low-cost Bubble CPAP (bCPAP) device to ventilator driven CPAP in the treatment of RDS. Methods: This is a single-blind, randomized controlled trial done on May 2022 to October 2022 in a Level III Neonatal Intensive Care Unit in the Philippines. Preterm newborns (<36 weeks) with RDS were randomized to receive Vayu bCPAP device or Ventilator-derived CPAP. Arterial Blood Gases, Oxygen Saturation, administration of surfactant, and CPAP failure rates were measured. Results: Seventy preterm newborns were included. No differences were observed between the Ventilator driven CPAP and Vayu bCPAP on the PaO2 (97.51mmHg vs 97.37mmHg), So2 (97.08% vs 95.60%) levels, amount of surfactant administered between groups. There were no observed differences in CPAP failure rates between Vayu bPCAP (x̄ 3.23 days) and ventilator-driven CPAP (x̄ 2.98 days). However, a significant difference was noted on the CO2 level (40.32mmHg vs 50.70mmHg), which was higher among those hooked to Ventilator-driven CPAP (p 0.004). Conclusion: This study has shown that the novel low-cost bubble CPAP (Vayu bCPAP) can be used as an efficacious alternate non invasive oxygen therapy among preterm neonates with RDS, although the CO2 levels were higher among those hooked to ventilator driven CPAP, other outcome parameters measured showed that both devices are comparable. Recommendation: A multi-center or national study to account for geographic region, which may alter the outcomes of patients connected to different ventilatory support. Cost comparison between devices is also suggested. A mixed-method research assessing the experiences of health care professionals in assembling and utilizing the gadget is a second consideration.

Keywords: bubble CPAP, ventilator-derived CPAP; infant, premature, respiratory distress syndrome

Procedia PDF Downloads 45
17051 A Framework for Successful TQM Implementation and Its Effect on the Organizational Sustainability Development

Authors: Redha Elhuni, M. Munir Ahmad

Abstract:

The main purpose of this research is to construct a generic model for successful implementation of Total Quality Management (TQM) in oil sector, and to find out the effects of this model on the organizational sustainability development (OSD) performance of Libyan oil and gas companies using the structured equation modeling (SEM) approach. The research approach covers both quantitative and qualitative methods. A questionnaire was developed in order to identify the quality factors that are seen by Libyan oil and gas companies to be critical to the success of TQM implementation. Hypotheses were developed to evaluate the impact of TQM implementation on O SD. Data analysis reveals that there is a significant positive effect of the TQM implementation on OSD. 24 quality factors are found to be critical and absolutely essential for successful TQM implementation. The results generated a structure of the TQMSD implementation framework based on the four major road map constructs (Top management commitment, employee involvement and participation, customer-driven processes, and continuous improvement culture).

Keywords: total quality management, critical success factors, oil and gas, organizational sustainability development (SD), Libya

Procedia PDF Downloads 248