Search results for: software fault prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7341

Search results for: software fault prediction

4491 Early Impact Prediction and Key Factors Study of Artificial Intelligence Patents: A Method Based on LightGBM and Interpretable Machine Learning

Authors: Xingyu Gao, Qiang Wu

Abstract:

Patents play a crucial role in protecting innovation and intellectual property. Early prediction of the impact of artificial intelligence (AI) patents helps researchers and companies allocate resources and make better decisions. Understanding the key factors that influence patent impact can assist researchers in gaining a better understanding of the evolution of AI technology and innovation trends. Therefore, identifying highly impactful patents early and providing support for them holds immeasurable value in accelerating technological progress, reducing research and development costs, and mitigating market positioning risks. Despite the extensive research on AI patents, accurately predicting their early impact remains a challenge. Traditional methods often consider only single factors or simple combinations, failing to comprehensively and accurately reflect the actual impact of patents. This paper utilized the artificial intelligence patent database from the United States Patent and Trademark Office and the Len.org patent retrieval platform to obtain specific information on 35,708 AI patents. Using six machine learning models, namely Multiple Linear Regression, Random Forest Regression, XGBoost Regression, LightGBM Regression, Support Vector Machine Regression, and K-Nearest Neighbors Regression, and using early indicators of patents as features, the paper comprehensively predicted the impact of patents from three aspects: technical, social, and economic. These aspects include the technical leadership of patents, the number of citations they receive, and their shared value. The SHAP (Shapley Additive exPlanations) metric was used to explain the predictions of the best model, quantifying the contribution of each feature to the model's predictions. The experimental results on the AI patent dataset indicate that, for all three target variables, LightGBM regression shows the best predictive performance. Specifically, patent novelty has the greatest impact on predicting the technical impact of patents and has a positive effect. Additionally, the number of owners, the number of backward citations, and the number of independent claims are all crucial and have a positive influence on predicting technical impact. In predicting the social impact of patents, the number of applicants is considered the most critical input variable, but it has a negative impact on social impact. At the same time, the number of independent claims, the number of owners, and the number of backward citations are also important predictive factors, and they have a positive effect on social impact. For predicting the economic impact of patents, the number of independent claims is considered the most important factor and has a positive impact on economic impact. The number of owners, the number of sibling countries or regions, and the size of the extended patent family also have a positive influence on economic impact. The study primarily relies on data from the United States Patent and Trademark Office for artificial intelligence patents. Future research could consider more comprehensive data sources, including artificial intelligence patent data, from a global perspective. While the study takes into account various factors, there may still be other important features not considered. In the future, factors such as patent implementation and market applications may be considered as they could have an impact on the influence of patents.

Keywords: patent influence, interpretable machine learning, predictive models, SHAP

Procedia PDF Downloads 53
4490 Feature-Based Summarizing and Ranking from Customer Reviews

Authors: Dim En Nyaung, Thin Lai Lai Thein

Abstract:

Due to the rapid increase of Internet, web opinion sources dynamically emerge which is useful for both potential customers and product manufacturers for prediction and decision purposes. These are the user generated contents written in natural languages and are unstructured-free-texts scheme. Therefore, opinion mining techniques become popular to automatically process customer reviews for extracting product features and user opinions expressed over them. Since customer reviews may contain both opinionated and factual sentences, a supervised machine learning technique applies for subjectivity classification to improve the mining performance. In this paper, we dedicate our work is the task of opinion summarization. Therefore, product feature and opinion extraction is critical to opinion summarization, because its effectiveness significantly affects the identification of semantic relationships. The polarity and numeric score of all the features are determined by Senti-WordNet Lexicon. The problem of opinion summarization refers how to relate the opinion words with respect to a certain feature. Probabilistic based model of supervised learning will improve the result that is more flexible and effective.

Keywords: opinion mining, opinion summarization, sentiment analysis, text mining

Procedia PDF Downloads 332
4489 Renewable Energy Trends Analysis: A Patents Study

Authors: Sepulveda Juan

Abstract:

This article explains the elements and considerations taken into account when implementing and applying patent evaluation and scientometric study in the identifications of technology trends, and the tools that led to the implementation of a software application for patent revision. Univariate analysis helped recognize the technological leaders in the field of energy, and steered the way for a multivariate analysis of this sample, which allowed for a graphical description of the techniques of mature technologies, as well as the detection of emerging technologies. This article ends with a validation of the methodology as applied to the case of fuel cells.

Keywords: patents, scientometric, renewable energy, technology maps

Procedia PDF Downloads 312
4488 Modelling Fluoride Pollution of Groundwater Using Artificial Neural Network in the Western Parts of Jharkhand

Authors: Neeta Kumari, Gopal Pathak

Abstract:

Artificial neural network has been proved to be an efficient tool for non-parametric modeling of data in various applications where output is non-linearly associated with input. It is a preferred tool for many predictive data mining applications because of its power , flexibility, and ease of use. A standard feed forward networks (FFN) is used to predict the groundwater fluoride content. The ANN model is trained using back propagated algorithm, Tansig and Logsig activation function having varying number of neurons. The models are evaluated on the basis of statistical performance criteria like Root Mean Squarred Error (RMSE) and Regression coefficient (R2), bias (mean error), Coefficient of variation (CV), Nash-Sutcliffe efficiency (NSE), and the index of agreement (IOA). The results of the study indicate that Artificial neural network (ANN) can be used for groundwater fluoride prediction in the limited data situation in the hard rock region like western parts of Jharkhand with sufficiently good accuracy.

Keywords: Artificial neural network (ANN), FFN (Feed-forward network), backpropagation algorithm, Levenberg-Marquardt algorithm, groundwater fluoride contamination

Procedia PDF Downloads 554
4487 Variable Frequency Converter Fed Induction Motors

Authors: Abdulatif Abdulsalam Mohamed Shaban

Abstract:

A.C motors, in general, have superior performance characteristics to their d.c. counterparts. However, despite these advantage a.c. motors lack the controllability and simplicity and so d.c. motors retain a competitive edge where precise control is required. As part of an overall project to develop an improved cycloconverter control strategy for induction motors. Simulation and modelling techniques have been developed. This contribution describes a method used to simulate an induction motor drive using the SIMULINK toolbox within MATLAB software. The cycloconverter fed induction motor is principally modelled using the d-q axis equations. Results of the simulation for a given set of induction motor parameters are also presented.

Keywords: simulation, converter, motor, cycloconverter

Procedia PDF Downloads 612
4486 A Survey of Dynamic QoS Methods in Sofware Defined Networking

Authors: Vikram Kalekar

Abstract:

Modern Internet Protocol (IP) networks deploy traditional and modern Quality of Service (QoS) management methods to ensure the smooth flow of network packets during regular operations. SDN (Software-defined networking) networks have also made headway into better service delivery by means of novel QoS methodologies. While many of these techniques are experimental, some of them have been tested extensively in controlled environments, and few of them have the potential to be deployed widely in the industry. With this survey, we plan to analyze the approaches to QoS and resource allocation in SDN, and we will try to comment on the possible improvements to QoS management in the context of SDN.

Keywords: QoS, policy, congestion, flow management, latency, delay index terms-SDN, delay

Procedia PDF Downloads 197
4485 New Design of a Broadband Microwave Zero Bias Power Limiter

Authors: K. Echchakhaoui, E. Abdelmounim, J. Zbitou, H. Bennis, N. Ababssi, M. Latrach

Abstract:

In this paper a new design of a broadband microwave power limiter is presented and validated into simulation by using ADS software (Advanced Design System) from Agilent technologies. The final circuit is built on microstrip lines by using identical Zero Bias Schottky diodes. The power limiter is designed by Associating 3 stages Schottky diodes. The obtained simulation results permit to validate this circuit with a threshold input power level of 0 dBm until a maximum input power of 30 dBm.

Keywords: Limiter, microstrip, zero-biais, ADS

Procedia PDF Downloads 470
4484 Synoptic Analysis of a Heavy Flood in the Province of Sistan-Va-Balouchestan: Iran January 2020

Authors: N. Pegahfar, P. Ghafarian

Abstract:

In this research, the synoptic weather conditions during the heavy flood of 10-12 January 2020 in the Sistan-va-Balouchestan Province of Iran will be analyzed. To this aim, reanalysis data from the National Centers for Environmental Prediction (NCEP) and National Center for Atmospheric Research (NCAR), NCEP Global Forecasting System (GFS) analysis data, measured data from a surface station together with satellite images from the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) have been used from 9 to 12 January 2020. Atmospheric parameters both at the lower troposphere and also at the upper part of that have been used, including absolute vorticity, wind velocity, temperature, geopotential height, relative humidity, and precipitation. Results indicated that both lower-level and upper-level currents were strong. In addition, the transport of a large amount of humidity from the Oman Sea and the Red Sea to the south and southeast of Iran (Sistan-va-Balouchestan Province) led to the vast and unexpected precipitation and then a heavy flood.

Keywords: Sistan-va-Balouchestn Province, heavy flood, synoptic, analysis data

Procedia PDF Downloads 104
4483 On the Influence of Sleep Habits for Predicting Preterm Births: A Machine Learning Approach

Authors: C. Fernandez-Plaza, I. Abad, E. Diaz, I. Diaz

Abstract:

Births occurring before the 37th week of gestation are considered preterm births. A threat of preterm is defined as the beginning of regular uterine contractions, dilation and cervical effacement between 23 and 36 gestation weeks. To author's best knowledge, the factors that determine the beginning of the birth are not completely defined yet. In particular, the incidence of sleep habits on preterm births is weekly studied. The aim of this study is to develop a model to predict the factors affecting premature delivery on pregnancy, based on the above potential risk factors, including those derived from sleep habits and light exposure at night (introduced as 12 variables obtained by a telephone survey using two questionnaires previously used by other authors). Thus, three groups of variables were included in the study (maternal, fetal and sleep habits). The study was approved by Research Ethics Committee of the Principado of Asturias (Spain). An observational, retrospective and descriptive study was performed with 481 births between January 1, 2015 and May 10, 2016 in the University Central Hospital of Asturias (Spain). A statistical analysis using SPSS was carried out to compare qualitative and quantitative variables between preterm and term delivery. Chi-square test qualitative variable and t-test for quantitative variables were applied. Statistically significant differences (p < 0.05) between preterm vs. term births were found for primiparity, multi-parity, kind of conception, place of residence or premature rupture of membranes and interruption during nights. In addition to the statistical analysis, machine learning methods to look for a prediction model were tested. In particular, tree based models were applied as the trade-off between performance and interpretability is especially suitable for this study. C5.0, recursive partitioning, random forest and tree bag models were analysed using caret R-package. Cross validation with 10-folds and parameter tuning to optimize the methods were applied. In addition, different noise reduction methods were applied to the initial data using NoiseFiltersR package. The best performance was obtained by C5.0 method with Accuracy 0.91, Sensitivity 0.93, Specificity 0.89 and Precision 0.91. Some well known preterm birth factors were identified: Cervix Dilation, maternal BMI, Premature rupture of membranes or nuchal translucency analysis in the first trimester. The model also identifies other new factors related to sleep habits such as light through window, bedtime on working days, usage of electronic devices before sleeping from Mondays to Fridays or change of sleeping habits reflected in the number of hours, in the depth of sleep or in the lighting of the room. IF dilation < = 2.95 AND usage of electronic devices before sleeping from Mondays to Friday = YES and change of sleeping habits = YES, then preterm is one of the predicting rules obtained by C5.0. In this work a model for predicting preterm births is developed. It is based on machine learning together with noise reduction techniques. The method maximizing the performance is the one selected. This model shows the influence of variables related to sleep habits in preterm prediction.

Keywords: machine learning, noise reduction, preterm birth, sleep habit

Procedia PDF Downloads 153
4482 Main Control Factors of Fluid Loss in Drilling and Completion in Shunbei Oilfield by Unmanned Intervention Algorithm

Authors: Peng Zhang, Lihui Zheng, Xiangchun Wang, Xiaopan Kou

Abstract:

Quantitative research on the main control factors of lost circulation has few considerations and single data source. Using Unmanned Intervention Algorithm to find the main control factors of lost circulation adopts all measurable parameters. The degree of lost circulation is characterized by the loss rate as the objective function. Geological, engineering and fluid data are used as layers, and 27 factors such as wellhead coordinates and WOB are used as dimensions. Data classification is implemented to determine function independent variables. The mathematical equation of loss rate and 27 influencing factors is established by multiple regression method, and the undetermined coefficient method is used to solve the undetermined coefficient of the equation. Only three factors in t-test are greater than the test value 40, and the F-test value is 96.557%, indicating that the correlation of the model is good. The funnel viscosity, final shear force and drilling time were selected as the main control factors by elimination method, contribution rate method and functional method. The calculated values of the two wells used for verification differ from the actual values by -3.036m3/h and -2.374m3/h, with errors of 7.21% and 6.35%. The influence of engineering factors on the loss rate is greater than that of funnel viscosity and final shear force, and the influence of the three factors is less than that of geological factors. Quantitatively calculate the best combination of funnel viscosity, final shear force and drilling time. The minimum loss rate of lost circulation wells in Shunbei area is 10m3/h. It can be seen that man-made main control factors can only slow down the leakage, but cannot fundamentally eliminate it. This is more in line with the characteristics of karst caves and fractures in Shunbei fault solution oil and gas reservoir.

Keywords: drilling and completion, drilling fluid, lost circulation, loss rate, main controlling factors, unmanned intervention algorithm

Procedia PDF Downloads 118
4481 A Neural Network Modelling Approach for Predicting Permeability from Well Logs Data

Authors: Chico Horacio Jose Sambo

Abstract:

Recently neural network has gained popularity when come to solve complex nonlinear problems. Permeability is one of fundamental reservoir characteristics system that are anisotropic distributed and non-linear manner. For this reason, permeability prediction from well log data is well suited by using neural networks and other computer-based techniques. The main goal of this paper is to predict reservoir permeability from well logs data by using neural network approach. A multi-layered perceptron trained by back propagation algorithm was used to build the predictive model. The performance of the model on net results was measured by correlation coefficient. The correlation coefficient from testing, training, validation and all data sets was evaluated. The results show that neural network was capable of reproducing permeability with accuracy in all cases, so that the calculated correlation coefficients for training, testing and validation permeability were 0.96273, 0.89991 and 0.87858, respectively. The generalization of the results to other field can be made after examining new data, and a regional study might be possible to study reservoir properties with cheap and very fast constructed models.

Keywords: neural network, permeability, multilayer perceptron, well log

Procedia PDF Downloads 408
4480 Exploring Tweet Geolocation: Leveraging Large Language Models for Post-Hoc Explanations

Authors: Sarra Hasni, Sami Faiz

Abstract:

In recent years, location prediction on social networks has gained significant attention, with short and unstructured texts like tweets posing additional challenges. Advanced geolocation models have been proposed, increasing the need to explain their predictions. In this paper, we provide explanations for a geolocation black-box model using LIME and SHAP, two state-of-the-art XAI (eXplainable Artificial Intelligence) methods. We extend our evaluations to Large Language Models (LLMs) as post hoc explainers for tweet geolocation. Our preliminary results show that LLMs outperform LIME and SHAP by generating more accurate explanations. Additionally, we demonstrate that prompts with examples and meta-prompts containing phonetic spelling rules improve the interpretability of these models, even with informal input data. This approach highlights the potential of advanced prompt engineering techniques to enhance the effectiveness of black-box models in geolocation tasks on social networks.

Keywords: large language model, post hoc explainer, prompt engineering, local explanation, tweet geolocation

Procedia PDF Downloads 32
4479 The Accuracy of an In-House Developed Computer-Assisted Surgery Protocol for Mandibular Micro-Vascular Reconstruction

Authors: Christophe Spaas, Lies Pottel, Joke De Ceulaer, Johan Abeloos, Philippe Lamoral, Tom De Backer, Calix De Clercq

Abstract:

We aimed to evaluate the accuracy of an in-house developed low-cost computer-assisted surgery (CAS) protocol for osseous free flap mandibular reconstruction. All patients who underwent primary or secondary mandibular reconstruction with a free (solely or composite) osseous flap, either a fibula free flap or iliac crest free flap, between January 2014 and December 2017 were evaluated. The low-cost protocol consisted out of a virtual surgical planning, a prebend custom reconstruction plate and an individualized free flap positioning guide. The accuracy of the protocol was evaluated through comparison of the postoperative outcome with the 3D virtual planning, based on measurement of the following parameters: intercondylar distance, mandibular angle (axial and sagittal), inner angular distance, anterior-posterior distance, length of the fibular/iliac crest segments and osteotomy angles. A statistical analysis of the obtained values was done. Virtual 3D surgical planning and cutting guide design were performed with Proplan CMF® software (Materialise, Leuven, Belgium) and IPS Gate (KLS Martin, Tuttlingen, Germany). Segmentation of the DICOM data as well as outcome analysis were done with BrainLab iPlan® Software (Brainlab AG, Feldkirchen, Germany). A cost analysis of the protocol was done. Twenty-two patients (11 fibula /11 iliac crest) were included and analyzed. Based on voxel-based registration on the cranial base, 3D virtual planning landmark parameters did not significantly differ from those measured on the actual treatment outcome (p-values >0.05). A cost evaluation of the in-house developed CAS protocol revealed a 1750 euro cost reduction in comparison with a standard CAS protocol with a patient-specific reconstruction plate. Our results indicate that an accurate transfer of the planning with our in-house developed low-cost CAS protocol is feasible at a significant lower cost.

Keywords: CAD/CAM, computer-assisted surgery, low-cost, mandibular reconstruction

Procedia PDF Downloads 144
4478 Determination of Optical Constants of Semiconductor Thin Films by Ellipsometry

Authors: Aïssa Manallah, Mohamed Bouafia

Abstract:

Ellipsometry is an optical method based on the study of the behavior of polarized light. The light reflected on a surface induces a change in the polarization state which depends on the characteristics of the material (complex refractive index and thickness of the different layers constituting the device). The purpose of this work is to determine the optical properties of semiconductor thin films by ellipsometry. This paper describes the experimental aspects concerning the semiconductor samples, the SE400 ellipsometer principle, and the results obtained by direct measurements of ellipsometric parameters and modelling using appropriate software.

Keywords: ellipsometry, optical constants, semiconductors, thin films

Procedia PDF Downloads 313
4477 Feasibility of an Extreme Wind Risk Assessment Software for Industrial Applications

Authors: Francesco Pandolfi, Georgios Baltzopoulos, Iunio Iervolino

Abstract:

The impact of extreme winds on industrial assets and the built environment is gaining increasing attention from stakeholders, including the corporate insurance industry. This has led to a progressively more in-depth study of building vulnerability and fragility to wind. Wind vulnerability models are used in probabilistic risk assessment to relate a loss metric to an intensity measure of the natural event, usually a gust or a mean wind speed. In fact, vulnerability models can be integrated with the wind hazard, which consists of associating a probability to each intensity level in a time interval (e.g., by means of return periods) to provide an assessment of future losses due to extreme wind. This has also given impulse to the world- and regional-scale wind hazard studies.Another approach often adopted for the probabilistic description of building vulnerability to the wind is the use of fragility functions, which provide the conditional probability that selected building components will exceed certain damage states, given wind intensity. In fact, in wind engineering literature, it is more common to find structural system- or component-level fragility functions rather than wind vulnerability models for an entire building. Loss assessment based on component fragilities requires some logical combination rules that define the building’s damage state given the damage state of each component and the availability of a consequence model that provides the losses associated with each damage state. When risk calculations are based on numerical simulation of a structure’s behavior during extreme wind scenarios, the interaction of component fragilities is intertwined with the computational procedure. However, simulation-based approaches are usually computationally demanding and case-specific. In this context, the present work introduces the ExtReMe wind risk assESsment prototype Software, ERMESS, which is being developed at the University of Naples Federico II. ERMESS is a wind risk assessment tool for insurance applications to industrial facilities, collecting a wide assortment of available wind vulnerability models and fragility functions to facilitate their incorporation into risk calculations based on in-built or user-defined wind hazard data. This software implements an alternative method for building-specific risk assessment based on existing component-level fragility functions and on a number of simplifying assumptions for their interactions. The applicability of this alternative procedure is explored by means of an illustrative proof-of-concept example, which considers four main building components, namely: the roof covering, roof structure, envelope wall and envelope openings. The application shows that, despite the simplifying assumptions, the procedure can yield risk evaluations that are comparable to those obtained via more rigorous building-level simulation-based methods, at least in the considered example. The advantage of this approach is shown to lie in the fact that a database of building component fragility curves can be put to use for the development of new wind vulnerability models to cover building typologies not yet adequately covered by existing works and whose rigorous development is usually beyond the budget of portfolio-related industrial applications.

Keywords: component wind fragility, probabilistic risk assessment, vulnerability model, wind-induced losses

Procedia PDF Downloads 183
4476 Traction Behavior of Linear Piezo-Viscous Lubricants in Rough Elastohydrodynamic Lubrication Contacts

Authors: Punit Kumar, Niraj Kumar

Abstract:

The traction behavior of lubricants with the linear pressure-viscosity response in EHL line contacts is investigated numerically for smooth as well as rough surfaces. The analysis involves the simultaneous solution of Reynolds, elasticity and energy equations along with the computation of lubricant properties and surface temperatures. The temperature modified Doolittle-Tait equations are used to calculate viscosity and density as functions of fluid pressure and temperature, while Carreau model is used to describe the lubricant rheology. The surface roughness is assumed to be sinusoidal and it is present on the nearly stationary surface in near-pure sliding EHL conjunction. The linear P-V oil is found to yield much lower traction coefficients and slightly thicker EHL films as compared to the synthetic oil for a given set of dimensionless speed and load parameters. Besides, the increase in traction coefficient attributed to surface roughness is much lower for the former case. The present analysis emphasizes the importance of employing realistic pressure-viscosity response for accurate prediction of EHL traction.

Keywords: EHL, linear pressure-viscosity, surface roughness, traction, water/glycol

Procedia PDF Downloads 386
4475 3D Multimedia Model for Educational Design Engineering

Authors: Mohanaad Talal Shakir

Abstract:

This paper tries to propose educational design by using multimedia technology for Engineering of computer Technology, Alma'ref University College in Iraq. This paper evaluates the acceptance, cognition, and interactiveness of the proposed model by students by using the statistical relationship to determine the stage of the model. Objectives of proposed education design are to develop a user-friendly software for education purposes using multimedia technology and to develop animation for 3D model to simulate assembling and disassembling process of high-speed flow.

Keywords: CAL, multimedia, shock tunnel, interactivity, engineering education

Procedia PDF Downloads 625
4474 Leveraging Digital Cyber Technology for Self-Care and Improved Management of DMPA-SC Clients

Authors: Oluwaseun Adeleke, Grace Amarachi Omenife, Jennifer Adebambo, Mopelola Raji, Anthony Nwala, Mogbonjubade Adesulure

Abstract:

Introduction: The incorporation of digital technology in healthcare systems is instrumental in transforming the delivery, management, and overall experience of healthcare and holds the potential to scale up access through over 200 million active mobile phones used in Nigeria. Digital tools enable increased access to care, stronger client engagement, progress in research and data-driven insights, and more effective promotion of self-care and do-it-yourself practices. The Delivering Innovation in Self-Care (DISC) project 2021 has played a pivotal role in granting women greater autonomy over their sexual and reproductive health (SRH) through a variety of approaches, including information and training to self-inject contraception (DMPA-SC). To optimize its outcomes, the project also leverages digital technology platforms like social media: Facebook, Instagram, and Meet Tina (Chatbot) via WhatsApp, Customer Relationship Management (CRM) applications Freshworks, and Viamo. Methodology: The project has been successful at optimizing in-person digital cyberspace interaction to sensitize individuals effectively about self-injection and provide linkages to SI services. This platform employs the Freshworks CRM software application, along with specially trained personnel known as Cyber IPC Agents and DHIS calling centers. Integration of Freshworks CRM software with social media allows a direct connection with clients to address emerging issues, schedule follow-ups, send reminders to improve compliance with self-injection schedules, enhance the overall user experience for self-injection (SI) clients, and generate comprehensive reports and analytics on client interactions. Interaction covers a range of topics, including – How to use SI, learning more about SI, side-effects and its management, accessing services, fertility, ovulation, other family planning methods, inquiries related to Sexual Reproductive Health as well as uses an address log to connect them with nearby facilities or online pharmaceuticals. Results: Between the months of March to September, a total of 5,403 engagements were recorded. Among these, 4,685 were satisfactorily resolved. Since the program's inception, digital advertising has created 233,633,075 impressions, reached 12,715,582 persons, and resulted in 3,394,048 clicks. Conclusion: Leveraging digital technology has proven to be an invaluable tool in client management and improving client experience. The use of Cyber technology has enabled the successful development and maintenance of client relationships, which have been effective at providing support, facilitating delivery and compliance with DMPA-SC self-injection services, and ensuring overall client satisfaction. Concurrently, providing qualitative data, including user experience feedback, has enabled the derivation of crucial insights that inform the decision-making process and guide in normalizing self-care behavior.

Keywords: selfcare, DMPA-SC self-injection, digital technology, cyber technology, freshworks CRM software

Procedia PDF Downloads 71
4473 Drug Design Modelling and Molecular Virtual Simulation of an Optimized BSA-Based Nanoparticle Formulation Loaded with Di-Berberine Sulfate Acid Salt

Authors: Eman M. Sarhan, Doaa A. Ghareeb, Gabriella Ortore, Amr A. Amara, Mohamed M. El-Sayed

Abstract:

Drug salting and nanoparticle-based drug delivery formulations are considered to be an effective means for rendering the hydrophobic drugs’ nano-scale dispersion in aqueous media, and thus circumventing the pitfalls of their poor solubility as well as enhancing their membrane permeability. The current study aims to increase the bioavailability of quaternary ammonium berberine through acid salting and biodegradable bovine serum albumin (BSA)-based nanoparticulate drug formulation. Berberine hydroxide (BBR-OH) that was chemically synthesized by alkalization of the commercially available berberine hydrochloride (BBR-HCl) was then acidified to get Di-berberine sulfate (BBR)₂SO₄. The purified crystals were spectrally characterized. The desolvation technique was optimized for the preparation of size-controlled BSA-BBR-HCl, BSA-BBR-OH, and BSA-(BBR)₂SO₄ nanoparticles. Particle size, zeta potential, drug release, encapsulation efficiency, Fourier transform infrared spectroscopy (FTIR), tandem MS-MS spectroscopy, energy-dispersive X-ray spectroscopy (EDX), scanning and transmitting electron microscopic examination (SEM, TEM), in vitro bioactivity, and in silico drug-polymer interaction were determined. BSA (PDB ID; 4OR0) protonation state at different pH values was predicted using Amber12 molecular dynamic simulation. Then blind docking was performed using Lamarkian genetic algorithm (LGA) through AutoDock4.2 software. Results proved the purity and the size-controlled synthesis of berberine-BSA-nanoparticles. The possible binding poses, hydrophobic and hydrophilic interactions of berberine on BSA at different pH values were predicted. Antioxidant, anti-hemolytic, and cell differentiated ability of tested drugs and their nano-formulations were evaluated. Thus, drug salting and the potentially effective albumin berberine nanoparticle formulations can be successfully developed using a well-optimized desolvation technique and exhibiting better in vitro cellular bioavailability.

Keywords: berberine, BSA, BBR-OH, BBR-HCl, BSA-BBR-HCl, BSA-BBR-OH, (BBR)₂SO₄, BSA-(BBR)₂SO₄, FTIR, AutoDock4.2 Software, Lamarkian genetic algorithm, SEM, TEM, EDX

Procedia PDF Downloads 176
4472 Gender Differences in the Prediction of Smartphone Use While Driving: Personal and Social Factors

Authors: Erez Kita, Gil Luria

Abstract:

This study examines gender as a boundary condition for the relationship between the psychological variable of mindfulness and the social variable of income with regards to the use of smartphones by young drivers. The use of smartphones while driving increases the likelihood of a car accident, endangering young drivers and other road users. The study sample included 186 young drivers who were legally permitted to drive without supervision. The subjects were first asked to complete questionnaires on mindfulness and income. Next, their smartphone use while driving was monitored over a one-month period. This study is unique as it used an objective smartphone monitoring application (rather than self-reporting) to count the number of times the young participants actually touched their smartphones while driving. The findings show that gender moderates the effects of social and personal factors (i.e., income and mindfulness) on the use of smartphones while driving. The pattern of moderation was similar for both social and personal factors. For men, mindfulness and income are negatively associated with the use of smartphones while driving. These factors are not related to the use of smartphones by women drivers. Mindfulness and income can be used to identify male populations that are at risk of using smartphones while driving. Interventions that improve mindfulness can be used to reduce the use of smartphones by male drivers.

Keywords: mindfulness, using smartphones while driving, income, gender, young drivers

Procedia PDF Downloads 175
4471 High School Gain Analytics From National Assessment Program – Literacy and Numeracy and Australian Tertiary Admission Rankin Linkage

Authors: Andrew Laming, John Hattie, Mark Wilson

Abstract:

Nine Queensland Independent high schools provided deidentified student-matched ATAR and NAPLAN data for all 1217 ATAR graduates since 2020 who also sat NAPLAN at the school. Graduating cohorts from the nine schools contained a mean 100 ATAR graduates with previous NAPLAN data from their school. Excluded were vocational students (mean=27) and any ATAR graduates without NAPLAN data (mean=20). Based on Index of Community Socio-Educational Access (ICSEA) prediction, all schools had larger that predicted proportions of their students graduating with ATARs. There were an additional 173 students not releasing their ATARs to their school (14%), requiring this data to be inferred by schools. Gain was established by first converting each student’s strongest NAPLAN domain to a statewide percentile, then subtracting this result from final ATAR. The resulting ‘percentile shift’ was corrected for plausible ATAR participation at each NAPLAN level. Strongest NAPLAN domain had the highest correlation with ATAR (R2=0.58). RESULTS School mean NAPLAN scores fitted ICSEA closely (R2=0.97). Schools achieved a mean cohort gain of two ATAR rankings, but only 66% of students gained. This ranged from 46% of top-NAPLAN decile students gaining, rising to 75% achieving gains outside the top decile. The 54% of top-decile students whose ATAR fell short of prediction lost a mean 4.0 percentiles (or 6.2 percentiles prior to correction for regression to the mean). 71% of students in smaller schools gained, compared to 63% in larger schools. NAPLAN variability in each of the 13 ICSEA1100 cohorts was 17%, with both intra-school and inter-school variation of these values extremely low (0.3% to 1.8%). Mean ATAR change between years in each school was just 1.1 ATAR ranks. This suggests consecutive school cohorts and ICSEA-similar schools share very similar distributions and outcomes over time. Quantile analysis of the NAPLAN/ATAR revealed heteroscedasticity, but splines offered little additional benefit over simple linear regression. The NAPLAN/ATAR R2 was 0.33. DISCUSSION Standardised data like NAPLAN and ATAR offer educators a simple no-cost progression metric to analyse performance in conjunction with their internal test results. Change is expressed in percentiles, or ATAR shift per student, which is layperson intuitive. Findings may also reduce ATAR/vocational stream mismatch, reveal proportions of cohorts meeting or falling short of expectation and demonstrate by how much. Finally, ‘crashed’ ATARs well below expectation are revealed, which schools can reasonably work to minimise. The percentile shift method is neither value-add nor a growth percentile. In the absence of exit NAPLAN testing, this metric is unable to discriminate academic gain from legitimate ATAR-maximizing strategies. But by controlling for ICSEA, ATAR proportion variation and student mobility, it uncovers progression to ATAR metrics which are not currently publicly available. However achieved, ATAR maximisation is a sought-after private good. So long as standardised nationwide data is available, this analysis offers useful analytics for educators and reasonable predictivity when counselling subsequent cohorts about their ATAR prospects.  

Keywords: NAPLAN, ATAR, analytics, measurement, gain, performance, data, percentile, value-added, high school, numeracy, reading comprehension, variability, regression to the mean

Procedia PDF Downloads 70
4470 Lumped Parameter Models for Numerical Simulation of The Dynamic Response of Hoisting Appliances

Authors: Candida Petrogalli, Giovanni Incerti, Luigi Solazzi

Abstract:

This paper describes three lumped parameters models for the study of the dynamic behaviour of a boom crane. The models proposed here allow evaluating the fluctuations of the load arising from the rope and structure elasticity and from the type of the motion command imposed by the winch. A calculation software was developed in order to determine the actual acceleration of the lifted mass and the dynamic overload during the lifting phase. Some application examples are presented, with the aim of showing the correlation between the magnitude of the stress and the type of the employed motion command.

Keywords: crane, dynamic model, overloading condition, vibration

Procedia PDF Downloads 579
4469 Pharmacophore-Based Modeling of a Series of Human Glutaminyl Cyclase Inhibitors to Identify Lead Molecules by Virtual Screening, Molecular Docking and Molecular Dynamics Simulation Study

Authors: Ankur Chaudhuri, Sibani Sen Chakraborty

Abstract:

In human, glutaminyl cyclase activity is highly abundant in neuronal and secretory tissues and is preferentially restricted to hypothalamus and pituitary. The N-terminal modification of β-amyloids (Aβs) peptides by the generation of a pyro-glutamyl (pGlu) modified Aβs (pE-Aβs) is an important process in the initiation of the formation of neurotoxic plaques in Alzheimer’s disease (AD). This process is catalyzed by glutaminyl cyclase (QC). The expression of QC is characteristically up-regulated in the early stage of AD, and the hallmark of the inhibition of QC is the prevention of the formation of pE-Aβs and plaques. A computer-aided drug design (CADD) process was employed to give an idea for the designing of potentially active compounds to understand the inhibitory potency against human glutaminyl cyclase (QC). This work elaborates the ligand-based and structure-based pharmacophore exploration of glutaminyl cyclase (QC) by using the known inhibitors. Three dimensional (3D) quantitative structure-activity relationship (QSAR) methods were applied to 154 compounds with known IC50 values. All the inhibitors were divided into two sets, training-set, and test-sets. Generally, training-set was used to build the quantitative pharmacophore model based on the principle of structural diversity, whereas the test-set was employed to evaluate the predictive ability of the pharmacophore hypotheses. A chemical feature-based pharmacophore model was generated from the known 92 training-set compounds by HypoGen module implemented in Discovery Studio 2017 R2 software package. The best hypothesis was selected (Hypo1) based upon the highest correlation coefficient (0.8906), lowest total cost (463.72), and the lowest root mean square deviation (2.24Å) values. The highest correlation coefficient value indicates greater predictive activity of the hypothesis, whereas the lower root mean square deviation signifies a small deviation of experimental activity from the predicted one. The best pharmacophore model (Hypo1) of the candidate inhibitors predicted comprised four features: two hydrogen bond acceptor, one hydrogen bond donor, and one hydrophobic feature. The Hypo1 was validated by several parameters such as test set activity prediction, cost analysis, Fischer's randomization test, leave-one-out method, and heat map of ligand profiler. The predicted features were then used for virtual screening of potential compounds from NCI, ASINEX, Maybridge and Chembridge databases. More than seven million compounds were used for this purpose. The hit compounds were filtered by drug-likeness and pharmacokinetics properties. The selective hits were docked to the high-resolution three-dimensional structure of the target protein glutaminyl cyclase (PDB ID: 2AFU/2AFW) to filter these hits further. To validate the molecular docking results, the most active compound from the dataset was selected as a reference molecule. From the density functional theory (DFT) study, ten molecules were selected based on their highest HOMO (highest occupied molecular orbitals) energy and the lowest bandgap values. Molecular dynamics simulations with explicit solvation systems of the final ten hit compounds revealed that a large number of non-covalent interactions were formed with the binding site of the human glutaminyl cyclase. It was suggested that the hit compounds reported in this study could help in future designing of potent inhibitors as leads against human glutaminyl cyclase.

Keywords: glutaminyl cyclase, hit lead, pharmacophore model, simulation

Procedia PDF Downloads 134
4468 Mechanical Characterization of Brain Tissue in Compression

Authors: Abbas Shafiee, Mohammad Taghi Ahmadian, Maryam Hoviattalab

Abstract:

The biomechanical behavior of brain tissue is needed for predicting the traumatic brain injury (TBI). Each year over 1.5 million people sustain a TBI in the USA. The appropriate coefficients for injury prediction can be evaluated using experimental data. In this study, an experimental setup on brain soft tissue was developed to perform unconfined compression tests at quasistatic strain rates ∈0.0004 s-1 and 0.008 s-1 and 0.4 stress relaxation test under unconfined uniaxial compression with ∈ 0.67 s-1 ramp rate. The fitted visco-hyperelastic parameters were utilized by using obtained stress-strain curves. The experimental data was validated using finite element analysis (FEA) and previous findings. Also, influence of friction coefficient on unconfined compression and relaxation test and effect of ramp rate in relaxation test is investigated. Results of the findings are implemented on the analysis of a human brain under high acceleration due to impact.

Keywords: brain soft tissue, visco-hyperelastic, finite element analysis (FEA), friction, quasistatic strain rate

Procedia PDF Downloads 660
4467 The Integrated Methodological Development of Reliability, Risk and Condition-Based Maintenance in the Improvement of the Thermal Power Plant Availability

Authors: Henry Pariaman, Iwa Garniwa, Isti Surjandari, Bambang Sugiarto

Abstract:

Availability of a complex system of thermal power plant is strongly influenced by the reliability of spare parts and maintenance management policies. A reliability-centered maintenance (RCM) technique is an established method of analysis and is the main reference for maintenance planning. This method considers the consequences of failure in its implementation, but does not deal with further risk of down time that associated with failures, loss of production or high maintenance costs. Risk-based maintenance (RBM) technique provides support strategies to minimize the risks posed by the failure to obtain maintenance task considering cost effectiveness. Meanwhile, condition-based maintenance (CBM) focuses on monitoring the application of the conditions that allow the planning and scheduling of maintenance or other action should be taken to avoid the risk of failure prior to the time-based maintenance. Implementation of RCM, RBM, CBM alone or combined RCM and RBM or RCM and CBM is a maintenance technique used in thermal power plants. Implementation of these three techniques in an integrated maintenance will increase the availability of thermal power plants compared to the use of maintenance techniques individually or in combination of two techniques. This study uses the reliability, risks and conditions-based maintenance in an integrated manner to increase the availability of thermal power plants. The method generates MPI (Priority Maintenance Index) is RPN (Risk Priority Number) are multiplied by RI (Risk Index) and FDT (Failure Defense Task) which can generate the task of monitoring and assessment of conditions other than maintenance tasks. Both MPI and FDT obtained from development of functional tree, failure mode effects analysis, fault-tree analysis, and risk analysis (risk assessment and risk evaluation) were then used to develop and implement a plan and schedule maintenance, monitoring and assessment of the condition and ultimately perform availability analysis. The results of this study indicate that the reliability, risks and conditions-based maintenance methods, in an integrated manner can increase the availability of thermal power plants.

Keywords: integrated maintenance techniques, availability, thermal power plant, MPI, FDT

Procedia PDF Downloads 799
4466 Automated Machine Learning Algorithm Using Recurrent Neural Network to Perform Long-Term Time Series Forecasting

Authors: Ying Su, Morgan C. Wang

Abstract:

Long-term time series forecasting is an important research area for automated machine learning (AutoML). Currently, forecasting based on either machine learning or statistical learning is usually built by experts, and it requires significant manual effort, from model construction, feature engineering, and hyper-parameter tuning to the construction of the time series model. Automation is not possible since there are too many human interventions. To overcome these limitations, this article proposed to use recurrent neural networks (RNN) through the memory state of RNN to perform long-term time series prediction. We have shown that this proposed approach is better than the traditional Autoregressive Integrated Moving Average (ARIMA). In addition, we also found it is better than other network systems, including Fully Connected Neural Networks (FNN), Convolutional Neural Networks (CNN), and Nonpooling Convolutional Neural Networks (NPCNN).

Keywords: automated machines learning, autoregressive integrated moving average, neural networks, time series analysis

Procedia PDF Downloads 111
4465 Estimation of Fourier Coefficients of Flux Density for Surface Mounted Permanent Magnet (SMPM) Generators by Direct Search Optimization

Authors: Ramakrishna Rao Mamidi

Abstract:

It is essential for Surface Mounted Permanent Magnet (SMPM) generators to determine the performance prediction and analyze the magnet’s air gap flux density wave shape. The flux density wave shape is neither a pure sine wave or square wave nor a combination. This is due to the variation of air gap reluctance between the stator and permanent magnets. The stator slot openings and the number of slots make the wave shape highly complicated. To reduce the complexity of analysis, approximations are made to the wave shape using Fourier analysis. In contrast to the traditional integration method, the Fourier coefficients, an and bn, are obtained by direct search method optimization. The wave shape with optimized coefficients gives a wave shape close to the desired wave shape. Harmonics amplitudes are worked out and compared with initial values. It can be concluded that the direct search method can be used for estimating Fourier coefficients for irregular wave shapes.

Keywords: direct search, flux plot, fourier analysis, permanent magnets

Procedia PDF Downloads 220
4464 Applied Theory Building to Achieve Success in Iran Municipalities

Authors: Morteza Rahiminejad

Abstract:

There are over 1200 cities and municipalities all around Iran, including 30 mega cities, which municipal organizations, Interior ministries, and city councils supervise. Even so, there has been neither any research about the process of success nor performance assessment in municipalities. In this research an attempt is made to build a comprehensive theory (or model) to show the reasons or success process among the local governments. The present research is based on the contingency approach in which the relevant circumstances are important, and both environment and situations call for their own management methods. The methodology of research is grounded theory, which uses Atlas.ti software as a tool.

Keywords: success, municipality, Iran, theory building

Procedia PDF Downloads 44
4463 Bubble Scrum: How to Run in Organizations That Only Know How to Walk

Authors: Zaheer A. Ali, George Szabo

Abstract:

SCRUM has roots in software and web development and works very well on that in that space. However, any technical person who has watched a typical waterfall managed project spiral out of control or into an abyss, has thought: "there must be a better way". I will discuss how that thought leads naturally to adopting Agile principles and SCRUM, as well as how Agile and SCRUM can be implemented in large institutions with long histories via a method I developed: Bubble Scrum. We will also see how SCRUM can be implemented in interesting places outside of the technical sphere and also discuss where and how to subtly bring Agility and SCRUM into large, rigid, institutions.

Keywords: agile, enterprise-agile, agile at scale, agile transition, project management, scrum

Procedia PDF Downloads 168
4462 Websites for Hypothesis Testing

Authors: Frantisek Mosna

Abstract:

E-learning has become an efficient and widespread means in process of education at all branches of human activities. Statistics is not an exception. Unfortunately the main focus in the statistics teaching is usually paid to the substitution to formulas. Suitable web-sites can simplify and automate calculation and provide more attention and time to the basic principles of statistics, mathematization of real-life situations and following interpretation of results. We introduce our own web-sites for hypothesis testing. Their didactic aspects, technical possibilities of individual tools for their creating, experience and advantages or disadvantages of them are discussed in this paper. These web-sites do not substitute common statistical software but significantly improve the teaching of the statistics at universities.

Keywords: e-learning, hypothesis testing, PHP, web-sites

Procedia PDF Downloads 426