Search results for: search algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3682

Search results for: search algorithms

2782 Feature Analysis of Predictive Maintenance Models

Authors: Zhaoan Wang

Abstract:

Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.

Keywords: automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation

Procedia PDF Downloads 120
2781 Autonomous Strategic Aircraft Deconfliction in a Multi-Vehicle Low Altitude Urban Environment

Authors: Loyd R. Hook, Maryam Moharek

Abstract:

With the envisioned future growth of low altitude urban aircraft operations for airborne delivery service and advanced air mobility, strategies to coordinate and deconflict aircraft flight paths must be prioritized. Autonomous coordination and planning of flight trajectories is the preferred approach to the future vision in order to increase safety, density, and efficiency over manual methods employed today. Difficulties arise because any conflict resolution must be constrained by all other aircraft, all airspace restrictions, and all ground-based obstacles in the vicinity. These considerations make pair-wise tactical deconfliction difficult at best and unlikely to find a suitable solution for the entire system of vehicles. In addition, more traditional methods which rely on long time scales and large protected zones will artificially limit vehicle density and drastically decrease efficiency. Instead, strategic planning, which is able to respond to highly dynamic conditions and still account for high density operations, will be required to coordinate multiple vehicles in the highly constrained low altitude urban environment. This paper develops and evaluates such a planning algorithm which can be implemented autonomously across multiple aircraft and situations. Data from this evaluation provide promising results with simulations showing up to 10 aircraft deconflicted through a relatively narrow low-altitude urban canyon without any vehicle to vehicle or obstacle conflict. The algorithm achieves this level of coordination beginning with the assumption that each vehicle is controlled to follow an independently constructed flight path, which is itself free of obstacle conflict and restricted airspace. Then, by preferencing speed change deconfliction maneuvers constrained by the vehicles flight envelope, vehicles can remain as close to the original planned path and prevent cascading vehicle to vehicle conflicts. Performing the search for a set of commands which can simultaneously ensure separation for each pair-wise aircraft interaction and optimize the total velocities of all the aircraft is further complicated by the fact that each aircraft's flight plan could contain multiple segments. This means that relative velocities will change when any aircraft achieves a waypoint and changes course. Additionally, the timing of when that aircraft will achieve a waypoint (or, more directly, the order upon which all of the aircraft will achieve their respective waypoints) will change with the commanded speed. Put all together, the continuous relative velocity of each vehicle pair and the discretized change in relative velocity at waypoints resembles a hybrid reachability problem - a form of control reachability. This paper proposes two methods for finding solutions to these multi-body problems. First, an analytical formulation of the continuous problem is developed with an exhaustive search of the combined state space. However, because of computational complexity, this technique is only computable for pairwise interactions. For more complicated scenarios, including the proposed 10 vehicle example, a discretized search space is used, and a depth-first search with early stopping is employed to find the first solution that solves the constraints.

Keywords: strategic planning, autonomous, aircraft, deconfliction

Procedia PDF Downloads 85
2780 Factors Influencing Soil Organic Carbon Storage Estimation in Agricultural Soils: A Machine Learning Approach Using Remote Sensing Data Integration

Authors: O. Sunantha, S. Zhenfeng, S. Phattraporn, A. Zeeshan

Abstract:

The decline of soil organic carbon (SOC) in global agriculture is a critical issue requiring rapid and accurate estimation for informed policymaking. While it is recognized that SOC predictors vary significantly when derived from remote sensing data and environmental variables, identifying the specific parameters most suitable for accurately estimating SOC in diverse agricultural areas remains a challenge. This study utilizes remote sensing data to precisely estimate SOC and identify influential factors in diverse agricultural areas, such as paddy, corn, sugarcane, cassava, and perennial crops. Extreme gradient boosting (XGBoost), random forest (RF), and support vector regression (SVR) models are employed to analyze these factors' impact on SOC estimation. The results show key factors influencing SOC estimation include slope, vegetation indices (EVI), spectral reflectance indices (red index, red edge2), temperature, land use, and surface soil moisture, as indicated by their averaged importance scores across XGBoost, RF, and SVR models. Therefore, using different machine learning algorithms for SOC estimation reveals varying influential factors from remote sensing data and environmental variables. This approach emphasizes feature selection, as different machine learning algorithms identify various key factors from remote sensing data and environmental variables for accurate SOC estimation.

Keywords: factors influencing SOC estimation, remote sensing data, environmental variables, machine learning

Procedia PDF Downloads 15
2779 Establishing a Computational Screening Framework to Identify Environmental Exposures Using Untargeted Gas-Chromatography High-Resolution Mass Spectrometry

Authors: Juni C. Kim, Anna R. Robuck, Douglas I. Walker

Abstract:

The human exposome, which includes chemical exposures over the lifetime and their effects, is now recognized as an important measure for understanding human health; however, the complexity of the data makes the identification of environmental chemicals challenging. The goal of our project was to establish a computational workflow for the improved identification of environmental pollutants containing chlorine or bromine. Using the “pattern. search” function available in the R package NonTarget, we wrote a multifunctional script that searches mass spectral clusters from untargeted gas-chromatography high-resolution mass spectrometry (GC-HRMS) for the presence of spectra consistent with chlorine and bromine-containing organic compounds. The “pattern. search” function was incorporated into a different function that allows the evaluation of clusters containing multiple analyte fragments, has multi-core support, and provides a simplified output identifying listing compounds containing chlorine and/or bromine. The new function was able to process 46,000 spectral clusters in under 8 seconds and identified over 150 potential halogenated spectra. We next applied our function to a deidentified dataset from patients diagnosed with primary biliary cholangitis (PBC), primary sclerosing cholangitis (PSC), and healthy controls. Twenty-two spectra corresponded to potential halogenated compounds in the PSC and PBC dataset, including six significantly different in PBC patients, while four differed in PSC patients. We have developed an improved algorithm for detecting halogenated compounds in GC-HRMS data, providing a strategy for prioritizing exposures in the study of human disease.

Keywords: exposome, metabolome, computational metabolomics, high-resolution mass spectrometry, exposure, pollutants

Procedia PDF Downloads 128
2778 Improving Self-Administered Medication Adherence for Older Adults: A Systematic Review

Authors: Mathumalar Loganathan, Lina Syazana, Bryony Dean Franklin

Abstract:

Background: The therapeutic benefit of self-administered medication for long-term use is limited by an average 50% non-adherence rate. Patient forgetfulness is a common factor in unintentional non-adherence. With a growing ageing population, strategies to improve self-administration of medication adherence are essential. Our aim was to review systematically the effects of interventions to optimise self-administration of medication. Method: Database searched were MEDLINE, EMBASE, PsynINFO, CINAHL from 1980 to 31 October 2013. Search terms included were ‘self-administration’, ‘self-care’, ‘medication adherence’, and ‘intervention’. Two independent reviewers undertook screening and methodological quality assessment, using the Downs and Black rating scale. Results: The search strategy retrieved 6 studies that met the inclusion and exclusion criteria. Three intervention strategies were identified: self-administration medication programme (SAMP), nursing education and medication packaging (pill calendar). A nursing education programme focused on improving patients’ behavioural self-management of drug prescribing. This was the most studied area and three studies highlighting an improvement in self-administration of medication. Conclusion: Results are mixed and there is no one interventional strategy that has proved to be effective. Nevertheless, self-administration of medication programme seems to show most promise. A multi-faceted approach and clearer policy guideline are likely to be required to improve prescribing for these vulnerable patients. Mixed results were found for SAMP. Medication packaging (pill calendar) was evaluated in one study showing a significant improvement in self-administration of medication. A meta-analysis could not be performed due to heterogeneity in the outcome measures.

Keywords: self-administered medication, intervention, prescribing, older patients

Procedia PDF Downloads 316
2777 Mindfulness and Mental Resilience Training for Pilots: Enhancing Cognitive Performance and Stress Management

Authors: Nargiza Nuralieva

Abstract:

The study delves into assessing the influence of mindfulness and mental resilience training on the cognitive performance and stress management of pilots. Employing a meticulous literature search across databases such as Medline and Google Scholar, the study used specific keywords to target a wide array of studies. Inclusion criteria were stringent, focusing on peer-reviewed studies in English that utilized designs like randomized controlled trials, with a specific interest in interventions related to mindfulness or mental resilience training for pilots and measured outcomes pertaining to cognitive performance and stress management. The initial literature search identified a pool of 123 articles, with subsequent screening resulting in the exclusion of 77 based on title and abstract. The remaining 54 articles underwent a more rigorous full-text screening, leading to the exclusion of 41. Additionally, five studies were selected from the World Health Organization's clinical trials database. A total of 11 articles from meta-analyses were retained for examination, underscoring the study's dedication to a meticulous and robust inclusion process. The interventions varied widely, incorporating mixed approaches, Cognitive behavioral Therapy (CBT)-based, and mindfulness-based techniques. The analysis uncovered positive effects across these interventions. Specifically, mixed interventions demonstrated a Standardized Mean Difference (SMD) of 0.54, CBT-based interventions showed an SMD of 0.29, and mindfulness-based interventions exhibited an SMD of 0.43. Long-term effects at a 6-month follow-up suggested sustained impacts for both mindfulness-based (SMD: 0.63) and CBT-based interventions (SMD: 0.73), albeit with notable heterogeneity.

Keywords: mindfulness, mental resilience, pilots, cognitive performance, stress management

Procedia PDF Downloads 48
2776 Performance Study of Classification Algorithms for Consumer Online Shopping Attitudes and Behavior Using Data Mining

Authors: Rana Alaa El-Deen Ahmed, M. Elemam Shehab, Shereen Morsy, Nermeen Mekawie

Abstract:

With the growing popularity and acceptance of e-commerce platforms, users face an ever increasing burden in actually choosing the right product from the large number of online offers. Thus, techniques for personalization and shopping guides are needed by users. For a pleasant and successful shopping experience, users need to know easily which products to buy with high confidence. Since selling a wide variety of products has become easier due to the popularity of online stores, online retailers are able to sell more products than a physical store. The disadvantage is that the customers might not find products they need. In this research the customer will be able to find the products he is searching for, because recommender systems are used in some ecommerce web sites. Recommender system learns from the information about customers and products and provides appropriate personalized recommendations to customers to find the needed product. In this paper eleven classification algorithms are comparatively tested to find the best classifier fit for consumer online shopping attitudes and behavior in the experimented dataset. The WEKA knowledge analysis tool, which is an open source data mining workbench software used in comparing conventional classifiers to get the best classifier was used in this research. In this research by using the data mining tool (WEKA) with the experimented classifiers the results show that decision table and filtered classifier gives the highest accuracy and the lowest accuracy classification via clustering and simple cart.

Keywords: classification, data mining, machine learning, online shopping, WEKA

Procedia PDF Downloads 343
2775 The Search for the Self in Psychotherapy: Findings from Relational Theory and Neuroanatomy

Authors: Harry G. Segal

Abstract:

The idea of the “self” has been essential ever since the early modern period in western culture, especially since the development of psychotherapy, but advances in neuroscience and cognitive theory challenge traditional notions of the self. More specifically, neuroanatomists have found no location of “the self” in the brain; instead, consciousness has been posited to be a rapid combination of perception, memory, anticipation of future events, and judgment. In this paper, a theoretical model is presented to address these neuroanatomical findings and to revise the historical understanding of “selfhood” in the practice of psychotherapy.

Keywords: the self, psychotherapy, the self and the brain

Procedia PDF Downloads 96
2774 Development of Star Image Simulator for Star Tracker Algorithm Validation

Authors: Zoubida Mahi

Abstract:

A successful satellite mission in space requires a reliable attitude and orbit control system to command, control and position the satellite in appropriate orbits. Several sensors are used for attitude control, such as magnetic sensors, earth sensors, horizon sensors, gyroscopes, and solar sensors. The star tracker is the most accurate sensor compared to other sensors, and it is able to offer high-accuracy attitude control without the need for prior attitude information. There are mainly three approaches in star sensor research: digital simulation, hardware in the loop simulation, and field test of star observation. In the digital simulation approach, all of the processes are done in software, including star image simulation. Hence, it is necessary to develop star image simulation software that could simulate real space environments and various star sensor configurations. In this paper, we present a new stellar image simulation tool that is used to test and validate the stellar sensor algorithms; the developed tool allows to simulate of stellar images with several types of noise, such as background noise, gaussian noise, Poisson noise, multiplicative noise, and several scenarios that exist in space such as the presence of the moon, the presence of optical system problem, illumination and false objects. On the other hand, we present in this paper a new star extraction algorithm based on a new centroid calculation method. We compared our algorithm with other star extraction algorithms from the literature, and the results obtained show the star extraction capability of the proposed algorithm.

Keywords: star tracker, star simulation, star detection, centroid, noise, scenario

Procedia PDF Downloads 85
2773 The Impact of Project-Based Learning under Representative Minorities Students

Authors: Shwadhin Sharma

Abstract:

As there has been increasing focus on the shorter attention span of the millennials students, there is a relative absence of instructional tools on behavioral assessments in learning information technology skills within the information systems field and textbooks. This study uses project-based learning in which students gain knowledge and skills related to information technology by working on an extended project that allows students to find a real business problem design information systems based on information collected from the company and develop an information system that solves the problem of the company. Eighty students from two sections of the same course engage in the project from the first week of the class till the sixteenth week of the class to deliver a small business information system that allows them to employ all the skills and knowledge that they learned in the class into the systems they are creating. Computer Information Systems related courses are often difficult to understand and process especially for the Under Representative Minorities students who have limited computer or information systems related (academic) experiences. Project-based learning demands constant attention of the students and forces them to apply knowledge learned in the class to a project that helps retaining knowledge. To make sure our assumption is correct, we started with a pre-test and post-test to test the students learning (of skills) based on the project. Our test showed that almost 90% of the students from the two sections scored higher in post-test as compared to pre-test. Based on this premise, we conducted a further survey that measured student’s job-search preparation, knowledge of data analysis, involved with the course, satisfaction with the course, student’s overall reaction the course and students' ability to meet the traditional learning goals related to the course.

Keywords: project-based learning, job-search preparation, satisfaction with course, traditional learning goals

Procedia PDF Downloads 201
2772 The Impact of a Lower Health Literacy in the Self-Management of Patients with a Multiple Sclerosis: A Literature Review

Authors: Helga Martins, Idália Matias

Abstract:

Background:Multiple sclerosis is a chronic inflammatory autoimmune demyelinating disease that affects young adults. Multiple sclerosis is a chronic disease in which the patient needs to self-manage the disease and the therapeutic regimen. Consequently, the promotion of health literacy assumes a relevant role for the accessibility, understanding, and use of information in order to promote and maintain the health of patients with multiple sclerosis. Aim: To determine the impact of lower health literacy in the self-management of patients with a multiple sclerosis. Methods: Literature review based on a search on the following electronic databases: CINAHLand MEDLINE; comprising all results published between September 2016 and September 2021. The search strategy was: (“Self-management [MeSH]” AND “Multiple sclerosis[MeSH]”AND “Health literacy[MeSH]”). The inclusion criteria were: original papers reporting about multiple sclerosis patients; participants with age above 18 years old, written in English, Spanish, French, or Portuguese. Two independent reviewers have done the screening and analysis of the results. 38 citations were identified, and after duplicates removal, a total of 25 results were screened; 14 were included after the application of the inclusion criteria. Results: The lower health literacy in the self-management of patients with a multiple sclerosis is related toless healthy choices, riskier health behavior, poor health outcomes, decreased of adhering to the therapeutic regimen after discharge, less self-management of chronic illness, and increased the time of hospitalization. Conclusion: Inadequate levels of health literacy contribute to poor health outcomes, unsuccessful self-management of chronic illness, and inadequate adherence to the therapeutic regimen. Therefore, health literacy is important for health policy and the healthcare services, as it can be understood as a mediator of self-management of multiple sclerosis disease.

Keywords: health literacy, multiple sclerosis, review, self-management

Procedia PDF Downloads 143
2771 Performance Evaluation of Parallel Surface Modeling and Generation on Actual and Virtual Multicore Systems

Authors: Nyeng P. Gyang

Abstract:

Even though past, current and future trends suggest that multicore and cloud computing systems are increasingly prevalent/ubiquitous, this class of parallel systems is nonetheless underutilized, in general, and barely used for research on employing parallel Delaunay triangulation for parallel surface modeling and generation, in particular. The performances, of actual/physical and virtual/cloud multicore systems/machines, at executing various algorithms, which implement various parallelization strategies of the incremental insertion technique of the Delaunay triangulation algorithm, were evaluated. T-tests were run on the data collected, in order to determine whether various performance metrics differences (including execution time, speedup and efficiency) were statistically significant. Results show that the actual machine is approximately twice faster than the virtual machine at executing the same programs for the various parallelization strategies. Results, which furnish the scalability behaviors of the various parallelization strategies, also show that some of the differences between the performances of these systems, during different runs of the algorithms on the systems, were statistically significant. A few pseudo superlinear speedup results, which were computed from the raw data collected, are not true superlinear speedup values. These pseudo superlinear speedup values, which arise as a result of one way of computing speedups, disappear and give way to asymmetric speedups, which are the accurate kind of speedups that occur in the experiments performed.

Keywords: cloud computing systems, multicore systems, parallel Delaunay triangulation, parallel surface modeling and generation

Procedia PDF Downloads 198
2770 A Distributed Smart Battery Management System – sBMS, for Stationary Energy Storage Applications

Authors: António J. Gano, Carmen Rangel

Abstract:

Currently, electric energy storage systems for stationary applications have known an increasing interest, namely with the integration of local renewable energy power sources into energy communities. Li-ion batteries are considered the leading electric storage devices to achieve this integration, and Battery Management Systems (BMS) are decisive for their control and optimum performance. In this work, the advancement of a smart BMS (sBMS) prototype with a modular distributed topology is described. The system, still under development, has a distributed architecture with modular characteristics to operate with different battery pack topologies and charge capacities, integrating adaptive algorithms for functional state real-time monitoring and management of multicellular Li-ion batteries, and is intended for application in the context of a local energy community fed by renewable energy sources. This sBMS system includes different developed hardware units: (1) Cell monitoring units (CMUs) for interfacing with each individual cell or module monitoring within the battery pack; (2) Battery monitoring and switching unit (BMU) for global battery pack monitoring, thermal control and functional operating state switching; (3) Main management and local control unit (MCU) for local sBMS’s management and control, also serving as a communications gateway to external systems and devices. This architecture is fully expandable to battery packs with a large number of cells, or modules, interconnected in series, as the several units have local data acquisition and processing capabilities, communicating over a standard CAN bus and will be able to operate almost autonomously. The CMU units are intended to be used with Li-ion cells but can be used with other cell chemistries, with output voltages within the 2.5 to 5 V range. The different unit’s characteristics and specifications are described, including the different implemented hardware solutions. The developed hardware supports both passive and active methods for charge equalization, considered fundamental functionalities for optimizing the performance and the useful lifetime of a Li-ion battery package. The functional characteristics of the different units of this sBMS system, including different process variables data acquisition using a flexible set of sensors, can support the development of custom algorithms for estimating the parameters defining the functional states of the battery pack (State-of-Charge, State-of-Health, etc.) as well as different charge equalizing strategies and algorithms. This sBMS system is intended to interface with other systems and devices using standard communication protocols, like those used by the Internet of Things. In the future, this sBMS architecture can evolve to a fully decentralized topology, with all the units using Wi-Fi protocols and integrating a mesh network, making unnecessary the MCU unit. The status of the work in progress is reported, leading to conclusions on the system already executed, considering the implemented hardware solution, not only as fully functional advanced and configurable battery management system but also as a platform for developing custom algorithms and optimizing strategies to achieve better performance of electric energy stationary storage devices.

Keywords: Li-ion battery, smart BMS, stationary electric storage, distributed BMS

Procedia PDF Downloads 88
2769 Antecedents of Regret and Satisfaction in Electronic Commerce

Authors: Chechen Liao, Pui-Lai To, Chuang-Chun Liu

Abstract:

Online shopping has become very popular recently. In today’s highly competitive online retail environment, retaining existing customers is a necessity for online retailers. This study focuses on the antecedents and consequences of Internet buyer regret and satisfaction in the online consumer purchasing process. This study examines the roles that online consumer’s purchasing process evaluations (i.e., search experience difficulty, service-attribute evaluations, product-attribute evaluations and post-purchase price perceptions) and alternative evaluation (i.e., alternative attractiveness) play in determining buyer regret and satisfaction in e-commerce. The study also examines the consequences of regret, satisfaction and habit in regard to repurchase intention. In addition, this study attempts to investigate the moderating role of habit in attaining a better understanding of the relationship between repurchase intention and its antecedents. Survey data collected from 431 online customers are analyzed using structural equation modeling (SEM) with partial least squares (PLS) and support provided for the hypothesized links. These results indicate that online consumer’s purchasing process evaluations (i.e., search experience difficulty, service-attribute evaluations, product-attribute evaluations and post-purchase price perceptions) have significant influences on regret and satisfaction, which in turn influences repurchase intention. In addition, alternative evaluation (i.e., alternative attractiveness) has a significant positive influence on regret. The research model can provide a richer understanding of online customers’ repurchase behavior and contribute to both research and practice.

Keywords: online shopping, purchase evaluation, regret, satisfaction

Procedia PDF Downloads 276
2768 NANCY: Combining Adversarial Networks with Cycle-Consistency for Robust Multi-Modal Image Registration

Authors: Mirjana Ruppel, Rajendra Persad, Amit Bahl, Sanja Dogramadzi, Chris Melhuish, Lyndon Smith

Abstract:

Multimodal image registration is a profoundly complex task which is why deep learning has been used widely to address it in recent years. However, two main challenges remain: Firstly, the lack of ground truth data calls for an unsupervised learning approach, which leads to the second challenge of defining a feasible loss function that can compare two images of different modalities to judge their level of alignment. To avoid this issue altogether we implement a generative adversarial network consisting of two registration networks GAB, GBA and two discrimination networks DA, DB connected by spatial transformation layers. GAB learns to generate a deformation field which registers an image of the modality B to an image of the modality A. To do that, it uses the feedback of the discriminator DB which is learning to judge the quality of alignment of the registered image B. GBA and DA learn a mapping from modality A to modality B. Additionally, a cycle-consistency loss is implemented. For this, both registration networks are employed twice, therefore resulting in images ˆA, ˆB which were registered to ˜B, ˜A which were registered to the initial image pair A, B. Thus the resulting and initial images of the same modality can be easily compared. A dataset of liver CT and MRI was used to evaluate the quality of our approach and to compare it against learning and non-learning based registration algorithms. Our approach leads to dice scores of up to 0.80 ± 0.01 and is therefore comparable to and slightly more successful than algorithms like SimpleElastix and VoxelMorph.

Keywords: cycle consistency, deformable multimodal image registration, deep learning, GAN

Procedia PDF Downloads 120
2767 Stochastic Matrices and Lp Norms for Ill-Conditioned Linear Systems

Authors: Riadh Zorgati, Thomas Triboulet

Abstract:

In quite diverse application areas such as astronomy, medical imaging, geophysics or nondestructive evaluation, many problems related to calibration, fitting or estimation of a large number of input parameters of a model from a small amount of output noisy data, can be cast as inverse problems. Due to noisy data corruption, insufficient data and model errors, most inverse problems are ill-posed in a Hadamard sense, i.e. existence, uniqueness and stability of the solution are not guaranteed. A wide class of inverse problems in physics relates to the Fredholm equation of the first kind. The ill-posedness of such inverse problem results, after discretization, in a very ill-conditioned linear system of equations, the condition number of the associated matrix can typically range from 109 to 1018. This condition number plays the role of an amplifier of uncertainties on data during inversion and then, renders the inverse problem difficult to handle numerically. Similar problems appear in other areas such as numerical optimization when using interior points algorithms for solving linear programs leads to face ill-conditioned systems of linear equations. Devising efficient solution approaches for such system of equations is therefore of great practical interest. Efficient iterative algorithms are proposed for solving a system of linear equations. The approach is based on a preconditioning of the initial matrix of the system with an approximation of a generalized inverse leading to a stochastic preconditioned matrix. This approach, valid for non-negative matrices, is first extended to hermitian, semi-definite positive matrices and then generalized to any complex rectangular matrices. The main results obtained are as follows: 1) We are able to build a generalized inverse of any complex rectangular matrix which satisfies the convergence condition requested in iterative algorithms for solving a system of linear equations. This completes the (short) list of generalized inverse having this property, after Kaczmarz and Cimmino matrices. Theoretical results on both the characterization of the type of generalized inverse obtained and the convergence are derived. 2) Thanks to its properties, this matrix can be efficiently used in different solving schemes as Richardson-Tanabe or preconditioned conjugate gradients. 3) By using Lp norms, we propose generalized Kaczmarz’s type matrices. We also show how Cimmino's matrix can be considered as a particular case consisting in choosing the Euclidian norm in an asymmetrical structure. 4) Regarding numerical results obtained on some pathological well-known test-cases (Hilbert, Nakasaka, …), some of the proposed algorithms are empirically shown to be more efficient on ill-conditioned problems and more robust to error propagation than the known classical techniques we have tested (Gauss, Moore-Penrose inverse, minimum residue, conjugate gradients, Kaczmarz, Cimmino). We end on a very early prospective application of our approach based on stochastic matrices aiming at computing some parameters (such as the extreme values, the mean, the variance, …) of the solution of a linear system prior to its resolution. Such an approach, if it were to be efficient, would be a source of information on the solution of a system of linear equations.

Keywords: conditioning, generalized inverse, linear system, norms, stochastic matrix

Procedia PDF Downloads 123
2766 Association between Substance Use Disorder, PTSD and the Effectiveness of Collaborative Care for Depression in Primary Care: A Systematic Literature Search and Narrative Review

Authors: J. Raub, H. Schillok, L. Kaupe, C. Jung-Sievers, G. Pitschel-Walz, M. Bühner, J. Gensichen, F. D. Pokal-Gruppe

Abstract:

Introduction: In Germany, depression ranks among the top ten diseases with the highest disease burden and often occurs with comorbidities. Collaborative Care (CC), a concept developed in the United States for the primary care management of chronic diseases, has been identified as an efficient model for the treatment of depression in general medicine. A recent meta-analysis highlights research gaps regarding CC in patients with psychiatric multimorbidity. The highest prevalence of psychiatric comorbidities in depression is observed in anxiety disorders, post-traumatic stress disorder (PTSD), and substance use disorders. Methods: We conducted a literature search following the PRISMA guidelines with three components: Collaborative Care, Depression and randomized controlled trial on the common databases. We focused on the examination of psychiatric comorbidities in depression, specifically Posttraumatic Stress Disorder (PTSD) and Substance Use Disorder (SUD). Results: During the screening process, we identified nine relevant articles related to PTSD, the number of articles related to Substance Use Disorder (SUD) was ten. We examined a total of 8,634 individuals. Our literature review did not reveal any overall significant superiority of the Collaborative Care model compared to Usual Care in patients with depression with comorbid Substance Use Disorder (SUD) or Posttraumatic Stress Disorder (PTSD). Discussion: Five studies demonstrate a faster and statistically significant improvement in depression outcomes among patients with Substance Use Disorder (SUD) and Posttraumatic Stress Disorder (PTSD). Currently, several randomized controlled trials on the topic of Collaborative Care in depression with psychiatric comorbidity are ongoing, such as miCare, Claro and COMET.

Keywords: Depression, primary care, collaborative care, PTSD, Substance use Disorder

Procedia PDF Downloads 77
2765 TimeTune: Personalized Study Plans Generation with Google Calendar Integration

Authors: Chevon Fernando, Banuka Athuraliya

Abstract:

The purpose of this research is to provide a solution to the students’ time management, which usually becomes an issue because students must study and manage their personal commitments. "TimeTune," an AI-based study planner that provides an opportunity to maneuver study timeframes by incorporating modern machine learning algorithms with calendar applications, is unveiled as the ideal solution. The research is focused on the development of LSTM models that connect to the Google Calendar API in the process of developing learning paths that would be fit for a unique student's daily life experience and study history. A key finding of this research is the success in building the LSTM model to predict optimal study times, which, integrating with the real-time data of Google Calendar, will generate the timetables automatically in a personalized and customized manner. The methodology encompasses Agile development practices and Object-Oriented Analysis and Design (OOAD) principles, focusing on user-centric design and iterative development. By adopting this method, students can significantly reduce the tension associated with poor study habits and time management. In conclusion, "TimeTune" displays an advanced step in personalized education technology. The fact that its application of ML algorithms and calendar integration is quite innovative is slowly and steadily revolutionizing the lives of students. The excellence of maintaining a balanced academic and personal life is stress reduction, which the applications promise to provide for students when it comes to managing their studies.

Keywords: personalized learning, study planner, time management, calendar integration

Procedia PDF Downloads 36
2764 Examining How Youth Use Mobile Devices for Health Information: Preliminary Findings of a Survey Study with High School Students in Croatia

Authors: Sung Un Kim, Ivana Martinović, Snježana Stanarević Katavić

Abstract:

As more and more youth use mobile devices, such as tablets and smartphones, for information seeking in their everyday lives, the purpose of this study is to understand the behaviors of youth seeking health information on mobile devices. The specific objective of this study is to examine 1) for what health issues youth use mobile devices, 2) for what reasons youth use mobile devices to obtain health information, 3) in what ways youth use mobile devices for health information, and 4) the features of health applications that youth find useful. The researchers devised a questionnaire for this study. Four hundred eight students from two high schools, located in Osijek, Croatia, participated by answering the questionnaire (281 girls and 127 boys). The collected data were analyzed using descriptive statistics and content analysis. The results show that among all participants, about 85 percent (n = 344) reported having used mobile devices for health information. The most frequent health topic for which they had been using mobile devices is physical activity (n = 273), followed by eating issues and nutrition (n = 224), mental health (n = 160), sexual health (n = 157), alcohol, drugs, and tobacco (n = 125), safety (n = 96) and particular diseases (n = 62). They use mobile devices to obtain health information due to the ease of use (n = 342), the ease of sharing health information (n = 281), portability (n = 215), timeliness (n = 162), and the ease of tracking/recording/monitoring health status (n = 147). Of those who have used mobile devices for health information, three-quarters (n = 261) use mobile devices to search health information, while 32.8% (n =113) use applications and 31.7% (n =109) browse information. Those who have used applications for health information (n = 113) consider the alert feature (n=107) as the most useful, followed by the tracking/recording/monitoring feature (n =92), the customized information feature (n = 86), the video feature (n = 58), and the sharing feature (n =39). It is notable that although health applications have been actively developed and studied, a majority of the participants search for or browse information on mobile devices, instead of using applications. The researchers will discuss reasons that some of them did not use mobile devices to obtain health information, students’ concerns about using health applications, and features that they wish to have in health applications.

Keywords: Croatia, health information, information seeking behaviors, mobile devices, youth

Procedia PDF Downloads 390
2763 A Meta-Analysis on the Efficacy and Safety of TRC101/Veverimer 6g/Day in Increasing Serum Bicarbonate Levels of Chronic Kidney Disease Patients with Metabolic Acidosis

Authors: Hazel Ann Gianelli Cu, Stephanie Co, Radcliff Cobankiat

Abstract:

Objectives: TRC101/Veverimer is an orally administered, non absorbed, sodium- and counterion-free hydrochloric acid binder for the treatment of metabolic acidosis associated with chronic kidney disease. The main objective of this study is to determine the efficacy of TRC 101/ Veverimer 6g/day in increasing serum bicarbonate levels of chronic kidney disease patients with metabolic acidosis. In this meta analysis, we also aim to look at safety outcomes, adverse effects and if the level of serum bicarbonate reached metabolic alkalosis when given TRC101/Veverimer. Methodology: Pubmed, Cochrane, Google Scholar and Science direct were used to search for randomized controlled trials about TRC101/Veverimer use in Chronic kidney disease patients with metabolic acidosis. Search strategy according to the Prisma checklist was done with evaluation of biases and synthesis of results using the Cochrane Review Manager software 5.4. Results: Two randomized controlled trials involving 371 chronic kidney disease patients were included in this study. Results show there was a significant increase in the serum bicarbonate level when given TRC101/Veverimer compared to the placebo. Both studies had a significant number of participants who completed the studies until the end. P value of <0.00001 was used in both studies with a confidence interval of 95%. Conclusion: TRC101/Veverimer 6g/day was shown to effectively and safely increase serum bicarbonate or achieve normalization in chronic kidney disease patients with metabolic acidosis as compared with a placebo. This was associated with delayed progression of kidney disease with improvement of physical functioning, however longer duration of future studies is ideal in order to assess further the long advantages and consequences of TRC 101/Veverimer.

Keywords: chronic kidney disease, metabolic acidosis, Veverimer, TRC101

Procedia PDF Downloads 187
2762 Production Optimization under Geological Uncertainty Using Distance-Based Clustering

Authors: Byeongcheol Kang, Junyi Kim, Hyungsik Jung, Hyungjun Yang, Jaewoo An, Jonggeun Choe

Abstract:

It is important to figure out reservoir properties for better production management. Due to the limited information, there are geological uncertainties on very heterogeneous or channel reservoir. One of the solutions is to generate multiple equi-probable realizations using geostatistical methods. However, some models have wrong properties, which need to be excluded for simulation efficiency and reliability. We propose a novel method of model selection scheme, based on distance-based clustering for reliable application of production optimization algorithm. Distance is defined as a degree of dissimilarity between the data. We calculate Hausdorff distance to classify the models based on their similarity. Hausdorff distance is useful for shape matching of the reservoir models. We use multi-dimensional scaling (MDS) to describe the models on two dimensional space and group them by K-means clustering. Rather than simulating all models, we choose one representative model from each cluster and find out the best model, which has the similar production rates with the true values. From the process, we can select good reservoir models near the best model with high confidence. We make 100 channel reservoir models using single normal equation simulation (SNESIM). Since oil and gas prefer to flow through the sand facies, it is critical to characterize pattern and connectivity of the channels in the reservoir. After calculating Hausdorff distances and projecting the models by MDS, we can see that the models assemble depending on their channel patterns. These channel distributions affect operation controls of each production well so that the model selection scheme improves management optimization process. We use one of useful global search algorithms, particle swarm optimization (PSO), for our production optimization. PSO is good to find global optimum of objective function, but it takes too much time due to its usage of many particles and iterations. In addition, if we use multiple reservoir models, the simulation time for PSO will be soared. By using the proposed method, we can select good and reliable models that already matches production data. Considering geological uncertainty of the reservoir, we can get well-optimized production controls for maximum net present value. The proposed method shows one of novel solutions to select good cases among the various probabilities. The model selection schemes can be applied to not only production optimization but also history matching or other ensemble-based methods for efficient simulations.

Keywords: distance-based clustering, geological uncertainty, particle swarm optimization (PSO), production optimization

Procedia PDF Downloads 139
2761 Mathematical Modelling and AI-Based Degradation Analysis of the Second-Life Lithium-Ion Battery Packs for Stationary Applications

Authors: Farhad Salek, Shahaboddin Resalati

Abstract:

The production of electric vehicles (EVs) featuring lithium-ion battery technology has substantially escalated over the past decade, demonstrating a steady and persistent upward trajectory. The imminent retirement of electric vehicle (EV) batteries after approximately eight years underscores the critical need for their redirection towards recycling, a task complicated by the current inadequacy of recycling infrastructures globally. A potential solution for such concerns involves extending the operational lifespan of electric vehicle (EV) batteries through their utilization in stationary energy storage systems during secondary applications. Such adoptions, however, require addressing the safety concerns associated with batteries’ knee points and thermal runaways. This paper develops an accurate mathematical model representative of the second-life battery packs from a cell-to-pack scale using an equivalent circuit model (ECM) methodology. Neural network algorithms are employed to forecast the degradation parameters based on the EV batteries' aging history to develop a degradation model. The degradation model is integrated with the ECM to reflect the impacts of the cycle aging mechanism on battery parameters during operation. The developed model is tested under real-life load profiles to evaluate the life span of the batteries in various operating conditions. The methodology and the algorithms introduced in this paper can be considered the basis for Battery Management System (BMS) design and techno-economic analysis of such technologies.

Keywords: second life battery, electric vehicles, degradation, neural network

Procedia PDF Downloads 48
2760 Yawning Computing Using Bayesian Networks

Authors: Serge Tshibangu, Turgay Celik, Zenzo Ncube

Abstract:

Road crashes kill nearly over a million people every year, and leave millions more injured or permanently disabled. Various annual reports reveal that the percentage of fatal crashes due to fatigue/driver falling asleep comes directly after the percentage of fatal crashes due to intoxicated drivers. This percentage is higher than the combined percentage of fatal crashes due to illegal/Un-Safe U-turn and illegal/Un-Safe reversing. Although a relatively small percentage of police reports on road accidents highlights drowsiness and fatigue, the importance of these factors is greater than we might think, hidden by the undercounting of their events. Some scenarios show that these factors are significant in accidents with killed and injured people. Thus the need for an automatic drivers fatigue detection system in order to considerably reduce the number of accidents owing to fatigue.This research approaches the drivers fatigue detection problem in an innovative way by combining cues collected from both temporal analysis of drivers’ faces and environment. Monotony in driving environment is inter-related with visual symptoms of fatigue on drivers’ faces to achieve fatigue detection. Optical and infrared (IR) sensors are used to analyse the monotony in driving environment and to detect the visual symptoms of fatigue on human face. Internal cues from drivers faces and external cues from environment are combined together using machine learning algorithms to automatically detect fatigue.

Keywords: intelligent transportation systems, bayesian networks, yawning computing, machine learning algorithms

Procedia PDF Downloads 450
2759 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection

Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye

Abstract:

The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.

Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document

Procedia PDF Downloads 149
2758 Unveiling Comorbidities in Irritable Bowel Syndrome: A UK BioBank Study utilizing Supervised Machine Learning

Authors: Uswah Ahmad Khan, Muhammad Moazam Fraz, Humayoon Shafique Satti, Qasim Aziz

Abstract:

Approximately 10-14% of the global population experiences a functional disorder known as irritable bowel syndrome (IBS). The disorder is defined by persistent abdominal pain and an irregular bowel pattern. IBS significantly impairs work productivity and disrupts patients' daily lives and activities. Although IBS is widespread, there is still an incomplete understanding of its underlying pathophysiology. This study aims to help characterize the phenotype of IBS patients by differentiating the comorbidities found in IBS patients from those in non-IBS patients using machine learning algorithms. In this study, we extracted samples coding for IBS from the UK BioBank cohort and randomly selected patients without a code for IBS to create a total sample size of 18,000. We selected the codes for comorbidities of these cases from 2 years before and after their IBS diagnosis and compared them to the comorbidities in the non-IBS cohort. Machine learning models, including Decision Trees, Gradient Boosting, Support Vector Machine (SVM), AdaBoost, Logistic Regression, and XGBoost, were employed to assess their accuracy in predicting IBS. The most accurate model was then chosen to identify the features associated with IBS. In our case, we used XGBoost feature importance as a feature selection method. We applied different models to the top 10% of features, which numbered 50. Gradient Boosting, Logistic Regression and XGBoost algorithms yielded a diagnosis of IBS with an optimal accuracy of 71.08%, 71.427%, and 71.53%, respectively. Among the comorbidities most closely associated with IBS included gut diseases (Haemorrhoids, diverticular diseases), atopic conditions(asthma), and psychiatric comorbidities (depressive episodes or disorder, anxiety). This finding emphasizes the need for a comprehensive approach when evaluating the phenotype of IBS, suggesting the possibility of identifying new subsets of IBS rather than relying solely on the conventional classification based on stool type. Additionally, our study demonstrates the potential of machine learning algorithms in predicting the development of IBS based on comorbidities, which may enhance diagnosis and facilitate better management of modifiable risk factors for IBS. Further research is necessary to confirm our findings and establish cause and effect. Alternative feature selection methods and even larger and more diverse datasets may lead to more accurate classification models. Despite these limitations, our findings highlight the effectiveness of Logistic Regression and XGBoost in predicting IBS diagnosis.

Keywords: comorbidities, disease association, irritable bowel syndrome (IBS), predictive analytics

Procedia PDF Downloads 103
2757 Transformer Fault Diagnostic Predicting Model Using Support Vector Machine with Gradient Decent Optimization

Authors: R. O. Osaseri, A. R. Usiobaifo

Abstract:

The power transformer which is responsible for the voltage transformation is of great relevance in the power system and oil-immerse transformer is widely used all over the world. A prompt and proper maintenance of the transformer is of utmost importance. The dissolved gasses content in power transformer, oil is of enormous importance in detecting incipient fault of the transformer. There is a need for accurate prediction of the incipient fault in transformer oil in order to facilitate the prompt maintenance and reducing the cost and error minimization. Study on fault prediction and diagnostic has been the center of many researchers and many previous works have been reported on the use of artificial intelligence to predict incipient failure of transformer faults. In this study machine learning technique was employed by using gradient decent algorithms and Support Vector Machine (SVM) in predicting incipient fault diagnosis of transformer. The method focuses on creating a system that improves its performance on previous result and historical data. The system design approach is basically in two phases; training and testing phase. The gradient decent algorithm is trained with a training dataset while the learned algorithm is applied to a set of new data. This two dataset is used to prove the accuracy of the proposed model. In this study a transformer fault diagnostic model based on Support Vector Machine (SVM) and gradient decent algorithms has been presented with a satisfactory diagnostic capability with high percentage in predicting incipient failure of transformer faults than existing diagnostic methods.

Keywords: diagnostic model, gradient decent, machine learning, support vector machine (SVM), transformer fault

Procedia PDF Downloads 314
2756 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies

Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk

Abstract:

Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.

Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)

Procedia PDF Downloads 74
2755 Integrated Intensity and Spatial Enhancement Technique for Color Images

Authors: Evan W. Krieger, Vijayan K. Asari, Saibabu Arigela

Abstract:

Video imagery captured for real-time security and surveillance applications is typically captured in complex lighting conditions. These less than ideal conditions can result in imagery that can have underexposed or overexposed regions. It is also typical that the video is too low in resolution for certain applications. The purpose of security and surveillance video is that we should be able to make accurate conclusions based on the images seen in the video. Therefore, if poor lighting and low resolution conditions occur in the captured video, the ability to make accurate conclusions based on the received information will be reduced. We propose a solution to this problem by using image preprocessing to improve these images before use in a particular application. The proposed algorithm will integrate an intensity enhancement algorithm with a super resolution technique. The intensity enhancement portion consists of a nonlinear inverse sign transformation and an adaptive contrast enhancement. The super resolution section is a single image super resolution technique is a Fourier phase feature based method that uses a machine learning approach with kernel regression. The proposed technique intelligently integrates these algorithms to be able to produce a high quality output while also being more efficient than the sequential use of these algorithms. This integration is accomplished by performing the proposed algorithm on the intensity image produced from the original color image. After enhancement and super resolution, a color restoration technique is employed to obtain an improved visibility color image.

Keywords: dynamic range compression, multi-level Fourier features, nonlinear enhancement, super resolution

Procedia PDF Downloads 543
2754 Building User Behavioral Models by Processing Web Logs and Clustering Mechanisms

Authors: Madhuka G. P. D. Udantha, Gihan V. Dias, Surangika Ranathunga

Abstract:

Today Websites contain very interesting applications. But there are only few methodologies to analyze User navigations through the Websites and formulating if the Website is put to correct use. The web logs are only used if some major attack or malfunctioning occurs. Web Logs contain lot interesting dealings on users in the system. Analyzing web logs has become a challenge due to the huge log volume. Finding interesting patterns is not as easy as it is due to size, distribution and importance of minor details of each log. Web logs contain very important data of user and site which are not been put to good use. Retrieving interesting information from logs gives an idea of what the users need, group users according to their various needs and improve site to build an effective and efficient site. The model we built is able to detect attacks or malfunctioning of the system and anomaly detection. Logs will be more complex as volume of traffic and the size and complexity of web site grows. Unsupervised techniques are used in this solution which is fully automated. Expert knowledge is only used in validation. In our approach first clean and purify the logs to bring them to a common platform with a standard format and structure. After cleaning module web session builder is executed. It outputs two files, Web Sessions file and Indexed URLs file. The Indexed URLs file contains the list of URLs accessed and their indices. Web Sessions file lists down the indices of each web session. Then DBSCAN and EM Algorithms are used iteratively and recursively to get the best clustering results of the web sessions. Using homogeneity, completeness, V-measure, intra and inter cluster distance and silhouette coefficient as parameters these algorithms self-evaluate themselves to input better parametric values to run the algorithms. If a cluster is found to be too large then micro-clustering is used. Using Cluster Signature Module the clusters are annotated with a unique signature called finger-print. In this module each cluster is fed to Associative Rule Learning Module. If it outputs confidence and support as value 1 for an access sequence it would be a potential signature for the cluster. Then the access sequence occurrences are checked in other clusters. If it is found to be unique for the cluster considered then the cluster is annotated with the signature. These signatures are used in anomaly detection, prevent cyber attacks, real-time dashboards that visualize users, accessing web pages, predict actions of users and various other applications in Finance, University Websites, News and Media Websites etc.

Keywords: anomaly detection, clustering, pattern recognition, web sessions

Procedia PDF Downloads 279
2753 Heuristics for Optimizing Power Consumption in the Smart Grid

Authors: Zaid Jamal Saeed Almahmoud

Abstract:

Our increasing reliance on electricity, with inefficient consumption trends, has resulted in several economical and environmental threats. These threats include wasting billions of dollars, draining limited resources, and elevating the impact of climate change. As a solution, the smart grid is emerging as the future power grid, with smart techniques to optimize power consumption and electricity generation. Minimizing the peak power consumption under a fixed delay requirement is a significant problem in the smart grid. In addition, matching demand to supply is a key requirement for the success of the future electricity. In this work, we consider the problem of minimizing the peak demand under appliances constraints by scheduling power jobs with uniform release dates and deadlines. As the problem is known to be NP-Hard, we propose two versions of a heuristic algorithm for solving this problem. Our theoretical analysis and experimental results show that our proposed heuristics outperform existing methods by providing a better approximation to the optimal solution. In addition, we consider dynamic pricing methods to minimize the peak load and match demand to supply in the smart grid. Our contribution is the proposal of generic, as well as customized pricing heuristics to minimize the peak demand and match demand with supply. In addition, we propose optimal pricing algorithms that can be used when the maximum deadline period of the power jobs is relatively small. Finally, we provide theoretical analysis and conduct several experiments to evaluate the performance of the proposed algorithms.

Keywords: heuristics, optimization, smart grid, peak demand, power supply

Procedia PDF Downloads 82