Search results for: software fault prediction
6755 A Robust Software for Advanced Analysis of Space Steel Frames
Authors: Viet-Hung Truong, Seung-Eock Kim
Abstract:
This paper presents a robust software package for practical advanced analysis of space steel framed structures. The pre- and post-processors of the presented software package are coded in the C++ programming language while the solver is written by using the FORTRAN programming language. A user-friendly graphical interface of the presented software is developed to facilitate the modeling process and result interpretation of the problem. The solver employs the stability functions for capturing the second-order effects to minimize modeling and computational time. Both the plastic-hinge and fiber-hinge beam-column elements are available in the presented software. The generalized displacement control method is adopted to solve the nonlinear equilibrium equations.Keywords: advanced analysis, beam-column, fiber-hinge, plastic hinge, steel frame
Procedia PDF Downloads 3086754 Stock Market Prediction by Regression Model with Social Moods
Authors: Masahiro Ohmura, Koh Kakusho, Takeshi Okadome
Abstract:
This paper presents a regression model with autocorrelated errors in which the inputs are social moods obtained by analyzing the adjectives in Twitter posts using a document topic model. The regression model predicts Dow Jones Industrial Average (DJIA) more precisely than autoregressive moving-average models.Keywords: stock market prediction, social moods, regression model, DJIA
Procedia PDF Downloads 5496753 Improving Software Technology to Support Release Process in Global Software Development Environment: An Experience Report
Authors: Hualter Barbosa, Bruno Bonifacio
Abstract:
The process of globalization and new business has transformed the dynamics of software development. To meet the new demands, the software industry has adapted new methodologies that can shorten development cycles to ensure greater competitiveness. Given this scenario, Global Software Development (GSD) has become a strategic element for new products' success. However, the reliability, opportunity, and perceived value can be influenced substantially with the automation of steps in the development process activities. In this sense, the development of new technologies can help developers and managers to improve the quality of development. This paper presents a report on improving one of the release process activities of Sidia's mobile product area using software technology. The objective is to present the improvement of the CLCATCH tool developed based on experimental studies and qualitative analysis on the points of improvement for the release process in Android update projects for Samsung mobile devices. The results show improvement for the new version and approach of the tool, with points that can facilitate new features of the proposed technology.Keywords: Android updated, empirical studies, GSD, process improvement
Procedia PDF Downloads 1436752 The Development of Portable Application Software for Cardiovascular Fitness Norms of NDUM Cadet Students
Authors: Mohar Kassim, Hardy Azmir, Rahmat Sholihin Mokhtar
Abstract:
The purpose of this study is to build portable application software to determine the level of cardiovascular fitness for cadet students of the National Defence University of Malaysia (NDUM). Fitness in the context of this study refers to physical fitness, specifically the cardiovascular endurance level test battery in the form of a 2.4 km run test for UPNM cadet students. This run test will be conducted to measure, test, and evaluate the performance of UPNM cadet students. All the run test results can be recorded electronically inside the portable software and will later be able to show the level of cardiovascular fitness of every cadet student according to age and gender. This software can also calculate the body mass index (BMI). Normative survey method will be used in this study through the analysis of the 2.4 km run test results. The run test scores will be classified in interval and ratio scales. Based on the findings of this study, portable application software will produced. The software will be able to directly assist the Military Training Academy (ALK), Malaysian Armed Forces (ATM), and other relevant agencies in determining the level of cardiovascular fitness among their staff. The test can be done electronically and on portable mode. The next step to be taken is to have this application patented.Keywords: development, software, application, portable, fitness norms, cardiovascular endurance
Procedia PDF Downloads 5506751 Stakeholder Management for Successful Software Projects
Authors: Kassem Saleh
Abstract:
An alarming number of software projects fail to deliver the required functionalities within the provided budget and timeframe and with the required qualities. Some of the main reasons for this problem include bad stakeholder management, poor communications and informal change management. Informal processes to identify, engage and control stakeholders lead to these reasons. Recently, to emphasize its importance, the Project Management Institute (PMI) updated the Project Management Body of Knowledge (PMBoK) to explicitly include the stakeholder management knowledge area. This knowledge area consists of four processes to identify stakeholders, plan stakeholder management, and manage and control stakeholder engagement. The use of appropriate techniques for stakeholder management in software projects will definitely lead to higher quality and successful software. In this paper, we describe some of the proven techniques that can be used during the execution of the four processes for stakeholder management. Development of collaboration tools for automating these processes are recommended and need to be integrated in available software project management tools.Keywords: project management, stakeholder management, software development, project management body of knowledge
Procedia PDF Downloads 3136750 A Comparative Analysis of the Performance of COSMO and WRF Models in Quantitative Rainfall Prediction
Authors: Isaac Mugume, Charles Basalirwa, Daniel Waiswa, Mary Nsabagwa, Triphonia Jacob Ngailo, Joachim Reuder, Sch¨attler Ulrich, Musa Semujju
Abstract:
The Numerical weather prediction (NWP) models are considered powerful tools for guiding quantitative rainfall prediction. A couple of NWP models exist and are used at many operational weather prediction centers. This study considers two models namely the Consortium for Small–scale Modeling (COSMO) model and the Weather Research and Forecasting (WRF) model. It compares the models’ ability to predict rainfall over Uganda for the period 21st April 2013 to 10th May 2013 using the root mean square (RMSE) and the mean error (ME). In comparing the performance of the models, this study assesses their ability to predict light rainfall events and extreme rainfall events. All the experiments used the default parameterization configurations and with same horizontal resolution (7 Km). The results show that COSMO model had a tendency of largely predicting no rain which explained its under–prediction. The COSMO model (RMSE: 14.16; ME: -5.91) presented a significantly (p = 0.014) higher magnitude of error compared to the WRF model (RMSE: 11.86; ME: -1.09). However the COSMO model (RMSE: 3.85; ME: 1.39) performed significantly (p = 0.003) better than the WRF model (RMSE: 8.14; ME: 5.30) in simulating light rainfall events. All the models under–predicted extreme rainfall events with the COSMO model (RMSE: 43.63; ME: -39.58) presenting significantly higher error magnitudes than the WRF model (RMSE: 35.14; ME: -26.95). This study recommends additional diagnosis of the models’ treatment of deep convection over the tropics.Keywords: comparative performance, the COSMO model, the WRF model, light rainfall events, extreme rainfall events
Procedia PDF Downloads 2626749 A Systematic Review of Process Research in Software Engineering
Authors: Tulasi Rayasa, Phani Kumar Pullela
Abstract:
A systematic review is a research method that involves collecting and evaluating the information on a specific topic in order to provide a comprehensive and unbiased review. This type of review aims to improve the software development process by ensuring that the research is thorough and accurate. To ensure objectivity, it is important to follow systematic guidelines and consider multiple sources, such as literature reviews, interviews, and surveys. The evaluation process should also be streamlined by incorporating research from journals and other sources, such as grey literature. The main goal of a systematic review is to identify the consistency of current models in the field of computer application and software engineering.Keywords: computer application, software engineering, process research, data science
Procedia PDF Downloads 996748 Comparative Analysis of Automation Testing Tools
Authors: Amit Bhanushali
Abstract:
In the ever-changing landscape of software development, automated software testing has emerged as a critical component of the Software Development Life Cycle (SDLC). This research undertakes a comparative study of three major automated testing tools -UFT, Selenium, and RPA- evaluating them on usability, maintenance, and effectiveness. Leveraging existing JAVA-based applications as test cases, the study aims to guide testers in selecting the optimal tool for specific applications. By exploring key features such as source and licensing, testing expenses, object repositories, usability, and language support, the research provides practical insights into UFT, Selenium, and RPA. Acknowledging the pivotal role of these tools in streamlining testing processes amid time constraints and resource limitations, the study assists professionals in making informed choices aligned with their organizational needs.Keywords: software testing tools, software development lifecycle (SDLC), test automation frameworks, automated software, JAVA-based, UFT, selenium and RPA (robotic process automation), source and licensing, object repository
Procedia PDF Downloads 1006747 Quantitative Structure-Property Relationship Study of Base Dissociation Constants of Some Benzimidazoles
Authors: Sanja O. Podunavac-Kuzmanović, Lidija R. Jevrić, Strahinja Z. Kovačević
Abstract:
Benzimidazoles are a group of compounds with significant antibacterial, antifungal and anticancer activity. The studied compounds consist of the main benzimidazole structure with different combinations of substituens. This study is based on the two-dimensional and three-dimensional molecular modeling and calculation of molecular descriptors (physicochemical and lipophilicity descriptors) of structurally diverse benzimidazoles. Molecular modeling was carried out by using ChemBio3D Ultra version 14.0 software. The obtained 3D models were subjected to energy minimization using molecular mechanics force field method (MM2). The cutoff for structure optimization was set at a gradient of 0.1 kcal/Åmol. The obtained set of molecular descriptors was used in principal component analysis (PCA) of possible similarities and dissimilarities among the studied derivatives. After the molecular modeling, the quantitative structure-property relationship (QSPR) analysis was applied in order to get the mathematical models which can be used in prediction of pKb values of structurally similar benzimidazoles. The obtained models are based on statistically valid multiple linear regression (MLR) equations. The calculated cross-validation parameters indicate the high prediction ability of the established QSPR models. This study is financially supported by COST action CM1306 and the project No. 114-451-347/2015-02, financially supported by the Provincial Secretariat for Science and Technological Development of Vojvodina.Keywords: benzimidazoles, chemometrics, molecular modeling, molecular descriptors, QSPR
Procedia PDF Downloads 2906746 Pressure Gradient Prediction of Oil-Water Two Phase Flow through Horizontal Pipe
Authors: Ahmed I. Raheem
Abstract:
In this thesis, stratified and stratified wavy flow regimes have been investigated numerically for the oil (1.57 mPa s viscosity and 780 kg/m3 density) and water twophase flow in small and large horizontal steel pipes with a diameter between 0.0254 to 0.508 m by ANSYS Fluent software. Volume of fluid (VOF) with two phases flows using two equations family models (Realizable k-Keywords: CFD, two-phase flow, pressure gradient, volume of fluid, large diameter, horizontal pipe, oil-water stratified and stratified wavy flow
Procedia PDF Downloads 4346745 Estimation of Transition and Emission Probabilities
Authors: Aakansha Gupta, Neha Vadnere, Tapasvi Soni, M. Anbarsi
Abstract:
Protein secondary structure prediction is one of the most important goals pursued by bioinformatics and theoretical chemistry; it is highly important in medicine and biotechnology. Some aspects of protein functions and genome analysis can be predicted by secondary structure prediction. This is used to help annotate sequences, classify proteins, identify domains, and recognize functional motifs. In this paper, we represent protein secondary structure as a mathematical model. To extract and predict the protein secondary structure from the primary structure, we require a set of parameters. Any constants appearing in the model are specified by these parameters, which also provide a mechanism for efficient and accurate use of data. To estimate these model parameters there are many algorithms out of which the most popular one is the EM algorithm or called the Expectation Maximization Algorithm. These model parameters are estimated with the use of protein datasets like RS126 by using the Bayesian Probabilistic method (data set being categorical). This paper can then be extended into comparing the efficiency of EM algorithm to the other algorithms for estimating the model parameters, which will in turn lead to an efficient component for the Protein Secondary Structure Prediction. Further this paper provides a scope to use these parameters for predicting secondary structure of proteins using machine learning techniques like neural networks and fuzzy logic. The ultimate objective will be to obtain greater accuracy better than the previously achieved.Keywords: model parameters, expectation maximization algorithm, protein secondary structure prediction, bioinformatics
Procedia PDF Downloads 4826744 Transient Stability Improvement in Multi-Machine System Using Power System Stabilizer (PSS) and Static Var Compensator (SVC)
Authors: Khoshnaw Khalid Hama Saleh, Ergun Ercelebi
Abstract:
Increasingly complex modern power systems require stability, especially for transient and small disturbances. Transient stability plays a major role in stability during fault and large disturbance. This paper compares a power system stabilizer (PSS) and static Var compensator (SVC) to improve damping oscillation and enhance transient stability. The effectiveness of a PSS connected to the exciter and/or governor in damping electromechanical oscillations of isolated synchronous generator was tested. The SVC device is a member of the shunt FACTS (flexible alternating current transmission system) family, utilized in power transmission systems. The designed model was tested with a multi-machine system consisting of four machines six bus, using MATLAB/SIMULINK software. The results obtained indicate that SVC solutions are better than PSS.Keywords: FACTS, MATLAB/SIMULINK, multi-machine system, PSS, SVC, transient stability
Procedia PDF Downloads 4566743 Nonparametric Quantile Regression for Multivariate Spatial Data
Authors: S. H. Arnaud Kanga, O. Hili, S. Dabo-Niang
Abstract:
Spatial prediction is an issue appealing and attracting several fields such as agriculture, environmental sciences, ecology, econometrics, and many others. Although multiple non-parametric prediction methods exist for spatial data, those are based on the conditional expectation. This paper took a different approach by examining a non-parametric spatial predictor of the conditional quantile. The study especially observes the stationary multidimensional spatial process over a rectangular domain. Indeed, the proposed quantile is obtained by inverting the conditional distribution function. Furthermore, the proposed estimator of the conditional distribution function depends on three kernels, where one of them controls the distance between spatial locations, while the other two control the distance between observations. In addition, the almost complete convergence and the convergence in mean order q of the kernel predictor are obtained when the sample considered is alpha-mixing. Such approach of the prediction method gives the advantage of accuracy as it overcomes sensitivity to extreme and outliers values.Keywords: conditional quantile, kernel, nonparametric, stationary
Procedia PDF Downloads 1556742 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence
Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno
Abstract:
Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index
Procedia PDF Downloads 1696741 Optimal Tuning of RST Controller Using PSO Optimization for Synchronous Generator Based Wind Turbine under Three-Phase Voltage Dips
Authors: K. Tahir, C. Belfedal, T. Allaoui, C. Gerard, M. Doumi
Abstract:
In this paper, we presented an optimized RST controller using Particle Swarm Optimization (PSO) meta-heuristic technique of the active and reactive power regulation of a grid connected wind turbine based on a wound field synchronous generator. This regulation is achieved below the synchronous speed, by means of a maximum power point tracking algorithm. The control of our system is tested under typical wind variations and parameters variation, fault grid condition by simulation. Some results are presented and discussed to prove simplicity and efficiency of the WRSG control for WECS. On the other hand, according to simulation results, variable speed driven WRSG is not significantly impacted in fault conditions.Keywords: wind energy, particle swarm optimization, wound rotor synchronous generator, power control, RST controller, maximum power point tracking
Procedia PDF Downloads 4536740 A Deep Learning Based Integrated Model For Spatial Flood Prediction
Authors: Vinayaka Gude Divya Sampath
Abstract:
The research introduces an integrated prediction model to assess the susceptibility of roads in a future flooding event. The model consists of deep learning algorithm for forecasting gauge height data and Flood Inundation Mapper (FIM) for spatial flooding. An optimal architecture for Long short-term memory network (LSTM) was identified for the gauge located on Tangipahoa River at Robert, LA. Dropout was applied to the model to evaluate the uncertainty associated with the predictions. The estimates are then used along with FIM to identify the spatial flooding. Further geoprocessing in ArcGIS provides the susceptibility values for different roads. The model was validated based on the devastating flood of August 2016. The paper discusses the challenges for generalization the methodology for other locations and also for various types of flooding. The developed model can be used by the transportation department and other emergency response organizations for effective disaster management.Keywords: deep learning, disaster management, flood prediction, urban flooding
Procedia PDF Downloads 1496739 Analysis, Design, and Implementation of Quality Management System for KSA Software Company
Authors: Omar Said Almushyt
Abstract:
Quality management, in all countries all over the world, has become recently necessary to face challenges among companies. Software companies in KSA suffer from two problems, namely, low customer satisfaction, and low product quality. Implementation of quality management for a software company can solve these problems, by improving the quality of products and enhancing customer satisfaction. This will lead the company to be competitive. Introducing quality management system onto system analysis followed by system design and finally implementing that system can achieve these goals. Results of the present work showed that the proposed method can increase both the product quality by 10 % and the customer satisfaction by 20 %.Keywords: quality, management, software, information engineering
Procedia PDF Downloads 4406738 Customer Acquisition through Time-Aware Marketing Campaign Analysis in Banking Industry
Authors: Harneet Walia, Morteza Zihayat
Abstract:
Customer acquisition has become one of the critical issues of any business in the 21st century; having a healthy customer base is the essential asset of the bank business. Term deposits act as a major source of cheap funds for the banks to invest and benefit from interest rate arbitrage. To attract customers, the marketing campaigns at most financial institutions consist of multiple outbound telephonic calls with more than one contact to a customer which is a very time-consuming process. Therefore, customized direct marketing has become more critical than ever for attracting new clients. As customer acquisition is becoming more difficult to archive, having an intelligent and redefined list is necessary to sell a product smartly. Our aim of this research is to increase the effectiveness of campaigns by predicting customers who will most likely subscribe to the fixed deposit and suggest the most suitable month to reach out to customers. We design a Time Aware Upsell Prediction Framework (TAUPF) using two different approaches, with an aim to find the best approach and technique to build the prediction model. TAUPF is implemented using Upsell Prediction Approach (UPA) and Clustered Upsell Prediction Approach (CUPA). We also address the data imbalance problem by examining and comparing different methods of sampling (Up-sampling and down-sampling). Our results have shown building such a model is quite feasible and profitable for the financial institutions. The Time Aware Upsell Prediction Framework (TAUPF) can be easily used in any industry such as telecom, automobile, tourism, etc. where the TAUPF (Clustered Upsell Prediction Approach (CUPA) or Upsell Prediction Approach (UPA)) holds valid. In our case, CUPA books more reliable. As proven in our research, one of the most important challenges is to define measures which have enough predictive power as the subscription to a fixed deposit depends on highly ambiguous situations and cannot be easily isolated. While we have shown the practicality of time-aware upsell prediction model where financial institutions can benefit from contacting the customers at the specified month, further research needs to be done to understand the specific time of the day. In addition, a further empirical/pilot study on real live customer needs to be conducted to prove the effectiveness of the model in the real world.Keywords: customer acquisition, predictive analysis, targeted marketing, time-aware analysis
Procedia PDF Downloads 1256737 Comparison between Two Software Packages GSTARS4 and HEC-6 about Prediction of the Sedimentation Amount in Dam Reservoirs and to Estimate Its Efficient Life Time in the South of Iran
Authors: Fatemeh Faramarzi, Hosein Mahjoob
Abstract:
Building dams on rivers for utilization of water resources causes problems in hydrodynamic equilibrium and results in leaving all or part of the sediments carried by water in dam reservoir. This phenomenon has also significant impacts on water and sediment flow regime and in the long term can cause morphological changes in the environment surrounding the river, reducing the useful life of the reservoir which threatens sustainable development through inefficient management of water resources. In the past, empirical methods were used to predict the sedimentation amount in dam reservoirs and to estimate its efficient lifetime. But recently the mathematical and computational models are widely used in sedimentation studies in dam reservoirs as a suitable tool. These models usually solve the equations using finite element method. This study compares the results from tow software packages, GSTARS4 & HEC-6, in the prediction of the sedimentation amount in Dez dam, southern Iran. The model provides a one-dimensional, steady-state simulation of sediment deposition and erosion by solving the equations of momentum, flow and sediment continuity and sediment transport. GSTARS4 (Generalized Sediment Transport Model for Alluvial River Simulation) which is based on a one-dimensional mathematical model that simulates bed changes in both longitudinal and transverse directions by using flow tubes in a quasi-two-dimensional scheme to calibrate a period of 47 years and forecast the next 47 years of sedimentation in Dez Dam, Southern Iran. This dam is among the highest dams all over the world (with its 203 m height), and irrigates more than 125000 square hectares of downstream lands and plays a major role in flood control in the region. The input data including geometry, hydraulic and sedimentary data, starts from 1955 to 2003 on a daily basis. To predict future river discharge, in this research, the time series data were assumed to be repeated after 47 years. Finally, the obtained result was very satisfactory in the delta region so that the output from GSTARS4 was almost identical to the hydrographic profile in 2003. In the Dez dam due to the long (65 km) and a large tank, the vertical currents are dominant causing the calculations by the above-mentioned method to be inaccurate. To solve this problem, we used the empirical reduction method to calculate the sedimentation in the downstream area which led to very good answers. Thus, we demonstrated that by combining these two methods a very suitable model for sedimentation in Dez dam for the study period can be obtained. The present study demonstrated successfully that the outputs of both methods are the same.Keywords: Dez Dam, prediction, sedimentation, water resources, computational models, finite element method, GSTARS4, HEC-6
Procedia PDF Downloads 3136736 Software Engineering Inspired Cost Estimation for Process Modelling
Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller
Abstract:
Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development rocess and the process of process modelling which is a phase of the Business Process Management life-cycle.Keywords: COCOMO II, busines process modeling, cost estimation method, BPM COCOMO
Procedia PDF Downloads 4416735 Sensor and Actuator Fault Detection in Connected Vehicles under a Packet Dropping Network
Authors: Z. Abdollahi Biron, P. Pisu
Abstract:
Connected vehicles are one of the promising technologies for future Intelligent Transportation Systems (ITS). A connected vehicle system is essentially a set of vehicles communicating through a network to exchange their information with each other and the infrastructure. Although this interconnection of the vehicles can be potentially beneficial in creating an efficient, sustainable, and green transportation system, a set of safety and reliability challenges come out with this technology. The first challenge arises from the information loss due to unreliable communication network which affects the control/management system of the individual vehicles and the overall system. Such scenario may lead to degraded or even unsafe operation which could be potentially catastrophic. Secondly, faulty sensors and actuators can affect the individual vehicle’s safe operation and in turn will create a potentially unsafe node in the vehicular network. Further, sending that faulty sensor information to other vehicles and failure in actuators may significantly affect the safe operation of the overall vehicular network. Therefore, it is of utmost importance to take these issues into consideration while designing the control/management algorithms of the individual vehicles as a part of connected vehicle system. In this paper, we consider a connected vehicle system under Co-operative Adaptive Cruise Control (CACC) and propose a fault diagnosis scheme that deals with these aforementioned challenges. Specifically, the conventional CACC algorithm is modified by adding a Kalman filter-based estimation algorithm to suppress the effect of lost information under unreliable network. Further, a sliding mode observer-based algorithm is used to improve the sensor reliability under faults. The effectiveness of the overall diagnostic scheme is verified via simulation studies.Keywords: fault diagnostics, communication network, connected vehicles, packet drop out, platoon
Procedia PDF Downloads 2396734 Cricket Shot Recognition using Conditional Directed Spatial-Temporal Graph Networks
Authors: Tanu Aneja, Harsha Malaviya
Abstract:
Capturing pose information in cricket shots poses several challenges, such as low-resolution videos, noisy data, and joint occlusions caused by the nature of the shots. In response to these challenges, we propose a CondDGConv-based framework specifically for cricket shot prediction. By analyzing the spatial-temporal relationships in batsman shot sequences from an annotated 2D cricket dataset, our model achieves a 97% accuracy in predicting shot types. This performance is made possible by conditioning the graph network on batsman 2D poses, allowing for precise prediction of shot outcomes based on pose dynamics. Our approach highlights the potential for enhancing shot prediction in cricket analytics, offering a robust solution for overcoming pose-related challenges in sports analysis.Keywords: action recognition, cricket. sports video analytics, computer vision, graph convolutional networks
Procedia PDF Downloads 216733 Design and Optimization Fire Alarm System to Protect Gas Condensate Reservoirs With the Use of Nano-Technology
Authors: Hefzollah Mohammadian, Ensieh Hajeb, Mohamad Baqer Heidari
Abstract:
In this paper, for the protection and safety of tanks gases (flammable materials) and also due to the considerable economic value of the reservoir, the new system for the protection, the conservation and fire fighting has been cloned. The system consists of several parts: the Sensors to detect heat and fire with Nanotechnology (nano sensor), Barrier for isolation and protection from a range of two electronic zones, analyzer for detection and locating point of fire accurately, Main electronic board to announce fire, Fault diagnosis in different locations, such as relevant alarms and activate different devices for fire distinguish and announcement. An important feature of this system, high speed and capability of fire detection system in a way that is able to detect the value of the ambient temperature that can be adjusted. Another advantage of this system is autonomous and does not require human operator in place. Using nanotechnology, in addition to speeding up the work, reduces the cost of construction of the sensor and also the notification system and fire extinguish.Keywords: analyser, barrier, heat resistance, general fault, general alarm, nano sensor
Procedia PDF Downloads 4566732 Data-Driven Simulations Tools for Der and Battery Rich Power Grids
Authors: Ali Moradiamani, Samaneh Sadat Sajjadi, Mahdi Jalili
Abstract:
Power system analysis has been a major research topic in the generation and distribution sections, in both industry and academia, for a long time. Several load flow and fault analysis scenarios have been normally performed to study the performance of different parts of the grid in the context of, for example, voltage and frequency control. Software tools, such as PSCAD, PSSE, and PowerFactory DIgSILENT, have been developed to perform these analyses accurately. Distribution grid had been the passive part of the grid and had been known as the grid of consumers. However, a significant paradigm shift has happened with the emergence of Distributed Energy Resources (DERs) in the distribution level. It means that the concept of power system analysis needs to be extended to the distribution grid, especially considering self sufficient technologies such as microgrids. Compared to the generation and transmission levels, the distribution level includes significantly more generation/consumption nodes thanks to PV rooftop solar generation and battery energy storage systems. In addition, different consumption profile is expected from household residents resulting in a diverse set of scenarios. Emergence of electric vehicles will absolutely make the environment more complicated considering their charging (and possibly discharging) requirements. These complexities, as well as the large size of distribution grids, create challenges for the available power system analysis software. In this paper, we study the requirements of simulation tools in the distribution grid and how data-driven algorithms are required to increase the accuracy of the simulation results.Keywords: smart grids, distributed energy resources, electric vehicles, battery storage systsms, simulation tools
Procedia PDF Downloads 1056731 Geodynamic Evolution of the Tunisian Dorsal Backland (Central Mediterranean) from the Cenozoic to Present
Authors: Aymen Arfaoui, Abdelkader Soumaya, Noureddine Ben Ayed
Abstract:
The study region is located in the Tunisian Dorsal Backland (Central Mediterranean), which is the easternmost part of the Saharan Atlas mountain range, trending southwest-northeast. Based on our fieldwork, seismic tomography images, seismicity, and previous studies, we propose an interpretation of the relationship between the surface deformation and fault kinematics in the study area and the internal dynamic processes acting in the Central Mediterranean from the Cenozoic to the present. The subduction and dynamics of internal forces beneath the complicated Maghrebides mobile belt have an impact on the Tertiary and Quaternary tectonic regimes in the Pelagian and Atlassic foreland that is part of our study region. The left lateral reactivation of the major "Tunisian N-S Axis fault" and the development of a compressional relay between the Hammamet Korbous and Messella-Ressas faults are possibly a result of tectonic stresses due to the slab roll-back following the Africa/Eurasia convergence. After the slab segmentation and its eastward migration (5–4 Ma) and the formation of the Strait of Sicily "rift zone" further east, a transtensional tectonic regime has been installed in this area. According to seismic tomography images, the STEP fault of the "North-South Axis" at Hammamet-Korbous coincides with the western edge of the "Slab windows" of the Sicilian Channel and the eastern boundary of the positive anomalies attributed to the residual Slab of Tunisia. On the other hand, significant E-W Plio-Quaternary tectonic activity may be observed along the eastern portion of this STEP fault system in the Grombalia zone as a result of recent vertical lithospheric motion in response to the lateral slab migration eastward to Sicily Channel. According to SKS fast splitting directions, the upper mantle flow pattern beneath Tunisian Dorsal is parallel to the NE-SW to E-W orientation of the Shmin identified in the study area, similar to the Plio-Quaternary extensional orientation in the Central Mediterranean. Additionally, the removal of the lithosphere and the subsequent uplift of the sub-lithospheric mantle beneath the topographic highs of the Dorsal and its surroundings may be the cause of the dominant extensional to transtensional Quaternary regime. The occurrence of strike-slip and extensional seismic events in the Pelagian block reveals that the regional transtensional tectonic regime persists today. Finally, we believe that the geodynamic history of the study area since the Cenozoic is primarily influenced by the preexisting weak zones, the African slab detachment, and the upper mantle flow pattern in the central Mediterranean.Keywords: Tunisia, lithospheric discontinuity (STEP fault), geodynamic evolution, Tunisian dorsal backland, strike-slip fault, seismic tomography, seismicity, central Mediterranean
Procedia PDF Downloads 806730 Free and Open Source Software for BIM Workflow of Steel Structure Design
Authors: Danilo Di Donato
Abstract:
The continuous new releases of free and open source software (FOSS) and the high costs of proprietary software -whose monopoly is characterized by closed codes and the low level of implementation and customization of software by end-users- impose a reflection on possible tools that can be chosen and adopted for the design and the representation of new steel constructions. The paper aims to show experimentation carried out to verify the actual potential and the effective applicability of FOSS supports to the BIM modeling of steel structures, particularly considering the goal of a possible workflow in order to achieve high level of development (LOD); allow effective interchange methods between different software. To this end, the examined software packages are those with open source or freeware licenses, in order to evaluate their use in architectural praxis. The test has primarily involved the experimentation of Freecad -the only Open Source software that allows a complete and integrated BIM workflow- and then the results have been compared with those of two proprietary software, Sketchup and TeklaBim Sight, which are released with a free version, but not usable for commercial purposes. The experiments carried out on Open Source, and freeware software was then compared with the outcomes that are obtained by two proprietary software, Sketchup Pro and Tekla Structure which has special modules particularly addressed to the design of steel structures. This evaluation has concerned different comparative criteria, that have been defined on the basis of categories related to the reliability, the efficiency, the potentiality, achievable LOD and user-friendliness of the analyzed software packages. In order to verify the actual outcomes of FOSS BIM for the steel structure projects, these results have been compared with a simulation related to a real case study and carried out with a proprietary software BIM modeling. Therefore, the same design theme, the project of a shelter of public space, has been developed using different software. Therefore the purpose of the contribution is to assess what are the developments and potentialities inherent in FOSS BIM, in order to estimate their effective applicability to professional practice, their limits and new fields of research they propose.Keywords: BIM, steel buildings, FOSS, LOD
Procedia PDF Downloads 1756729 Predicting Daily Patient Hospital Visits Using Machine Learning
Authors: Shreya Goyal
Abstract:
The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.Keywords: machine learning, SVM, HIPAA, data
Procedia PDF Downloads 666728 Uplift Segmentation Approach for Targeting Customers in a Churn Prediction Model
Authors: Shivahari Revathi Venkateswaran
Abstract:
Segmenting customers plays a significant role in churn prediction. It helps the marketing team with proactive and reactive customer retention. For the reactive retention, the retention team reaches out to customers who already showed intent to disconnect by giving some special offers. When coming to proactive retention, the marketing team uses churn prediction model, which ranks each customer from rank 1 to 100, where 1 being more risk to churn/disconnect (high ranks have high propensity to churn). The churn prediction model is built by using XGBoost model. However, with the churn rank, the marketing team can only reach out to the customers based on their individual ranks. To profile different groups of customers and to frame different marketing strategies for targeted groups of customers are not possible with the churn ranks. For this, the customers must be grouped in different segments based on their profiles, like demographics and other non-controllable attributes. This helps the marketing team to frame different offer groups for the targeted audience and prevent them from disconnecting (proactive retention). For segmentation, machine learning approaches like k-mean clustering will not form unique customer segments that have customers with same attributes. This paper finds an alternate approach to find all the combination of unique segments that can be formed from the user attributes and then finds the segments who have uplift (churn rate higher than the baseline churn rate). For this, search algorithms like fast search and recursive search are used. Further, for each segment, all customers can be targeted using individual churn ranks from the churn prediction model. Finally, a UI (User Interface) is developed for the marketing team to interactively search for the meaningful segments that are formed and target the right set of audience for future marketing campaigns and prevent them from disconnecting.Keywords: churn prediction modeling, XGBoost model, uplift segments, proactive marketing, search algorithms, retention, k-mean clustering
Procedia PDF Downloads 716727 Damage-Based Seismic Design and Evaluation of Reinforced Concrete Bridges
Authors: Ping-Hsiung Wang, Kuo-Chun Chang
Abstract:
There has been a common trend worldwide in the seismic design and evaluation of bridges towards the performance-based method where the lateral displacement or the displacement ductility of bridge column is regarded as an important indicator for performance assessment. However, the seismic response of a bridge to an earthquake is a combined result of cyclic displacements and accumulated energy dissipation, causing damage to the bridge, and hence the lateral displacement (ductility) alone is insufficient to tell its actual seismic performance. This study aims to propose a damage-based seismic design and evaluation method for reinforced concrete bridges on the basis of the newly developed capacity-based inelastic displacement spectra. The capacity-based inelastic displacement spectra that comprise an inelastic displacement ratio spectrum and a corresponding damage state spectrum was constructed by using a series of nonlinear time history analyses and a versatile, smooth hysteresis model. The smooth model could take into account the effects of various design parameters of RC bridge columns and correlates the column’s strength deterioration with the Park and Ang’s damage index. It was proved that the damage index not only can be used to accurately predict the onset of strength deterioration, but also can be a good indicator for assessing the actual visible damage condition of column regardless of its loading history (i.e., similar damage index corresponds to similar actual damage condition for the same designed columns subjected to very different cyclic loading protocols as well as earthquake loading), providing a better insight into the seismic performance of bridges. Besides, the computed spectra show that the inelastic displacement ratio for far-field ground motions approximately conforms to the equal displacement rule when structural period is larger than around 0.8 s, but that for near-fault ground motions departs from the rule in the whole considered spectral regions. Furthermore, the near-fault ground motions would lead to significantly greater inelastic displacement ratio and damage index than far-field ground motions and most of the practical design scenarios cannot survive the considered near-fault ground motion when the strength reduction factor of bridge is not less than 5.0. Finally, the spectrum formula is presented as a function of structural period, strength reduction factor, and various column design parameters for far-field and near-fault ground motions by means of the regression analysis of the computed spectra. And based on the developed spectrum formula, a design example of a bridge is presented to illustrate the proposed damage-based seismic design and evaluation method where the damage state of the bridge is used as the performance objective.Keywords: damage index, far-field, near-fault, reinforced concrete bridge, seismic design and evaluation
Procedia PDF Downloads 1256726 On the Homology Modeling, Structural Function Relationship and Binding Site Prediction of Human Alsin Protein
Authors: Y. Ruchi, A. Prerna, S. Deepshikha
Abstract:
Amyotrophic lateral sclerosis (ALS), also known as “Lou Gehrig’s disease”. It is a neurodegenerative disease associated with degeneration of motor neurons in the cerebral cortex, brain stem, and spinal cord characterized by distal muscle weakness, atrophy, normal sensation, pyramidal signs and progressive muscular paralysis reflecting. ALS2 is a juvenile autosomal recessive disorder, slowly progressive, that maps to chromosome 2q33 and is associated with mutations in the alsin gene, a putative GTPase regulator. In this paper we have done homology modeling of alsin2 protein using multiple templates (3KCI_A, 4LIM_A, 402W_A, 4D9S_A, and 4DNV_A) designed using the Prime program in Schrödinger software. Further modeled structure is used to identify effective binding sites on the basis of structural and physical properties using sitemap program in Schrödinger software, structural and function analysis is done by using Prosite and ExPASy server that gives insight into conserved domains and motifs that can be used for protein classification. This paper summarizes the structural, functional and binding site property of alsin2 protein. These binding sites can be potential drug target sites and can be used for docking studies.Keywords: ALS, binding site, homology modeling, neuronal degeneration
Procedia PDF Downloads 390