Search results for: validation techniques
6992 Considering Partially Developed Artifacts in Change Impact Analysis Implementation
Authors: Nazri Kama, Sufyan Basri, Roslina Ibrahim
Abstract:
It is important to manage the changes in the software to meet the evolving needs of the customer. Accepting too many changes causes delay in the completion and it incurs additional cost. One type of information that helps to make the decision is through change impact analysis. Current impact analysis approaches assume that all classes in the class artifact are completely developed and the class artifact is used as a source of analysis. However, these assumptions are impractical for impact analysis in the software development phase as some classes in the class artifact are still under development or partially developed that leads to inaccuracy. This paper presents a novel impact analysis approach to be used in the software development phase. The significant achievements of the approach are demonstrated through an extensive experimental validation using three case studies.Keywords: software development, impact analysis, traceability, static analysis.
Procedia PDF Downloads 6086991 Evaluation of Short-Term Load Forecasting Techniques Applied for Smart Micro-Grids
Authors: Xiaolei Hu, Enrico Ferrera, Riccardo Tomasi, Claudio Pastrone
Abstract:
Load Forecasting plays a key role in making today's and future's Smart Energy Grids sustainable and reliable. Accurate power consumption prediction allows utilities to organize in advance their resources or to execute Demand Response strategies more effectively, which enables several features such as higher sustainability, better quality of service, and affordable electricity tariffs. It is easy yet effective to apply Load Forecasting at larger geographic scale, i.e. Smart Micro Grids, wherein the lower available grid flexibility makes accurate prediction more critical in Demand Response applications. This paper analyses the application of short-term load forecasting in a concrete scenario, proposed within the EU-funded GreenCom project, which collect load data from single loads and households belonging to a Smart Micro Grid. Three short-term load forecasting techniques, i.e. linear regression, artificial neural networks, and radial basis function network, are considered, compared, and evaluated through absolute forecast errors and training time. The influence of weather conditions in Load Forecasting is also evaluated. A new definition of Gain is introduced in this paper, which innovatively serves as an indicator of short-term prediction capabilities of time spam consistency. Two models, 24- and 1-hour-ahead forecasting, are built to comprehensively compare these three techniques.Keywords: short-term load forecasting, smart micro grid, linear regression, artificial neural networks, radial basis function network, gain
Procedia PDF Downloads 4716990 Modern Spectrum Sensing Techniques for Cognitive Radio Networks: Practical Implementation and Performance Evaluation
Authors: Antoni Ivanov, Nikolay Dandanov, Nicole Christoff, Vladimir Poulkov
Abstract:
Spectrum underutilization has made cognitive radio a promising technology both for current and future telecommunications. This is due to the ability to exploit the unused spectrum in the bands dedicated to other wireless communication systems, and thus, increase their occupancy. The essential function, which allows the cognitive radio device to perceive the occupancy of the spectrum, is spectrum sensing. In this paper, the performance of modern adaptations of the four most widely used spectrum sensing techniques namely, energy detection (ED), cyclostationary feature detection (CSFD), matched filter (MF) and eigenvalues-based detection (EBD) is compared. The implementation has been accomplished through the PlutoSDR hardware platform and the GNU Radio software package in very low Signal-to-Noise Ratio (SNR) conditions. The optimal detection performance of the examined methods in a realistic implementation-oriented model is found for the common relevant parameters (number of observed samples, sensing time and required probability of false alarm).Keywords: cognitive radio, dynamic spectrum access, GNU Radio, spectrum sensing
Procedia PDF Downloads 2466989 Safety Validation of Black-Box Autonomous Systems: A Multi-Fidelity Reinforcement Learning Approach
Authors: Jared Beard, Ali Baheri
Abstract:
As autonomous systems become more prominent in society, ensuring their safe application becomes increasingly important. This is clearly demonstrated with autonomous cars traveling through a crowded city or robots traversing a warehouse with heavy equipment. Human environments can be complex, having high dimensional state and action spaces. This gives rise to two problems. One being that analytic solutions may not be possible. The other is that in simulation based approaches, searching the entirety of the problem space could be computationally intractable, ruling out formal methods. To overcome this, approximate solutions may seek to find failures or estimate their likelihood of occurrence. One such approach is adaptive stress testing (AST) which uses reinforcement learning to induce failures in the system. The premise of which is that a learned model can be used to help find new failure scenarios, making better use of simulations. In spite of these failures AST fails to find particularly sparse failures and can be inclined to find similar solutions to those found previously. To help overcome this, multi-fidelity learning can be used to alleviate this overuse of information. That is, information in lower fidelity can simulations can be used to build up samples less expensively, and more effectively cover the solution space to find a broader set of failures. Recent work in multi-fidelity learning has passed information bidirectionally using “knows what it knows” (KWIK) reinforcement learners to minimize the number of samples in high fidelity simulators (thereby reducing computation time and load). The contribution of this work, then, is development of the bidirectional multi-fidelity AST framework. Such an algorithm, uses multi-fidelity KWIK learners in an adversarial context to find failure modes. Thus far, a KWIK learner has been used to train an adversary in a grid world to prevent an agent from reaching its goal; thus demonstrating the utility of KWIK learners in an AST framework. The next step is implementation of the bidirectional multi-fidelity AST framework described. Testing will be conducted in a grid world containing an agent attempting to reach a goal position and adversary tasked with intercepting the agent as demonstrated previously. Fidelities will be modified by adjusting the size of a time-step, with higher-fidelity effectively allowing for more responsive closed loop feedback. Results will compare the single KWIK AST learner with the multi-fidelity algorithm with respect to number of samples, distinct failure modes found, and relative effect of learning after a number of trials.Keywords: multi-fidelity reinforcement learning, multi-fidelity simulation, safety validation, falsification
Procedia PDF Downloads 1576988 Intrusion Detection Using Dual Artificial Techniques
Authors: Rana I. Abdulghani, Amera I. Melhum
Abstract:
With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.Keywords: IDS, SI, BP, NSL_KDD, PSO
Procedia PDF Downloads 3836987 Development of National Scale Hydropower Resource Assessment Scheme Using SWAT and Geospatial Techniques
Authors: Rowane May A. Fesalbon, Greyland C. Agno, Jodel L. Cuasay, Dindo A. Malonzo, Ma. Rosario Concepcion O. Ang
Abstract:
The Department of Energy of the Republic of the Philippines estimates that the country’s energy reserves for 2015 are dwindling– observed in the rotating power outages in several localities. To aid in the energy crisis, a national hydropower resource assessment scheme is developed. Hydropower is a resource that is derived from flowing water and difference in elevation. It is a renewable energy resource that is deemed abundant in the Philippines – being an archipelagic country that is rich in bodies of water and water resources. The objectives of this study is to develop a methodology for a national hydropower resource assessment using hydrologic modeling and geospatial techniques in order to generate resource maps for future reference and use of the government and other stakeholders. The methodology developed for this purpose is focused on two models – the implementation of the Soil and Water Assessment Tool (SWAT) for the river discharge and the use of geospatial techniques to analyze the topography and obtain the head, and generate the theoretical hydropower potential sites. The methodology is highly coupled with Geographic Information Systems to maximize the use of geodatabases and the spatial significance of the determined sites. The hydrologic model used in this workflow is SWAT integrated in the GIS software ArcGIS. The head is determined by a developed algorithm that utilizes a Synthetic Aperture Radar (SAR)-derived digital elevation model (DEM) which has a resolution of 10-meters. The initial results of the developed workflow indicate hydropower potential in the river reaches ranging from pico (less than 5 kW) to mini (1-3 MW) theoretical potential.Keywords: ArcSWAT, renewable energy, hydrologic model, hydropower, GIS
Procedia PDF Downloads 3136986 The Journey of a Malicious HTTP Request
Authors: M. Mansouri, P. Jaklitsch, E. Teiniker
Abstract:
SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect high-level attacks such as SQL injection.Keywords: Linux system calls, web attack detection, interception, SQL
Procedia PDF Downloads 3596985 Concept Drifts Detection and Localisation in Process Mining
Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa
Abstract:
Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining
Procedia PDF Downloads 3486984 Stress Field Induced By an Interfacial Edge Dislocation in a Multi-Layered Medium
Authors: Aditya Khanna, Andrei Kotousov
Abstract:
A novel method is presented for obtaining the stress field induced by an edge dislocation in a multilayered composite. To demonstrate the applications of the obtained solution, we consider the problem of an interfacial crack in a periodically layered bimaterial medium. The crack is modeled as a continuous distribution of edge dislocations and the Distributed Dislocation Technique (DDT) is utilized to obtain numerical results for the energy release rate (ERR). The numerical results correspond well with previously published results and the comparison serves as a validation of the obtained dislocation solution.Keywords: distributed dislocation technique, edge dislocation, elastic field, interfacial crack, multi-layered composite
Procedia PDF Downloads 4466983 Exhaustive Study of Essential Constraint Satisfaction Problem Techniques Based on N-Queens Problem
Authors: Md. Ahsan Ayub, Kazi A. Kalpoma, Humaira Tasnim Proma, Syed Mehrab Kabir, Rakib Ibna Hamid Chowdhury
Abstract:
Constraint Satisfaction Problem (CSP) is observed in various applications, i.e., scheduling problems, timetabling problems, assignment problems, etc. Researchers adopt a CSP technique to tackle a certain problem; however, each technique follows different approaches and ways to solve a problem network. In our exhaustive study, it has been possible to visualize the processes of essential CSP algorithms from a very concrete constraint satisfaction example, NQueens Problem, in order to possess a deep understanding about how a particular constraint satisfaction problem will be dealt with by our studied and implemented techniques. Besides, benchmark results - time vs. value of N in N-Queens - have been generated from our implemented approaches, which help understand at what factor each algorithm produces solutions; especially, in N-Queens puzzle. Thus, extended decisions can be made to instantiate a real life problem within CSP’s framework.Keywords: arc consistency (AC), backjumping algorithm (BJ), backtracking algorithm (BT), constraint satisfaction problem (CSP), forward checking (FC), least constrained values (LCV), maintaining arc consistency (MAC), minimum remaining values (MRV), N-Queens problem
Procedia PDF Downloads 3656982 Additive Friction Stir Manufacturing Process: Interest in Understanding Thermal Phenomena and Numerical Modeling of the Temperature Rise Phase
Authors: Antoine Lauvray, Fabien Poulhaon, Pierre Michaud, Pierre Joyot, Emmanuel Duc
Abstract:
Additive Friction Stir Manufacturing (AFSM) is a new industrial process that follows the emergence of friction-based processes. The AFSM process is a solid-state additive process using the energy produced by the friction at the interface between a rotating non-consumable tool and a substrate. Friction depends on various parameters like axial force, rotation speed or friction coefficient. The feeder material is a metallic rod that flows through a hole in the tool. Unlike in Friction Stir Welding (FSW) where abundant literature exists and addresses many aspects going from process implementation to characterization and modeling, there are still few research works focusing on AFSM. Therefore, there is still a lack of understanding of the physical phenomena taking place during the process. This research work aims at a better AFSM process understanding and implementation, thanks to numerical simulation and experimental validation performed on a prototype effector. Such an approach is considered a promising way for studying the influence of the process parameters and to finally identify a process window that seems relevant. The deposition of material through the AFSM process takes place in several phases. In chronological order these phases are the docking phase, the dwell time phase, the deposition phase, and the removal phase. The present work focuses on the dwell time phase that enables the temperature rise of the system composed of the tool, the filler material, and the substrate and due to pure friction. Analytic modeling of heat generation based on friction considers as main parameters the rotational speed and the contact pressure. Another parameter considered influential is the friction coefficient assumed to be variable due to the self-lubrication of the system with the rise in temperature or the materials in contact roughness smoothing over time. This study proposes, through numerical modeling followed by experimental validation, to question the influence of the various input parameters on the dwell time phase. Rotation speed, temperature, spindle torque, and axial force are the main monitored parameters during experimentations and serve as reference data for the calibration of the numerical model. This research shows that the geometry of the tool as well as fluctuations of the input parameters like axial force and rotational speed are very influential on the temperature reached and/or the time required to reach the targeted temperature. The main outcome is the prediction of a process window which is a key result for a more efficient process implementation.Keywords: numerical model, additive manufacturing, friction, process
Procedia PDF Downloads 1476981 The Ethics of Corporate Social Responsibility Statements in Undercutting Sustainability: A Communication Perspective
Authors: Steven Woods
Abstract:
The use of Corporate Social Responsibility Statements has become ubiquitous in society. The appeal to consumers by being a well-behaved social entity has become a strategy not just to ensure brand loyalty but also to further larger scale projects of corporate interests. Specifically, the use of CSR to position corporations as good planetary citizens involves not just self-promotion but also a way of transferring responsibility from systems to individuals. By using techniques labeled as “greenwashing” and emphasizing ethical consumption choices as the solution, corporations present themselves as good members of the community and pursuing sustainability. Ultimately, the primary function of Corporate Social Responsibility statements is to maintain the economic status quo of ongoing growth and consumption while presenting and environmentally progressive image to the public, as well as reassuring them corporate behavior is superior to government intervention. By analyzing the communication techniques utilized through content analysis of specific examples, along with an analysis of the frames of meaning constructed in the CSR statements, the practices of Corporate Responsibility and Sustainability will be addressed from an ethical perspective.Keywords: corporate social responsibility, ethics, greenwashing, sustainability
Procedia PDF Downloads 726980 Using Data Mining Techniques to Evaluate the Different Factors Affecting the Academic Performance of Students at the Faculty of Information Technology in Hashemite University in Jordan
Authors: Feras Hanandeh, Majdi Shannag
Abstract:
This research studies the different factors that could affect the Faculty of Information Technology in Hashemite University students’ accumulative average. The research paper verifies the student information, background, their academic records, and how this information will affect the student to get high grades. The student information used in the study is extracted from the student’s academic records. The data mining tools and techniques are used to decide which attribute(s) will affect the student’s accumulative average. The results show that the most important factor which affects the students’ accumulative average is the student Acceptance Type. And we built a decision tree model and rules to determine how the student can get high grades in their courses. The overall accuracy of the model is 44% which is accepted rate.Keywords: data mining, classification, extracting rules, decision tree
Procedia PDF Downloads 4176979 [Keynote Talk]: Software Reliability Assessment and Fault Tolerance: Issues and Challenges
Authors: T. Gayen
Abstract:
Although, there are several software reliability models existing today there does not exist any versatile model even today which can be used for the reliability assessment of software. Complex software has a large number of states (unlike the hardware) so it becomes practically difficult to completely test the software. Irrespective of the amount of testing one does, sometimes it becomes extremely difficult to assure that the final software product is fault free. The Black Box Software Reliability models are found be quite uncertain for the reliability assessment of various systems. As mission critical applications need to be highly reliable and since it is not always possible to ensure the development of highly reliable system. Hence, in order to achieve fault-free operation of software one develops some mechanism to handle faults remaining in the system even after the development. Although, several such techniques are currently in use to achieve fault tolerance, yet these mechanisms may not always be very suitable for various systems. Hence, this discussion is focused on analyzing the issues and challenges faced with the existing techniques for reliability assessment and fault tolerance of various software systems.Keywords: black box, fault tolerance, failure, software reliability
Procedia PDF Downloads 4266978 Secret Sharing in Visual Cryptography Using NVSS and Data Hiding Techniques
Authors: Misha Alexander, S. B. Waykar
Abstract:
Visual Cryptography is a special unbreakable encryption technique that transforms the secret image into random noisy pixels. These shares are transmitted over the network and because of its noisy texture it attracts the hackers. To address this issue a Natural Visual Secret Sharing Scheme (NVSS) was introduced that uses natural shares either in digital or printed form to generate the noisy secret share. This scheme greatly reduces the transmission risk but causes distortion in the retrieved secret image through variation in settings and properties of digital devices used to capture the natural image during encryption / decryption phase. This paper proposes a new NVSS scheme that extracts the secret key from randomly selected unaltered multiple natural images. To further improve the security of the shares data hiding techniques such as Steganography and Alpha channel watermarking are proposed.Keywords: decryption, encryption, natural visual secret sharing, natural images, noisy share, pixel swapping
Procedia PDF Downloads 4066977 Achieving Success in NPD Projects
Authors: Ankush Agrawal, Nadia Bhuiyan
Abstract:
The new product development (NPD) literature emphasizes the importance of introducing new products on the market for continuing business success. New products are responsible for employment, economic growth, technological progress, and high standards of living. Therefore, the study of NPD and the processes through which they emerge is important. The goal of our research is to propose a framework of critical success factors, metrics, and tools and techniques for implementing metrics for each stage of the new product development (NPD) process. An extensive literature review was undertaken to investigate decades of studies on NPD success and how it can be achieved. These studies were scanned for common factors for firms that enjoyed success of new products on the market. The paper summarizes NPD success factors, suggests metrics that should be used to measure these factors, and proposes tools and techniques to make use of these metrics. This was done for each stage of the NPD process, and brought together in a framework that the authors propose should be followed for complex NPD projects. While many studies have been conducted on critical success factors for NPD, these studies tend to be fragmented and focus on one or a few phases of the NPD process.Keywords: new product development, performance, critical success factors, framework
Procedia PDF Downloads 3996976 Special Features Of Phacoemulsification Technique For Dense Cataracts
Authors: Shilkin A.G., Goncharov D.V., Rotanov D.A., Voitecha M.A., Kulyagina Y.I., Mochalova U.E.
Abstract:
Context: Phacoemulsification is a surgical technique used to remove cataracts, but it has a higher number of complications when dense cataracts are present. The risk factors include thin posterior capsule, dense nucleus fragments, and prolonged exposure to high-power ultrasound. To minimize these complications, various methods are used. Research aim: The aim of this study is to develop and implement optimal methods of ultrasound phacoemulsification for dense cataracts in order to minimize postoperative complications. Methodology: The study involved 36 eyes of dogs with dense cataracts over a period of 5 years. The surgeries were performed using a LEICA 844 surgical microscope and an Oertli Faros phacoemulsifier. The surgical techniques included the optimal technique for breaking the nucleus, bimanual surgery, and the use of Akahoshi prechoppers. Findings: The complications observed during the surgery included rupture of the posterior capsule and the need for anterior vitrectomy. Complications in the postoperative period included corneal edema and uveitis. Theoretical importance: This study contributes to the field by providing insights into the special features of phacoemulsification for dense cataracts. It highlights the importance of using specific techniques and settings to minimize complications. Data collection and analysis procedures: The data for the study were collected from surgeries performed on dogs with dense cataracts. The complications were documented and analyzed. Question addressed: The study addressed the question of how to minimize complications during phacoemulsification surgery for dense cataracts. Conclusion: By following the optimal techniques, settings, and using prechoppers, the surgery for dense cataracts can be made safer and faster, minimizing the risks and complications.Keywords: dense cataracts, phacoemulsification, phacoemulsification of cataracts in elderly dogs, осложнения факоэмульсификации
Procedia PDF Downloads 636975 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 916974 Employing Visual Culture to Enhance Initial Adult Maltese Language Acquisition
Authors: Jacqueline Żammit
Abstract:
Recent research indicates that the utilization of right-brain strategies holds significant implications for the acquisition of language skills. Nevertheless, the utilization of visual culture as a means to stimulate these strategies and amplify language retention among adults engaging in second language (L2) learning remains a relatively unexplored area. This investigation delves into the impact of visual culture on activating right-brain processes during the initial stages of language acquisition, particularly in the context of teaching Maltese as a second language (ML2) to adult learners. By employing a qualitative research approach, this study convenes a focus group comprising twenty-seven educators to delve into a range of visual culture techniques integrated within language instruction. The collected data is subjected to thematic analysis using NVivo software. The findings underscore a variety of impactful visual culture techniques, encompassing activities such as drawing, sketching, interactive matching games, orthographic mapping, memory palace strategies, wordless picture books, picture-centered learning methodologies, infographics, Face Memory Game, Spot the Difference, Word Search Puzzles, the Hidden Object Game, educational videos, the Shadow Matching technique, Find the Differences exercises, and color-coded methodologies. These identified techniques hold potential for application within ML2 classes for adult learners. Consequently, this study not only provides insights into optimizing language learning through specific visual culture strategies but also furnishes practical recommendations for enhancing language competencies and skills.Keywords: visual culture, right-brain strategies, second language acquisition, maltese as a second language, visual aids, language-based activities
Procedia PDF Downloads 636973 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Models
Authors: Alam Ali, Ashok Kumar Pathak
Abstract:
Path analysis is a statistical technique used to evaluate the direct and indirect effects of variables in path models. One or more structural regression equations are used to estimate a series of parameters in path models to find the better fit of data. However, sometimes the assumptions of classical regression models, such as ordinary least squares (OLS), are violated by the nature of the data, resulting in insignificant direct and indirect effects of exogenous variables. This article aims to explore the effectiveness of a copula-based regression approach as an alternative to classical regression, specifically when variables are linked through an elliptical copula.Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique
Procedia PDF Downloads 436972 ViraPart: A Text Refinement Framework for Automatic Speech Recognition and Natural Language Processing Tasks in Persian
Authors: Narges Farokhshad, Milad Molazadeh, Saman Jamalabbasi, Hamed Babaei Giglou, Saeed Bibak
Abstract:
The Persian language is an inflectional subject-object-verb language. This fact makes Persian a more uncertain language. However, using techniques such as Zero-Width Non-Joiner (ZWNJ) recognition, punctuation restoration, and Persian Ezafe construction will lead us to a more understandable and precise language. In most of the works in Persian, these techniques are addressed individually. Despite that, we believe that for text refinement in Persian, all of these tasks are necessary. In this work, we proposed a ViraPart framework that uses embedded ParsBERT in its core for text clarifications. First, used the BERT variant for Persian followed by a classifier layer for classification procedures. Next, we combined models outputs to output cleartext. In the end, the proposed model for ZWNJ recognition, punctuation restoration, and Persian Ezafe construction performs the averaged F1 macro scores of 96.90%, 92.13%, and 98.50%, respectively. Experimental results show that our proposed approach is very effective in text refinement for the Persian language.Keywords: Persian Ezafe, punctuation, ZWNJ, NLP, ParsBERT, transformers
Procedia PDF Downloads 2186971 Using Machine Learning Techniques for Autism Spectrum Disorder Analysis and Detection in Children
Authors: Norah Mohammed Alshahrani, Abdulaziz Almaleh
Abstract:
Autism Spectrum Disorder (ASD) is a condition related to issues with brain development that affects how a person recognises and communicates with others which results in difficulties with interaction and communication socially and it is constantly growing. Early recognition of ASD allows children to lead safe and healthy lives and helps doctors with accurate diagnoses and management of conditions. Therefore, it is crucial to develop a method that will achieve good results and with high accuracy for the measurement of ASD in children. In this paper, ASD datasets of toddlers and children have been analyzed. We employed the following machine learning techniques to attempt to explore ASD and they are Random Forest (RF), Decision Tree (DT), Na¨ıve Bayes (NB) and Support Vector Machine (SVM). Then Feature selection was used to provide fewer attributes from ASD datasets while preserving model performance. As a result, we found that the best result has been provided by the Support Vector Machine (SVM), achieving 0.98% in the toddler dataset and 0.99% in the children dataset.Keywords: autism spectrum disorder, machine learning, feature selection, support vector machine
Procedia PDF Downloads 1526970 Condition Assessment of Reinforced Concrete Bridge Deck Using Ground Penetrating Radar
Authors: Azin Shakibabarough, Mojtaba Valinejadshoubi, Ashutosh Bagchi
Abstract:
Catastrophic bridge failure happens due to the lack of inspection, lack of design and extreme events like flooding, an earthquake. Bridge Management System (BMS) is utilized to diminish such an accident with proper design and frequent inspection. Visual inspection cannot detect any subsurface defects, so using Non-Destructive Evaluation (NDE) techniques remove these barriers as far as possible. Among all NDE techniques, Ground Penetrating Radar (GPR) has been proved as a highly effective device for detecting internal defects in a reinforced concrete bridge deck. GPR is used for detecting rebar location and rebar corrosion in the reinforced concrete deck. GPR profile is composed of hyperbola series in which sound hyperbola denotes sound rebar and blur hyperbola or signal attenuation shows corroded rebar. Interpretation of GPR images is implemented by numerical analysis or visualization. Researchers recently found that interpretation through visualization is more precise than interpretation through numerical analysis, but visualization is time-consuming and a highly subjective process. Automating the interpretation of GPR image through visualization can solve these problems. After interpretation of all scans of a bridge, condition assessment is conducted based on the generated corrosion map. However, this such a condition assessment is not objective and precise. Condition assessment based on structural integrity and strength parameters can make it more objective and precise. The main purpose of this study is to present an automated interpretation method of a reinforced concrete bridge deck through a visualization technique. In the end, the combined analysis of the structural condition in a bridge is implemented.Keywords: bridge condition assessment, ground penetrating radar, GPR, NDE techniques, visualization
Procedia PDF Downloads 1496969 Standardization Of Miniature Neutron Research Reactor And Occupational Safety Analysis
Authors: Raymond Limen Njinga
Abstract:
The comparator factors (Fc) for miniature research reactors are of great importance in the field of nuclear physics as it provide accurate bases for the evaluation of elements in all form of samples via ko-NAA techniques. The Fc was initially simulated theoretically thereafter, series of experiments were performed to validate the results. In this situation, the experimental values were obtained using the alloy of Au(0.1%) - Al monitor foil and a neutron flux setting of 5.00E+11 cm-2.s-1. As was observed in the inner irradiation position, the average experimental value of 7.120E+05 was reported against the theoretical value of 7.330E+05. In comparison, a percentage deviation of 2.86 (from theoretical value) was observed. In the large case of the outer irradiation position, the experimental value of 1.170E+06 was recorded against the theoretical value of 1.210E+06 with a percentage deviation of 3.310 (from the theoretical value). The estimation of equivalent dose rate at 5m from neutron flux of 5.00E+11 cm-2.s-1 within the neutron energies of 1KeV, 10KeV, 100KeV, 500KeV, 1MeV, 5MeV and 10MeV were calculated to be 0.01 Sv/h, 0.01 Sv/h, 0.03 Sv/h, 0.15 Sv/h, 0.21Sv/h and 0.25 Sv/h respectively with a total dose within a period of an hour was obtained to be 0.66 Sv.Keywords: neutron flux, comparator factor, NAA techniques, neutron energy, equivalent dose
Procedia PDF Downloads 1836968 Solid State Drive End to End Reliability Prediction, Characterization and Control
Authors: Mohd Azman Abdul Latif, Erwan Basiron
Abstract:
A flaw or drift from expected operational performance in one component (NAND, PMIC, controller, DRAM, etc.) may affect the reliability of the entire Solid State Drive (SSD) system. Therefore, it is important to ensure the required quality of each individual component through qualification testing specified using standards or user requirements. Qualification testing is time-consuming and comes at a substantial cost for product manufacturers. A highly technical team, from all the eminent stakeholders is embarking on reliability prediction from beginning of new product development, identify critical to reliability parameters, perform full-blown characterization to embed margin into product reliability and establish control to ensure the product reliability is sustainable in the mass production. The paper will discuss a comprehensive development framework, comprehending SSD end to end from design to assembly, in-line inspection, in-line testing and will be able to predict and to validate the product reliability at the early stage of new product development. During the design stage, the SSD will go through intense reliability margin investigation with focus on assembly process attributes, process equipment control, in-process metrology and also comprehending forward looking product roadmap. Once these pillars are completed, the next step is to perform process characterization and build up reliability prediction modeling. Next, for the design validation process, the reliability prediction specifically solder joint simulator will be established. The SSD will be stratified into Non-Operating and Operating tests with focus on solder joint reliability and connectivity/component latent failures by prevention through design intervention and containment through Temperature Cycle Test (TCT). Some of the SSDs will be subjected to the physical solder joint analysis called Dye and Pry (DP) and Cross Section analysis. The result will be feedbacked to the simulation team for any corrective actions required to further improve the design. Once the SSD is validated and is proven working, it will be subjected to implementation of the monitor phase whereby Design for Assembly (DFA) rules will be updated. At this stage, the design change, process and equipment parameters are in control. Predictable product reliability at early product development will enable on-time sample qualification delivery to customer and will optimize product development validation, effective development resource and will avoid forced late investment to bandage the end-of-life product failures. Understanding the critical to reliability parameters earlier will allow focus on increasing the product margin that will increase customer confidence to product reliability.Keywords: e2e reliability prediction, SSD, TCT, solder joint reliability, NUDD, connectivity issues, qualifications, characterization and control
Procedia PDF Downloads 1746967 Implementation and Validation of Therapeutic Tourism Products for Families With Children With Autism Spectrum Disorder in Azores Islands: “Azores All in Blue” Project
Authors: Ana Rita Conde, Pilar Mota, Tânia Botelho, Suzana Caldeira, Isabel Rego, Jessica Pacheco, Osvaldo Silva, Áurea Sousa
Abstract:
Tourism promotes well-being and health to children with ASD and their families. Literature indicates the need to provide tourist activities that integrate the therapeutic component, to promote the development and well-being of children with ASD. The study aims to implement tourist offers in Azores that integrate the therapeutic feature, assess their suitability and impact on the well-being and health of the child and caregivers. Using a mixed methodology, the study integrates families that experience and evaluate the impact of tourism products developed specifically for them.Keywords: austism spectrum disorder, children, therapeutic tourism activities, well-being, health, inclusive tourism
Procedia PDF Downloads 1446966 Fault Diagnosis in Induction Motor
Authors: Kirti Gosavi, Anita Bhole
Abstract:
The paper demonstrates simulation and steady-state performance of three phase squirrel cage induction motor and detection of rotor broken bar fault using MATLAB. This simulation model is successfully used in the fault detection of rotor broken bar for the induction machines. A dynamic model using PWM inverter and mathematical modelling of the motor is developed. The dynamic simulation of the small power induction motor is one of the key steps in the validation of the design process of the motor drive system and it is needed for eliminating advertent design errors and the resulting error in the prototype construction and testing. The simulation model will be helpful in detecting the faults in three phase induction motor using Motor current signature analysis.Keywords: squirrel cage induction motor, pulse width modulation (PWM), fault diagnosis, induction motor
Procedia PDF Downloads 6336965 Safety of Built Infrastructure: Single Degree of Freedom Approach to Blast Resistant RC Wall Panels
Authors: Muizz Sanni-Anibire
Abstract:
The 21st century has witnessed growing concerns for the protection of built facilities against natural and man-made disasters. Studies in earthquake resistant buildings, fire, and explosion resistant buildings now dominate the arena. To protect people and facilities from the effects of the explosion, reinforced concrete walls have been designed to be blast resistant. Understanding the performance of these walls is a key step in ensuring the safety of built facilities. Blast walls are mostly designed using simple techniques such as single degree of freedom (SDOF) method, despite the increasing use of multi-degree of freedom techniques such as the finite element method. This study is the first stage of a continuous research into the safety and reliability of blast walls. It presents the SDOF approach applied to the analysis of a concrete wall panel under three representative bomb situations. These are motorcycle 50 kg, car 400kg and also van with the capacity of 1500 kg of TNT explosive.Keywords: blast wall, safety, protection, explosion
Procedia PDF Downloads 2636964 Building an Opinion Dynamics Model from Experimental Data
Authors: Dino Carpentras, Paul J. Maher, Caoimhe O'Reilly, Michael Quayle
Abstract:
Opinion dynamics is a sub-field of agent-based modeling that focuses on people’s opinions and their evolutions over time. Despite the rapid increase in the number of publications in this field, it is still not clear how to apply these models to real-world scenarios. Indeed, there is no agreement on how people update their opinion while interacting. Furthermore, it is not clear if different topics will show the same dynamics (e.g., more polarized topics may behave differently). These problems are mostly due to the lack of experimental validation of the models. Some previous studies started bridging this gap in the literature by directly measuring people’s opinions before and after the interaction. However, these experiments force people to express their opinion as a number instead of using natural language (and then, eventually, encoding it as numbers). This is not the way people normally interact, and it may strongly alter the measured dynamics. Another limitation of these studies is that they usually average all the topics together, without checking if different topics may show different dynamics. In our work, we collected data from 200 participants on 5 unpolarized topics. Participants expressed their opinions in natural language (“agree” or “disagree”). We also measured the certainty of their answer, expressed as a number between 1 and 10. However, this value was not shown to other participants to keep the interaction based on natural language. We then showed the opinion (and not the certainty) of another participant and, after a distraction task, we repeated the measurement. To make the data compatible with opinion dynamics models, we multiplied opinion and certainty to obtain a new parameter (here called “continuous opinion”) ranging from -10 to +10 (using agree=1 and disagree=-1). We firstly checked the 5 topics individually, finding that all of them behaved in a similar way despite having different initial opinions distributions. This suggested that the same model could be applied for different unpolarized topics. We also observed that people tend to maintain similar levels of certainty, even when they changed their opinion. This is a strong violation of what is suggested from common models, where people starting at, for example, +8, will first move towards 0 instead of directly jumping to -8. We also observed social influence, meaning that people exposed with “agree” were more likely to move to higher levels of continuous opinion, while people exposed with “disagree” were more likely to move to lower levels. However, we also observed that the effect of influence was smaller than the effect of random fluctuations. Also, this configuration is different from standard models, where noise, when present, is usually much smaller than the effect of social influence. Starting from this, we built an opinion dynamics model that explains more than 80% of data variance. This model was also able to show the natural conversion of polarization from unpolarized states. This experimental approach offers a new way to build models grounded on experimental data. Furthermore, the model offers new insight into the fundamental terms of opinion dynamics models.Keywords: experimental validation, micro-dynamics rule, opinion dynamics, update rule
Procedia PDF Downloads 1106963 Psychometric Validation of Czech Version of Spiritual Needs Assessment for Patients: The First Part of Research
Authors: Lucie Mrackova, Helena Kisvetrova
Abstract:
Spirituality is an integral part of human life. In a secular environment, spiritual needs are often overlooked, especially in acute nursing care. Spiritual needs assessment for patients (SNAP), which also exists in the Czech version (SNAP-CZ), can be used for objective evaluation. The aim of this study was to measure the psychometric properties of SNAP-CZ and to find correlations between SNAP-CZ and sociodemographic and clinical variables. A cross-sectional study with tools assessing spiritual needs (SNAP-CZ), anxiety (Beck Anxiety Inventory; BAI), depression (Beck Depression Inventory; BDI), pain (Visual Analogue Scale; VAS), self-sufficiency (Barthel Index; BI); cognitive function (Montreal Cognitive Test; MoCa) and selected socio-demographic data was performed. The psychometric properties of SNAP-CZ were tested using factor analysis, reliability and validity tests, and correlations between the questionnaire and sociodemographic data and clinical variables. Internal consistency was established with Cronbach’s alfa for the overall score, respective domains, and individual items. Reliability was assessed by test-retest by Interclass correlation coefficient (ICC). Data for correlation analysis were processed according to Pearson's correlation coefficient. The study included 172 trauma patients (the mean age = 40.6 ± 12.1 years) who experienced polytrauma or severe monotrauma. There were a total of 106 (61.6%) male subjects, 140 (81.4%) respondents identified themselves as non-believers. The full-scale Cronbach's alpha was 0.907. The test-retest showed the reliability of the individual domains in the range of 0.924 to 0.960 ICC. Factor analysis resulted in a three-factor solution (psychosocial needs (alfa = 0.788), spiritual needs (alfa = 0.886) and religious needs (alfa = 0.841)). Correlation analysis using Pearson's correlation coefficient showed that the domain of psychosocial needs significantly correlated only with gender (r = 0.178, p = 0.020). Males had a statistically significant lower average value in this domain (mean = 12.5) compared to females (mean = 13.8). The domain of spiritual needs significantly correlated with gender (r = 0.199, p = 0.009), social status (r = 0.156, p = 0.043), faith (r = -0.250, p = 0.001), anxiety (r = 0.194, p = 0.011) and depression (r = 0.155, p = 0.044). The domain of religious needs significantly correlated with age (r = 0,208, p = 0,007), education (r = -0,161, p = 0,035), faith (r = -0,575, p < 0,0001) and depression (r = 0,179, p = 0,019). Overall, the whole SNAP scale significantly correlated with gender (r = 0.219, p = 0.004), social status (r = 0.175, p = 0.023), faith (r = -0.334, p <0.0001), anxiety (r = 0.177, p = 0.022) and depression (r = 0.173, p = 0.025). The results of this study corroborate the reliability of the SNAP-CZ and support its future use in the nursing care of trauma patients in a secular society. Acknowledgment: The study was supported by grant nr. IGA_FZV_2020_003.Keywords: acute nursing care, assessment of spiritual needs, patient, psychometric validation, spirituality
Procedia PDF Downloads 104